hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
80876433c00c1432de05cd73b62931873aa46756 | 77 | py | Python | test/integration/samples_in/escaped_dq.py | Inveracity/flynt | b975b6f61893d5db1114d68fbb5d212c4e11aeb8 | [
"MIT"
] | 487 | 2019-06-10T17:44:56.000Z | 2022-03-26T01:28:19.000Z | test/integration/samples_in/escaped_dq.py | Inveracity/flynt | b975b6f61893d5db1114d68fbb5d212c4e11aeb8 | [
"MIT"
] | 118 | 2019-07-03T12:26:39.000Z | 2022-03-06T22:40:17.000Z | test/integration/samples_in/escaped_dq.py | Inveracity/flynt | b975b6f61893d5db1114d68fbb5d212c4e11aeb8 | [
"MIT"
] | 25 | 2019-07-10T08:39:58.000Z | 2022-03-03T14:44:15.000Z | a, b, c = 1, 2, 3
t1 = "=== Images[{}]: {} === Rest: \"{}\"".format(a, b, c)
| 25.666667 | 58 | 0.350649 | 13 | 77 | 2.076923 | 0.769231 | 0.148148 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 0.220779 | 77 | 2 | 59 | 38.5 | 0.383333 | 0 | 0 | 0 | 0 | 0 | 0.38961 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
80baeb6a19ee888c850828def10e937432c5c76e | 58 | py | Python | venv/Lib/site-packages/pynance/tst/unit/tech/__init__.py | LeonardoHMS/imobi | 6b2b97a05df67ea7d493f7b601382f65c6629cc2 | [
"MIT"
] | 35 | 2015-03-12T04:16:14.000Z | 2020-12-17T18:10:15.000Z | venv/Lib/site-packages/pynance/tst/unit/tech/__init__.py | LeonardoHMS/imobi | 6b2b97a05df67ea7d493f7b601382f65c6629cc2 | [
"MIT"
] | 31 | 2015-03-16T21:31:04.000Z | 2021-01-26T00:12:34.000Z | venv/Lib/site-packages/pynance/tst/unit/tech/__init__.py | LeonardoHMS/imobi | 6b2b97a05df67ea7d493f7b601382f65c6629cc2 | [
"MIT"
] | 18 | 2015-09-30T10:40:26.000Z | 2021-01-25T21:20:44.000Z | # make subdirectory a module so that nose will find tests
| 29 | 57 | 0.793103 | 10 | 58 | 4.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189655 | 58 | 1 | 58 | 58 | 0.978723 | 0.948276 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
80e0bdf830bfc645d52186c0afe6434e6e40f7f7 | 322 | py | Python | lemon_markets/config.py | LinusReuter/lemon-markets-api-acsess | 953619bb8caf9f8523ec41960816881874f55344 | [
"MIT"
] | 10 | 2021-08-10T08:35:48.000Z | 2022-03-30T10:52:16.000Z | lemon_markets/config.py | LinusReuter/lemon-markets-api-acsess | 953619bb8caf9f8523ec41960816881874f55344 | [
"MIT"
] | 2 | 2021-10-17T11:48:53.000Z | 2021-12-16T17:29:05.000Z | lemon_markets/config.py | LinusReuter/lemon-markets-api-acsess | 953619bb8caf9f8523ec41960816881874f55344 | [
"MIT"
] | 8 | 2021-08-05T11:28:50.000Z | 2022-03-26T13:43:12.000Z | # undocumented on rtd
DEFAULT_PAPER_REST_API_URL: str = "https://paper.lemon.markets/rest/v1/"
DEFAULT_PAPER_DATA_REST_API_URL: str = "https://paper-data.lemon.markets/v1/"
DEFAULT_MONEY_REST_API_URL: str = ""
DEFAULT_MONEY_DATA_REST_API_URL: str = ""
DEFAULT_AUTH_API_URL: str = "https://auth.lemon.markets/oauth2/token"
| 46 | 77 | 0.78882 | 52 | 322 | 4.480769 | 0.346154 | 0.128755 | 0.193133 | 0.223176 | 0.403433 | 0.197425 | 0 | 0 | 0 | 0 | 0 | 0.010067 | 0.074534 | 322 | 6 | 78 | 53.666667 | 0.771812 | 0.059006 | 0 | 0 | 0 | 0 | 0.368771 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
80e957e9a732986209a1052d5285183cc3b82191 | 82 | py | Python | classic_FizzBuzz.py | ttglennhall/simple_dev_python | 68c199255beba0053c7c9b905dca7ea5ec919c66 | [
"MIT"
] | null | null | null | classic_FizzBuzz.py | ttglennhall/simple_dev_python | 68c199255beba0053c7c9b905dca7ea5ec919c66 | [
"MIT"
] | null | null | null | classic_FizzBuzz.py | ttglennhall/simple_dev_python | 68c199255beba0053c7c9b905dca7ea5ec919c66 | [
"MIT"
] | null | null | null | for n in range(1, 101):
print("Fizz" * (n % 3 == 0) + "Buzz" * (n % 5 == 0) or n) | 41 | 58 | 0.45122 | 17 | 82 | 2.176471 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 0.268293 | 82 | 2 | 58 | 41 | 0.483333 | 0 | 0 | 0 | 0 | 0 | 0.096386 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
80eec2c1a72c3774cd992eedbaa86751cf61154b | 46,234 | py | Python | Modules/3_classification/sen1_cnntrain_uil.py | zhu-xlab/So2Sat-LCZ-Classification-Demo | c675b98852d76525cfb7879880fe720f354de11b | [
"MIT"
] | 7 | 2021-11-25T13:07:32.000Z | 2022-03-21T07:33:40.000Z | Modules/3_classification/sen1_cnntrain_uil.py | zhu-xlab/So2Sat-LCZ-Classification-Demo | c675b98852d76525cfb7879880fe720f354de11b | [
"MIT"
] | null | null | null | Modules/3_classification/sen1_cnntrain_uil.py | zhu-xlab/So2Sat-LCZ-Classification-Demo | c675b98852d76525cfb7879880fe720f354de11b | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Tue May 22 15:09:53 2018
@author: hu jingliang
"""
'modified by jingliang, 07.09.2018 11:52 a.m. added feature extraction of real and imaginary part of unfiltered data'
# Last modified: 09.07.2018 14:41:00 Jingliang Hu
# handling the situation that data does not cover all label, changed funcion 'getCoordinate'
import os
import glob
import numpy as np
from osgeo import gdal
import sys
import h5py
gdal.UseExceptions()
#from __future__ import print_function
def cityListFromFolder(cityListFolder):
# cityListFolder = '/media/sf_So2Sat/auxdata/citylists/citylistLCZ9'
# cityListFolder = '/media/sf_So2Sat/auxdata/citylists/citylistOne'
cityList = glob.glob(cityListFolder+'/*')
if type(cityList) is list:
for i in range(0,len(cityList)):
cityList[i] = cityList[i].split('/')[-1]
elif type(cityList) is str:
cityList = cityList.split('/')[-1]
else:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: reading city list failed")
print("DIRECTORY: " + cityListFolder)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
return cityList
def getPathOfCity(dpath):
# dpath = '/media/sf_So2Sat/data/massive_downloading'
cityPath = glob.glob(dpath+'/*')
i = 0
while i < len(cityPath):
cpath = cityPath[i]
temp = cityPath[i].split('/')[-1].split('_')
i = i + 1
if not os.path.isdir(cpath) or len(temp)!=3:
#print cpath
print ('The directory is either not a directory, or is not named in standard. And it has been removed')
cityPath.remove(cpath)
i = i - 1
return cityPath
def dataOfCityList(cityList,dataStorePath):
# dataStorePath = '/media/sf_So2Sat/data'
dataOfCity = getPathOfCity(dataStorePath)
unfiltMosaicDataOfCity = []
if type(cityList) is list:
for i in range(0,len(cityList)):
for j in range(0,len(dataOfCity)):
if cityList[i].replace('_','') in dataOfCity[j].split('/')[-1].lower():
temp = glob.glob(dataOfCity[j]+'/mosaic_unfilt_dat/*/mosaic.tif')
for name in temp:
unfiltMosaicDataOfCity.append(name)
break
elif j == len(dataOfCity)-1:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("WARNING:")
print('Data for city of '+cityList[i]+' not found')
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
elif type(cityList) is str:
for j in range(0,len(dataOfCity)):
if cityList.replace('_','') in dataOfCity[j].split('/')[-1].lower():
temp = glob.glob(dataOfCity[j]+'/mosaic_unfilt_dat/*/mosaic.tif')
for name in temp:
unfiltMosaicDataOfCity.append(name)
break
elif j == len(dataOfCity)-1:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("WARNING:")
print('Data for city of '+cityList+' not found')
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
else:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: error in given city list")
print("DIRECTORY: " + cityList)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
return unfiltMosaicDataOfCity
def getPathOfTime(cpath):
tempPath = cpath+'/mosaic_unfilt_dat'
city = cpath.split('/')[-1]
if not os.path.exists(tempPath):
tempPath = tempPath.replace('mosaic_unfilt_dat','geocoded_subset_unfilt_dat')
timePath = glob.glob(tempPath + '/*')
if len(timePath)==0:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: can not find preprocessed unfiltered data for the city of: " + city)
print("DIRECTORY: " + tempPath)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
return timePath
def getPath2Data(tpath):
tifPath = glob.glob(tpath+'/*.tif')
if len(tifPath)==0:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: can not find data ")
print("DIRECTORY: " + tpath)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
if len(tifPath) > 1:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: more than one data found for a city at a time period ")
print("DIRECTORY: " + tifPath)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
return tifPath[0]
def getImagCoord(dir2DataGeotiff,xWorld,yWorld):
# find the number of rows and columns with given geographic coordinate. Given coordinate should be in WGS 84/UTM
try:
dataTif = gdal.Open(dir2DataGeotiff)
except RuntimeError as e:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: the given data geotiff can not be open by GDAL")
print("DIRECTORY: "+dir2DataGeotiff)
print("GDAL EXCEPCTION: "+e)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
geoInfoData = dataTif.GetGeoTransform()
col_dat = np.round((xWorld - geoInfoData[0])/geoInfoData[1])
row_dat = np.round((geoInfoData[3] - yWorld)/np.abs(geoInfoData[5]))
return col_dat,row_dat
def getCoordinate(gtPath,dataPath):
# find the data corresponds to label location, based on the geo-coordinate
try:
labelTif = gdal.Open(gtPath)
except RuntimeError as e:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: the given ground truth geotiff can not be open by GDAL")
print("DIRECTORY: "+gtPath)
print("GDAL EXCEPCTION: "+e)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
try:
dataTif = gdal.Open(dataPath)
except RuntimeError as e:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: the given data geotiff can not be open by GDAL")
print("DIRECTORY: "+dataPath)
print("GDAL EXCEPCTION: "+e)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
# read label data and find data corresponds to label location
label = labelTif.ReadAsArray()
label[label<0] = 0
label[label>100] = label[label>100]-90
row_lab,col_lab = np.where(label>0)
geoInfoLabel = labelTif.GetGeoTransform()
geoInfoData = dataTif.GetGeoTransform()
xWorld = geoInfoLabel[0] + col_lab * geoInfoLabel[1];
yWorld = geoInfoLabel[3] + row_lab * geoInfoLabel[5];
col_dat = np.round((xWorld - geoInfoData[0])/geoInfoData[1])
row_dat = np.round((geoInfoData[3] - yWorld)/np.abs(geoInfoData[5]))
outDataBoundary = np.where(col_dat<0)
if outDataBoundary[0].size>0:
print('West part of ground truth is not covered by data')
row_dat = np.delete(row_dat,outDataBoundary,0)
col_dat = np.delete(col_dat,outDataBoundary,0)
row_lab = np.delete(row_lab,outDataBoundary,0)
col_lab = np.delete(col_lab,outDataBoundary,0)
xWorld = np.delete(xWorld,outDataBoundary,0)
yWorld = np.delete(yWorld,outDataBoundary,0)
outDataBoundary = np.where(row_dat<0)
if outDataBoundary[0].size>0:
print('North part of ground truth is not covered by data')
row_dat = np.delete(row_dat,outDataBoundary,0)
col_dat = np.delete(col_dat,outDataBoundary,0)
row_lab = np.delete(row_lab,outDataBoundary,0)
col_lab = np.delete(col_lab,outDataBoundary,0)
xWorld = np.delete(xWorld,outDataBoundary,0)
yWorld = np.delete(yWorld,outDataBoundary,0)
outDataBoundary = np.where(row_dat>dataTif.RasterYSize)
if outDataBoundary[0].size>0:
print('South part of ground truth is not covered by data')
row_dat = np.delete(row_dat,outDataBoundary,0)
col_dat = np.delete(col_dat,outDataBoundary,0)
row_lab = np.delete(row_lab,outDataBoundary,0)
col_lab = np.delete(col_lab,outDataBoundary,0)
xWorld = np.delete(xWorld,outDataBoundary,0)
yWorld = np.delete(yWorld,outDataBoundary,0)
outDataBoundary = np.where(col_dat>dataTif.RasterXSize)
if outDataBoundary[0].size>0:
print('East part of ground truth is not covered by data')
row_dat = np.delete(row_dat,outDataBoundary,0)
col_dat = np.delete(col_dat,outDataBoundary,0)
row_lab = np.delete(row_lab,outDataBoundary,0)
col_lab = np.delete(col_lab,outDataBoundary,0)
xWorld = np.delete(xWorld,outDataBoundary,0)
yWorld = np.delete(yWorld,outDataBoundary,0)
return col_dat,row_dat,xWorld,yWorld,row_lab,col_lab
def boxcar(data,halfwin):
shiftInter = np.linspace(-halfwin,halfwin,num=2*halfwin+1)
res = np.zeros(data.shape)
for i in shiftInter:
for j in shiftInter:
res = res + np.roll(np.roll(data,int(j),axis=2),int(i),axis=1)
res = res/np.square((2*halfwin+1))
return res
def dBFeatStat(data,datamask):
temp = data[datamask]
if np.sum(temp==0)>0:
print('FOUND INTENSITY EQUALLS ZEROS AND SET TO EPS')
print('number of zeros: '+ str(np.sum(temp==0)))
temp[temp==0] = np.finfo(temp.dtype).eps
temp = 10*np.log10(temp)
dataRes = np.zeros(data.shape)
dataRes[datamask] = temp
dataRes[datamask==False] = np.finfo(data.dtype).min
stat = {}
stat['min'] = np.min(temp[temp>10*np.log10(np.finfo(temp.dtype).eps)])
stat['max'] = np.max(temp)
bins = np.arange(-60, 52, 3)
#print(np.max(temp))
#print(np.min(temp))
freq,_ = np.histogram(temp,bins)
stat['freq'] = freq
stat['bins'] = bins
stat['mean'] = np.mean(temp)
stat['std'] = np.std(temp)
return dataRes, stat
def dBFeatNorm(data,datamask,flag=2):
temp = data[datamask]
temp = 10*np.log10(temp)
if flag==1:
valRange = np.max(temp[:])-np.min(temp[:])
temp = (temp - np.min(temp[:]))/valRange
elif flag==2:
# stretching data after clipping the left and right tails of the pdf by 2%
# Gamba, Paolo, Massimilano Aldrighi, and Mattia Stasolla. "Robust extraction of urban area extents in HR and VHR SAR images." IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 4.1 (2011): 27-34.
maxV = np.max(temp)
minV = np.min(temp)
#clip = (maxV-minV)*0.02
#temp[temp<(minV+clip)] = (minV+clip)
#temp[temp>(maxV-clip)] = (maxV-clip)
temp = (temp-(minV))/(maxV-minV)
elif flag==3:
maxV = np.max(temp)
minV = np.min(temp)
clip = (maxV-minV)*0.04
strSca = maxV - clip
alignPnt = np.median(temp[temp>strSca])
temp = temp + (100 - alignPnt)
dataRes = np.zeros(data.shape)
dataRes[datamask] = temp
dataRes[datamask==False] = np.min(temp[:])
return dataRes
def dBFeatNorm_Gamba(data):
# stretching data after clipping the left and right tails of the pdf by 2%
# Gamba, Paolo, Massimilano Aldrighi, and Mattia Stasolla. "Robust extraction of urban area extents in HR and VHR SAR images." IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 4.1 (2011): 27-34.
datamask = data!=0
temp = data[datamask]
temp = 10*np.log10(temp)
maxV = np.max(temp)
minV = np.min(temp)
clip = (maxV-minV)*0.02
temp[temp<(minV+clip)] = (minV+clip)
temp[temp>(maxV-clip)] = (maxV-clip)
dataRes = np.zeros(data.shape)
dataRes[datamask] = temp
dataRes[datamask==False] = np.min(temp[:])
return dataRes
def dBFeatNorm_strongScatter(data):
datamask = data!=0
temp = data[datamask]
temp = 10*np.log10(temp)
strSca = np.percentile(temp,96)
alignPnt = np.median(temp[temp>strSca])
temp = temp + (100 - alignPnt)
dataRes = np.zeros(data.shape)
dataRes[datamask] = temp
dataRes[datamask==False] = np.min(temp[:])
return dataRes
def realImagFeatStat(data,datamask):
temp = data[datamask]
dataRes = np.zeros(data.shape)
dataRes[datamask] = temp
dataRes[datamask==False] = np.finfo(data.dtype).min
stat = {}
rangeTemp = temp
bins = np.arange(np.min(rangeTemp), np.max(rangeTemp), np.ptp(rangeTemp)/100)
freq,_ = np.histogram(rangeTemp,bins)
stat['freq'] = freq
stat['bins'] = bins
stat['mean'] = np.mean(rangeTemp)
stat['std'] = np.std(rangeTemp)
return dataRes, stat
def featStat(data,datamask):
temp = data[datamask]
if np.sum(temp==0)>0:
print('FOUND INTENSITY EQUALLS ZEROS AND SET TO EPS')
print('number of zeros: '+ str(np.sum(temp==0)))
temp[temp==0] = np.finfo(temp.dtype).eps
dataRes = np.zeros(data.shape)
dataRes[datamask] = temp
dataRes[datamask==False] = np.finfo(data.dtype).min
stat = {}
rangeTemp = temp[temp!=np.finfo(temp.dtype).eps]
rangeTemp = rangeTemp[np.isnan(rangeTemp)==False]
bins = np.arange(np.min(rangeTemp), np.max(rangeTemp), np.ptp(rangeTemp)/100)
freq,_ = np.histogram(rangeTemp,bins)
stat['freq'] = freq
stat['bins'] = bins
stat['mean'] = np.mean(rangeTemp)
stat['std'] = np.std(rangeTemp)
return dataRes, stat
def featNorm(data,datamask):
temp = data[datamask]
valRange = np.max(temp[:])-np.min(temp[:])
temp = (temp - np.min(temp[:]))/valRange
dataRes = np.zeros(data.shape)
dataRes[datamask] = temp
dataRes[datamask==False] = np.min(temp[:])
return dataRes
def getData(dataPath, featSelection = [0,0,0,0,0,0,0,0,0,0,0]):
# featSelection: a 1 by 11 vector indicating which features should be extract.
#
# e.g. [1,1,0,1,1,1,0] means extract 5 features: VH_dB_un, VV_dB_un, VH_dB_un, VV_dB_un and COH_lee
# by now, feature selection only support in total 7 features,
# ------------------------------------------------------------------------------------
# Feature | Shortcuts | delete code | preserving code
# ------------------------------------------------------------------------------------
# unfiltered VH in dB | VH_dB_un | 0 | 1
# unfiltered VV in dB | VV_dB_un | 0 | 1
# boxcar coherence | COH_boxcar | 0 | 1
# VH real part | VH_real | 0 | 1
# VH imag part | VH_imag | 0 | 1
# VV real part | VV_real | 0 | 1
# VV imag part | VV_imag | 0 | 1
# lee-filtered VH in dB | VH_dB_lee | 0 | 1
# lee-filtered VV in dB | VV_dB_lee | 0 | 1
# lee-filtered coherence | COH_lee | 0 | 1
# lee-filtered relative phase | PHA_lee | 0 | 1
# -------------------------------------------------------------------------------------
#
#
# dataPath = '/data/hu/LCZ42_SEN1/LCZ42_23052_LosAngeles-LongBeach-SantaAna/mosaic_unfilt_dat/201706/mosaic.tif'
# featSelection = [1,1,0,1,1,1,0]
# if featSelection = [0,0,0,0,0,0,0] return data mask and unfiltered data
# generate data mask from lee-filtered data
maskPath = dataPath.replace('unfilt_','')
try:
print(maskPath)
maskTif = gdal.Open(maskPath)
except RuntimeError as e:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: the given data geotiff can not be open by GDAL")
print("DIRECTORY: "+dataPath)
print("GDAL EXCEPCTION: "+e)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
data = maskTif.ReadAsArray()
# data mask extraction
datamask = (data[0,:,:]+data[3,:,:])>0
del(maskTif)
del(data)
# initial a dictionary to save maximum, minimum, histogram of intensity bands
stat = {}
# unfiltered data need to be loaded
if np.sum(featSelection[:7]) > 0:
try:
dataTif = gdal.Open(dataPath)
except RuntimeError as e:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: the given data geotiff can not be open by GDAL")
print("DIRECTORY: "+dataPath)
print("GDAL EXCEPCTION: "+e)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
# read data
data = dataTif.ReadAsArray()
data[np.isnan(data)] = 0
VH_real = data[0,:,:].copy()
VH_imag = data[1,:,:].copy()
VV_real = data[2,:,:].copy()
VV_imag = data[3,:,:].copy()
# convert complex form into covariance matrix form
tempData = np.zeros(data.shape)
tempData[0,:,:] = data[0,:,:]*data[0,:,:] + data[1,:,:]*data[1,:,:]
tempData[1,:,:] = data[2,:,:]*data[2,:,:] + data[3,:,:]*data[3,:,:]
tempData[2,:,:] = data[0,:,:]*data[2,:,:] + data[1,:,:]*data[3,:,:]
tempData[3,:,:] = data[1,:,:]*data[2,:,:] - data[0,:,:]*data[3,:,:]
data = tempData
del(tempData)
# get intensity of VV, VH; get boxcar coherence
VH_dB_un = data[0,:,:]
VV_dB_un = data[1,:,:]
data = boxcar(data,1)
VH_dB_un[VH_dB_un==0] = data[0,VH_dB_un==0]
VV_dB_un[VV_dB_un==0] = data[1,VV_dB_un==0]
COH_boxcar = np.sqrt(np.add(np.square(data[2,:,:]),np.square(data[3,:,:])))/np.sqrt(data[1,:,:]*data[0,:,:])
del(data)
# convert intensity into dB, and normalization
if featSelection[0]==1:
#print(' unfiltered VH:')
VH_dB_un, stat['VHdB'] = dBFeatStat(VH_dB_un,datamask)
else:
del(VH_dB_un)
if featSelection[1]==1:
#print(' unfiltered VV:')
VV_dB_un, stat['VVdB'] = dBFeatStat(VV_dB_un,datamask)
else:
del(VV_dB_un)
if featSelection[2]==1:
#print(' boxcar coherence:')
COH_boxcar, stat['coh'] = featStat(COH_boxcar,datamask)
else:
del(COH_boxcar)
if featSelection[3]==1:
#print(' VH real part:')
VH_real, stat['VH_real'] = realImagFeatStat(VH_real,datamask)
else:
del(VH_real)
if featSelection[4]==1:
#print(' VH imaginary part:')
VH_imag, stat['VH_imag'] = realImagFeatStat(VH_imag,datamask)
else:
del(VH_imag)
if featSelection[5]==1:
#print(' VV real part:')
VV_real, stat['VV_real'] = realImagFeatStat(VV_real,datamask)
else:
del(VV_real)
if featSelection[6]==1:
#print(' VV imaginary part:')
VV_imag, stat['VV_imag'] = realImagFeatStat(VV_imag,datamask)
else:
del(VV_imag)
del(dataTif)
if np.sum(featSelection[7:]) > 0:
# lee-filtered data need to be loaded
dataPath = dataPath.replace('unfilt_','')
try:
dataTif = gdal.Open(dataPath)
except RuntimeError as e:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: the given data geotiff can not be open by GDAL")
print("DIRECTORY: "+dataPath)
print("GDAL EXCEPCTION: "+e)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
data = dataTif.ReadAsArray()
data = np.stack((data[0,:,:],data[3,:,:],data[1,:,:],data[2,:,:]))
if featSelection[7]==1:
#print(' LEE filtered VH:')
VH_dB_lee, stat['VH_dB_lee'] = dBFeatStat(data[0,:,:],datamask)
if featSelection[8]==1:
#print(' LEE filtered VV:')
VV_dB_lee, stat['VV_dB_lee'] = dBFeatStat(data[1,:,:],datamask)
if featSelection[9]==1:
#print(' LEE coherence:')
COH_lee = np.sqrt(np.add(np.square(data[2,:,:]),np.square(data[3,:,:])))/np.sqrt(data[1,:,:]*data[0,:,:])
COH_lee, stat['COH_lee'] = featStat(COH_lee,datamask)
if featSelection[10]==1:
PHA_lee = np.cos(np.arctan2(data[3,:,:],data[2,:,:]))
PHA_lee, stat['PHA_lee'] = featStat(PHA_lee,datamask)
data = np.zeros([np.sum(featSelection),data.shape[1],data.shape[2]])
orderChl = 0
if featSelection[0]==1:
data[orderChl,:,:] = VH_dB_un
orderChl = orderChl + 1
del(VH_dB_un)
if featSelection[1]==1:
data[orderChl,:,:] = VV_dB_un
orderChl = orderChl + 1
del(VV_dB_un)
if featSelection[2]==1:
data[orderChl,:,:] = COH_boxcar
orderChl = orderChl + 1
del(COH_boxcar)
if featSelection[3]==1:
data[orderChl,:,:] = VH_real
orderChl = orderChl + 1
del(VH_real)
if featSelection[4]==1:
data[orderChl,:,:] = VH_imag
orderChl = orderChl + 1
del(VH_imag)
if featSelection[5]==1:
data[orderChl,:,:] = VV_real
orderChl = orderChl + 1
del(VV_real)
if featSelection[6]==1:
data[orderChl,:,:] = VV_imag
orderChl = orderChl + 1
del(VV_imag)
if featSelection[7]==1:
data[orderChl,:,:] = VH_dB_lee
orderChl = orderChl + 1
del(VH_dB_lee)
if featSelection[8]==1:
data[orderChl,:,:] = VV_dB_lee
orderChl = orderChl + 1
del(VV_dB_lee)
if featSelection[9]==1:
data[orderChl,:,:] = COH_lee
orderChl = orderChl + 1
del(COH_lee)
if featSelection[10]==1:
data[orderChl,:,:] = PHA_lee
orderChl = orderChl + 1
del(PHA_lee)
return data,datamask,stat
def getPath2LCZTIFF(dataPath, labelPath):
city = dataPath.split('/')[-4].split('_')[-1].lower()
gtPath = labelPath + '/' + city + '/' + city + '_lcz_GT.tif'
return gtPath
def getPath2LCZ42v3(dataPath,labelPath):
city = dataPath.split('/')[-4].split('_')[-1].lower()
gtPath = labelPath + '/' + city + '_lcz_GT.tif'
return gtPath
def urbanPCAAlign(data,datamask,stat):
# dB thresholding
dBThreshold = -5
# find the channel of lee filtered vh in dB
k = stat.keys()
vhLeeIdx = 0
if 'VHdB' in k:
vhLeeIdx = vhLeeIdx + 1
if 'VVdB' in k:
vhLeeIdx = vhLeeIdx + 1
if 'coh' in k:
vhLeeIdx = vhLeeIdx + 1
if 'VH_real' in k:
vhLeeIdx = vhLeeIdx + 1
if 'VH_imag' in k:
vhLeeIdx = vhLeeIdx + 1
if 'VV_real' in k:
vhLeeIdx = vhLeeIdx + 1
if 'VV_imag' in k:
vhLeeIdx = vhLeeIdx + 1
VHdBLee = data[vhLeeIdx,:,:]
# masking data of urban
mask = (datamask*1*((VHdBLee>dBThreshold)*1))==1
urbanData = np.transpose(data[:,mask])
# shifting data by median
featMedian = np.median(urbanData,axis=0)
featMedian = np.tile(featMedian,urbanData.shape[0])
featMedian = np.reshape(featMedian,urbanData.shape)
urbanData = urbanData - featMedian
# pca projections
from sklearn.decomposition import PCA
pca = PCA()
pca.fit(urbanData)
# projection data of the whole city
# step 1: median shift
dn,rw,cl = data.shape
data = np.transpose(data,(1,2,0))
data = np.reshape(data,(rw*cl,dn))
featMedian = featMedian[0,:]
featMedian = np.tile(featMedian,data.shape[0])
featMedian = np.reshape(featMedian,data.shape)
data = data - featMedian
# step 2: projection
dataPCA = pca.transform(data)
dataPCA = np.reshape(dataPCA,(rw,cl,dn))
dataPCA = np.transpose(dataPCA,(2,0,1))
print (dataPCA.shape)
statPCA = []
for i in range(0,dataPCA.shape[0]):
statPCA.append({'mean':np.mean(dataPCA[i,:,:]),'std':np.std(dataPCA[i,:,:]),'median':np.median(dataPCA[i,:,:])})
return dataPCA, statPCA
def getDataNormMStd(dataPath, featSelection = [0,0,0,0,0,0,0,0,0,0,0]):
# featSelection: a 1 by 11 vector indicating which features should be extract.
#
# e.g. [1,1,0,1,1,1,0] means extract 5 features: VH_dB_un, VV_dB_un, VH_dB_un, VV_dB_un and COH_lee
# by now, feature selection only support in total 7 features,
# ------------------------------------------------------------------------------------
# Feature | Shortcuts | delete code | preserving code
# ------------------------------------------------------------------------------------
# unfiltered VH in dB | VH_dB_un | 0 | 1
# unfiltered VV in dB | VV_dB_un | 0 | 1
# boxcar coherence | COH_boxcar | 0 | 1
# VH real part | VH_real | 0 | 1
# VH imag part | VH_imag | 0 | 1
# VV real part | VV_real | 0 | 1
# VV imag part | VV_imag | 0 | 1
# lee-filtered VH in dB | VH_dB_lee | 0 | 1
# lee-filtered VV in dB | VV_dB_lee | 0 | 1
# lee-filtered coherence | COH_lee | 0 | 1
# lee-filtered relative phase | PHA_lee | 0 | 1
# -------------------------------------------------------------------------------------
#
#
# dataPath = '/data/hu/LCZ42_SEN1/LCZ42_23052_LosAngeles-LongBeach-SantaAna/mosaic_unfilt_dat/201706/mosaic.tif'
# featSelection = [1,1,0,1,1,1,0]
# if featSelection = [0,0,0,0,0,0,0] return data mask and unfiltered data
# generate data mask from lee-filtered data
maskPath = dataPath.replace('unfilt_','')
try:
#print(maskPath)
maskTif = gdal.Open(maskPath)
except RuntimeError as e:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: the given data geotiff can not be open by GDAL")
print("DIRECTORY: "+dataPath)
print("GDAL EXCEPCTION: "+e)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
data = maskTif.ReadAsArray()
# data mask extraction
datamask = (data[0,:,:]+data[3,:,:])>0
del(maskTif)
del(data)
# initial a dictionary to save maximum, minimum, histogram of intensity bands
stat = {}
# unfiltered data need to be loaded
if np.sum(featSelection[:7]) > 0:
try:
dataTif = gdal.Open(dataPath)
except RuntimeError as e:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: the given data geotiff can not be open by GDAL")
print("DIRECTORY: "+dataPath)
print("GDAL EXCEPCTION: "+e)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
# read data
data = dataTif.ReadAsArray()
data[np.isnan(data)] = 0
VH_real = data[0,:,:].copy()
VH_imag = data[1,:,:].copy()
VV_real = data[2,:,:].copy()
VV_imag = data[3,:,:].copy()
# convert complex form into covariance matrix form
tempData = np.zeros(data.shape)
tempData[0,:,:] = data[0,:,:]*data[0,:,:] + data[1,:,:]*data[1,:,:]
tempData[1,:,:] = data[2,:,:]*data[2,:,:] + data[3,:,:]*data[3,:,:]
tempData[2,:,:] = data[0,:,:]*data[2,:,:] + data[1,:,:]*data[3,:,:]
tempData[3,:,:] = data[1,:,:]*data[2,:,:] - data[0,:,:]*data[3,:,:]
data = tempData
del(tempData)
# get intensity of VV, VH; get boxcar coherence
VH_dB_un = data[0,:,:]
VV_dB_un = data[1,:,:]
data = boxcar(data,1)
VH_dB_un[VH_dB_un==0] = data[0,VH_dB_un==0]
VV_dB_un[VV_dB_un==0] = data[1,VV_dB_un==0]
COH_boxcar = np.sqrt(np.add(np.square(data[2,:,:]),np.square(data[3,:,:])))/np.sqrt(data[1,:,:]*data[0,:,:])
del(data)
# convert intensity into dB, and normalization
if featSelection[0]==1:
#print(' unfiltered VH:')
VH_dB_un, stat['VHdB'] = dBFeatStat(VH_dB_un,datamask)
else:
del(VH_dB_un)
if featSelection[1]==1:
#print(' unfiltered VV:')
VV_dB_un, stat['VVdB'] = dBFeatStat(VV_dB_un,datamask)
else:
del(VV_dB_un)
if featSelection[2]==1:
#print(' boxcar coherence:')
COH_boxcar, stat['coh'] = featStat(COH_boxcar,datamask)
else:
del(COH_boxcar)
if featSelection[3]==1:
#print(' VH real part:')
VH_real, stat['VH_real'] = realImagFeatStat(VH_real,datamask)
else:
del(VH_real)
if featSelection[4]==1:
#print(' VH imaginary part:')
VH_imag, stat['VH_imag'] = realImagFeatStat(VH_imag,datamask)
else:
del(VH_imag)
if featSelection[5]==1:
#print(' VV real part:')
VV_real, stat['VV_real'] = realImagFeatStat(VV_real,datamask)
else:
del(VV_real)
if featSelection[6]==1:
#print(' VV imaginary part:')
VV_imag, stat['VV_imag'] = realImagFeatStat(VV_imag,datamask)
else:
del(VV_imag)
del(dataTif)
if np.sum(featSelection[7:]) > 0:
# lee-filtered data need to be loaded
dataPath = dataPath.replace('unfilt_','')
try:
dataTif = gdal.Open(dataPath)
except RuntimeError as e:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: the given data geotiff can not be open by GDAL")
print("DIRECTORY: "+dataPath)
print("GDAL EXCEPCTION: "+e)
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
data = dataTif.ReadAsArray()
data = np.stack((data[0,:,:],data[3,:,:],data[1,:,:],data[2,:,:]))
if featSelection[7]==1:
#print(' LEE filtered VH:')
VH_dB_lee, stat['VH_dB_lee'] = dBFeatStat(data[0,:,:],datamask)
if featSelection[8]==1:
#print(' LEE filtered VV:')
VV_dB_lee, stat['VV_dB_lee'] = dBFeatStat(data[1,:,:],datamask)
if featSelection[9]==1:
#print(' LEE coherence:')
COH_lee = np.sqrt(np.add(np.square(data[2,:,:]),np.square(data[3,:,:])))/np.sqrt(data[1,:,:]*data[0,:,:])
COH_lee, stat['COH_lee'] = featStat(COH_lee,datamask)
if featSelection[10]==1:
PHA_lee = np.cos(np.arctan2(data[3,:,:],data[2,:,:]))
PHA_lee, stat['PHA_lee'] = featStat(PHA_lee,datamask)
data = np.zeros([np.sum(featSelection),data.shape[1],data.shape[2]])
orderChl = 0
if featSelection[0]==1:
data[orderChl,:,:] = (VH_dB_un-stat['VHdB']['mean'])/stat['VHdB']['std']
orderChl = orderChl + 1
del(VH_dB_un)
if featSelection[1]==1:
data[orderChl,:,:] = (VV_dB_un-stat['VVdB']['mean'])/stat['VVdB']['std']
orderChl = orderChl + 1
del(VV_dB_un)
if featSelection[2]==1:
data[orderChl,:,:] = (COH_boxcar-stat['coh']['mean'])/stat['coh']['std']
orderChl = orderChl + 1
del(COH_boxcar)
if featSelection[3]==1:
data[orderChl,:,:] = (VH_real-stat['VH_real']['mean'])/stat['VH_real']['std']
orderChl = orderChl + 1
del(VH_real)
if featSelection[4]==1:
data[orderChl,:,:] = (VH_imag-stat['VH_imag']['mean'])/stat['VH_imag']['std']
orderChl = orderChl + 1
del(VH_imag)
if featSelection[5]==1:
data[orderChl,:,:] = (VV_real-stat['VV_real']['mean'])/stat['VV_real']['std']
orderChl = orderChl + 1
del(VV_real)
if featSelection[6]==1:
data[orderChl,:,:] = (VV_imag-stat['VV_imag']['mean'])/stat['VV_imag']['std']
orderChl = orderChl + 1
del(VV_imag)
if featSelection[7]==1:
data[orderChl,:,:] = (VH_dB_lee-stat['VH_dB_lee']['mean'])/stat['VH_dB_lee']['std']
orderChl = orderChl + 1
del(VH_dB_lee)
if featSelection[8]==1:
data[orderChl,:,:] = (VV_dB_lee-stat['VV_dB_lee']['mean'])/stat['VV_dB_lee']['std']
orderChl = orderChl + 1
del(VV_dB_lee)
if featSelection[9]==1:
data[orderChl,:,:] = (COH_lee-stat['COH_lee']['mean'])/stat['COH_lee']['std']
orderChl = orderChl + 1
del(COH_lee)
if featSelection[10]==1:
data[orderChl,:,:] = (PHA_lee-stat['PHA_lee']['mean'])/stat['PHA_lee']['std']
orderChl = orderChl + 1
del(PHA_lee)
return data,datamask,stat
def getTrainData(dataPath, gtPath, featSelection = 0,patchSize = 32):
# featSelection: a 1 by 11 vector indicating which features should be extract.
#
# e.g. [1,1,0,0,0,0,0,1,1,1,0] means extract 5 features: VH_dB_un, VV_dB_un, VH_dB_un, VV_dB_un and COH_lee
# by now, feature selection only support in total 7 features,
# ------------------------------------------------------------------------------------
# Feature | Shortcuts | delete code | preserving code
# ------------------------------------------------------------------------------------
# unfiltered VH in dB | VH_dB_un | 0 | 1
# unfiltered VV in dB | VV_dB_un | 0 | 1
# boxcar coherence | COH_boxcar | 0 | 1
# VH real part | VH_real | 0 | 1
# VH imag part | VH_imag | 0 | 1
# VV real part | VV_real | 0 | 1
# VV imag part | VV_imag | 0 | 1
# lee-filtered VH in dB | VH_dB_lee | 0 | 1
# lee-filtered VV in dB | VV_dB_lee | 0 | 1
# lee-filtered coherence | COH_lee | 0 | 1
# lee-filtered relative phase | PHA_lee | 0 | 1
# -------------------------------------------------------------------------------------
#
#
# dataPath = '/media/sf_So2Sat/data/LCZ42_22606_Zurich/mosaic_unfilt_dat/201612/mosaic.tif'
# gtPath = '/media/sf_So2Sat/LCZ42/zurich/zurich_lcz_GT.tif'
# featSelection = [0,0,0,0,1,1,0]
if np.sum(featSelection) == 0:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: No feature is selected")
print("ERROR: featSeletion = "+str(featSelection))
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
# get data and nodata mask
data, mask, stat = getData(dataPath, featSelection)
# eliminate the data on the boundary
mask[:np.int(np.floor(patchSize/2)),:] = False
mask[-np.int(np.floor(patchSize/2)):,:] = False
mask[:,:np.int(np.floor(patchSize/2))] = False
mask[:,-np.int(np.floor(patchSize/2)):] = False
col_dat,row_dat,xWorld,yWorld,row_lab,col_lab = getCoordinate(gtPath,dataPath)
coord = np.stack((row_dat,col_dat,xWorld,yWorld,row_lab,col_lab),axis=1)
del col_dat,row_dat,xWorld,yWorld,row_lab,col_lab
# read label data and find data corresponds to label location
labelTif = gdal.Open(gtPath)
label = labelTif.ReadAsArray()
label[label<0] = 0
label[label>100] = label[label>100]-90
# delete those labels on boundary or in no data area
coord = coordInMask(coord, mask, patchSize)
row_dat = coord[:,0].astype(np.int)
col_dat = coord[:,1].astype(np.int)
row_lab = coord[:,4].astype(np.int)
col_lab = coord[:,5].astype(np.int)
# cutting patches
trainDat = np.zeros([len(row_dat),patchSize,patchSize,data.shape[0]])
trainLab = np.zeros([len(row_dat),17])
for idx in range(0,len(row_dat)):
trainDat[idx,:,:,:] = np.transpose(data[:,int(row_dat[idx]-patchSize/2):int(row_dat[idx]+patchSize/2),int(col_dat[idx]-patchSize/2):int(col_dat[idx]+patchSize/2)],(1,2,0))
trainLab[idx,int(label[row_lab[idx],col_lab[idx]]-1)] = 1
return trainDat, trainLab, coord, stat
def cutPatches(imagRow,imagCol,data,patchSize):
# this function cut patches from data
# Input
# - imagRow -- image row location of patch centers
# - imagCol -- image col location of patch centers
# - data -- data to be cut [dn,rw,cl]
# - patchSize -- size of patch
#
# Output
# - dat -- data patches in [nb,rw,cl,dn]; nb: number of patches, rw: row of pathces, cl: cl of patches, dn: dimension of patches
dat = np.zeros([len(imagRow),patchSize,patchSize,data.shape[0]])
for idx in range(0,len(imagRow)):
dat[idx,:,:,:] = np.transpose(data[:,int(imagRow[idx]-patchSize/2):int(imagRow[idx]+patchSize/2),int(imagCol[idx]-patchSize/2):int(imagCol[idx]+patchSize/2)],(1,2,0))
return dat
def getTrainTestData(dataPath, trainMapPath, testMapPath, featSelection = [1,1,1,1,1,1,1,1,1,1,1], patchSize = 32):
# featSelection: a 1 by 7 vector indicating which features should be extract.
#
# e.g. [1,1,0,1,1,1,0] means extract 5 features: VH_dB_un, VV_dB_un, VH_dB_un, VV_dB_un and COH_lee
# by now, feature selection only support in total 7 features,
# ------------------------------------------------------------------------------------
# Feature | Shortcuts | delete code | preserving code
# ------------------------------------------------------------------------------------
# unfiltered VH in dB | VH_dB_un | 0 | 1
# unfiltered VV in dB | VV_dB_un | 0 | 1
# boxcar coherence | COH_boxcar | 0 | 1
# VH real part | VH_real | 0 | 1
# VH imag part | VH_imag | 0 | 1
# VV real part | VV_real | 0 | 1
# VV imag part | VV_imag | 0 | 1
# lee-filtered VH in dB | VH_dB_lee | 0 | 1
# lee-filtered VV in dB | VV_dB_lee | 0 | 1
# lee-filtered coherence | COH_lee | 0 | 1
# lee-filtered relative phase | PHA_lee | 0 | 1
# -------------------------------------------------------------------------------------
#
#
# dataPath = '/media/sf_So2Sat/data/LCZ42_22606_Zurich/mosaic_unfilt_dat/201612/mosaic.tif'
# gtPath = '/media/sf_So2Sat/LCZ42/zurich/zurich_lcz_GT.tif'
# featSelection = [0,0,0,0,1,1,0]
if np.sum(featSelection) == 0:
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
print("ERROR: No feature is selected")
print("ERROR: featSeletion = "+str(featSelection))
print("!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!")
sys.exit(1)
# get data and data mask
data, mask, stat = getData(dataPath, featSelection)
# eliminate the data on the boundary
mask[:np.int(np.floor(patchSize/2)),:] = False
mask[-np.int(np.floor(patchSize/2)):,:] = False
mask[:,:np.int(np.floor(patchSize/2))] = False
mask[:,-np.int(np.floor(patchSize/2)):] = False
# load training data based on training map
col_dat,row_dat,xWorld,yWorld,row_lab,col_lab = getCoordinate(trainMapPath,dataPath)
trainCoord = np.stack((row_dat,col_dat,xWorld,yWorld,row_lab,col_lab),axis=1)
del col_dat,row_dat,xWorld,yWorld,row_lab,col_lab
# read label data and find data corresponds to label location
labelTif = gdal.Open(trainMapPath)
label = labelTif.ReadAsArray()
label[label<0] = 0
label[label>100] = label[label>100]-90
# delete those labels on boundary or in no data area
trainCoord = coordInMask(trainCoord, mask, patchSize)
row_dat = trainCoord[:,0].astype(np.int)
col_dat = trainCoord[:,1].astype(np.int)
row_lab = trainCoord[:,4].astype(np.int)
col_lab = trainCoord[:,5].astype(np.int)
# cutting patches
trainDat = np.zeros([len(row_dat),patchSize,patchSize,data.shape[0]])
trainLab = np.zeros([len(row_dat),17])
for idx in range(0,len(row_dat)):
trainDat[idx,:,:,:] = np.transpose(data[:,int(row_dat[idx]-patchSize/2):int(row_dat[idx]+patchSize/2),int(col_dat[idx]-patchSize/2):int(col_dat[idx]+patchSize/2)],(1,2,0))
trainLab[idx,int(label[row_lab[idx],col_lab[idx]]-1)] = 1
del(labelTif)
# load testing data based on tesing map
col_dat,row_dat,xWorld,yWorld,row_lab,col_lab = getCoordinate(testMapPath,dataPath)
testCoord = np.stack((row_dat,col_dat,xWorld,yWorld,row_lab,col_lab),axis=1)
del col_dat,row_dat,xWorld,yWorld,row_lab,col_lab
# read label data and find data corresponds to label location
labelTif = gdal.Open(testMapPath)
label = labelTif.ReadAsArray()
label[label<0] = 0
label[label>100] = label[label>100]-90
# delete those labels on boundary or in no data area
testCoord = coordInMask(testCoord,mask,patchSize)
row_dat = testCoord[:,0].astype(np.int)
col_dat = testCoord[:,1].astype(np.int)
row_lab = testCoord[:,4].astype(np.int)
col_lab = testCoord[:,5].astype(np.int)
# cutting patches
testDat = np.zeros([len(row_dat),patchSize,patchSize,data.shape[0]])
testLab = np.zeros([len(row_dat),17])
for idx in range(0,len(row_dat)):
testDat[idx,:,:,:] = np.transpose(data[:,int(row_dat[idx]-patchSize/2):int(row_dat[idx]+patchSize/2),int(col_dat[idx]-patchSize/2):int(col_dat[idx]+patchSize/2)],(1,2,0))
testLab[idx,int(label[row_lab[idx],col_lab[idx]]-1)] = 1
return trainDat, trainLab, trainCoord, testDat, testLab, testCoord, stat
def coordInMask(coord,datamask,patchSize):
out_coord = coord.copy()
# delete label on northern boundary
row_dat = out_coord[:,0].copy()
idxBoundary = np.where(row_dat<np.floor(patchSize/2))
out_coord = np.delete(out_coord,idxBoundary,0)
# delete label on southern boundary
row_dat = out_coord[:,0].copy()
idxBoundary = np.where(row_dat>(datamask.shape[0]-np.floor(patchSize/2)-1))
out_coord = np.delete(out_coord,idxBoundary,0)
# delete label on western boundary
col_dat = out_coord[:,1].copy()
idxBoundary = np.where(col_dat<np.floor(patchSize/2))
out_coord = np.delete(out_coord,idxBoundary,0)
# delete label on eastern boundary
col_dat = out_coord[:,1].copy()
idxBoundary = np.where(col_dat>(datamask.shape[1]-np.floor(patchSize/2)-1))
out_coord = np.delete(out_coord,idxBoundary,0)
# delete label in no data area
row_dat = out_coord[:,0].copy()
col_dat = out_coord[:,1].copy()
idx = len(row_dat)-1
for row_val in reversed(row_dat):
if datamask[int(row_val-patchSize/2):int(row_val+patchSize/2),int(col_dat[idx]-patchSize/2):int(col_dat[idx]+patchSize/2)].all()==False:
row_dat = np.delete(row_dat,idx,0)
col_dat = np.delete(col_dat,idx,0)
out_coord = np.delete(out_coord,idx,0)
idx = idx - 1
return out_coord
def readH5DataPatches(cityList,dataStorePath,featSelection = [0,1,2,7,8,9,10]):
# cityList = ['vancouver']
# dataStorePath = '/data/hu/LCZ42_SEN1_H5_RANDOM_Feat11'
# read patch size
fid = h5py.File(dataStorePath + '/' + cityList[0] + '/' + cityList[0] + '.h5','r')
_,patchSize,_,_ = fid['x_tra'].shape
del fid
# initial data
x_tra = np.empty(shape=[1,patchSize,patchSize,len(featSelection)])
y_tra = np.zeros([1,17])
# read data of the city list
for i in range(0,len(cityList)):
city = cityList[i]
print('Loading data of the city: '+city)
h5filePath = dataStorePath + '/' + city + '/' + city + '.h5'
fid = h5py.File(h5filePath,'r')
if fid['x_test'].shape[0]==0:
x_tmp = fid['x_tra'][:,:,:,featSelection]
y_tmp = fid['y_tra'][:]
else:
tempA = fid['x_tra'][:,:,:,featSelection]
tempB = fid['x_test'][:,:,:,featSelection]
x_tmp = np.concatenate((tempA,tempB),axis=0)
del tempA, tempB
tempA = fid['y_tra'][:]
tempB = fid['y_test'][:]
y_tmp = np.concatenate((tempA,tempB),axis=0)
del tempA,tempB
fid.close
del fid
x_tra = np.concatenate((x_tra,x_tmp),axis=0)
y_tra = np.concatenate((y_tra,y_tmp),axis=0)
# delete the first element which is used to initialize x_tra and y_tra
x_tra = np.delete(x_tra, (0), axis=0)
y_tra = np.delete(y_tra, (0), axis=0)
return x_tra, y_tra
| 40.098873 | 234 | 0.528096 | 5,615 | 46,234 | 4.245058 | 0.086198 | 0.010069 | 0.004405 | 0.004699 | 0.772907 | 0.749203 | 0.718996 | 0.707627 | 0.697307 | 0.690888 | 0 | 0.028216 | 0.265649 | 46,234 | 1,152 | 235 | 40.133681 | 0.673834 | 0.231129 | 0 | 0.675325 | 0 | 0.001299 | 0.144389 | 0.071756 | 0 | 0 | 0 | 0 | 0 | 1 | 0.032468 | false | 0 | 0.009091 | 0 | 0.074026 | 0.120779 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
03a2e25e4951cf0bee3a9858570ab1a3c655f866 | 129 | py | Python | fcm_devices/apps.py | lukeburden/django-fcm-devices | 349ddded0059fe023d0c92ce8512d7e22e169d24 | [
"MIT"
] | null | null | null | fcm_devices/apps.py | lukeburden/django-fcm-devices | 349ddded0059fe023d0c92ce8512d7e22e169d24 | [
"MIT"
] | null | null | null | fcm_devices/apps.py | lukeburden/django-fcm-devices | 349ddded0059fe023d0c92ce8512d7e22e169d24 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class FCMDevicesConfig(AppConfig):
name = "fcm_devices"
verbose_name = "FCM Devices"
| 18.428571 | 34 | 0.744186 | 15 | 129 | 6.266667 | 0.733333 | 0.148936 | 0.297872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178295 | 129 | 6 | 35 | 21.5 | 0.886792 | 0 | 0 | 0 | 0 | 0 | 0.170543 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
03d260b30d7ee12153a08bd55d9eb938f4d911db | 89 | py | Python | spacyal/apps.py | sennierer/spacyal | 4cf97b67690fe33efeaa999d7c82d29612d3da21 | [
"MIT"
] | 6 | 2018-07-26T12:02:49.000Z | 2020-06-12T15:02:07.000Z | spacyal/apps.py | sennierer/spacyal | 4cf97b67690fe33efeaa999d7c82d29612d3da21 | [
"MIT"
] | 15 | 2018-07-26T19:33:46.000Z | 2022-02-12T02:46:34.000Z | spacyal/apps.py | sennierer/spacyal | 4cf97b67690fe33efeaa999d7c82d29612d3da21 | [
"MIT"
] | 4 | 2019-01-02T14:51:02.000Z | 2021-08-12T02:17:02.000Z | from django.apps import AppConfig
class SpacyalConfig(AppConfig):
name = 'spacyal'
| 14.833333 | 33 | 0.752809 | 10 | 89 | 6.7 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168539 | 89 | 5 | 34 | 17.8 | 0.905405 | 0 | 0 | 0 | 0 | 0 | 0.078652 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
03d800539ecf64f1338e0c0656abcb2a896608c6 | 92 | py | Python | frontend/batterycycling/ellingham/apps.py | atomisticnet/gibbsml | 43a0e176160b522208320754d07966c8ed9a54a2 | [
"MIT"
] | 5 | 2021-12-02T07:59:23.000Z | 2022-02-12T06:03:56.000Z | frontend/batterycycling/ellingham/apps.py | atomisticnet/gibbsml | 43a0e176160b522208320754d07966c8ed9a54a2 | [
"MIT"
] | null | null | null | frontend/batterycycling/ellingham/apps.py | atomisticnet/gibbsml | 43a0e176160b522208320754d07966c8ed9a54a2 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class EllinghamConfig(AppConfig):
name = 'ellingham'
| 18.4 | 33 | 0.771739 | 10 | 92 | 7.1 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152174 | 92 | 4 | 34 | 23 | 0.910256 | 0 | 0 | 0 | 0 | 0 | 0.097826 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
ff0a9ad88424858ae0673bdbc2fca226902802c4 | 183 | py | Python | src/accedian_hostinfo/gcloud/address.py | vivienm/accedian-hostinfo | 8e23ca271e5e6978027db2e6dcfcd177f8a98f3b | [
"MIT"
] | null | null | null | src/accedian_hostinfo/gcloud/address.py | vivienm/accedian-hostinfo | 8e23ca271e5e6978027db2e6dcfcd177f8a98f3b | [
"MIT"
] | null | null | null | src/accedian_hostinfo/gcloud/address.py | vivienm/accedian-hostinfo | 8e23ca271e5e6978027db2e6dcfcd177f8a98f3b | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from ipaddress import IPv4Address
@dataclass(frozen=True)
class Address:
"""A Google Cloud address."""
name: str
address: IPv4Address
| 16.636364 | 33 | 0.737705 | 21 | 183 | 6.428571 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013423 | 0.185792 | 183 | 10 | 34 | 18.3 | 0.892617 | 0.125683 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
207bf17bb5df0eb74aabcb394cc5e9ce1b95621c | 501 | py | Python | tests/test_main.py | engeir/release-trader | f18cea4269ed84d45555f7b18e27e3b60b1a8738 | [
"MIT"
] | null | null | null | tests/test_main.py | engeir/release-trader | f18cea4269ed84d45555f7b18e27e3b60b1a8738 | [
"MIT"
] | null | null | null | tests/test_main.py | engeir/release-trader | f18cea4269ed84d45555f7b18e27e3b60b1a8738 | [
"MIT"
] | null | null | null | """Test cases for the __main__ module."""
# import pytest
# from click.testing import CliRunner
# from release_trader import __main__
# @pytest.fixture
# def runner() -> CliRunner:
# """Fixture for invoking command-line interfaces."""
# return CliRunner()
# def test_main_succeeds(runner: CliRunner) -> None:
# """It exits with a status code of zero."""
# result = runner.invoke(__main__.main)
# assert result.exit_code == 0
def test_placeholder():
"""Docstring."""
pass
| 27.833333 | 57 | 0.680639 | 61 | 501 | 5.311475 | 0.655738 | 0.092593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002451 | 0.185629 | 501 | 17 | 58 | 29.470588 | 0.791667 | 0.854291 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
20981fa19e1de0287be7e08f09a6a35099019a55 | 538 | py | Python | weather_api/migrations/0003_auto_20200917_2126.py | r-i-y-a/weather-report-repo | c9ef023e9cf096a81e18deba4458e8c8f0e04003 | [
"MIT"
] | 1 | 2021-03-18T18:06:27.000Z | 2021-03-18T18:06:27.000Z | weather_api/migrations/0003_auto_20200917_2126.py | r-i-y-a/weather-report-repo | c9ef023e9cf096a81e18deba4458e8c8f0e04003 | [
"MIT"
] | null | null | null | weather_api/migrations/0003_auto_20200917_2126.py | r-i-y-a/weather-report-repo | c9ef023e9cf096a81e18deba4458e8c8f0e04003 | [
"MIT"
] | null | null | null | # Generated by Django 2.2 on 2020-09-17 21:26
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('weather_api', '0002_auto_20200917_1409'),
]
operations = [
migrations.RemoveField(
model_name='city',
name='description',
),
migrations.RemoveField(
model_name='city',
name='icon',
),
migrations.RemoveField(
model_name='city',
name='temperature',
),
]
| 20.692308 | 51 | 0.54461 | 49 | 538 | 5.836735 | 0.632653 | 0.22028 | 0.272727 | 0.314685 | 0.398601 | 0.398601 | 0 | 0 | 0 | 0 | 0 | 0.084986 | 0.343866 | 538 | 25 | 52 | 21.52 | 0.725212 | 0.079926 | 0 | 0.473684 | 1 | 0 | 0.146045 | 0.046653 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.052632 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
20a4ae06b30a82c855bbbe85578c90914f536b09 | 179 | py | Python | src/xsd_frontend/templatetags/forms.py | minyiky/xSACdb | 8c407e9a9da196750a66ad53613ad67c8c56e1c3 | [
"MIT"
] | 2 | 2017-08-14T14:40:17.000Z | 2019-02-07T13:10:23.000Z | src/xsd_frontend/templatetags/forms.py | minyiky/xSACdb | 8c407e9a9da196750a66ad53613ad67c8c56e1c3 | [
"MIT"
] | 19 | 2016-02-07T18:02:53.000Z | 2019-11-03T17:48:13.000Z | src/xsd_frontend/templatetags/forms.py | minyiky/xSACdb | 8c407e9a9da196750a66ad53613ad67c8c56e1c3 | [
"MIT"
] | 4 | 2015-10-19T17:24:35.000Z | 2021-05-12T07:30:32.000Z |
from django import template
register = template.Library()
@register.filter()
def xsd_field_type(field):
return field.field.__class__.__name__.lower().replace('field', '')
| 17.9 | 70 | 0.743017 | 22 | 179 | 5.590909 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117318 | 179 | 9 | 71 | 19.888889 | 0.778481 | 0 | 0 | 0 | 0 | 0 | 0.028249 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.2 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
20ebfb9215d0165c0b596224c45984b3fd683c3a | 1,587 | py | Python | neuroanalysis/data/loaders/loaders.py | meganbkratz/neuroanalysis | b67b52c35bc730c8c8e8d91f15c9cc996be70063 | [
"MIT"
] | 5 | 2017-08-29T20:41:28.000Z | 2021-04-26T07:12:22.000Z | neuroanalysis/data/loaders/loaders.py | meganbkratz/neuroanalysis | b67b52c35bc730c8c8e8d91f15c9cc996be70063 | [
"MIT"
] | 3 | 2018-07-27T10:55:17.000Z | 2021-12-07T21:20:47.000Z | neuroanalysis/data/loaders/loaders.py | meganbkratz/neuroanalysis | b67b52c35bc730c8c8e8d91f15c9cc996be70063 | [
"MIT"
] | 9 | 2018-03-04T14:59:21.000Z | 2021-12-07T20:18:36.000Z |
class DatasetLoader():
"""An abstract base class for Dataset loaders."""
def get_dataset_name(self):
"""Return a string with the name of this dataset."""
raise NotImplementedError("Must be implemented in subclass.")
def get_sync_recordings(self, dataset):
"""Return a tuple (list of SyncRecordings, list of RecordingSequences)."""
raise NotImplementedError("Must be implemented in subclass.")
def get_recordings(self, sync_rec):
"""Return a dict of {device: Recording}"""
raise NotImplementedError("Must be implemented in subclass.")
def get_tseries_data(self, tseries):
"""Return a numpy array of the data in the tseries."""
raise NotImplementedError("Must be implemented in subclass.")
def load_stimulus(self, recording):
"""Return an instance of stimuli.Stimulus"""
raise NotImplementedError("Must be implemented in subclass.")
def load_stimulus_items(self, recording):
"""Return a list of Stimulus instances.
Used with LazyLoadStimulus to parse stimuli when they are needed."""
raise NotImplementedError("Must be implemented in subclass.")
def load_test_pulse(self, recording):
"""Return a PatchClampTestPulse."""
raise NotImplementedError("Must be implemented in subclass.")
def find_nearest_test_pulse(self, recording):
raise NotImplementedError("Must be implemented in subclass.")
def get_baseline_regions(self, recording):
raise NotImplementedError("Must be implemented in subclass.")
| 37.785714 | 82 | 0.693762 | 184 | 1,587 | 5.88587 | 0.331522 | 0.199446 | 0.232687 | 0.249307 | 0.515235 | 0.515235 | 0.515235 | 0.515235 | 0.465374 | 0.243767 | 0 | 0 | 0.214871 | 1,587 | 41 | 83 | 38.707317 | 0.869181 | 0.26339 | 0 | 0.473684 | 0 | 0 | 0.258065 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.473684 | false | 0 | 0 | 0 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
456cb4efcf52ba77a9478a21ee9c3e9983b397e4 | 957 | py | Python | env/lib/python3.6/site-packages/web3/testing.py | bopopescu/smart_contracts7 | 40a487cb3843e86ab5e4cb50b1aafa2095f648cd | [
"Apache-2.0"
] | null | null | null | env/lib/python3.6/site-packages/web3/testing.py | bopopescu/smart_contracts7 | 40a487cb3843e86ab5e4cb50b1aafa2095f648cd | [
"Apache-2.0"
] | null | null | null | env/lib/python3.6/site-packages/web3/testing.py | bopopescu/smart_contracts7 | 40a487cb3843e86ab5e4cb50b1aafa2095f648cd | [
"Apache-2.0"
] | 1 | 2020-07-24T17:53:25.000Z | 2020-07-24T17:53:25.000Z | from web3.utils.encoding import (
to_decimal,
)
from web3.utils.functional import (
apply_formatters_to_return,
)
class Testing(object):
def __init__(self, web3):
self.web3 = web3
def timeTravel(self, timestamp):
return self.web3._requestManager.request_blocking("testing_timeTravel", [timestamp])
def mine(self, num_blocks=1):
return self.web3._requestManager.request_blocking("evm_mine", [num_blocks])
@apply_formatters_to_return(to_decimal)
def snapshot(self):
return self.web3._requestManager.request_blocking("evm_snapshot", [])
def reset(self):
return self.web3._requestManager.request_blocking("evm_reset", [])
def revert(self, snapshot_idx=None):
if snapshot_idx is None:
return self.web3._requestManager.request_blocking("evm_revert", [])
else:
return self.web3._requestManager.request_blocking("evm_revert", [snapshot_idx])
| 30.870968 | 92 | 0.703239 | 113 | 957 | 5.646018 | 0.309735 | 0.100313 | 0.131661 | 0.263323 | 0.459248 | 0.459248 | 0.39185 | 0.319749 | 0 | 0 | 0 | 0.015444 | 0.188088 | 957 | 30 | 93 | 31.9 | 0.805663 | 0 | 0 | 0 | 0 | 0 | 0.07001 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.26087 | false | 0 | 0.086957 | 0.173913 | 0.652174 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
45962a2063b7f090b1de639fe756278ff347ae14 | 110 | py | Python | abc143/abc143_a.py | Vermee81/practice-coding-contests | 78aada60fa75f208ee0eef337b33b27b1c260d18 | [
"MIT"
] | null | null | null | abc143/abc143_a.py | Vermee81/practice-coding-contests | 78aada60fa75f208ee0eef337b33b27b1c260d18 | [
"MIT"
] | null | null | null | abc143/abc143_a.py | Vermee81/practice-coding-contests | 78aada60fa75f208ee0eef337b33b27b1c260d18 | [
"MIT"
] | null | null | null | # https://atcoder.jp/contests/abc143/tasks/abc143_a
A, B = map(int, input().split())
print(max(0, A - B * 2))
| 27.5 | 51 | 0.645455 | 20 | 110 | 3.5 | 0.8 | 0.057143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082474 | 0.118182 | 110 | 3 | 52 | 36.666667 | 0.639175 | 0.445455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
45a5657e2ddef74b13c6600c784c9567cd62868c | 532 | py | Python | usaspending_api/etl/transaction_loaders/tests/test_data_load_helpers.py | g4brielvs/usaspending-api | bae7da2c204937ec1cdf75c052405b13145728d5 | [
"CC0-1.0"
] | 217 | 2016-11-03T17:09:53.000Z | 2022-03-10T04:17:54.000Z | usaspending_api/etl/transaction_loaders/tests/test_data_load_helpers.py | g4brielvs/usaspending-api | bae7da2c204937ec1cdf75c052405b13145728d5 | [
"CC0-1.0"
] | 622 | 2016-09-02T19:18:23.000Z | 2022-03-29T17:11:01.000Z | usaspending_api/etl/transaction_loaders/tests/test_data_load_helpers.py | g4brielvs/usaspending-api | bae7da2c204937ec1cdf75c052405b13145728d5 | [
"CC0-1.0"
] | 93 | 2016-09-07T20:28:57.000Z | 2022-02-25T00:25:27.000Z | from usaspending_api.etl.transaction_loaders.data_load_helpers import capitalize_if_string, false_if_null
def test_capitalize_if_string():
assert capitalize_if_string("bob4") == "BOB4"
assert capitalize_if_string(7) == 7
assert capitalize_if_string(None) is None
def test_false_if_null():
assert false_if_null("true") == "true"
assert false_if_null("false") == "false"
assert false_if_null("n") == "n"
assert false_if_null(True)
assert not false_if_null(False)
assert not false_if_null(None)
| 31.294118 | 105 | 0.746241 | 80 | 532 | 4.5625 | 0.3125 | 0.153425 | 0.241096 | 0.186301 | 0.224658 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008869 | 0.152256 | 532 | 16 | 106 | 33.25 | 0.800443 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 0 | 0 | 0 | 0 | 0 | 0.75 | 1 | 0.166667 | true | 0 | 0.083333 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
45ad7700a44c230f4614bcf53902db86e07c9f69 | 111 | py | Python | ossPy/scrips/fdaExplore.py | team-oss/dspg20oss | dd9a3c5cd9c26a95bbc9b478ead86f09d9a30d8d | [
"MIT"
] | 1 | 2020-06-11T20:03:08.000Z | 2020-06-11T20:03:08.000Z | ossPy/scrips/fdaExplore.py | team-oss/dspg20oss | dd9a3c5cd9c26a95bbc9b478ead86f09d9a30d8d | [
"MIT"
] | null | null | null | ossPy/scrips/fdaExplore.py | team-oss/dspg20oss | dd9a3c5cd9c26a95bbc9b478ead86f09d9a30d8d | [
"MIT"
] | 2 | 2020-07-23T19:39:23.000Z | 2021-10-07T15:33:25.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
Created on Mon Jul 13 16:28:22 2020
@author: dnb3k
"""
fda | 12.333333 | 35 | 0.603604 | 19 | 111 | 3.526316 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164835 | 0.18018 | 111 | 9 | 36 | 12.333333 | 0.571429 | 0.855856 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
45c2e538ea0219759a38973a6e4f34e146406c4b | 42 | py | Python | tests/resources/system_tests/test_breakpoints/loopy.py | rozuur/ptvsd | 046fd0f054b2eed91ec5df02e5f36151b71e36b1 | [
"MIT"
] | null | null | null | tests/resources/system_tests/test_breakpoints/loopy.py | rozuur/ptvsd | 046fd0f054b2eed91ec5df02e5f36151b71e36b1 | [
"MIT"
] | null | null | null | tests/resources/system_tests/test_breakpoints/loopy.py | rozuur/ptvsd | 046fd0f054b2eed91ec5df02e5f36151b71e36b1 | [
"MIT"
] | null | null | null | a = 1
b = 2
for i in range(10):
c = a
| 8.4 | 19 | 0.452381 | 11 | 42 | 1.727273 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 0.404762 | 42 | 4 | 20 | 10.5 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
afdb1b81e70dc9c7355cc4be98722d1638fa2d10 | 355 | py | Python | todo/api/viewsets.py | ajoyoommen/zerrenda | de8eb722358318603e4f303eb2abb0b4ffa46561 | [
"MIT"
] | null | null | null | todo/api/viewsets.py | ajoyoommen/zerrenda | de8eb722358318603e4f303eb2abb0b4ffa46561 | [
"MIT"
] | null | null | null | todo/api/viewsets.py | ajoyoommen/zerrenda | de8eb722358318603e4f303eb2abb0b4ffa46561 | [
"MIT"
] | null | null | null | from rest_framework import viewsets
from .. import models
from . import serializers
class ListViewSet(viewsets.ModelViewSet):
serializer_class = serializers.ListSerializer
queryset = models.List.objects.all()
class ItemViewSet(viewsets.ModelViewSet):
serializer_class = serializers.ItemSerializer
queryset = models.Item.objects.all()
| 23.666667 | 49 | 0.785915 | 37 | 355 | 7.459459 | 0.513514 | 0.072464 | 0.217391 | 0.253623 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.138028 | 355 | 14 | 50 | 25.357143 | 0.901961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
aff56e320c9a5560889b5d24e354d5af7ac0afb3 | 44 | py | Python | 2741/code.py | ToTo-Mo/BaekJoon | e7540448a810e350b461b1300dd47bda58bfcaeb | [
"Apache-2.0"
] | null | null | null | 2741/code.py | ToTo-Mo/BaekJoon | e7540448a810e350b461b1300dd47bda58bfcaeb | [
"Apache-2.0"
] | null | null | null | 2741/code.py | ToTo-Mo/BaekJoon | e7540448a810e350b461b1300dd47bda58bfcaeb | [
"Apache-2.0"
] | null | null | null | for i in range(int(input())):
print(i+1) | 22 | 29 | 0.590909 | 9 | 44 | 2.888889 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027778 | 0.181818 | 44 | 2 | 30 | 22 | 0.694444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
b309815af192f5145648c7be05db1cc066ef7f1f | 75 | py | Python | tests/models/__init__.py | dnsosa/drug-lit-contradictory-claims | c03faa7269050344b631b12302214a3175384e98 | [
"MIT"
] | null | null | null | tests/models/__init__.py | dnsosa/drug-lit-contradictory-claims | c03faa7269050344b631b12302214a3175384e98 | [
"MIT"
] | 18 | 2020-07-18T14:28:07.000Z | 2021-02-01T08:36:35.000Z | tests/models/__init__.py | dnsosa/drug-lit-contradictory-claims | c03faa7269050344b631b12302214a3175384e98 | [
"MIT"
] | 6 | 2020-07-14T05:06:02.000Z | 2020-09-05T16:46:33.000Z | # -*- coding: utf-8 -*-
"""Unit tests for contradictory_claims/models."""
| 18.75 | 49 | 0.64 | 9 | 75 | 5.222222 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015385 | 0.133333 | 75 | 3 | 50 | 25 | 0.707692 | 0.88 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
b32d03b23d5e02cc87599720596daf5db6a5d3af | 141 | py | Python | base/models/skipped_entry.py | sudoguy/dtf_bot | 424172c527d27f8ccee412d497e5e82ca97e84a0 | [
"MIT"
] | 2 | 2020-03-14T08:20:46.000Z | 2021-03-19T06:47:39.000Z | base/models/skipped_entry.py | sudoguy/dtf_bot | 424172c527d27f8ccee412d497e5e82ca97e84a0 | [
"MIT"
] | 9 | 2019-08-06T02:01:35.000Z | 2022-02-10T07:50:47.000Z | base/models/skipped_entry.py | sudoguy/dtf_bot | 424172c527d27f8ccee412d497e5e82ca97e84a0 | [
"MIT"
] | null | null | null | from django.db import models
from .base import BaseModel
class SkippedEntry(BaseModel):
id = models.BigIntegerField(primary_key=True)
| 17.625 | 49 | 0.787234 | 18 | 141 | 6.111111 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141844 | 141 | 7 | 50 | 20.142857 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
b35db4c338e9600704b061c4a7c04ad8782ab67a | 184 | py | Python | notifier.py | YoussefEmad99/Real-Time-Whatsapp-Controller-Bot-For-Productivity | 8399a40a36222fad62b0026795ace9fdbeac6e29 | [
"MIT"
] | null | null | null | notifier.py | YoussefEmad99/Real-Time-Whatsapp-Controller-Bot-For-Productivity | 8399a40a36222fad62b0026795ace9fdbeac6e29 | [
"MIT"
] | null | null | null | notifier.py | YoussefEmad99/Real-Time-Whatsapp-Controller-Bot-For-Productivity | 8399a40a36222fad62b0026795ace9fdbeac6e29 | [
"MIT"
] | null | null | null | from win10toast import ToastNotifier
def notify(title, description):
notifier = ToastNotifier()
try:
notifier.show_toast(title, description)
except:
pass
| 18.4 | 47 | 0.684783 | 18 | 184 | 6.944444 | 0.777778 | 0.256 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014388 | 0.244565 | 184 | 9 | 48 | 20.444444 | 0.884892 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0.142857 | 0.142857 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
b3780de3a6b1454063470cfeca7ea6d8c4eb2adf | 211 | py | Python | kedro/extras/datasets/yaml/__init__.py | daniel-falk/kedro | 19187199339ddc4a757aaaa328f319ec4c1e452a | [
"Apache-2.0"
] | 2,047 | 2022-01-10T15:22:12.000Z | 2022-03-31T13:38:56.000Z | kedro/extras/datasets/yaml/__init__.py | daniel-falk/kedro | 19187199339ddc4a757aaaa328f319ec4c1e452a | [
"Apache-2.0"
] | 170 | 2022-01-10T12:44:31.000Z | 2022-03-31T17:01:24.000Z | kedro/extras/datasets/yaml/__init__.py | daniel-falk/kedro | 19187199339ddc4a757aaaa328f319ec4c1e452a | [
"Apache-2.0"
] | 112 | 2022-01-10T19:15:24.000Z | 2022-03-30T11:20:52.000Z | """``AbstractDataSet`` implementation to load/save data from/to a YAML file."""
__all__ = ["YAMLDataSet"]
from contextlib import suppress
with suppress(ImportError):
from .yaml_dataset import YAMLDataSet
| 23.444444 | 79 | 0.758294 | 25 | 211 | 6.2 | 0.72 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137441 | 211 | 8 | 80 | 26.375 | 0.851648 | 0.345972 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
2fde2d99b5bc790d23139a27f1ccaea0a3368161 | 528 | py | Python | pywps/inout/__init__.py | janpisl/pywps | 73a1835359f0503e08fb007d75de699bf3cf29ed | [
"MIT"
] | null | null | null | pywps/inout/__init__.py | janpisl/pywps | 73a1835359f0503e08fb007d75de699bf3cf29ed | [
"MIT"
] | null | null | null | pywps/inout/__init__.py | janpisl/pywps | 73a1835359f0503e08fb007d75de699bf3cf29ed | [
"MIT"
] | null | null | null | ##################################################################
# Copyright 2018 Open Source Geospatial Foundation and others #
# licensed under MIT, Please consult LICENSE.txt for details #
##################################################################
from pywps.inout.inputs import LiteralInput, ComplexInput, BoundingBoxInput
from pywps.inout.outputs import LiteralOutput, ComplexOutput, BoundingBoxOutput
from pywps.inout.formats import Format, FORMATS, get_format
from pywps.inout.basic import UOM
| 52.8 | 80 | 0.596591 | 48 | 528 | 6.541667 | 0.708333 | 0.11465 | 0.178344 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008565 | 0.11553 | 528 | 9 | 81 | 58.666667 | 0.663812 | 0.231061 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
2fdf1820e17e98db46a3eaee345874b23804f827 | 980 | py | Python | prometheus_configurator/manager.py | wikimedia/cloud-metricsinfra-prometheus-configurator | 9d95483065829cc43c155bccb60a19060c45d95e | [
"BSD-3-Clause"
] | null | null | null | prometheus_configurator/manager.py | wikimedia/cloud-metricsinfra-prometheus-configurator | 9d95483065829cc43c155bccb60a19060c45d95e | [
"BSD-3-Clause"
] | null | null | null | prometheus_configurator/manager.py | wikimedia/cloud-metricsinfra-prometheus-configurator | 9d95483065829cc43c155bccb60a19060c45d95e | [
"BSD-3-Clause"
] | null | null | null | import logging
import requests
import prometheus_configurator
logger = logging.getLogger(__name__)
class PrometheusManagerClient:
def __init__(self, base_url: str):
self.base_url = base_url
self.session = requests.Session()
self.session.headers['User-Agent'] = (
f'prometheus_configurator/{prometheus_configurator.__version__} '
+ f'python-requests/{requests.__version__}'
)
def get(self, url, **kwargs):
url = f"{self.base_url}/{url.lstrip('/')}"
logger.debug(f'performing http GET request to {url}')
return self.session.get(url, **kwargs).json()
def get_projects(self):
# TODO: shard projects across multiple prometheus instances? see T286301
return self.get('/v1/projects')
def get_project_details(self, project_id: int):
return self.get(f'/v1/projects/{project_id}')
def get_contact_groups(self):
return self.get('/v1/contact-groups')
| 29.69697 | 80 | 0.664286 | 117 | 980 | 5.307692 | 0.410256 | 0.045089 | 0.05314 | 0.048309 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011688 | 0.214286 | 980 | 32 | 81 | 30.625 | 0.794805 | 0.071429 | 0 | 0 | 0 | 0 | 0.257709 | 0.172907 | 0 | 0 | 0 | 0.03125 | 0 | 1 | 0.227273 | false | 0 | 0.136364 | 0.136364 | 0.590909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
2ffbd3b574a76958bc4e960a7affc2f7e99767d7 | 261 | py | Python | src/vsc/model/rand_obj_model_if.py | cmarqu/pyvsc | c7ff708256b7cdce0eccab8b7d6e2037edbdc5fa | [
"Apache-2.0"
] | null | null | null | src/vsc/model/rand_obj_model_if.py | cmarqu/pyvsc | c7ff708256b7cdce0eccab8b7d6e2037edbdc5fa | [
"Apache-2.0"
] | null | null | null | src/vsc/model/rand_obj_model_if.py | cmarqu/pyvsc | c7ff708256b7cdce0eccab8b7d6e2037edbdc5fa | [
"Apache-2.0"
] | null | null | null | '''
Created on Feb 29, 2020
@author: ballance
'''
class RandObjModelIF(object):
"""Implements a callback interface to notify about randomization phases"""
def do_pre_randomize(self):
pass
def do_post_randomize(self):
pass | 18.642857 | 78 | 0.659004 | 31 | 261 | 5.419355 | 0.83871 | 0.059524 | 0.202381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030612 | 0.249042 | 261 | 14 | 79 | 18.642857 | 0.826531 | 0.425287 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0.4 | 0 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 4 |
640117648c74550d785e27aaf1e262be72109fe7 | 34,226 | py | Python | scripts/graphs.py | ucbrise/waldo | d435d7f447a36b5d000f45ff1b0c73747c1d7aa2 | [
"Apache-2.0"
] | 7 | 2021-12-19T11:31:06.000Z | 2022-03-01T04:19:51.000Z | scripts/graphs.py | ucbrise/waldo | d435d7f447a36b5d000f45ff1b0c73747c1d7aa2 | [
"Apache-2.0"
] | null | null | null | scripts/graphs.py | ucbrise/waldo | d435d7f447a36b5d000f45ff1b0c73747c1d7aa2 | [
"Apache-2.0"
] | null | null | null | # Natacha Crooks - ncrooks@berkeley.edu - 2020
# This is an example script that plots all graphs for a particular paper.
# This script assumes that all experiments are located in expData, and
# makes use of the plotting library in graph_util
import shieldExperiment
import sys
sys.path.append("util/")
from ssh_util import *
from ec2_util import *
import boto.ec2
from compile_util import *
from prop_util import *
from math_util import *
from graph_util import *
import matplotlib
matplotlib.use('Agg')
import numpy as np
import matplotlib.pyplot as plt
# Script to plot all graphs. Needs to be updated when add new experiments
expData = "../experiments/results/current-results/"
plotTPC = False
plotSmallbank = False
plotFreeHealth = False
plotStrideDurability = False
plotCheckpointFrequency = True
plotParallelOram= True
plotStrideSize = True
plotWriteOpt = False
plotWriteBack= True
plotWriteBackStride = True
plotApplications = True
plotApplicationsSlides = True
plotBatchSucks = True
def aggregateDataThroughput(folder, dataPath, output,
outputStd, legend = None):
x = len(dataPath)
y = len(dataPath[0])
if (legend):
dat = np.zeros((y,x+1))
datStd = np.zeros((y,x+1))
startIndex = 1
for j in range(0,y):
dat[j][0] = legend[j]
datStd[j][0] = legend[j]
else:
dat = np.zeros((y,x))
datStd = np.zeros((y,x))
startIndex = 0
for i in range(0,x):
for j in range(0,y):
name = folder + "/" + dataPath[i][j]
print name
data = np.atleast_2d(np.loadtxt(name))
nbRepetitions = data.shape[0]
reps = np.zeros(nbRepetitions)
for r in range(0,nbRepetitions):
print name
reps[r] = (data[r][11])
print nbRepetitions
print reps
dat[j][i+startIndex] = np.mean(reps)
datStd[j][i+startIndex] = np.std(reps)
print np.std(reps)
np.savetxt(output, dat, delimiter=' ')
np.savetxt(outputStd, datStd, delimiter=' ')
def measureIncrease(dataPath, output, legend = None):
print "Measure Increase"
data = np.atleast_2d(np.loadtxt(dataPath))
print data
# What is changing is in a column
nbRows = data.shape[0]
nbCols = data.shape[1]
out= np.zeros((nbRows,nbCols))
if (legend):
index = 1
for i in range(0, nbRows):
out[i][0]=data[i][0]
else:
index = 0
for j in range(index,nbCols):
baseline = data[0][j]
for k in range (1, nbRows):
increase = data[k][j]/baseline
out[k][j]=increase
np.savetxt(output, out, delimiter=' ')
def aggregateDataLatency(folder, dataPath, output,
outputStd, legend = None):
x = len(dataPath)
y = len(dataPath[0])
if (legend):
dat = np.zeros((y,x+1))
datStd = np.zeros((y,x+1))
startIndex = 1
for j in range(0,y):
dat[j][0] = legend[j]
datStd[j][0] = legend[j]
else:
dat = np.zeros((y,x))
datStd = np.zeros((y,x))
startIndex = 0
for i in range(0,x):
for j in range(0,y):
name = folder + "/" + dataPath[i][j]
data = np.atleast_2d(np.loadtxt(name))
nbRepetitions = data.shape[0]
reps = np.zeros(nbRepetitions)
for r in range(0,nbRepetitions):
reps[r] = (data[r][1])
dat[j][i+startIndex] = np.mean(reps)
datStd[j][i+startIndex] = np.std(reps)
np.savetxt(output, dat, delimiter=' ')
np.savetxt(outputStd, datStd, delimiter=' ')
def main():
print "Durability: # of Strides"
if (plotStrideDurability):
folder = expData + "/" + "oram/durability/nb-strides"
outputThroughput = folder + "/aggT.dat"
outputLatency = folder + "/aggL.dat"
outputThroughputStd = folder + "/aggTStd.dat"
outputLatencyStd = folder + "/aggLStd.dat"
barNames = ["1", "2", "4", "6", "8"]
datasetNames = ["Server NoDur", "Server Dur", "Dynamo NoDur", "Dynamo Dur" ]
data = [
[
"oram-server-nb-stride-1-not-durable/results.dat",
"oram-server-nb-stride-2-not-durable/results.dat",
"oram-server-nb-stride-4-not-durable/results.dat",
"oram-server-nb-stride-6-not-durable/results.dat",
"oram-server-nb-stride-8-not-durable/results.dat",
],
[
"oram-server-nb-stride-1-durable/results.dat",
"oram-server-nb-stride-2-durable/results.dat",
"oram-server-nb-stride-4-durable/results.dat",
"oram-server-nb-stride-6-durable/results.dat",
"oram-server-nb-stride-8-durable/results.dat",
],
[
"oram-dynamo-nb-stride-1-not-durable/results.dat",
"oram-dynamo-nb-stride-2-not-durable/results.dat",
"oram-dynamo-nb-stride-4-not-durable/results.dat",
"oram-dynamo-nb-stride-6-not-durable/results.dat",
"oram-dynamo-nb-stride-8-not-durable/results.dat",
],
[
"oram-dynamo-nb-stride-1-durable/results.dat",
"oram-dynamo-nb-stride-2-durable/results.dat",
"oram-dynamo-nb-stride-4-durable/results.dat",
"oram-dynamo-nb-stride-6-durable/results.dat",
"oram-dynamo-nb-stride-8-durable/results.dat",
],
]
aggregateDataThroughput(folder,data, outputThroughput, outputThroughputStd)
aggregateDataLatency(folder,data, outputLatency, outputLatencyStd)
dat = [(outputThroughput,0), (outputThroughput,1), (outputThroughput,2), (outputThroughput,3)]
datStd = [(outputThroughputStd,0), (outputThroughputStd,1), (outputThroughputStd,2), (outputThroughputStd,3)]
plotBars("Durability (# of Batches)", barNames, datasetNames,
"Throughput (ops/s)", dat, True, folder + "/stride-durable-bars", datStd,
black=False, ylim = 12000, xAxis='Number of Read Batches')
dat = [(outputLatency,0), (outputLatency,1), (outputLatency,2), (outputLatency,3)]
datStd = [(outputLatencyStd,0), (outputLatencyStd,1), (outputLatencyStd,2), (outputLatencyStd,3)]
plotBars("Durability (# of Batches)", barNames, datasetNames,
"Latency (ms)", dat, True, folder + "/stride-durable-latency-bars", datStd, black=False,
xAxis='Number of Read Batches')
print "Durability: Checkpoint Frequency"
if (plotCheckpointFrequency):
folder = expData + "/" + "oram/durability/checkpoint-freq"
outputThroughput = folder + "/aggT.dat"
outputLatency = folder + "/aggL.dat"
outputThroughputStd = folder + "/aggTStd.dat"
outputLatencyStd = folder + "/aggLStd.dat"
barNames = ["1", "4", "16", "64", "256"]
datasetNames = ["Server", "Server WAN", "Dynamo"]
data = [
[
"oram-server-freq-1-durable/results.dat",
"oram-server-freq-4-durable/results.dat",
"oram-server-freq-16-durable/results.dat",
"oram-server-freq-64-durable/results.dat",
"oram-server-freq-256-durable/results.dat",
],
[
"oram-geoserver-freq-1-durable/results.dat",
"oram-geoserver-freq-4-durable/results.dat",
"oram-geoserver-freq-16-durable/results.dat",
"oram-geoserver-freq-64-durable/results.dat",
"oram-geoserver-freq-256-durable/results.dat",
],
[
"oram-dynamo-freq-1-durable/results.dat",
"oram-dynamo-freq-4-durable/results.dat",
"oram-dynamo-freq-16-durable/results.dat",
"oram-dynamo-freq-64-durable/results.dat",
"oram-dynamo-freq-256-durable/results.dat",
],
]
aggregateDataThroughput(folder,data, outputThroughput, outputThroughputStd)
aggregateDataLatency(folder,data, outputLatency, outputLatencyStd)
dat = [(outputThroughput,0), (outputThroughput, 1), (outputThroughput,2)]
datStd = [(outputThroughputStd,0), (outputThroughputStd,1), (outputThroughputStd,2)]
plotBars("Durability (Checkpoint Frequency)", barNames, datasetNames,
"Throughput (ops/s)", dat, True, folder + "/checkpoint-freq-throughput-bars", datStd,
black=False, ylim=8000)
dat = [(outputLatency,0),(outputLatency,1), (outputLatency,2)]
datStd = [(outputLatencyStd,0),(outputLatency,1), (outputLatency,2)]
plotBars("Durability (Checkpoint Frequency)", barNames, datasetNames,
"Latency (ms)", dat, True, folder + "/checkpoint-latency-bars", datStd, black=False,
xAxis='Checkpoint Frequency')
##################### Parallelisation of the ORAM ###################################
print "Parallel ORAM Results"
if (plotParallelOram):
folder = expData + "/" + "parallel-oram"
outputThroughput = folder + "/aggT.dat"
outputThroughputStd = folder + "aggTStd.dat"
barNames = ["Dummy", "Server", "Server WAN", "Dynamo"]
dataSetNames = ["Sequential", "Parallel", "ParallelCrypto"]
data = [ # Sequential
[
"base-oram-seq-dummy-nocrypto/results.dat",
"base-oram-seq-server-hashmap-nocrypto/results.dat",
"base-oram-seq-geoserver-hashmap-nocrypto/results.dat",
"base-oram-seq-dynamo-nocrypto/results.dat"],
# Parallel No Crypto
[
"base-oram-par-dummy-nocrypto/results.dat",
"base-oram-par-server-hashmap-nocrypto/results.dat",
"base-oram-par-geoserver-hashmap-nocrypto/results.dat",
"base-oram-par-dynamo-nocrypto/results.dat"
],
# Parallel
[
"base-oram-par-dummy/results.dat",
"base-oram-par-server-hashmap/results.dat",
"base-oram-par-geoserver-hashmap/results.dat",
"base-oram-par-dynamo/results.dat"
],
]
aggregateDataThroughput(folder,data, outputThroughput, outputThroughputStd)
dat = [(outputThroughput,0), (outputThroughput,1), (outputThroughput,2)]
datStd = [(outputThroughputStd,0), (outputThroughputStd,1), (outputThroughputStd,2)]
plotBars("Parallelisation - Throughput Impact", barNames, dataSetNames,
"Throughput (ops/s)", dat, True, folder + "/parallel-oram-throughput", datStd, True, black= False, logY=True)
#### ############## Impact of write optimisation #############################
print "Write Optimisations"
if (plotWriteOpt):
folder = expData + "/" + "write-opt"
outputThroughput = folder + "/aggT.dat"
outputLatency = folder + "/aggL.dat"
outputThroughputStd = folder + "/aggTStd.dat"
outputLatencyStd = folder + "/aggLStd.dat"
barNames = ["RO",
"RH", "WH", "WO"]
datasetNames = ["Dummy", "Dummy-Opt", "Server", "Server-Opt", "Dynamo", "Dynamo-Opt"]
data = [ # Dummy
[
"base-oram-par-dummy-rh/results.dat",
"base-oram-par-dummy-rh/results.dat",
"base-oram-par-dummy-wh/results.dat",
"base-oram-par-dummy-wo/results.dat",
],
# Dummy Opt
[
"base-oram-par-dummy-rh/results.dat",
"base-oram-par-dummy-rh-opt/results.dat",
"base-oram-par-dummy-wh-opt/results.dat",
"base-oram-par-dummy-wo-opt/results.dat",
],# Server
[
"base-oram-par-server-rh/results.dat",
"base-oram-par-server-rh/results.dat",
"base-oram-par-server-wh/results.dat",
"base-oram-par-server-wo/results.dat",
],
# Server Opt
[
"base-oram-par-server-rh/results.dat",
"base-oram-par-server-rh-opt/results.dat",
"base-oram-par-server-wh-opt/results.dat",
"base-oram-par-server-wo-opt/results.dat",
],
[
"base-oram-par-dynamo-rh/results.dat",
"base-oram-par-dynamo-rh/results.dat",
"base-oram-par-dynamo-wh/results.dat",
"base-oram-par-dynamo-wo/results.dat",
],
# Server Opt
[
"base-oram-par-dynamo-rh/results.dat",
"base-oram-par-dynamo-rh-opt/results.dat",
"base-oram-par-dynamo-wh-opt/results.dat",
"base-oram-par-dynamo-wo-opt/results.dat",
]
]
aggregateDataThroughput(folder,data, outputThroughput, outputThroughputStd)
aggregateDataLatency(folder,data, outputLatency,outputLatencyStd)
dat = [(outputThroughput,0), (outputThroughput,1), (outputThroughput,2), (outputThroughput,3), (outputThroughput,4), (outputThroughput,5)]
datStd = [(outputThroughputStd,0), (outputThroughputStd,1), (outputThroughputStd,2), (outputThroughputStd,3), (outputThroughput,4), (outputThroughput,5)]
plotBars("Write Optimisation", barNames, datasetNames,
"Throughput (ops/s)", dat, False, folder + "/writes-throughput-bars", datStd, black=False)
#### ############## Impact of write back #############################
print "Write Back"
if (plotWriteBack):
folder = expData + "/" + "writeback"
outputThroughput = folder + "/aggT.dat"
outputLatency = folder + "/aggL.dat"
outputThroughputStd = folder + "/aggTStd.dat"
outputLatencyStd = folder + "/aggLStd.dat"
barNames = ["Dummy", "Server", "Server WAN", "Dynamo"]
datasetNames = ["Normal", "Write Back"]
data = [
[
"base-oram-par-dummy/results.dat",
"base-oram-par-server-hashmap/results.dat",
"base-oram-par-geoserver-hashmap/results.dat",
"base-oram-par-dynamo/results.dat"
],
[
"base-oram-par-dummy-writeback/results.dat",
"base-oram-par-server-hashmap-writeback/results.dat",
"base-oram-par-geoserver-hashmap-writeback/results.dat",
"base-oram-par-dynamo-writeback/results.dat",
]
]
aggregateDataThroughput(folder,data, outputThroughput,outputThroughputStd)
aggregateDataLatency(folder,data, outputLatency,outputLatencyStd)
dat = [(outputThroughput,0), (outputThroughput,1)]
datStd = [(outputThroughputStd,0), (outputThroughputStd,1)]
plotBars("Delayed Write", barNames, datasetNames,
"Throughput (ops/s)", dat, False, folder + "/writes-throughput-bars", datStd, black=False)
dat = [(outputLatency,0), (outputLatency,1)]
datStd = [(outputLatencyStd,0), (outputLatencyStd,1)]
plotBars("Delayed Write", barNames, datasetNames,
"Latency (ms)", dat, False, folder + "/writes-latency-bars", datStd, black=False)
#### ##################### Impact of stride size #######################################
print "Batch Size Results"
if (plotStrideSize):
folder = expData + "/" + "strides"
outputThroughput = folder + "/aggT.dat"
outputLatency = folder + "/aggL.dat"
outputThroughputStd = folder + "/aggTStd.dat"
outputLatencyStd = folder + "/aggLStd.dat"
xAxis = [1,
10,100,
500,1000,2000,5000,
10000]
barNames = [
"1",
"10","100",
"500",
"1000",
"2000",
"5000",
"10000"]
datasetNames = ["Dummy", "Server", "Server WAN", "Dynamo"]
data = [ # Dummy
[
"base-oram-par-dummy-1/results.dat",
"base-oram-par-dummy-10/results.dat",
"base-oram-par-dummy-100/results.dat",
"base-oram-par-dummy-500/results.dat",
"base-oram-par-dummy-1000/results.dat",
"base-oram-par-dummy-2000/results.dat",
"base-oram-par-dummy-5000/results.dat",
"base-oram-par-dummy-10000/results.dat"],
# Server
[
"base-oram-par-server-hashmap-1/results.dat",
"base-oram-par-server-hashmap-10/results.dat",
"base-oram-par-server-hashmap-100/results.dat",
"base-oram-par-server-hashmap-500/results.dat",
"base-oram-par-server-hashmap-1000/results.dat",
"base-oram-par-server-hashmap-2000/results.dat",
"base-oram-par-server-hashmap-5000/results.dat",
"base-oram-par-server-hashmap-10000/results.dat",
],
# Server WAN
[
"base-oram-par-geoserver-hashmap-1/results.dat",
"base-oram-par-geoserver-hashmap-10/results.dat",
"base-oram-par-geoserver-hashmap-100/results.dat",
"base-oram-par-geoserver-hashmap-500/results.dat",
"base-oram-par-geoserver-hashmap-1000/results.dat",
"base-oram-par-geoserver-hashmap-2000/results.dat",
"base-oram-par-geoserver-hashmap-5000/results.dat",
"base-oram-par-geoserver-hashmap-10000/results.dat",
],
# Dynamo
[
"base-oram-par-dynamo-1/results.dat",
"base-oram-par-dynamo-10/results.dat",
"base-oram-par-dynamo-100/results.dat",
"base-oram-par-dynamo-500/results.dat",
"base-oram-par-dynamo-1000/results.dat",
"base-oram-par-dynamo-2000/results.dat",
"base-oram-par-dynamo-5000/results.dat",
"base-oram-par-dynamo-10000/results.dat",
],
]
aggregateDataThroughput(folder,data, outputThroughput, outputThroughputStd, xAxis)
aggregateDataLatency(folder,data, outputLatency, outputLatencyStd,xAxis)
dat = [(outputThroughput,1), (outputThroughput,2), (outputThroughput,3), (outputThroughput,4)]
datStd = [(outputThroughputStd,1), (outputThroughputStd,2), (outputThroughputStd,3), (outputThroughputStd,4)]
plotBars("Batch Size - Throughput Impact", barNames, datasetNames,
"Throughput (ops/s)", dat, True, folder + "/strides-throughput-bars", datStd, ylim = 14000, black=False)
dat = [(outputLatency,1), (outputLatency,2), (outputLatency,3), (outputLatency,4)]
datStd = [(outputLatencyStd,1), (outputLatencyStd,2), (outputLatencyStd,3), (outputLatencyStd,4)]
plotBars("Batch Size - Latency Impact", barNames, datasetNames,
"Latency (ms)", dat, True, folder + "/strides-latency-bars", datStd, black=False)
print "WriteBack Batch Size Results"
if (plotWriteBackStride):
folder = expData + "/" + "writebackstrides"
outputThroughput = folder + "/aggT.dat"
outputLatency = folder + "/aggL.dat"
outputThroughputStd = folder + "/aggTStd.dat"
outputLatencyStd = folder + "/aggLStd.dat"
xAxis = [1,
2,4,
6,8,
#10,
16, 32, 64,128,256,512]
barNames = ["1",
"2","4",
"6",
"8",
# "10",
"16", "32", "64", "128", "256", "512"]
datasetNames = ["Dummy", "Server", "Server WAN", "Dynamo"]
data = [ # Dummy
[
"base-oram-par-dummy-1/results.dat",
"base-oram-par-dummy-2/results.dat",
"base-oram-par-dummy-4/results.dat",
"base-oram-par-dummy-6/results.dat",
"base-oram-par-dummy-8/results.dat",
#"base-oram-par-dummy-10/results.dat",
"base-oram-par-dummy-16/results.dat",
"base-oram-par-dummy-32/results.dat",
"base-oram-par-dummy-64/results.dat",
"base-oram-par-dummy-128/results.dat",
"base-oram-par-dummy-256/results.dat",
"base-oram-par-dummy-512/results.dat",
],
# Server
[
"base-oram-par-server-hashmap-1/results.dat",
"base-oram-par-server-hashmap-2/results.dat",
"base-oram-par-server-hashmap-4/results.dat",
"base-oram-par-server-hashmap-6/results.dat",
"base-oram-par-server-hashmap-8/results.dat",
#"base-oram-par-server-hashmap-10/results.dat",
"base-oram-par-server-hashmap-16/results.dat",
"base-oram-par-server-hashmap-32/results.dat",
"base-oram-par-server-hashmap-64/results.dat",
"base-oram-par-server-hashmap-128/results.dat",
"base-oram-par-server-hashmap-256/results.dat",
"base-oram-par-server-hashmap-512/results.dat",
],
# Server WAN
[
"base-oram-par-geoserver-hashmap-1/results.dat",
"base-oram-par-geoserver-hashmap-2/results.dat",
"base-oram-par-geoserver-hashmap-4/results.dat",
"base-oram-par-geoserver-hashmap-6/results.dat",
"base-oram-par-geoserver-hashmap-8/results.dat",
#"base-oram-par-geoserver-hashmap-10/results.dat",
"base-oram-par-geoserver-hashmap-16/results.dat",
"base-oram-par-geoserver-hashmap-32/results.dat",
"base-oram-par-geoserver-hashmap-64/results.dat",
"base-oram-par-geoserver-hashmap-128/results.dat",
"base-oram-par-geoserver-hashmap-256/results.dat",
"base-oram-par-geoserver-hashmap-512/results.dat",
],
# Dynamo
[
"base-oram-par-dynamo-1/results.dat",
"base-oram-par-dynamo-2/results.dat",
"base-oram-par-dynamo-4/results.dat",
"base-oram-par-dynamo-6/results.dat",
"base-oram-par-dynamo-8/results.dat",
#"base-oram-par-dynamo-10/results.dat",
"base-oram-par-dynamo-16/results.dat",
"base-oram-par-dynamo-32/results.dat",
"base-oram-par-dynamo-64/results.dat",
"base-oram-par-dynamo-128/results.dat",
"base-oram-par-dynamo-256/results.dat",
"base-oram-par-dynamo-512/results.dat",
],
]
aggregateDataThroughput(folder,data, outputThroughput, outputThroughputStd, xAxis)
aggregateDataLatency(folder,data, outputLatency, outputLatencyStd,xAxis)
dat = [(outputThroughput,1), (outputThroughput,2), (outputThroughput,3), (outputThroughput,4)]
datStd = [(outputThroughputStd,1), (outputThroughputStd,2), (outputThroughputStd,3), (outputThroughputStd,4)]
plotBars("Number of Batches- Throughput Impact", barNames, datasetNames,
"Throughput (ops/s)", dat, False, folder + "/writebackstrides-throughput-bars", datStd, ylim=15000,black=False)
dat = [(outputLatency,1), (outputLatency,2), (outputLatency,3), (outputLatency,4)]
datStd = [(outputLatencyStd,1), (outputLatencyStd,2), (outputLatencyStd,3), (outputLatencyStd,4)]
plotBars("Number of Batches - Latency Impact", barNames, datasetNames,
"Latency (ms)", dat, False, folder + "/writebackstrides-latency-bars", datStd, black=False)
print "Write Back Batch Size Results Increase"
if (plotWriteBackStride):
folder = expData + "/" + "writebackstrides"
outputIncrease= folder + "/increase.dat"
xAxis = [1,
2,4,
# 6,
8,
#10,
16,32,64,128,256,512
]
barNames = ["1",
"2","4",
# "6",
"8",
# "10",
"16","32","64","128",#"256","512"
]
datasetNames = ["Dummy", "Server", "Server WAN", "Dynamo"]
data = [ # Dummy
[
"base-oram-par-dummy-1/results.dat",
"base-oram-par-dummy-2/results.dat",
"base-oram-par-dummy-4/results.dat",
"base-oram-par-dummy-8/results.dat",
"base-oram-par-dummy-16/results.dat",
"base-oram-par-dummy-32/results.dat",
"base-oram-par-dummy-64/results.dat",
"base-oram-par-dummy-128/results.dat",
# "base-oram-par-dummy-256/results.dat",
# "base-oram-par-dummy-512/results.dat",
],
# Server
[
"base-oram-par-server-hashmap-1/results.dat",
"base-oram-par-server-hashmap-2/results.dat",
"base-oram-par-server-hashmap-4/results.dat",
"base-oram-par-server-hashmap-8/results.dat",
"base-oram-par-server-hashmap-16/results.dat",
"base-oram-par-server-hashmap-32/results.dat",
"base-oram-par-server-hashmap-64/results.dat",
"base-oram-par-server-hashmap-128/results.dat",
# "base-oram-par-server-hashmap-256/results.dat",
# "base-oram-par-server-hashmap-512/results.dat",
],
# Geo Server
[
"base-oram-par-geoserver-hashmap-1/results.dat",
"base-oram-par-geoserver-hashmap-2/results.dat",
"base-oram-par-geoserver-hashmap-4/results.dat",
"base-oram-par-geoserver-hashmap-8/results.dat",
"base-oram-par-geoserver-hashmap-16/results.dat",
"base-oram-par-geoserver-hashmap-32/results.dat",
"base-oram-par-geoserver-hashmap-64/results.dat",
"base-oram-par-geoserver-hashmap-128/results.dat",
# "base-oram-par-geoserver-hashmap-256/results.dat",
# "base-oram-par-geoserver-hashmap-512/results.dat",
],
# Dynamo
[
"base-oram-par-dynamo-1/results.dat",
"base-oram-par-dynamo-2/results.dat",
"base-oram-par-dynamo-4/results.dat",
# "base-oram-par-dynamo-6/results.dat",
"base-oram-par-dynamo-8/results.dat",
# "base-oram-par-dynamo-10/results.dat",
"base-oram-par-dynamo-16/results.dat",
"base-oram-par-dynamo-32/results.dat",
"base-oram-par-dynamo-64/results.dat",
"base-oram-par-dynamo-128/results.dat",
# "base-oram-par-dynamo-256/results.dat",
# "base-oram-par-dynamo-512/results.dat",
],
]
aggregateDataThroughput(folder,data, outputThroughput, outputThroughputStd, xAxis)
measureIncrease(outputThroughput, outputIncrease, xAxis)
outputIncrease= folder + "/increase.dat"
dat = [(outputIncrease,"Dummy",0,1), (outputIncrease,"Server",0,2), (outputIncrease, "Server WAN", 0,3),(outputIncrease,"Dynamo",0,4)]
print "FOlder " + folder
plotLine("Number of Batches - Throughput Increase",
"Batch Size", "Relative Increase", folder + "/writebackstrides-increase-line", dat, True, log2X=True)
# dat = [(outputIncrease,1), (outputIncrease,2), (outputIncrease,3)]
# plotBars("Number of Batches - Throughput Increase", barNames, datasetNames,
# "Ratio", dat, True, folder + "/writebackstrides-increase-bars", black=True, paper=True,xHor=True)
if (plotApplications):
folder = expData + "/" + "applications"
outputThroughput = folder + "/aggT.dat"
outputLatency = folder + "/aggL.dat"
outputThroughputStd = folder + "/aggTStd.dat"
outputLatencyStd = folder + "/aggLStd.dat"
barNames = ["TPC-C","FreeHealth", "Smallbank"]
datasetNames = ["Obladi", "NoPriv", "MySQL", "ObladiW", "NoPrivW"]
data = [ # Obladi
[
"tpcc/tpc10oramserverhashmap/results.dat",
"freehealth/freehealthoramserverhashmap/results.dat",
"smallbank/smallbankoramserverhashmap/results.dat"
],
# NoPriv
[
"tpcc/tpc10noramserverhashmap/results.dat",
"freehealth/freehealthnoramserverhashmap/results.dat",
"smallbank/smallbanknoramserverhashmap/results.dat"
],
# Mysql
[
"tpcc/tpc10norammysql/results.dat",
"freehealth/freehealthsql/results.dat",
"smallbank/smallbanknoramsql/results.dat"
],
# Obladi WAN
[
"tpcc/tpc10oramgeoserverhashmap/results.dat",
"freehealth/freehealthoramgeoserverhashmap/results.dat",
"smallbank/smallbankoramgeoserverhashmap/results.dat"
],
# NoPriv WAN
[
"tpcc/tpc10noramgeoserverhashmap/results.dat",
"freehealth/freehealthnoramgeoserverhashmap/results.dat",
"smallbank/smallbanknoramgeoserverhashmap/results.dat"
]
]
aggregateDataThroughput(folder,data, outputThroughput, outputThroughputStd)
aggregateDataLatency(folder,data, outputLatency, outputLatencyStd)
dat = [(outputThroughput,0), (outputThroughput,1), (outputThroughput, 2), (outputThroughput,3), (outputThroughput,4)]
datStd = [(outputThroughputStd,0), (outputThroughputStd,1), (outputThroughputStd,2), (outputThroughputStd,3), (outputThroughputStd,4)]
plotBars("Plot", barNames, datasetNames,
"Throughput (Trx/s)", dat, False, folder + "/application-throughput-bars", datStd, logY=True, black=False)
dat = [(outputLatency,0), (outputLatency,1), (outputLatency,2), (outputLatency,3), (outputLatency,4)]
datStd = [(outputLatencyStd,0), (outputLatencyStd,1), (outputLatencyStd,2), (outputLatencyStd,3), (outputLatencyStd,4)]
plotBars("Plot", barNames, datasetNames,
"Latency (ms)", dat, False, folder + "/application-latency-bars", datStd, logY=True,black=False)
if (plotApplicationsSlides):
folder = expData + "/" + "applications"
outputThroughput = folder + "/aggTSlides.dat"
outputLatency = folder + "/aggLSlides.dat"
outputThroughputStd = folder + "/aggTSlidesStd.dat"
outputLatencyStd = folder + "/aggLSlidesStd.dat"
barNames = ["TPC-C","FreeHealth", "Smallbank"]
datasetNames = ["Obladi", "NoPriv", "MySQL"]
data = [
# Obladi WAN
[
"tpcc/tpc10oramgeoserverhashmap/results.dat",
"freehealth/freehealthoramgeoserverhashmap/results.dat",
"smallbank/smallbankoramgeoserverhashmap/results.dat"
],
# NoPriv WAN
[
"tpcc/tpc10noramgeoserverhashmap/results.dat",
"freehealth/freehealthnoramgeoserverhashmap/results.dat",
"smallbank/smallbanknoramgeoserverhashmap/results.dat"],
# Mysql
[
"tpcc/tpc10norammysql/results.dat",
"freehealth/freehealthsql/results.dat",
"smallbank/smallbanknoramsql/results.dat"
],
]
aggregateDataThroughput(folder,data, outputThroughput, outputThroughputStd)
aggregateDataLatency(folder,data, outputLatency, outputLatencyStd)
dat = [(outputThroughput,0), (outputThroughput,1), (outputThroughput, 2)]
datStd = [(outputThroughputStd,0), (outputThroughputStd,1), (outputThroughputStd,2)]
plotBars("Plot", barNames, datasetNames,
"Throughput (Trx/s)", dat, False, folder + "/application-throughput-bars-slides", datStd, logY=True, black=False)
dat = [(outputLatency,0), (outputLatency,1), (outputLatency,2)]
datStd = [(outputLatencyStd,0), (outputLatencyStd,1), (outputLatencyStd,2)]
plotBars("Plot", barNames, datasetNames,
"Latency (ms)", dat, False, folder + "/application-latency-bars-slides", datStd, logY=True,black=False)
if (plotBatchSucks):
folder = expData + "/applications/" + "smallbank"
smallbankstride = folder + "/strides.dat"
folder = expData + "/applications/" + "freehealth"
freehealthstride = folder + "/strides.dat"
folder = expData + "/applications/" + "tpcc"
tpcstride = folder + "/strides.dat"
dat = [(smallbankstride,"SmallBank",0,1), (freehealthstride,"FreeHealth",0,1), (tpcstride, "TPC-C",0,1)]
plotLine("","Epoch Size (ms) ",
"Throughput (trx/s)", folder + "/smallbank", dat, True)
if __name__ == "__main__": main()
| 46.693042 | 161 | 0.552095 | 3,363 | 34,226 | 5.61344 | 0.079096 | 0.119186 | 0.094396 | 0.138256 | 0.814016 | 0.776142 | 0.711728 | 0.630999 | 0.574849 | 0.550747 | 0 | 0.028932 | 0.305207 | 34,226 | 732 | 162 | 46.756831 | 0.764929 | 0.05043 | 0 | 0.476038 | 0 | 0 | 0.349148 | 0.281875 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.019169 | null | null | 0.025559 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
64014b4b902f64a3923abf68ad3421b79552e05f | 5,163 | py | Python | gyoithon/forms.py | gyoisamurai/GyoiBoard | aebd29820a39d1d88b9e5874b56f923b1c88d170 | [
"MIT"
] | 3 | 2021-07-03T12:41:39.000Z | 2021-11-18T17:23:08.000Z | gyoithon/forms.py | gyoisamurai/GyoiBoard | aebd29820a39d1d88b9e5874b56f923b1c88d170 | [
"MIT"
] | null | null | null | gyoithon/forms.py | gyoisamurai/GyoiBoard | aebd29820a39d1d88b9e5874b56f923b1c88d170 | [
"MIT"
] | 3 | 2021-06-12T13:25:48.000Z | 2021-09-30T19:06:49.000Z | import os
from django import forms
from gyoithon.models import Organization, Domain, Subdomain
# Organization Registration Form.
class RegistrationOrganizationForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
super(RegistrationOrganizationForm, self).__init__(*args, **kwargs)
for field in self.fields.values():
field.widget.attrs["class"] = "form-control"
class Meta:
model = Organization
fields = ('name', 'region', 'industry', 'overview')
# Organization Update Form.
class UpdateOrganizationForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
for field in self.fields.values():
field.widget.attrs["class"] = "form-control"
self.fields['name'].widget.attrs['readonly'] = 'readonly'
self.fields['domain'].widget.attrs['readonly'] = 'readonly'
self.fields['subdomain'].widget.attrs['readonly'] = 'readonly'
self.fields['rank'].widget.attrs['readonly'] = 'readonly'
self.fields['status'].widget.attrs['readonly'] = 'readonly'
self.fields['registration_date'].widget.attrs['readonly'] = 'readonly'
class Meta:
model = Organization
fields = ('name', 'region', 'industry', 'overview', 'domain', 'subdomain', 'rank', 'status',
'registration_date')
# Domain Registration Form.
class RegistrationDomainForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
for field in self.fields.values():
field.widget.attrs["class"] = "form-control"
class Meta:
model = Domain
fields = ('name', 'registrar', 'administrative_contact', 'registrant_name', 'registrant_organization',
'registrant_email', 'admin_name', 'admin_organization', 'admin_email', 'tech_name',
'tech_organization', 'tech_email', 'name_server', 'overview')
# Domain Update Form.
class UpdateDomainForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
for field in self.fields.values():
field.widget.attrs["class"] = "form-control"
self.fields['related_organization_id'].widget.attrs['readonly'] = 'readonly'
self.fields['name'].widget.attrs['readonly'] = 'readonly'
self.fields['subdomain'].widget.attrs['readonly'] = 'readonly'
self.fields['rank'].widget.attrs['readonly'] = 'readonly'
self.fields['status'].widget.attrs['readonly'] = 'readonly'
self.fields['registration_date'].widget.attrs['readonly'] = 'readonly'
class Meta:
model = Domain
fields = ('related_organization_id', 'name', 'registrar', 'administrative_contact', 'registrant_name',
'registrant_organization', 'registrant_email', 'admin_name', 'admin_organization', 'admin_email',
'tech_name', 'tech_organization', 'tech_email', 'name_server', 'subdomain', 'rank', 'status',
'overview', 'registration_date')
# Subdomain Registration Form.
class RegistrationSubdomainForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
for field in self.fields.values():
field.widget.attrs["class"] = "form-control"
class Meta:
model = Subdomain
fields = ('name', 'ip_address', 'production', 'cloud_type', 'http_accessible', 'http_location',
'http_page_title', 'http_screenshot_url', 'http_screenshot_path', 'https_accessible',
'https_location', 'https_page_title', 'https_screenshot_url', 'https_screenshot_path',
'dns_a_record', 'dns_cname_record', 'dns_ns_record', 'dns_mx_record', 'dns_soa_record',
'dns_txt_record', 'overview')
# Subdomain Update Form.
class UpdateSubdomainForm(forms.ModelForm):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
for field in self.fields.values():
field.widget.attrs["class"] = "form-control"
self.fields['name'].widget.attrs['readonly'] = 'readonly'
self.fields['ip_address'].widget.attrs['readonly'] = 'readonly'
self.fields['http_accessible'].widget.attrs['readonly'] = 'readonly'
self.fields['http_location'].widget.attrs['readonly'] = 'readonly'
self.fields['https_accessible'].widget.attrs['readonly'] = 'readonly'
self.fields['https_location'].widget.attrs['readonly'] = 'readonly'
self.fields['rank'].widget.attrs['readonly'] = 'readonly'
self.fields['status'].widget.attrs['readonly'] = 'readonly'
self.fields['registration_date'].widget.attrs['readonly'] = 'readonly'
class Meta:
model = Subdomain
fields = ('name', 'ip_address', 'production', 'cloud_type',
'http_accessible', 'http_location', 'https_accessible', 'https_location',
'dns_a_record', 'dns_cname_record', 'dns_ns_record', 'dns_mx_record',
'dns_soa_record', 'dns_txt_record', 'rank', 'status', 'overview', 'registration_date')
| 45.690265 | 115 | 0.640325 | 530 | 5,163 | 5.986792 | 0.145283 | 0.085093 | 0.125749 | 0.178695 | 0.784431 | 0.755437 | 0.746927 | 0.683265 | 0.670659 | 0.638512 | 0 | 0 | 0.203758 | 5,163 | 112 | 116 | 46.098214 | 0.771832 | 0.030021 | 0 | 0.576471 | 0 | 0 | 0.333333 | 0.031394 | 0 | 0 | 0 | 0 | 0 | 1 | 0.070588 | false | 0 | 0.035294 | 0 | 0.247059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
6407a1085f143b7473a3623d99715f674bdba2d6 | 70 | py | Python | src/socket_plus/__init__.py | coco875/socket-plus | 5362be450e94854e6b3614bca5073bc1e3bfd7d6 | [
"MIT"
] | 1 | 2022-01-15T12:00:39.000Z | 2022-01-15T12:00:39.000Z | src/socket_plus/__init__.py | coco875/socket-plus | 5362be450e94854e6b3614bca5073bc1e3bfd7d6 | [
"MIT"
] | null | null | null | src/socket_plus/__init__.py | coco875/socket-plus | 5362be450e94854e6b3614bca5073bc1e3bfd7d6 | [
"MIT"
] | null | null | null | from .main import Client_connection, Server_connection, Client_Thread
| 35 | 69 | 0.871429 | 9 | 70 | 6.444444 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.085714 | 70 | 1 | 70 | 70 | 0.90625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
6418e92fadd97ed501a662c9990b0310a322625f | 64 | py | Python | {{cookiecutter.project_name}}/{{cookiecutter.project_name}}/extensions.py | jacebrowning/template-flask | a5f6b6aeb61610ff62170efd747085bfc6893805 | [
"Unlicense"
] | 7 | 2018-03-23T10:15:28.000Z | 2021-03-23T21:03:49.000Z | {{cookiecutter.project_name}}/{{cookiecutter.project_name}}/extensions.py | jacebrowning/template-flask | a5f6b6aeb61610ff62170efd747085bfc6893805 | [
"Unlicense"
] | 5 | 2017-02-16T03:51:59.000Z | 2019-10-18T16:55:43.000Z | {{cookiecutter.project_name}}/{{cookiecutter.project_name}}/extensions.py | jacebrowning/template-flask | a5f6b6aeb61610ff62170efd747085bfc6893805 | [
"Unlicense"
] | 1 | 2021-03-23T18:41:22.000Z | 2021-03-23T18:41:22.000Z | from flask_bootstrap import Bootstrap
bootstrap = Bootstrap()
| 12.8 | 37 | 0.8125 | 7 | 64 | 7.285714 | 0.571429 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140625 | 64 | 4 | 38 | 16 | 0.927273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
6423c6b70fb127ceae17461d768098731f96b6e6 | 328 | py | Python | pages/factories.py | andywar65/starter-fullstack | 683d6282eb02a9b967d15cd254976e67549672e9 | [
"BSD-2-Clause"
] | null | null | null | pages/factories.py | andywar65/starter-fullstack | 683d6282eb02a9b967d15cd254976e67549672e9 | [
"BSD-2-Clause"
] | null | null | null | pages/factories.py | andywar65/starter-fullstack | 683d6282eb02a9b967d15cd254976e67549672e9 | [
"BSD-2-Clause"
] | null | null | null | from factory import Faker # , SubFactory
from factory.django import DjangoModelFactory
from .models import Article
class ArticleFactory(DjangoModelFactory):
class Meta:
model = Article
title = Faker("sentence", nb_words=3)
intro = Faker("sentence", nb_words=5)
body = Faker("sentence", nb_words=20)
| 23.428571 | 45 | 0.716463 | 39 | 328 | 5.948718 | 0.564103 | 0.168103 | 0.193966 | 0.258621 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015094 | 0.192073 | 328 | 13 | 46 | 25.230769 | 0.860377 | 0.036585 | 0 | 0 | 0 | 0 | 0.076433 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
643f2e3b7d88d9c9f9e2572ad0286c010fabd919 | 6,102 | py | Python | tests/test_reboot.py | syseleven/rebootmgr | 681a98efbc2472ef6587cacf8eb813a711b750e2 | [
"MIT"
] | 11 | 2018-11-14T11:14:59.000Z | 2021-08-03T07:54:24.000Z | tests/test_reboot.py | syseleven/rebootmgr | 681a98efbc2472ef6587cacf8eb813a711b750e2 | [
"MIT"
] | 9 | 2018-11-14T10:20:57.000Z | 2020-03-17T12:38:25.000Z | tests/test_reboot.py | syseleven/rebootmgr | 681a98efbc2472ef6587cacf8eb813a711b750e2 | [
"MIT"
] | 3 | 2019-02-07T11:34:38.000Z | 2019-02-08T13:02:00.000Z | import socket
from rebootmgr.main import cli as rebootmgr
from rebootmgr.main import EXIT_CONSUL_LOCK_FAILED, \
EXIT_CONSUL_CHECKS_FAILED, EXIT_CONFIGURATION_IS_MISSING
def test_reboot_fails_without_config(run_cli, forward_consul_port):
result = run_cli(rebootmgr, ["-v"], catch_exceptions=True)
assert "Configuration data missing" in result.output
assert "Executing pre reboot tasks" not in result.output
assert result.exit_code == EXIT_CONFIGURATION_IS_MISSING
def test_reboot_fails_without_tasks(run_cli, forward_consul_port, default_config):
result = run_cli(rebootmgr, ["-v"], catch_exceptions=True)
assert "Executing pre reboot tasks" in result.output
assert result.exit_code == 1
assert isinstance(result.exception, FileNotFoundError)
def test_reboot_succeeds_with_tasks(run_cli, forward_consul_port, consul_cluster,
default_config, reboot_task,
mock_subprocess_run, mocker):
mocked_sleep = mocker.patch("time.sleep")
reboot_task("pre_boot", "00_some_task.sh")
mocked_run = mock_subprocess_run(["shutdown", "-r", "+1"])
result = run_cli(rebootmgr, ["-v"])
assert "00_some_task.sh" in result.output
assert result.exit_code == 0
mocked_run.assert_any_call(["shutdown", "-r", "+1"], check=True)
# We want rebootmgr to sleep for 2 minutes after running the pre boot tasks,
# so that we can notice when the tasks broke some consul checks.
mocked_sleep.assert_any_call(130)
# Check that it sets the reboot_in_progress flag
_, data = consul_cluster[0].kv.get("service/rebootmgr/reboot_in_progress")
assert data["Value"].decode() == socket.gethostname()
def test_dryrun_reboot_succeeds_with_tasks(run_cli, forward_consul_port,
consul_cluster, default_config,
reboot_task, mock_subprocess_run,
mocker):
mocked_sleep = mocker.patch("time.sleep")
reboot_task("pre_boot", "00_some_task.sh")
mocked_run = mock_subprocess_run(["shutdown", "-r", "+1"])
result = run_cli(rebootmgr, ["-vv", "--dryrun"])
assert "00_some_task.sh" in result.output
assert "in key service/rebootmgr/reboot_in_progress" in result.output
assert result.exit_code == 0
assert mocked_run.call_count == 1
args, kwargs = mocked_run.call_args
assert args[0] == "/etc/rebootmgr/pre_boot_tasks/00_some_task.sh"
assert 'env' in kwargs
assert 'REBOOTMGR_DRY_RUN' in kwargs['env']
assert kwargs['env']['REBOOTMGR_DRY_RUN'] == "1"
# In particular, 'shutdown' is not called
# We want rebootmgr to sleep for 2 minutes after running the pre boot tasks,
# so that we can notice when the tasks broke some consul checks.
mocked_sleep.assert_any_call(130)
# Check that it does not set the reboot_in_progress flag
_, data = consul_cluster[0].kv.get("service/rebootmgr/reboot_in_progress")
assert not data
def test_reboot_fail(
run_cli, forward_consul_port, default_config, reboot_task,
mock_subprocess_run, mocker):
mocked_sleep = mocker.patch("time.sleep")
mocked_run = mock_subprocess_run(
["shutdown", "-r", "+1"],
side_effect=Exception("Failed to run reboot command"))
result = run_cli(rebootmgr, ["-v"], catch_exceptions=True)
assert result.exit_code == 1
mocked_run.assert_any_call(["shutdown", "-r", "+1"], check=True)
# We want rebootmgr to sleep for 2 minutes after running the pre boot tasks,
# so that we can notice when the tasks broke some consul checks.
mocked_sleep.assert_any_call(130)
def test_reboot_fails_if_another_reboot_is_in_progress(
run_cli, forward_consul_port, default_config, consul_cluster):
consul_cluster[0].kv.put("service/rebootmgr/reboot_in_progress", "some_hostname")
result = run_cli(rebootmgr, ["-v"])
assert "some_hostname" in result.output
assert result.exit_code == EXIT_CONSUL_LOCK_FAILED
def test_reboot_succeeds_if_this_node_is_in_maintenance(
run_cli, forward_consul_port, default_config, consul_cluster,
reboot_task, mock_subprocess_run, mocker):
consul_cluster[0].agent.service.register("A", tags=["rebootmgr"])
consul_cluster[0].agent.maintenance(True)
mocker.patch("time.sleep")
mocked_run = mock_subprocess_run(["shutdown", "-r", "+1"])
result = run_cli(rebootmgr, ["-v"])
mocked_run.assert_any_call(["shutdown", "-r", "+1"], check=True)
assert result.exit_code == 0
def test_reboot_fails_if_another_node_is_in_maintenance(
run_cli, forward_consul_port, default_config, consul_cluster,
reboot_task, mock_subprocess_run, mocker):
consul_cluster[0].agent.service.register("A", tags=["rebootmgr"])
consul_cluster[1].agent.service.register("A", tags=["rebootmgr"])
consul_cluster[2].agent.service.register("A", tags=["rebootmgr"])
consul_cluster[1].agent.maintenance(True)
mocker.patch("time.sleep")
mocked_run = mock_subprocess_run(["shutdown", "-r", "+1"])
result = run_cli(rebootmgr, ["-v"])
mocked_run.assert_not_called()
assert 'There were failed consul checks' in result.output
assert '_node_maintenance on consul2' in result.output
assert result.exit_code == EXIT_CONSUL_CHECKS_FAILED
def test_reboot_succeeds_if_another_node_is_in_maintenance_but_ignoring(
run_cli, forward_consul_port, default_config, consul_cluster,
reboot_task, mock_subprocess_run, mocker):
consul_cluster[0].agent.service.register("A", tags=["rebootmgr"])
consul_cluster[1].agent.service.register("A", tags=["rebootmgr", "ignore_maintenance"])
consul_cluster[2].agent.service.register("A", tags=["rebootmgr"])
consul_cluster[1].agent.maintenance(True)
mocker.patch("time.sleep")
mocked_run = mock_subprocess_run(["shutdown", "-r", "+1"])
result = run_cli(rebootmgr, ["-v"])
mocked_run.assert_any_call(["shutdown", "-r", "+1"], check=True)
assert result.exit_code == 0
| 38.620253 | 91 | 0.705015 | 830 | 6,102 | 4.878313 | 0.150602 | 0.061003 | 0.050383 | 0.042233 | 0.813534 | 0.775253 | 0.733021 | 0.710052 | 0.689306 | 0.576686 | 0 | 0.011022 | 0.182235 | 6,102 | 157 | 92 | 38.866242 | 0.800401 | 0.090954 | 0 | 0.53 | 0 | 0 | 0.145747 | 0.034134 | 0 | 0 | 0 | 0 | 0.34 | 1 | 0.09 | false | 0 | 0.03 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
ff22824c2b93f6cfd764b36717e40cdcdf0b328b | 118 | py | Python | django_ulogin/forms.py | rmehtije/django-ulogin-1 | 0a5ab84e9b32427be222f611fd3e7cbf49851655 | [
"MIT"
] | 9 | 2015-04-21T16:03:27.000Z | 2021-12-25T01:04:44.000Z | django_ulogin/forms.py | rmehtije/django-ulogin-1 | 0a5ab84e9b32427be222f611fd3e7cbf49851655 | [
"MIT"
] | 20 | 2015-06-07T13:53:56.000Z | 2021-04-15T06:41:10.000Z | django_ulogin/forms.py | rmehtije/django-ulogin-1 | 0a5ab84e9b32427be222f611fd3e7cbf49851655 | [
"MIT"
] | 19 | 2015-07-03T06:24:53.000Z | 2021-04-19T11:01:07.000Z | from django import forms
class PostBackForm(forms.Form):
token = forms.CharField(max_length=255, required=True)
| 19.666667 | 58 | 0.771186 | 16 | 118 | 5.625 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0.135593 | 118 | 5 | 59 | 23.6 | 0.852941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
ff321758b5b0b6e6c1c25619f628e5a55112dc95 | 978 | py | Python | test/svm_test/svm_linear_test2.py | Edelweiss35/deep-machine-learning | b1e4b133609f303be77de824601925f448a94764 | [
"WTFPL"
] | 708 | 2015-01-07T20:17:58.000Z | 2022-03-07T02:30:42.000Z | test/svm_test/svm_linear_test2.py | Edelweiss35/deep-machine-learning | b1e4b133609f303be77de824601925f448a94764 | [
"WTFPL"
] | 2 | 2016-12-15T03:15:57.000Z | 2021-06-16T01:25:13.000Z | test/svm_test/svm_linear_test2.py | Edelweiss35/deep-machine-learning | b1e4b133609f303be77de824601925f448a94764 | [
"WTFPL"
] | 333 | 2015-01-09T06:51:46.000Z | 2022-01-16T08:49:58.000Z | import numpy as np
import scipy as sp
from dml.SVM import SVMC
X=[
[7.15,14.8],
[8.85,13],
[11.45,12.9],
[19.6,14.1],
[19.25,16.2],
[11.65,15.75],
[8.9,15.85],
[10.85,14.6],
[14.3,15.55],
[16.25,14.95],
[13.6,14.05],
[15.5,14.05],
[16.85,15],
[7.25,10.25],
[8,9.2],
[13.7,8.1],
[17.65,7.8],
[17.8,9.3],
[14.75,9.85],
[10.35,10],
[9.1,8.4],
[10.8,7.95],
[11.15,8.35],
[13.45,9.35],
[16.25,8.6],
[19,9.05],
[16.8,9.7],
[15.45,9.25],
[11.65,8.45],
[8.45,10.25],
[8.45,10.3],
[10.1,9.65],
[12.9,9.1],
[13.75,9.5],
[16.25,9.05],
[12.35,15.05]]
y=[
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[0],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1],
[1]]
X=np.array(X).transpose()
print X.shape
y=np.array(y).flatten(1)
y[y==0]=-1
print y.shape
svms=SVMC(X,y)
svms.train()
print len(svms.supportVec)
for i in range(len(svms.supportVec)):
t=svms.supportVec[i]
print svms.X[:,t]
svms.prints_test_linear()
| 10.294737 | 37 | 0.503067 | 240 | 978 | 2.041667 | 0.2375 | 0.089796 | 0.128571 | 0.163265 | 0.073469 | 0.073469 | 0.073469 | 0.073469 | 0.073469 | 0.073469 | 0 | 0.30485 | 0.114519 | 978 | 94 | 38 | 10.404255 | 0.26097 | 0 | 0 | 0.393258 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.033708 | null | null | 0.05618 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
ff6ffd628b16b0619ce49789808321f29a6fc682 | 163 | py | Python | test/tutorial1.py | ryancor/zorrom | 00519be5908b404c542bb208000d9da969daa4d4 | [
"BSD-2-Clause"
] | 21 | 2020-10-04T07:29:22.000Z | 2022-02-08T01:39:45.000Z | test/tutorial1.py | ryancor/zorrom | 00519be5908b404c542bb208000d9da969daa4d4 | [
"BSD-2-Clause"
] | 12 | 2020-10-04T07:44:54.000Z | 2021-05-26T22:54:49.000Z | test/tutorial1.py | ryancor/zorrom | 00519be5908b404c542bb208000d9da969daa4d4 | [
"BSD-2-Clause"
] | 6 | 2020-09-30T06:34:44.000Z | 2022-01-20T06:01:05.000Z | # roughly
# [print(bin(ord(x)).replace("0b", "")) for x in "Hello, world!\0\0\0"]
# python3 txtmunge.py --rotate 90 --flipy --invert hello1.txt test/tutorial1.txt
| 40.75 | 80 | 0.662577 | 27 | 163 | 4 | 0.851852 | 0.037037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.116564 | 163 | 3 | 81 | 54.333333 | 0.6875 | 0.957055 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
ff94a1b84a5c5f045dfacc44c4b81df676b0b543 | 495 | py | Python | tests/end_to_end/scenarios/list_adhoc_irc.py | norayr/biboumi | 805671032d25ee6ce09ed75e8a385c04e9563cdd | [
"Zlib"
] | 68 | 2015-01-29T21:07:37.000Z | 2022-03-20T14:48:07.000Z | tests/end_to_end/scenarios/list_adhoc_irc.py | norayr/biboumi | 805671032d25ee6ce09ed75e8a385c04e9563cdd | [
"Zlib"
] | 5 | 2016-10-24T18:34:30.000Z | 2021-08-31T13:30:37.000Z | tests/end_to_end/scenarios/list_adhoc_irc.py | norayr/biboumi | 805671032d25ee6ce09ed75e8a385c04e9563cdd | [
"Zlib"
] | 13 | 2015-12-11T15:19:05.000Z | 2021-08-31T13:24:35.000Z | from scenarios import *
scenario = (
send_stanza("<iq type='get' id='idwhatever' from='{jid_one}/{resource_one}' to='{irc_host_one}@{biboumi_host}'><query xmlns='http://jabber.org/protocol/disco#items' node='http://jabber.org/protocol/commands' /></iq>"),
expect_stanza("/iq[@type='result']/disco_items:query[@node='http://jabber.org/protocol/commands']",
"/iq/disco_items:query/disco_items:item[2]",
"!/iq/disco_items:query/disco_items:item[3]"),
)
| 55 | 222 | 0.658586 | 67 | 495 | 4.686567 | 0.492537 | 0.191083 | 0.124204 | 0.200637 | 0.414013 | 0.414013 | 0.414013 | 0 | 0 | 0 | 0 | 0.00464 | 0.129293 | 495 | 8 | 223 | 61.875 | 0.723898 | 0 | 0 | 0 | 0 | 0.285714 | 0.741414 | 0.313131 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
440d88717a66bb31db5431c4fad2ae29bb9d8dc4 | 110 | py | Python | tests/unit_tests/reliability/pfn1.py | SURGroup/UncertaintyQuantification | a94c8db47d07134ea2b3b0a3ca53ca818532c3e6 | [
"MIT"
] | null | null | null | tests/unit_tests/reliability/pfn1.py | SURGroup/UncertaintyQuantification | a94c8db47d07134ea2b3b0a3ca53ca818532c3e6 | [
"MIT"
] | null | null | null | tests/unit_tests/reliability/pfn1.py | SURGroup/UncertaintyQuantification | a94c8db47d07134ea2b3b0a3ca53ca818532c3e6 | [
"MIT"
] | null | null | null | def model_i(samples):
resistance = samples[0, 0]
stress = samples[0, 1]
return resistance - stress | 27.5 | 30 | 0.663636 | 15 | 110 | 4.8 | 0.6 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047059 | 0.227273 | 110 | 4 | 31 | 27.5 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
443a902df7768bc824142540cce406c336ec8940 | 2,550 | py | Python | dynamicserialize/dstypes/com/raytheon/uf/common/dataplugin/gfe/config/ProjectionData.py | mjames-upc/python-awips | e2b05f5587b02761df3b6dd5c6ee1f196bd5f11c | [
"BSD-3-Clause"
] | null | null | null | dynamicserialize/dstypes/com/raytheon/uf/common/dataplugin/gfe/config/ProjectionData.py | mjames-upc/python-awips | e2b05f5587b02761df3b6dd5c6ee1f196bd5f11c | [
"BSD-3-Clause"
] | null | null | null | dynamicserialize/dstypes/com/raytheon/uf/common/dataplugin/gfe/config/ProjectionData.py | mjames-upc/python-awips | e2b05f5587b02761df3b6dd5c6ee1f196bd5f11c | [
"BSD-3-Clause"
] | null | null | null | ##
##
# File auto-generated against equivalent DynamicSerialize Java class
class ProjectionData(object):
def __init__(self):
self.projectionID = None
self.projectionType = None
self.latLonLL = None
self.latLonUR = None
self.latLonOrigin = None
self.stdParallelOne = None
self.stdParallelTwo = None
self.gridPointLL = None
self.gridPointUR = None
self.latIntersect = None
self.lonCenter = None
self.lonOrigin = None
def getProjectionID(self):
return self.projectionID
def setProjectionID(self, projectionID):
self.projectionID = projectionID
def getProjectionType(self):
return self.projectionType
def setProjectionType(self, projectionType):
self.projectionType = projectionType
def getLatLonLL(self):
return self.latLonLL
def setLatLonLL(self, latLonLL):
self.latLonLL = latLonLL
def getLatLonUR(self):
return self.latLonUR
def setLatLonUR(self, latLonUR):
self.latLonUR = latLonUR
def getLatLonOrigin(self):
return self.latLonOrigin
def setLatLonOrigin(self, latLonOrigin):
self.latLonOrigin = latLonOrigin
def getStdParallelOne(self):
return self.stdParallelOne
def setStdParallelOne(self, stdParallelOne):
self.stdParallelOne = stdParallelOne
def getStdParallelTwo(self):
return self.stdParallelTwo
def setStdParallelTwo(self, stdParallelTwo):
self.stdParallelTwo = stdParallelTwo
def getGridPointLL(self):
return self.gridPointLL
def setGridPointLL(self, gridPointLL):
self.gridPointLL = gridPointLL
def getGridPointUR(self):
return self.gridPointUR
def setGridPointUR(self, gridPointUR):
self.gridPointUR = gridPointUR
def getLatIntersect(self):
return self.latIntersect
def setLatIntersect(self, latIntersect):
self.latIntersect = latIntersect
def getLonCenter(self):
return self.lonCenter
def setLonCenter(self, lonCenter):
self.lonCenter = lonCenter
def getLonOrigin(self):
return self.lonOrigin
def setLonOrigin(self, lonOrigin):
self.lonOrigin = lonOrigin
def keys(self):
return ['projectionID', 'projectionType', 'latLonLL', 'latLonUR',
'latLonOrigin', 'stdParallelOne', 'stdParallelTwo',
'gridPointLL', 'gridPointUR', 'latIntersect', 'lonCenter',
'lonOrigin']
| 25.5 | 74 | 0.663529 | 222 | 2,550 | 7.603604 | 0.234234 | 0.077014 | 0.099526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.259608 | 2,550 | 99 | 75 | 25.757576 | 0.894068 | 0.025882 | 0 | 0 | 1 | 0 | 0.054098 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.38806 | false | 0 | 0 | 0.19403 | 0.597015 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
444fa63dcb01d513f9a16950e8d45642e3cfdf76 | 270 | py | Python | api/views.py | phucdzvcll/MuaHo-Api-AdminWeb-Django | 663ad4e01d225d9db88e46f30a724d1b28b1b891 | [
"Apache-2.0"
] | null | null | null | api/views.py | phucdzvcll/MuaHo-Api-AdminWeb-Django | 663ad4e01d225d9db88e46f30a724d1b28b1b891 | [
"Apache-2.0"
] | null | null | null | api/views.py | phucdzvcll/MuaHo-Api-AdminWeb-Django | 663ad4e01d225d9db88e46f30a724d1b28b1b891 | [
"Apache-2.0"
] | 1 | 2021-11-13T02:59:30.000Z | 2021-11-13T02:59:30.000Z | from api.controllers.home_controller import get_categories
from django.http import HttpRequest, HttpResponse
from api.util import responseJson
from api.controllers import *
def categories(request: HttpRequest) -> HttpResponse:
return responseJson(get_categories())
| 33.75 | 58 | 0.82963 | 32 | 270 | 6.90625 | 0.53125 | 0.095023 | 0.162896 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107407 | 270 | 7 | 59 | 38.571429 | 0.917012 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.666667 | 0.166667 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 4 |
44879e0f0c2e7585e375c4ccbeca409d3e1bf28e | 372 | py | Python | IFR/mmcv/runner/optimizer/__init__.py | jfzhuang/IFR | d6ffdd0c0810d7bb244f102ba8cc19c12f61e102 | [
"MIT"
] | 3 | 2022-03-09T13:15:15.000Z | 2022-03-21T06:59:10.000Z | IFR/mmcv/runner/optimizer/__init__.py | jfzhuang/IFR | d6ffdd0c0810d7bb244f102ba8cc19c12f61e102 | [
"MIT"
] | null | null | null | IFR/mmcv/runner/optimizer/__init__.py | jfzhuang/IFR | d6ffdd0c0810d7bb244f102ba8cc19c12f61e102 | [
"MIT"
] | null | null | null | from .builder import (OPTIMIZER_BUILDERS, OPTIMIZERS, build_optimizer, build_optimizer_constructor)
from .builder import build_optimizer_fast
from .default_constructor import DefaultOptimizerConstructor, DefaultOptimizerConstructor_Fast
__all__ = [
'OPTIMIZER_BUILDERS', 'OPTIMIZERS', 'DefaultOptimizerConstructor', 'build_optimizer', 'build_optimizer_constructor'
]
| 46.5 | 119 | 0.849462 | 34 | 372 | 8.823529 | 0.352941 | 0.233333 | 0.113333 | 0.186667 | 0.26 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080645 | 372 | 7 | 120 | 53.142857 | 0.877193 | 0 | 0 | 0 | 0 | 0 | 0.260753 | 0.145161 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
922b4d85f838b5d0f54cc91a8eb5856c036aba66 | 126 | py | Python | poetry/console/logging/formatters/formatter.py | HarryPeach/poetry | 70ac497a81f3ac59ee890c6a7bee0ffc3cae6c6e | [
"MIT"
] | null | null | null | poetry/console/logging/formatters/formatter.py | HarryPeach/poetry | 70ac497a81f3ac59ee890c6a7bee0ffc3cae6c6e | [
"MIT"
] | null | null | null | poetry/console/logging/formatters/formatter.py | HarryPeach/poetry | 70ac497a81f3ac59ee890c6a7bee0ffc3cae6c6e | [
"MIT"
] | null | null | null | import logging
class Formatter:
def format(self, record: logging.LogRecord) -> str:
raise NotImplementedError()
| 18 | 55 | 0.706349 | 13 | 126 | 6.846154 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206349 | 126 | 6 | 56 | 21 | 0.89 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 4 |
927a3e0d7a83c27c9a9b00fceacf816b94767752 | 147 | py | Python | tests/conftest.py | gvwilson/nitinat | 48d95e27301af955c09d3757507f8fd66ce204d7 | [
"MIT"
] | null | null | null | tests/conftest.py | gvwilson/nitinat | 48d95e27301af955c09d3757507f8fd66ce204d7 | [
"MIT"
] | 14 | 2022-03-22T22:48:14.000Z | 2022-03-31T11:18:53.000Z | tests/conftest.py | gvwilson/nitinat | 48d95e27301af955c09d3757507f8fd66ce204d7 | [
"MIT"
] | null | null | null | """Shared fixtures."""
from pytest import fixture
from nitinat.parameters import Parameters
@fixture
def clear_cache():
Parameters.clear()
| 13.363636 | 41 | 0.748299 | 17 | 147 | 6.411765 | 0.647059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.14966 | 147 | 10 | 42 | 14.7 | 0.872 | 0.108844 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
9293db39496e63f2b5998e6a6dfae4e7e9da49bd | 97 | py | Python | HW/3/extension/__init__.py | houzeyu2683/IRRHW | c44298ad14c468eff36bc75ebc63abdc9ba24d55 | [
"Apache-2.0"
] | null | null | null | HW/3/extension/__init__.py | houzeyu2683/IRRHW | c44298ad14c468eff36bc75ebc63abdc9ba24d55 | [
"Apache-2.0"
] | null | null | null | HW/3/extension/__init__.py | houzeyu2683/IRRHW | c44298ad14c468eff36bc75ebc63abdc9ba24d55 | [
"Apache-2.0"
] | 1 | 2022-01-16T03:40:34.000Z | 2022-01-16T03:40:34.000Z |
from .tabulation import tabulation
from .machine import machine
from .tokenize import tokenize
| 16.166667 | 34 | 0.824742 | 12 | 97 | 6.666667 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.14433 | 97 | 5 | 35 | 19.4 | 0.963855 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
2ba8b72f40dda19fcf853d6843c6040ca3d1cb86 | 164 | py | Python | src/ansito/__init__.py | pawamoy/ansito | 87e3cdf17afab278dd0cd750bd7faa73803a9e94 | [
"0BSD"
] | 5 | 2019-08-13T18:32:46.000Z | 2021-12-31T20:36:03.000Z | src/ansito/__init__.py | pawamoy/ansito | 87e3cdf17afab278dd0cd750bd7faa73803a9e94 | [
"0BSD"
] | 5 | 2018-11-26T13:58:44.000Z | 2021-06-27T07:56:52.000Z | src/ansito/__init__.py | pawamoy/ansito | 87e3cdf17afab278dd0cd750bd7faa73803a9e94 | [
"0BSD"
] | 1 | 2019-04-11T01:07:03.000Z | 2019-04-11T01:07:03.000Z | """
ansito package.
Translate ANSI codes to any other format.
"""
from typing import List
__all__: List[str] = [] # noqa: WPS410 (the only __variable__ we use)
| 16.4 | 70 | 0.70122 | 23 | 164 | 4.652174 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022556 | 0.189024 | 164 | 9 | 71 | 18.222222 | 0.781955 | 0.628049 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
2bb98eac4728fab7e1217386a0eb866090bd1e98 | 86 | py | Python | rl/__init__.py | AlexandrePoisson/alphazero_singleplayer | db742bcbd61e1d62a6958136ca7bb2ae11053971 | [
"MIT"
] | 30 | 2017-05-03T16:07:15.000Z | 2022-01-27T14:08:22.000Z | rl/__init__.py | AlexandrePoisson/alphazero_singleplayer | db742bcbd61e1d62a6958136ca7bb2ae11053971 | [
"MIT"
] | 1 | 2020-05-02T17:29:04.000Z | 2020-05-02T17:29:04.000Z | src/rl/__init__.py | tmoer/cursus | 62c4aa793205294d5b3a99d192e9b6311f4d34a6 | [
"MIT"
] | 18 | 2019-01-22T15:30:58.000Z | 2022-03-21T13:13:35.000Z | # -*- coding: utf-8 -*-
"""
Created on Thu Mar 2 14:48:24 2017
@author: thomas
"""
| 10.75 | 35 | 0.55814 | 14 | 86 | 3.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179104 | 0.22093 | 86 | 7 | 36 | 12.285714 | 0.537313 | 0.872093 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
2bd155e9a17773b78320b5112f4d1ec7a30fc457 | 113 | py | Python | src/game/components/script.py | ShoaibSyed1/project-pokemon | 6916962cf0be478c2a229b6620e9425d707c2b29 | [
"MIT"
] | null | null | null | src/game/components/script.py | ShoaibSyed1/project-pokemon | 6916962cf0be478c2a229b6620e9425d707c2b29 | [
"MIT"
] | null | null | null | src/game/components/script.py | ShoaibSyed1/project-pokemon | 6916962cf0be478c2a229b6620e9425d707c2b29 | [
"MIT"
] | null | null | null | class ScriptComponent:
def __init__(self, script):
self.script = script
self.started = False
| 22.6 | 31 | 0.646018 | 12 | 113 | 5.75 | 0.666667 | 0.289855 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.274336 | 113 | 4 | 32 | 28.25 | 0.841463 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
2bfe05576e628d317e71707f047901881f8a0cb1 | 234 | py | Python | mmcv/runner/builder.py | HXWAndCL/mmcv | dfb48c87ae3156c4ee5f921edf89efc0fdc0c5bc | [
"Apache-2.0"
] | 1 | 2021-08-23T06:54:42.000Z | 2021-08-23T06:54:42.000Z | mmcv/runner/builder.py | humu789/mmcv | ea3e9789bfbb76e8b3d26b062867164f2dd86bd6 | [
"Apache-2.0"
] | null | null | null | mmcv/runner/builder.py | humu789/mmcv | ea3e9789bfbb76e8b3d26b062867164f2dd86bd6 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) OpenMMLab. All rights reserved.
from ..utils import Registry, build_from_cfg
RUNNERS = Registry('runner')
def build_runner(cfg, default_args=None):
return build_from_cfg(cfg, RUNNERS, default_args=default_args)
| 26 | 66 | 0.777778 | 33 | 234 | 5.272727 | 0.575758 | 0.189655 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123932 | 234 | 8 | 67 | 29.25 | 0.84878 | 0.192308 | 0 | 0 | 0 | 0 | 0.032086 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
a61c12db432d20c859ca2572ea592f857dda15e0 | 54 | py | Python | library/tests/unit_tests/user_script_1.py | devinbalkind/greppo | 817644fa5328259d6a0141c781db22dbb9f0650f | [
"Apache-2.0"
] | 221 | 2021-11-22T19:51:04.000Z | 2022-03-27T16:38:26.000Z | library/tests/unit_tests/user_script_1.py | devinbalkind/greppo | 817644fa5328259d6a0141c781db22dbb9f0650f | [
"Apache-2.0"
] | 9 | 2021-11-29T13:11:44.000Z | 2022-03-29T15:56:48.000Z | library/tests/unit_tests/user_script_1.py | devinbalkind/greppo | 817644fa5328259d6a0141c781db22dbb9f0650f | [
"Apache-2.0"
] | 18 | 2021-11-27T02:43:48.000Z | 2022-03-29T09:31:44.000Z | # This is a user script that does not compile
print(
| 13.5 | 45 | 0.740741 | 10 | 54 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 54 | 3 | 46 | 18 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
a62029fa34a272b80342e4fe4f0462bdaaa11a4c | 113 | py | Python | day02/02urlretrieve.py | lixiang30/SpiderProject | 664cb1d89353ad40ff7880ca2785fe8c5f654e75 | [
"Apache-2.0"
] | null | null | null | day02/02urlretrieve.py | lixiang30/SpiderProject | 664cb1d89353ad40ff7880ca2785fe8c5f654e75 | [
"Apache-2.0"
] | null | null | null | day02/02urlretrieve.py | lixiang30/SpiderProject | 664cb1d89353ad40ff7880ca2785fe8c5f654e75 | [
"Apache-2.0"
] | 1 | 2020-12-24T10:16:46.000Z | 2020-12-24T10:16:46.000Z |
from urllib import request
# 将下面url地址的内容保存到index.html中
request.urlretrieve("http://39.106.201.37/",'index.html') | 28.25 | 57 | 0.778761 | 15 | 113 | 5.866667 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09434 | 0.061947 | 113 | 4 | 57 | 28.25 | 0.735849 | 0.221239 | 0 | 0 | 0 | 0 | 0.360465 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
a63e0209101da6f2d6f3a2c1e96d70873b5f7a94 | 160 | py | Python | comment/urls.py | herschel-ma/django_blogsite_and_more | 41310aa96f4288e7ac7880ea6cdcd0964e1322fb | [
"MIT"
] | null | null | null | comment/urls.py | herschel-ma/django_blogsite_and_more | 41310aa96f4288e7ac7880ea6cdcd0964e1322fb | [
"MIT"
] | 4 | 2020-03-12T08:45:37.000Z | 2022-03-12T01:02:37.000Z | comment/urls.py | herschel-ma/django_blogsite_and_more | 41310aa96f4288e7ac7880ea6cdcd0964e1322fb | [
"MIT"
] | null | null | null | from django.urls import path
from . import views
app_name="comment"
urlpatterns = [
path('update_comment/', views.UpdateComment, name='update_comment'),
]
| 20 | 72 | 0.74375 | 20 | 160 | 5.8 | 0.6 | 0.224138 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13125 | 160 | 7 | 73 | 22.857143 | 0.834532 | 0 | 0 | 0 | 0 | 0 | 0.225 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
a676b4ad1ff67a1b7d83a3cee9c1a13ab11fe13e | 212 | py | Python | Concatenate.py | rashidulhasanhridoy/LANGUAGE-PROFICIENCY-Python-HackerRank | 46beecbf3a2468d6c598fe62a3e65c5f0c1395c8 | [
"Apache-2.0"
] | 1 | 2020-07-21T18:01:52.000Z | 2020-07-21T18:01:52.000Z | Concatenate.py | rashidulhasanhridoy/LANGUAGE-PROFICIENCY-Python-HackerRank | 46beecbf3a2468d6c598fe62a3e65c5f0c1395c8 | [
"Apache-2.0"
] | null | null | null | Concatenate.py | rashidulhasanhridoy/LANGUAGE-PROFICIENCY-Python-HackerRank | 46beecbf3a2468d6c598fe62a3e65c5f0c1395c8 | [
"Apache-2.0"
] | null | null | null | import numpy as np
x, y, z = map(int, input().strip().split())
arr = np.array(input().strip().split(), int)
for j in range(1, x + y):
arr = np.vstack((arr, np.array(input().strip().split(), int)))
print(arr)
| 30.285714 | 66 | 0.603774 | 38 | 212 | 3.368421 | 0.552632 | 0.234375 | 0.351563 | 0.234375 | 0.4375 | 0.4375 | 0.4375 | 0 | 0 | 0 | 0 | 0.005525 | 0.146226 | 212 | 6 | 67 | 35.333333 | 0.701657 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.166667 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a677a88f7f0fa9b74500a1ef9c0c1fd89d0ed610 | 8,796 | py | Python | tetris/pieces.py | chrisvaughn/tetrisAI | 8eedae9e7030d1adf5d46fad894e78ee41ad3a2c | [
"MIT"
] | null | null | null | tetris/pieces.py | chrisvaughn/tetrisAI | 8eedae9e7030d1adf5d46fad894e78ee41ad3a2c | [
"MIT"
] | null | null | null | tetris/pieces.py | chrisvaughn/tetrisAI | 8eedae9e7030d1adf5d46fad894e78ee41ad3a2c | [
"MIT"
] | null | null | null | import copy
from typing import List, Tuple, Union
import numpy as np
from .board import Board
class Piece:
def __init__(self, name: str, shapes: List[np.ndarray], default_shape_idx: int):
self.name = name
self.shapes = shapes
self.default_shape_idx = default_shape_idx
self._x = 0 # the center of the shape matrix
self._y = 0 # the center of the shape matrix
self.current_shape_idx = default_shape_idx
self._detection_shapes = None
def __str__(self) -> str:
return f"Piece<name: {self.name}, x: {self._x}, y: {self._y}>"
@property
def shape(self) -> np.ndarray:
return self.shapes[self.current_shape_idx]
@property
def corner_xy(self) -> Tuple[int, int]:
return self._x - 2, self._y - 2
@property
def zero_based_corner_xy(self) -> Tuple[int, int]:
return self._x - 2 - 1, self._y - 2 - 1
# 0-based, left most x, top most y, number of rotations of detection shape
def set_detected_position(self, x: int, y: int, shape_idx: int):
self.current_shape_idx = shape_idx
where = np.where(self.shape == 1)
self._x = x + 2 - np.amin(where[1]) + 1
self._y = y + 2 - np.amin(where[0]) + 1
# 1-based, center of piece
def set_position(self, x: int, y: int):
if x < 1 or x > Board.columns:
raise ValueError(f"x must be between 1 and {Board.columns}")
if y < 1 or y > Board.rows:
raise ValueError(f"y must be between 1 and {Board.rows}")
self._x = x
self._y = y
def possible_translations(self) -> Tuple[int, int]:
where = np.where(self.shape == 1)
min_x = np.amin(where[1])
max_x = np.amax(where[1])
t_left = self._x - (self.shape.shape[1] // 2 - min_x) - 1
t_right = 10 - self._x + (self.shape.shape[1] // 2 - max_x)
return t_left, t_right
def move_down(self, moves: int = 1):
self._y += moves
def move_left(self, moves: int = 1):
self._x -= moves
def move_right(self, moves: int = 1):
self._x += moves
def rot_ccw(self, rot: int = 1):
self.current_shape_idx = (self.current_shape_idx - rot) % len(self.shapes)
def rot_cw(self, rot: int = 1):
self.current_shape_idx = (self.current_shape_idx + rot) % len(self.shapes)
def clone(self):
return copy.deepcopy(self)
@property
def detection_shapes(self) -> List[np.ndarray]:
if self._detection_shapes:
return self._detection_shapes
ds = []
for s in self.shapes:
where = np.where(s == 1)
pruned = s[
np.amin(where[0]) : np.amax(where[0]) + 1,
np.amin(where[1]) : np.amax(where[1]) + 1,
]
ds.append(pruned)
self._detection_shapes = ds
return self._detection_shapes
Tetrominoes = [
Piece(
"i",
[
np.array(
[
[0, 0, 1, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[1, 1, 1, 1, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
],
1,
),
Piece(
"l",
[
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 1, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 1, 1, 0],
[0, 1, 0, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 1, 1, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 0, 1, 0],
[0, 1, 1, 1, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
],
1,
),
Piece(
"j",
[
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 0, 0],
[0, 1, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 1, 0, 0, 0],
[0, 1, 1, 1, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 1, 1, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 1, 1, 0],
[0, 0, 0, 1, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
],
3,
),
Piece(
"o",
[
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 1, 0, 0],
[0, 1, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
)
],
0,
),
Piece(
"s",
[
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 0, 1, 1, 0],
[0, 1, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 1, 0],
[0, 0, 0, 1, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
],
0,
),
Piece(
"t",
[
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 1, 1, 1, 0],
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 1, 1, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 1, 1, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 1, 0, 0],
[0, 1, 1, 0, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
],
2,
),
Piece(
"z",
[
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 0, 0, 0],
[0, 1, 1, 0, 0],
[0, 0, 1, 1, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
np.array(
[
[0, 0, 0, 0, 0],
[0, 0, 0, 1, 0],
[0, 0, 1, 1, 0],
[0, 0, 1, 0, 0],
[0, 0, 0, 0, 0],
],
dtype=int,
),
],
0,
),
]
| 26.899083 | 84 | 0.286039 | 1,007 | 8,796 | 2.418073 | 0.09434 | 0.273511 | 0.328953 | 0.336756 | 0.549076 | 0.542094 | 0.467351 | 0.450103 | 0.404107 | 0.404107 | 0 | 0.139036 | 0.570714 | 8,796 | 326 | 85 | 26.981595 | 0.505826 | 0.018076 | 0 | 0.669967 | 0 | 0.0033 | 0.015524 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.049505 | false | 0 | 0.013201 | 0.016502 | 0.092409 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a6af5969e0098a0ed9fd48f78f36dcd6a7620d01 | 23 | py | Python | cutadapt/__init__.py | lparsons/cutadapt | 5125337dc76539e5f451594529c72d99ef9a615b | [
"MIT",
"Unlicense"
] | 1 | 2018-12-12T10:33:18.000Z | 2018-12-12T10:33:18.000Z | cutadapt/__init__.py | lparsons/cutadapt | 5125337dc76539e5f451594529c72d99ef9a615b | [
"MIT",
"Unlicense"
] | null | null | null | cutadapt/__init__.py | lparsons/cutadapt | 5125337dc76539e5f451594529c72d99ef9a615b | [
"MIT",
"Unlicense"
] | null | null | null | __version__ = '1.2rc2'
| 11.5 | 22 | 0.695652 | 3 | 23 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 0.130435 | 23 | 1 | 23 | 23 | 0.45 | 0 | 0 | 0 | 0 | 0 | 0.26087 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
a6e6f08a023fc4d11ab4d85cc218d86062d10797 | 176 | py | Python | src/pdf2png/wsgi.py | msamsu/python-pdf2png-render | 7e491577a80dc5345efb47addf57b205576ea3c8 | [
"MIT"
] | null | null | null | src/pdf2png/wsgi.py | msamsu/python-pdf2png-render | 7e491577a80dc5345efb47addf57b205576ea3c8 | [
"MIT"
] | null | null | null | src/pdf2png/wsgi.py | msamsu/python-pdf2png-render | 7e491577a80dc5345efb47addf57b205576ea3c8 | [
"MIT"
] | null | null | null | import os
from django.core.wsgi import get_wsgi_application
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'pdf2png.settings.default')
application = get_wsgi_application()
| 22 | 75 | 0.829545 | 23 | 176 | 6.086957 | 0.608696 | 0.1 | 0.257143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006173 | 0.079545 | 176 | 7 | 76 | 25.142857 | 0.858025 | 0 | 0 | 0 | 0 | 0 | 0.261364 | 0.261364 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
a6eec9639a10b2a158117b9492c4cf066bde1746 | 109 | py | Python | core/helpers/__init__.py | Magoli1/carla-pre-crash-scenario-generator | 46666309a55e9aab59cebeda33aca7723c1d360c | [
"MIT"
] | 1 | 2022-03-30T08:28:53.000Z | 2022-03-30T08:28:53.000Z | core/helpers/__init__.py | Magoli1/carla-pre-crash-scenario-generator | 46666309a55e9aab59cebeda33aca7723c1d360c | [
"MIT"
] | null | null | null | core/helpers/__init__.py | Magoli1/carla-pre-crash-scenario-generator | 46666309a55e9aab59cebeda33aca7723c1d360c | [
"MIT"
] | null | null | null | """
Helper Functions
================
.. automodule:: core.helpers.utils
:members:
:undoc-members:
"""
| 13.625 | 34 | 0.550459 | 9 | 109 | 6.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146789 | 109 | 7 | 35 | 15.571429 | 0.645161 | 0.917431 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
5b25196e81851597bdce4a8d7739a354e94601f2 | 24 | py | Python | ykdl/version.py | SeaHOH/ykdl | 88bc30d2c5503c138a538a8243d6879c04fa0d33 | [
"MIT"
] | 136 | 2021-05-15T04:09:17.000Z | 2022-03-29T08:27:31.000Z | ykdl/version.py | SeaHOH/ykdl | 88bc30d2c5503c138a538a8243d6879c04fa0d33 | [
"MIT"
] | 41 | 2021-05-14T14:46:32.000Z | 2022-03-28T07:46:32.000Z | ykdl/version.py | SeaHOH/ykdl | 88bc30d2c5503c138a538a8243d6879c04fa0d33 | [
"MIT"
] | 32 | 2021-05-15T00:54:24.000Z | 2022-03-09T14:58:07.000Z | __version__ = '1.8.0b1'
| 12 | 23 | 0.666667 | 4 | 24 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 0.125 | 24 | 1 | 24 | 24 | 0.380952 | 0 | 0 | 0 | 0 | 0 | 0.291667 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
5b40b4cbfe8a255066ce51f46047b6437f50cf7c | 297 | py | Python | eLearn/views/about_us.py | Shehab-Magdy/Ayrid_E-Learn-master | 7dc6aac71b2026e8829d1e8b0540f6b1ae09e197 | [
"MIT"
] | null | null | null | eLearn/views/about_us.py | Shehab-Magdy/Ayrid_E-Learn-master | 7dc6aac71b2026e8829d1e8b0540f6b1ae09e197 | [
"MIT"
] | null | null | null | eLearn/views/about_us.py | Shehab-Magdy/Ayrid_E-Learn-master | 7dc6aac71b2026e8829d1e8b0540f6b1ae09e197 | [
"MIT"
] | null | null | null | from eLearn.models import Registration
from django.shortcuts import render, redirect, get_object_or_404
from django.http import HttpResponse
from django.core.exceptions import ObjectDoesNotExist
from django.urls import reverse
def about(request):
return render(request,'about_us/about.html')
| 33 | 64 | 0.835017 | 41 | 297 | 5.95122 | 0.634146 | 0.163934 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011278 | 0.104377 | 297 | 8 | 65 | 37.125 | 0.906015 | 0 | 0 | 0 | 0 | 0 | 0.063973 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.714286 | 0.142857 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 4 |
5b50bace3b88058afe2b07123cabe431e85abbd4 | 190 | py | Python | meshrcnn/data/datasets/__init__.py | ishanic/MeshRCNN-keypoints | fdc2c81ce57313207478ab9ff1699614addc5993 | [
"BSD-3-Clause"
] | 1 | 2021-06-25T17:23:02.000Z | 2021-06-25T17:23:02.000Z | meshrcnn/data/datasets/__init__.py | ishanic/MeshRCNN-keypoints | fdc2c81ce57313207478ab9ff1699614addc5993 | [
"BSD-3-Clause"
] | null | null | null | meshrcnn/data/datasets/__init__.py | ishanic/MeshRCNN-keypoints | fdc2c81ce57313207478ab9ff1699614addc5993 | [
"BSD-3-Clause"
] | null | null | null | # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
#from .pix3d import load_pix3d_json # isort:skip
#from . import builtin # ensure the builtin datasets are registered
| 47.5 | 70 | 0.773684 | 27 | 190 | 5.37037 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0125 | 0.157895 | 190 | 3 | 71 | 63.333333 | 0.89375 | 0.952632 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
5b8971973fcf7b230948474fc5142115cc5a75b0 | 147 | py | Python | allbymyself/path_utils.py | greydevv/django-allbymyself | 43d6f7cc3586d34765fe3338753f98fb421caeb5 | [
"BSD-3-Clause"
] | null | null | null | allbymyself/path_utils.py | greydevv/django-allbymyself | 43d6f7cc3586d34765fe3338753f98fb421caeb5 | [
"BSD-3-Clause"
] | null | null | null | allbymyself/path_utils.py | greydevv/django-allbymyself | 43d6f7cc3586d34765fe3338753f98fb421caeb5 | [
"BSD-3-Clause"
] | null | null | null | def get_path_name(model, page_name):
opts = model._meta
path_name = f'{opts.app_label}_{opts.model_name}_{page_name}'
return path_name
| 29.4 | 65 | 0.727891 | 24 | 147 | 4 | 0.5 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156463 | 147 | 4 | 66 | 36.75 | 0.774194 | 0 | 0 | 0 | 0 | 0 | 0.312925 | 0.312925 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
5bc5277e86ebb339e9fdc7f81d9081d3b11e82c6 | 253 | py | Python | pycfmodel/model/utils.py | donatoaz/pycfmodel | 1586e290b67d2347493dd4a77d2b0c8ee6c0936b | [
"Apache-2.0"
] | 23 | 2018-06-28T10:45:01.000Z | 2021-05-07T11:12:39.000Z | pycfmodel/model/utils.py | donatoaz/pycfmodel | 1586e290b67d2347493dd4a77d2b0c8ee6c0936b | [
"Apache-2.0"
] | 27 | 2019-03-09T08:33:22.000Z | 2022-03-03T14:59:11.000Z | pycfmodel/model/utils.py | donatoaz/pycfmodel | 1586e290b67d2347493dd4a77d2b0c8ee6c0936b | [
"Apache-2.0"
] | 7 | 2019-03-09T02:18:18.000Z | 2021-07-22T20:33:09.000Z | from dataclasses import dataclass
from typing import Optional
from pycfmodel.model.resources.properties.policy_document import PolicyDocument
@dataclass
class OptionallyNamedPolicyDocument:
name: Optional[str]
policy_document: PolicyDocument
| 23 | 79 | 0.841897 | 26 | 253 | 8.115385 | 0.653846 | 0.132701 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.118577 | 253 | 10 | 80 | 25.3 | 0.946188 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.428571 | 0 | 0.857143 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
7535c62ae272a60a721659d144191704f84ecb1f | 676 | py | Python | clients/python-experimental/generated/openapi_client/api/nature_api.py | cliffano/pokeapi-clients | 92af296c68c3e94afac52642ae22057faaf071ee | [
"MIT"
] | null | null | null | clients/python-experimental/generated/openapi_client/api/nature_api.py | cliffano/pokeapi-clients | 92af296c68c3e94afac52642ae22057faaf071ee | [
"MIT"
] | null | null | null | clients/python-experimental/generated/openapi_client/api/nature_api.py | cliffano/pokeapi-clients | 92af296c68c3e94afac52642ae22057faaf071ee | [
"MIT"
] | null | null | null | # coding: utf-8
"""
No description provided (generated by Openapi Generator https://github.com/openapitools/openapi-generator) # noqa: E501
The version of the OpenAPI document: 20220523
Generated by: https://openapi-generator.tech
"""
from openapi_client.api_client import ApiClient
from openapi_client.api.nature_api_endpoints.nature_list import NatureList
from openapi_client.api.nature_api_endpoints.nature_read import NatureRead
class NatureApi(
NatureList,
NatureRead,
ApiClient,
):
"""NOTE: This class is auto generated by OpenAPI Generator
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
pass
| 25.037037 | 124 | 0.75 | 86 | 676 | 5.77907 | 0.523256 | 0.160966 | 0.102616 | 0.120724 | 0.177062 | 0.177062 | 0.177062 | 0.177062 | 0 | 0 | 0 | 0.02139 | 0.170118 | 676 | 26 | 125 | 26 | 0.864528 | 0.52071 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.111111 | 0.333333 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 4 |
753699aa1e0aadfcd0cf4175562208281d1ea165 | 17,958 | py | Python | data-access/tests/test_elasticsearch_proxy.py | ngachung/incubator-sdap-nexus | 38e768694fcc142e2d88283cb1e44e05f88da847 | [
"Apache-2.0"
] | 17 | 2017-11-16T07:36:33.000Z | 2021-11-07T00:02:20.000Z | data-access/tests/test_elasticsearch_proxy.py | ngachung/incubator-sdap-nexus | 38e768694fcc142e2d88283cb1e44e05f88da847 | [
"Apache-2.0"
] | 35 | 2018-01-11T00:50:20.000Z | 2022-03-17T23:08:07.000Z | data-access/tests/test_elasticsearch_proxy.py | ngachung/incubator-sdap-nexus | 38e768694fcc142e2d88283cb1e44e05f88da847 | [
"Apache-2.0"
] | 25 | 2017-11-16T07:36:38.000Z | 2022-02-03T20:48:46.000Z | # Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the 'License'); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an 'AS IS' BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import unittest
import configparser
import datetime
import logging
import pkg_resources
import time
from nexustiles.dao.ElasticsearchProxy import ElasticsearchProxy
from shapely.geometry import box
class TestQuery(unittest.TestCase):
def setUp(self):
config = configparser.ConfigParser()
config.read_file(open("config/datastores.ini"))
self.proxy = ElasticsearchProxy(config)
logging.basicConfig(level=logging.INFO)
self.query_data = configparser.ConfigParser()
self.query_data.read_file(open("config/elasticsearch_query_data.ini"))
def test_get_tile_count(self):
bounding_polygon = box(self.query_data.getfloat('get_tile_count', 'min_lon'),
self.query_data.getfloat('get_tile_count', 'min_lat'),
self.query_data.getfloat('get_tile_count', 'max_lon'),
self.query_data.getfloat('get_tile_count', 'max_lat'))
metadata_keys = self.query_data.get('get_tile_count', 'metadata_keys').split(',')
metadata_values = self.query_data.get('get_tile_count', 'metadata_values').split(',')
metadata = []
for index, key in enumerate(metadata_keys):
metadata.append(key + ':' + metadata_values[index])
result = self.proxy.get_tile_count(self.query_data.get('get_tile_count', 'dataset_name'),
bounding_polygon,
self.query_data.getint('get_tile_count', 'start_time'),
self.query_data.getint('get_tile_count', 'end_time'),
metadata)
self.assertIsInstance(result, int)
self.assertIsNot(result, 0)
# print('RESULT FROM get_tile_count = ' + str(result))
def test_find_all_tiles_by_metadata(self):
metadata_keys = self.query_data.get('find_all_tiles_by_metadata', 'metadata_keys').split(',')
metadata_values = self.query_data.get('find_all_tiles_by_metadata', 'metadata_values').split(',')
metadata = []
for index, key in enumerate(metadata_keys):
metadata.append(key + ':' + metadata_values[index])
result = self.proxy.find_all_tiles_by_metadata(metadata,
self.query_data.get('find_all_tiles_by_metadata', 'dataset_name'),
self.query_data.getint('find_all_tiles_by_metadata', 'start_time'),
self.query_data.getint('find_all_tiles_by_metadata', "end_time"))
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise AssertionError
# print('RESULT FROM find_all_tiles_by_metadata (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:10]))
def test_find_distinct_bounding_boxes_in_polygon(self):
bounding_polygon = box(self.query_data.getfloat('find_distinct_bounding_boxes_in_polygon', 'min_lon'),
self.query_data.getfloat('find_distinct_bounding_boxes_in_polygon', 'min_lat'),
self.query_data.getfloat('find_distinct_bounding_boxes_in_polygon', 'max_lon'),
self.query_data.getfloat('find_distinct_bounding_boxes_in_polygon', 'max_lat'))
result = self.proxy.find_distinct_bounding_boxes_in_polygon(
bounding_polygon,
self.query_data.get('find_distinct_bounding_boxes_in_polygon', 'dataset_name'),
self.query_data.getint('find_distinct_bounding_boxes_in_polygon', 'start_time'),
self.query_data.getint('find_distinct_bounding_boxes_in_polygon', 'end_time'))
self.assertIsNot(len(result), 0)
self.assertIsInstance(result, list)
# print('RESULT FROM find_distinct_bounding_boxes_in_polygon (LENGTH = ' + str(len(result)) + ') -> ' + str(result))
def test_find_tile_by_id(self):
result = self.proxy.find_tile_by_id(self.query_data.get('find_tile_by_id', 'tile_id'))
self.assertIs(len(result), 1)
# print('RESULT FROM find_tile_by_id = ' + str(result))
def test_find_tiles_by_id(self):
tile_ids = [tile_id for tile_id in self.query_data.get('find_tiles_by_id', 'tile_ids').split(',')]
result = self.proxy.find_tiles_by_id(tile_ids,
self.query_data.get('find_tiles_by_id', 'dataset_name'))
self.assertIs(len(result), len(tile_ids))
# print('RESULT FROM find_tiles_by_id = ' + str(result[:10]))
def test_find_min_date_from_tiles(self):
tile_ids = [tile_id for tile_id in self.query_data.get('find_min_date_from_tiles', 'tile_ids').split(',')]
result = self.proxy.find_min_date_from_tiles(tile_ids, self.query_data.get('find_min_date_from_tiles', 'dataset_name'))
self.assertRegex(str(result), r'\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\+\d{2}:\d{2}')
# print('RESULT FROM find_min_date_from_tiles = ' + str(result))
def test_find_max_date_from_tiles(self):
tile_ids = [tile_id for tile_id in self.query_data.get('find_max_date_from_tiles', 'tile_ids').split(',')]
result = self.proxy.find_max_date_from_tiles(tile_ids, self.query_data.get('find_max_date_from_tiles', 'dataset_name'))
self.assertRegex(str(result), r'\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\+\d{2}:\d{2}')
# print('RESULT FROM find_max_date_from_tiles = ' + str(result))
def test_find_min_max_date_from_granule(self):
result = self.proxy.find_min_max_date_from_granule(self.query_data.get('find_min_max_date_from_granule', 'dataset_name'),
self.query_data.get('find_min_max_date_from_granule', 'granule_name'))
self.assertIs(len(result), 2)
self.assertIsInstance(result[0], datetime.datetime)
self.assertIsInstance(result[1], datetime.datetime)
# print('RESULT FROM find_min_max_date_from_granule = ' + str(result))
def test_get_data_series_list(self):
result = self.proxy.get_data_series_list()
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise AssertionError
# print('RESULT FROM get_data_series_list = ' + str(result[:10]))
def test_get_data_series_list_simple(self):
result = self.proxy.get_data_series_list_simple()
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise AssertionError
# print('RESULT FROM get_data_series_list_simple = ' + str(result[:10]))
def test_get_data_series_stats(self):
result = self.proxy.get_data_series_stats(self.query_data.get('get_data_series_stats', 'dataset_name'))
self.assertIsInstance(result, dict)
self.assertIs(len(result), 5)
result['available_dates'] = len(result['available_dates'])
# print('RESULT FROM get_data_series_stats (length of available dates) = ' + str(result))
def test_find_days_in_range_asc(self):
result = self.proxy.find_days_in_range_asc(self.query_data.getfloat('find_days_in_range_asc', 'min_lat'),
self.query_data.getfloat('find_days_in_range_asc', 'max_lat'),
self.query_data.getfloat('find_days_in_range_asc', 'min_lon'),
self.query_data.getfloat('find_days_in_range_asc', 'max_lon'),
self.query_data.get('find_days_in_range_asc', 'dataset_name'),
self.query_data.getint('find_days_in_range_asc', 'start_time'),
self.query_data.getint('find_days_in_range_asc', 'end_time'))
self.assertIsNot(len(result), 0)
self.assertIsInstance(result[0], float)
# print('RESULT FROM find_days_in_range_asc = ' + str(result[:10]))
def test_find_all_tiles_in_box_sorttimeasc(self):
result = self.proxy.find_all_tiles_in_box_sorttimeasc(
self.query_data.getfloat('find_all_tiles_in_box_sorttimeasc', 'min_lat'),
self.query_data.getfloat('find_all_tiles_in_box_sorttimeasc', 'max_lat'),
self.query_data.getfloat('find_all_tiles_in_box_sorttimeasc', 'min_lon'),
self.query_data.getfloat('find_all_tiles_in_box_sorttimeasc', 'max_lon'),
self.query_data.get('find_all_tiles_in_box_sorttimeasc', 'dataset_name'),
self.query_data.getint('find_all_tiles_in_box_sorttimeasc', 'start_time'),
self.query_data.getint('find_all_tiles_in_box_sorttimeasc', 'end_time'))
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise AssertionError
# print('RESULT FROM find_all_tiles_in_box_sorttimeasc (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:20]))
def test_find_all_tiles_in_polygon_sorttimeasc(self):
bounding_polygon = box(self.query_data.getfloat('find_all_tiles_in_polygon_sorttimeasc', 'min_lon'),
self.query_data.getfloat('find_all_tiles_in_polygon_sorttimeasc', 'min_lat'),
self.query_data.getfloat('find_all_tiles_in_polygon_sorttimeasc', 'max_lon'),
self.query_data.getfloat('find_all_tiles_in_polygon_sorttimeasc', 'max_lat'))
result = self.proxy.find_all_tiles_in_polygon_sorttimeasc(
bounding_polygon,
self.query_data.get('find_all_tiles_in_polygon_sorttimeasc', 'dataset_name'),
self.query_data.getint('find_all_tiles_in_polygon_sorttimeasc', 'start_time'),
self.query_data.getint('find_all_tiles_in_polygon_sorttimeasc', 'end_time'))
self.assertIsNotNone(result)
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise AssertionError
# print('FULL RESULT FROM find_all_tiles_in_polygon_sorttimeasc = ' + str(result))
# print('RESULT FROM find_all_tiles_in_polygon_sorttimeasc (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:10]))
def test_find_all_tiles_in_polygon(self):
bounding_polygon = box(self.query_data.getfloat('find_all_tiles_in_polygon', 'min_lon'),
self.query_data.getfloat('find_all_tiles_in_polygon', 'min_lat'),
self.query_data.getfloat('find_all_tiles_in_polygon', 'max_lon'),
self.query_data.getfloat('find_all_tiles_in_polygon', 'max_lat'))
result = self.proxy.find_all_tiles_in_polygon(bounding_polygon,
self.query_data.get('find_all_tiles_in_polygon', 'dataset_name'),
self.query_data.getint('find_all_tiles_in_polygon', 'start_time'),
self.query_data.getint('find_all_tiles_in_polygon', 'end_time'))
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise AssertionError
# print('RESULT FROM find_all_tiles_in_polygon (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:10]))
def test_find_tiles_by_exact_bounds(self):
result = self.proxy.find_tiles_by_exact_bounds(
self.query_data.getfloat('find_tiles_by_exact_bounds', 'min_lon'),
self.query_data.getfloat('find_tiles_by_exact_bounds', 'min_lat'),
self.query_data.getfloat('find_tiles_by_exact_bounds', 'max_lon'),
self.query_data.getfloat('find_tiles_by_exact_bounds', 'max_lat'),
self.query_data.get('find_tiles_by_exact_bounds', 'dataset_name'),
self.query_data.getint('find_tiles_by_exact_bounds', 'start_time'),
self.query_data.getint('find_tiles_by_exact_bounds', 'end_time'))
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise AssertionError
# print('RESULT FROM find_tiles_by_exact_bounds (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:10]))
def test_find_all_tiles_in_box_at_time(self):
result = self.proxy.find_all_tiles_in_box_at_time(
self.query_data.getfloat('find_all_tiles_in_box_at_time', 'min_lat'),
self.query_data.getfloat('find_all_tiles_in_box_at_time', 'max_lat'),
self.query_data.getfloat('find_all_tiles_in_box_at_time', 'min_lon'),
self.query_data.getfloat('find_all_tiles_in_box_at_time', 'max_lon'),
self.query_data.get('find_all_tiles_in_box_at_time', 'dataset_name'),
self.query_data.getint('find_all_tiles_in_box_at_time', 'search_time'))
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise AssertionError
# print('RESULT FROM find_all_tiles_in_box_at_time (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:10]))
def test_find_all_tiles_in_polygon_at_time(self):
bounding_polygon = box(self.query_data.getfloat('find_all_tiles_in_polygon_at_time', 'min_lon'),
self.query_data.getfloat('find_all_tiles_in_polygon_at_time', 'min_lat'),
self.query_data.getfloat('find_all_tiles_in_polygon_at_time', 'max_lon'),
self.query_data.getfloat('find_all_tiles_in_polygon_at_time', 'max_lat'))
result = self.proxy.find_all_tiles_in_polygon_at_time(
bounding_polygon,
self.query_data.get('find_all_tiles_in_polygon_at_time', 'dataset_name'),
self.query_data.getint('find_all_tiles_in_polygon_at_time', 'search_time'))
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise AssertionError
# print('RESULT FROM find_all_tiles_in_polygon_at_time (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:10]))
def test_find_all_tiles_within_box_at_time(self):
result = self.proxy.find_all_tiles_within_box_at_time(
self.query_data.getfloat('find_all_tiles_within_box_at_time', 'min_lat'),
self.query_data.getfloat('find_all_tiles_within_box_at_time', 'max_lat'),
self.query_data.getfloat('find_all_tiles_within_box_at_time', 'min_lon'),
self.query_data.getfloat('find_all_tiles_within_box_at_time', 'max_lon'),
self.query_data.get('find_all_tiles_within_box_at_time', 'dataset_name'),
self.query_data.getint('find_all_tiles_within_box_at_time', 'search_time'))
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise assertionerror
# print('RESULT FROM find_all_tiles_within_box_at_time (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:10]))
def test_find_all_boundary_tiles_at_time(self):
result = self.proxy.find_all_boundary_tiles_at_time(
self.query_data.getfloat('find_all_boundary_tiles_at_time', 'min_lat'),
self.query_data.getfloat('find_all_boundary_tiles_at_time', 'max_lat'),
self.query_data.getfloat('find_all_boundary_tiles_at_time', 'min_lon'),
self.query_data.getfloat('find_all_boundary_tiles_at_time', 'max_lon'),
self.query_data.get('find_all_boundary_tiles_at_time', 'dataset_name'),
self.query_data.getint('find_all_boundary_tiles_at_time', 'search_time'))
try:
self.assertIsInstance(result[0], dict)
except IndexError:
raise assertionerror
# print('RESULT FROM find_all_boundary_tiles_at_time (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:10]))
def test_find_tile_by_polygon_and_most_recent_day_of_year(self):
bounding_polygon = box(self.query_data.getfloat('find_tile_by_polygon_and_most_recent_day_of_year', 'min_lon'),
self.query_data.getfloat('find_tile_by_polygon_and_most_recent_day_of_year', 'min_lat'),
self.query_data.getfloat('find_tile_by_polygon_and_most_recent_day_of_year', 'max_lon'),
self.query_data.getfloat('find_tile_by_polygon_and_most_recent_day_of_year', 'max_lat'))
result = self.proxy.find_tile_by_polygon_and_most_recent_day_of_year(
bounding_polygon,
self.query_data.get('find_tile_by_polygon_and_most_recent_day_of_year', 'dataset_name'),
self.query_data.getint('find_tile_by_polygon_and_most_recent_day_of_year', 'day_of_year'))
self.assertIs(len(result), 1)
self.assertIsInstance(result, list)
# print('RESULT FROM find_tile_by_polygon_and_most_recent_day_of_year (LENGTH = ' + str(len(result)) + ') -> ' + str(result[:10]))
| 56.11875 | 138 | 0.649683 | 2,309 | 17,958 | 4.619749 | 0.079255 | 0.083529 | 0.119434 | 0.06431 | 0.854598 | 0.817381 | 0.768351 | 0.714446 | 0.647417 | 0.599231 | 0 | 0.00507 | 0.242121 | 17,958 | 319 | 139 | 56.294671 | 0.778692 | 0.152745 | 0 | 0.283784 | 0 | 0.009009 | 0.25308 | 0.183106 | 0 | 0 | 0 | 0 | 0.18018 | 1 | 0.099099 | false | 0 | 0.036036 | 0 | 0.13964 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
755cce908be56feee02b1f92d2fe57e1d339e32f | 118 | py | Python | katas/kyu_5/simple_pig_latin.py | the-zebulan/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | 40 | 2016-03-09T12:26:20.000Z | 2022-03-23T08:44:51.000Z | katas/kyu_5/simple_pig_latin.py | akalynych/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | null | null | null | katas/kyu_5/simple_pig_latin.py | akalynych/CodeWars | 1eafd1247d60955a5dfb63e4882e8ce86019f43a | [
"MIT"
] | 36 | 2016-11-07T19:59:58.000Z | 2022-03-31T11:18:27.000Z | PIG = '{}{}ay'.format
def pig_it(s):
return ' '.join(PIG(a[1:], a[0]) if a.isalpha() else a for a in s.split())
| 19.666667 | 78 | 0.550847 | 24 | 118 | 2.666667 | 0.708333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021053 | 0.194915 | 118 | 5 | 79 | 23.6 | 0.652632 | 0 | 0 | 0 | 0 | 0 | 0.059322 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 4 |
f37df2b823ff7ff17a17c327a4637a4624c4a628 | 52 | py | Python | tutorials/eboutique/microservices/shipping/src/queries/__init__.py | bhardwajRahul/minos-python | bad7a280ad92680abdeab01d1214688279cf6316 | [
"MIT"
] | 247 | 2022-01-24T14:55:30.000Z | 2022-03-25T12:06:17.000Z | tutorials/eboutique/microservices/shipping/src/queries/__init__.py | bhardwajRahul/minos-python | bad7a280ad92680abdeab01d1214688279cf6316 | [
"MIT"
] | 168 | 2022-01-24T14:54:31.000Z | 2022-03-31T09:31:09.000Z | tutorials/eboutique/microservices/shipping/src/queries/__init__.py | bhardwajRahul/minos-python | bad7a280ad92680abdeab01d1214688279cf6316 | [
"MIT"
] | 21 | 2022-02-06T17:25:58.000Z | 2022-03-27T04:50:29.000Z | from .services import (
ShippingQueryService,
)
| 13 | 25 | 0.730769 | 4 | 52 | 9.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192308 | 52 | 3 | 26 | 17.333333 | 0.904762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
f3812b63bb615a1af8bc6e604390eb332c308449 | 34 | py | Python | ServerMain.py | Shiruyaka/MeowMeow | 370684a167c39fe6090cb06e5acb61e363c48e49 | [
"MIT"
] | null | null | null | ServerMain.py | Shiruyaka/MeowMeow | 370684a167c39fe6090cb06e5acb61e363c48e49 | [
"MIT"
] | null | null | null | ServerMain.py | Shiruyaka/MeowMeow | 370684a167c39fe6090cb06e5acb61e363c48e49 | [
"MIT"
] | null | null | null | import Server
d = Server.Server() | 11.333333 | 19 | 0.735294 | 5 | 34 | 5 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147059 | 34 | 3 | 19 | 11.333333 | 0.862069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
f39af3dc940a823b034c85f6548c0f14ea4cd635 | 16,434 | py | Python | src/tests/test_request.py | craigahobbs/chisel | 19c106695af4c02b86ab24c8f19a69cde6dcf75a | [
"MIT"
] | 1 | 2021-04-03T15:26:49.000Z | 2021-04-03T15:26:49.000Z | src/tests/test_request.py | craigahobbs/chisel | 19c106695af4c02b86ab24c8f19a69cde6dcf75a | [
"MIT"
] | 3 | 2016-02-28T17:26:51.000Z | 2021-04-06T19:00:52.000Z | src/tests/test_request.py | craigahobbs/chisel | 19c106695af4c02b86ab24c8f19a69cde6dcf75a | [
"MIT"
] | 2 | 2015-03-08T22:32:09.000Z | 2016-10-21T00:10:07.000Z | # Licensed under the MIT License
# https://github.com/craigahobbs/chisel/blob/main/LICENSE
# pylint: disable=missing-class-docstring, missing-function-docstring, missing-module-docstring
from contextlib import contextmanager
from http import HTTPStatus
import os
import sys
from tempfile import TemporaryDirectory
from unittest import TestCase
import unittest.mock
from chisel import request, Application, Request, RedirectRequest, StaticRequest
# Helper context manager to create a list of files in a temporary directory
@contextmanager
def create_test_files(file_defs):
tempdir = TemporaryDirectory() # pylint: disable=consider-using-with
try:
for path_parts, content in file_defs:
if isinstance(path_parts, str):
path_parts = [path_parts]
path = os.path.join(tempdir.name, *path_parts)
os.makedirs(os.path.dirname(path), exist_ok=True)
with open(path, 'w', encoding='utf-8') as file_:
file_.write(content)
yield tempdir.name
finally:
tempdir.cleanup()
class TestRequest(TestCase):
def test_reqeust(self):
def my_request(environ, start_response):
assert isinstance(environ, dict)
assert callable(start_response)
start_response('OK', [])
return [b'ok']
req = Request(my_request)
self.assertEqual(req.wsgi_callback, my_request)
self.assertEqual(req.name, 'my_request')
self.assertEqual(req.urls, ((None, '/my_request'),))
self.assertEqual(req.doc, None)
self.assertEqual(req({}, lambda status, headers: None), [b'ok'])
def test_request_name(self):
def my_request(environ, start_response):
assert isinstance(environ, dict)
assert callable(start_response)
start_response('OK', [])
return [b'ok']
req = Request(my_request, name='foo')
self.assertEqual(req.wsgi_callback, my_request)
self.assertEqual(req.name, 'foo')
self.assertEqual(req.urls, ((None, '/foo'),))
self.assertEqual(req.doc, None)
self.assertEqual(req({}, lambda status, headers: None), [b'ok'])
def test_request_urls(self):
def my_request(environ, start_response):
assert isinstance(environ, dict)
assert callable(start_response)
start_response('OK', [])
return [b'ok']
req = Request(my_request, urls=[('GET', '/bar'), ('post', '/thud'), (None, '/bonk'), None])
self.assertEqual(req.wsgi_callback, my_request)
self.assertEqual(req.name, 'my_request')
self.assertEqual(req.urls, (('GET', '/bar'), ('POST', '/thud'), (None, '/bonk'), (None, '/my_request')))
self.assertEqual(req.doc, None)
self.assertEqual(req({}, lambda status, headers: None), [b'ok'])
def test_request_method_urls(self):
def my_request(environ, start_response):
assert isinstance(environ, dict)
assert callable(start_response)
start_response('OK', [])
return [b'ok']
req = Request(my_request, urls=[('GET', '/bar'), ('post', '/thud'), (None, '/bonk'), None])
self.assertEqual(req.wsgi_callback, my_request)
self.assertEqual(req.name, 'my_request')
self.assertEqual(req.urls, (('GET', '/bar'), ('POST', '/thud'), (None, '/bonk'), (None, '/my_request')))
self.assertEqual(req.doc, None)
self.assertEqual(req({}, lambda status, headers: None), [b'ok'])
def test_request_urls_str(self):
def my_request(environ, start_response):
assert isinstance(environ, dict)
assert callable(start_response)
start_response('OK', [])
return [b'ok']
req = Request(my_request, urls=[(None, '/bar')])
self.assertEqual(req.wsgi_callback, my_request)
self.assertEqual(req.name, 'my_request')
self.assertEqual(req.urls, ((None, '/bar'),))
self.assertEqual(req.doc, None)
self.assertEqual(req({}, lambda status, headers: None), [b'ok'])
req = Request(my_request, urls=[('get', '/bar')])
self.assertEqual(req.wsgi_callback, my_request)
self.assertEqual(req.name, 'my_request')
self.assertEqual(req.urls, (('GET', '/bar'),))
self.assertEqual(req.doc, None)
self.assertEqual(req({}, lambda status, headers: None), [b'ok'])
req = Request(my_request, urls=[('get', '/bar'), ('POST', '/bar')])
self.assertEqual(req.wsgi_callback, my_request)
self.assertEqual(req.name, 'my_request')
self.assertEqual(req.urls, (('GET', '/bar'), ('POST', '/bar')))
self.assertEqual(req.doc, None)
self.assertEqual(req({}, lambda status, headers: None), [b'ok'])
def test_request_name_and_urls(self):
def my_request(environ, start_response):
assert isinstance(environ, dict)
assert callable(start_response)
start_response('OK', [])
return [b'ok']
req = Request(my_request, name='foo', urls=[('GET', '/bar'), ('post', '/thud'), (None, '/bonk'), None])
self.assertEqual(req.wsgi_callback, my_request)
self.assertEqual(req.name, 'foo')
self.assertEqual(req.urls, (('GET', '/bar'), ('POST', '/thud'), (None, '/bonk'), (None, '/foo')))
self.assertEqual(req.doc, None)
self.assertEqual(req({}, lambda status, headers: None), [b'ok'])
def test_request_doc(self):
def my_request(environ, start_response):
assert isinstance(environ, dict)
assert callable(start_response)
start_response('OK', [])
return [b'ok']
req = Request(my_request, doc=('doc line 1', 'doc line 2'))
self.assertEqual(req.wsgi_callback, my_request)
self.assertEqual(req.name, 'my_request')
self.assertEqual(req.urls, ((None, '/my_request'),))
self.assertEqual(req.doc, ('doc line 1', 'doc line 2'))
self.assertEqual(req({}, lambda status, headers: None), [b'ok'])
def test_request_none(self):
with self.assertRaises(AssertionError) as cm_exc:
Request()
self.assertEqual(str(cm_exc.exception), 'must specify either wsgi_callback and/or name')
req = Request(name='my_request')
self.assertEqual(req.wsgi_callback, None)
self.assertEqual(req.name, 'my_request')
self.assertEqual(req.urls, ((None, '/my_request'),))
self.assertEqual(req.doc, None)
with self.assertRaises(AssertionError) as cm_exc:
req({}, None)
self.assertEqual(str(cm_exc.exception), 'wsgi_callback required when using Request directly')
def test_decorator(self):
@request
def my_request(environ, start_response):
assert isinstance(environ, dict)
assert callable(start_response)
start_response('OK', [])
return [b'ok']
self.assertTrue(isinstance(my_request, Request))
self.assertEqual(my_request({}, lambda status, headers: None), [b'ok'])
self.assertEqual(my_request.name, 'my_request')
self.assertEqual(my_request.urls, ((None, '/my_request'),))
self.assertEqual(my_request.doc, None)
def test_decorator_complete(self):
@request(name='foo', urls=[('GET', '/bar'), ('post', '/thud'), (None, '/bonk'), None], doc=('doc line 1', 'doc line 2'))
def my_request(environ, start_response):
assert isinstance(environ, dict)
assert callable(start_response)
start_response('OK', [])
return [b'ok']
self.assertTrue(isinstance(my_request, Request))
self.assertEqual(my_request.name, 'foo')
self.assertEqual(my_request.urls, (('GET', '/bar'), ('POST', '/thud'), (None, '/bonk'), (None, '/foo')))
self.assertEqual(my_request.doc, ('doc line 1', 'doc line 2'))
self.assertEqual(my_request({}, lambda status, headers: None), [b'ok'])
def test_import_requests(self):
test_files = (
(
'__init__.py',
''
),
(
('test_package', '__init__.py',),
''
),
(
('test_package', 'module.py',),
'''\
from chisel import request
@request
def request1(environ, start_response):
return [b'request1']
@request
def request2(environ, start_response):
return [b'request2']
'''
),
(
('test_package', 'module2.py',),
''
),
(
('test_package', 'sub', '__init__.py'),
''
),
(
('test_package', 'sub', 'subsub', '__init__.py'),
''
),
(
('test_package', 'sub', 'subsub', 'submodule.py'),
'''\
from chisel import request
@request
def request3(environ, start_response):
return [b'request3']
'''
)
)
with create_test_files(test_files) as requests_dir:
with unittest.mock.patch('sys.path', [requests_dir] + sys.path):
self.assertListEqual(
sorted(request.name for request in Request.import_requests('test_package')),
[
'request1',
'request2',
'request3'
]
)
def test_request_subclass(self):
class MyRequest(Request):
__slots__ = ('index',)
def __init__(self, index):
super().__init__(name=f'MyRequest{index}',
urls=[('GET', f'/my-request-{index}')],
doc=[f'My request number {index}.'])
self.index = index
def __call__(self, environ, start_response):
# Note: Do NOT call Request __call__ method in a subclass
start_response(HTTPStatus.OK, [('Content-Type', 'text/plain')])
return [f'This is request # {self.index}'.encode('utf-8')]
req = MyRequest(1)
self.assertTrue(isinstance(req, Request))
self.assertEqual(req.name, 'MyRequest1')
self.assertEqual(req.urls, (('GET', '/my-request-1'),))
self.assertEqual(req.doc, ['My request number 1.'])
self.assertEqual(req({}, lambda status, headers: None), [b'This is request # 1'])
class TestRedirect(TestCase):
def test_default(self):
redirect = RedirectRequest((('GET', '/old'),), '/new')
app = Application()
app.add_request(redirect)
self.assertEqual(redirect.name, 'redirect_new')
self.assertEqual(redirect.doc, 'Redirect to /new.')
status, headers, response = app.request('GET', '/old')
self.assertEqual(status, '301 Moved Permanently')
self.assertListEqual(headers, [
('Content-Type', 'text/plain'),
('Location', '/new')
])
self.assertEqual(response, b'/new')
def test_name(self):
redirect = RedirectRequest((('GET', '/old'),), '/new', name='redirect_old_to_new')
app = Application()
app.add_request(redirect)
self.assertEqual(redirect.name, 'redirect_old_to_new')
self.assertEqual(redirect.doc, 'Redirect to /new.')
status, headers, response = app.request('GET', '/old')
self.assertEqual(status, '301 Moved Permanently')
self.assertListEqual(headers, [
('Content-Type', 'text/plain'),
('Location', '/new')
])
self.assertEqual(response, b'/new')
def test_doc(self):
redirect = RedirectRequest((('GET', '/old'),), '/new', doc=('Redirect old to new',))
app = Application()
app.add_request(redirect)
self.assertEqual(redirect.name, 'redirect_new')
self.assertEqual(redirect.doc, ('Redirect old to new',))
status, headers, response = app.request('GET', '/old')
self.assertEqual(status, '301 Moved Permanently')
self.assertListEqual(headers, [
('Content-Type', 'text/plain'),
('Location', '/new')
])
self.assertEqual(response, b'/new')
def test_not_permanent(self):
redirect = RedirectRequest((('GET', '/old'),), '/new', permanent=False)
app = Application()
app.add_request(redirect)
self.assertEqual(redirect.name, 'redirect_new')
self.assertEqual(redirect.doc, 'Redirect to /new.')
status, headers, response = app.request('GET', '/old')
self.assertEqual(status, '302 Found')
self.assertListEqual(headers, [
('Content-Type', 'text/plain'),
('Location', '/new')
])
self.assertEqual(response, b'/new')
class TestStatic(TestCase):
def test_init(self):
static = StaticRequest('index.html', b'<!DOCTYPE html>')
self.assertEqual(static.name, 'index.html')
self.assertEqual(static.urls, (('GET', '/index.html'),))
self.assertEqual(static.doc, ('The static resource "index.html"',))
self.assertEqual(static.headers, (('Content-Type', 'text/html'), ('ETag', 'fe364450e1391215f596d043488f989f')))
self.assertEqual(static.content, b'<!DOCTYPE html>')
self.assertEqual(static.etag, 'fe364450e1391215f596d043488f989f')
def test_init_doc(self):
static = StaticRequest('index.html', b'<!DOCTYPE html>', doc=('This is the doc!',))
self.assertEqual(static.name, 'index.html')
self.assertEqual(static.urls, (('GET', '/index.html'),))
self.assertEqual(static.doc, ('This is the doc!',))
self.assertEqual(static.headers, (('Content-Type', 'text/html'), ('ETag', 'fe364450e1391215f596d043488f989f')))
self.assertEqual(static.content, b'<!DOCTYPE html>')
self.assertEqual(static.etag, 'fe364450e1391215f596d043488f989f')
def test_init_urls(self):
static = StaticRequest('index', b'<!DOCTYPE html>', urls=(('GET', '/index.html'),))
self.assertEqual(static.name, 'index')
self.assertEqual(static.urls, (('GET', '/index.html'),))
self.assertEqual(static.doc, ('The static resource "index"',))
self.assertEqual(static.headers, (('Content-Type', 'text/html'), ('ETag', 'fe364450e1391215f596d043488f989f')))
self.assertEqual(static.content, b'<!DOCTYPE html>')
self.assertEqual(static.etag, 'fe364450e1391215f596d043488f989f')
def test_content_type(self):
static = StaticRequest('index', b'<!DOCTYPE html>', content_type='text/html')
self.assertEqual(static.name, 'index')
self.assertEqual(static.urls, (('GET', '/index'),))
self.assertEqual(static.doc, ('The static resource "index"',))
self.assertEqual(static.headers, (('Content-Type', 'text/html'), ('ETag', 'fe364450e1391215f596d043488f989f')))
self.assertEqual(static.content, b'<!DOCTYPE html>')
self.assertEqual(static.etag, 'fe364450e1391215f596d043488f989f')
def test_content_type_unknown(self):
with self.assertRaises(AssertionError) as cm_exc:
StaticRequest('index.unknown', b'<!DOCTYPE html>')
self.assertEqual(str(cm_exc.exception), 'Unknown content type for static resource "index.unknown"')
def test_request(self):
app = Application()
static = StaticRequest('chisel-doc', b'<!DOCTYPE html>', urls=(('GET', '/doc/index.html'),))
app.add_request(static)
status, headers, response = app.request('GET', '/doc/index.html')
self.assertEqual(status, '200 OK')
self.assertListEqual(headers, [('Content-Type', 'text/html'), ('ETag', 'fe364450e1391215f596d043488f989f')])
self.assertEqual(response, b'<!DOCTYPE html>')
def test_request_not_modified(self):
app = Application()
static = StaticRequest('chisel-doc', b'<!DOCTYPE html>', urls=(('GET', '/doc/index.html'),))
app.add_request(static)
status, headers, response = app.request(
'GET',
'/doc/index.html',
environ={'HTTP_IF_NONE_MATCH': 'fe364450e1391215f596d043488f989f'}
)
self.assertEqual(status, '304 Not Modified')
self.assertListEqual(headers, [])
self.assertEqual(response, b'')
| 40.082927 | 128 | 0.594864 | 1,773 | 16,434 | 5.384659 | 0.106599 | 0.169687 | 0.099927 | 0.065361 | 0.768828 | 0.749345 | 0.704305 | 0.676967 | 0.650361 | 0.642296 | 0 | 0.023798 | 0.250821 | 16,434 | 409 | 129 | 40.180929 | 0.751624 | 0.021054 | 0 | 0.545171 | 0 | 0 | 0.164355 | 0.020267 | 0 | 0 | 0 | 0 | 0.433022 | 1 | 0.109034 | false | 0 | 0.031153 | 0 | 0.183801 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
f3bf1a8d1465f0a5bc0aee98dc4ac138bf9d9fda | 17 | py | Python | data/studio21_generated/introductory/4888/starter_code.py | vijaykumawat256/Prompt-Summarization | 614f5911e2acd2933440d909de2b4f86653dc214 | [
"Apache-2.0"
] | null | null | null | data/studio21_generated/introductory/4888/starter_code.py | vijaykumawat256/Prompt-Summarization | 614f5911e2acd2933440d909de2b4f86653dc214 | [
"Apache-2.0"
] | null | null | null | data/studio21_generated/introductory/4888/starter_code.py | vijaykumawat256/Prompt-Summarization | 614f5911e2acd2933440d909de2b4f86653dc214 | [
"Apache-2.0"
] | null | null | null | def recaman(n):
| 8.5 | 15 | 0.647059 | 3 | 17 | 3.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 17 | 2 | 16 | 8.5 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
f3c44c06119230743de895fa4f776f42188e2589 | 83 | py | Python | application/settings/dev.py | KraftSoft/together | 8f6ecb0545aa6cd9cbb79a4f8686928f77cea4af | [
"BSD-3-Clause"
] | null | null | null | application/settings/dev.py | KraftSoft/together | 8f6ecb0545aa6cd9cbb79a4f8686928f77cea4af | [
"BSD-3-Clause"
] | null | null | null | application/settings/dev.py | KraftSoft/together | 8f6ecb0545aa6cd9cbb79a4f8686928f77cea4af | [
"BSD-3-Clause"
] | null | null | null | from application.settings.base import * #NOQA
DEBUG = True
TEMPLATE_DEBUG = True
| 16.6 | 46 | 0.771084 | 11 | 83 | 5.727273 | 0.818182 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156627 | 83 | 4 | 47 | 20.75 | 0.9 | 0.048193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
f3c51e454157c9fbecc3ee9b2276e9b7f22a628d | 176 | py | Python | lesson_tasks/lesson7/num5.py | NikaEgorova/goiteens-python3-egorova | 809059abf3464cbe4f0f116e52ce53534a0ca5ba | [
"MIT"
] | null | null | null | lesson_tasks/lesson7/num5.py | NikaEgorova/goiteens-python3-egorova | 809059abf3464cbe4f0f116e52ce53534a0ca5ba | [
"MIT"
] | null | null | null | lesson_tasks/lesson7/num5.py | NikaEgorova/goiteens-python3-egorova | 809059abf3464cbe4f0f116e52ce53534a0ca5ba | [
"MIT"
] | null | null | null | pharaons = ("Клеопатра", "Рамзес", "Тутанхамон")
pharaons_list = list(pharaons)
for i in pharaons_list:
num_of_letters = len(i)
if num_of_letters > 8:
print(i)
| 25.142857 | 48 | 0.670455 | 25 | 176 | 4.48 | 0.6 | 0.214286 | 0.214286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007143 | 0.204545 | 176 | 6 | 49 | 29.333333 | 0.792857 | 0 | 0 | 0 | 0 | 0 | 0.142045 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
34060a4cf9d0e7dd8aa5deb208be370737a51d81 | 49 | py | Python | earthvision/datasets/skyscapes.py | otivedani/earth-vision | 52d1f1019f47c4611345703d381b9cda960afa7f | [
"MIT"
] | 29 | 2021-05-18T15:01:03.000Z | 2022-03-08T01:07:55.000Z | earthvision/datasets/skyscapes.py | otivedani/earth-vision | 52d1f1019f47c4611345703d381b9cda960afa7f | [
"MIT"
] | 65 | 2021-05-03T11:41:04.000Z | 2022-01-17T16:04:06.000Z | earthvision/datasets/skyscapes.py | otivedani/earth-vision | 52d1f1019f47c4611345703d381b9cda960afa7f | [
"MIT"
] | 9 | 2021-05-16T16:00:00.000Z | 2021-12-08T04:30:05.000Z | class SkyScapes():
raise NotImplementedError
| 16.333333 | 29 | 0.77551 | 4 | 49 | 9.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163265 | 49 | 2 | 30 | 24.5 | 0.926829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
3424f1db79678e7ef9f2b8e8955c389d4f28a3b6 | 60 | py | Python | keys.py | LeGoldFish/DTR2Sync | df2a8eee3c9f985c3b1d5371d08686b3d5718328 | [
"MIT"
] | null | null | null | keys.py | LeGoldFish/DTR2Sync | df2a8eee3c9f985c3b1d5371d08686b3d5718328 | [
"MIT"
] | null | null | null | keys.py | LeGoldFish/DTR2Sync | df2a8eee3c9f985c3b1d5371d08686b3d5718328 | [
"MIT"
] | null | null | null | from msvcrt import getch
while True:
print(ord(getch())) | 20 | 24 | 0.716667 | 9 | 60 | 4.777778 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 60 | 3 | 25 | 20 | 0.86 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
342e6d45025544849fb6bd88c63c7ff2f398a052 | 2,550 | py | Python | day19/p1.py | Seralpa/AdventOfCode2018 | da18189e872bd9a2d13a8950cf64f06892fcdf8d | [
"MIT"
] | null | null | null | day19/p1.py | Seralpa/AdventOfCode2018 | da18189e872bd9a2d13a8950cf64f06892fcdf8d | [
"MIT"
] | null | null | null | day19/p1.py | Seralpa/AdventOfCode2018 | da18189e872bd9a2d13a8950cf64f06892fcdf8d | [
"MIT"
] | null | null | null | def operate(instruction, registers):
if instruction[0] == 'addr': # addr
registers[instruction[3]] = registers[instruction[1]] + registers[instruction[2]]
elif instruction[0] == 'addi': # addi
registers[instruction[3]] = registers[instruction[1]] + instruction[2]
elif instruction[0] == 'mulr': # mulr
registers[instruction[3]] = registers[instruction[1]] * registers[instruction[2]]
elif instruction[0] == 'muli': # muli
registers[instruction[3]] = registers[instruction[1]] * instruction[2]
elif instruction[0] == 'banr': # banr
registers[instruction[3]] = registers[instruction[1]] & registers[instruction[2]]
elif instruction[0] == 'bani': # bani
registers[instruction[3]] = registers[instruction[1]] & instruction[2]
elif instruction[0] == 'borr': #borr
registers[instruction[3]] = registers[instruction[1]] | registers[instruction[2]]
elif instruction[0] == 'bori': # bori
registers[instruction[3]] = registers[instruction[1]] | instruction[2]
elif instruction[0] == 'setr': # setr
registers[instruction[3]] = registers[instruction[1]]
elif instruction[0] == 'seti': # seti
registers[instruction[3]] = instruction[1]
elif instruction[0] == 'gtir': # gtir
if instruction[1] > registers[instruction[2]]:
registers[instruction[3]] = 1
else:
registers[instruction[3]] = 0
elif instruction[0] == 'gtri': # gtri
if registers[instruction[1]] > instruction[2]:
registers[instruction[3]] = 1
else:
registers[instruction[3]] = 0
elif instruction[0] == 'gtrr': # gtrr
if registers[instruction[1]] > registers[instruction[2]]:
registers[instruction[3]] = 1
else:
registers[instruction[3]] = 0
elif instruction[0] == 'eqir': # eqir
if instruction[1] == registers[instruction[2]]:
registers[instruction[3]] = 1
else:
registers[instruction[3]] = 0
elif instruction[0] == 'eqri': # eqri
if registers[instruction[1]] == instruction[2]:
registers[instruction[3]] = 1
else:
registers[instruction[3]] = 0
elif instruction[0] == 'eqrr': # eqrr
if registers[instruction[1]] == registers[instruction[2]]:
registers[instruction[3]] = 1
else:
registers[instruction[3]] = 0
ip = 0
instructions = list()
with open("input.txt") as f:
ip = int(f.readline()[4])
instructions = [l.strip().split() for l in f.readlines()]
for i in instructions:
for j in range(1, len(i)):
i[j] = int(i[j])
registers = [0, 0, 0, 0, 0, 0]
while registers[ip] < len(instructions) and registers[ip] >= 0:
operate(instructions[registers[ip]], registers)
registers[ip] += 1
print(registers[0]) | 38.059701 | 83 | 0.678824 | 322 | 2,550 | 5.375776 | 0.161491 | 0.496823 | 0.266898 | 0.155979 | 0.741768 | 0.712883 | 0.688619 | 0.688619 | 0.688619 | 0.688619 | 0 | 0.042182 | 0.144706 | 2,550 | 67 | 84 | 38.059701 | 0.75149 | 0.030588 | 0 | 0.28125 | 0 | 0 | 0.029723 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.015625 | false | 0 | 0 | 0 | 0.015625 | 0.015625 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
346154ea693de10e5efff15286b1032bc702d5fd | 60 | py | Python | watson/framework/support/__init__.py | watsonpy/watson-framework | ffe157cb3fe24366ee55869d4260cce778012b4a | [
"BSD-3-Clause"
] | 68 | 2015-01-06T19:15:07.000Z | 2021-02-05T13:43:42.000Z | watson/framework/support/__init__.py | watsonpy/watson-framework | ffe157cb3fe24366ee55869d4260cce778012b4a | [
"BSD-3-Clause"
] | 13 | 2015-02-14T12:08:28.000Z | 2021-03-29T20:44:50.000Z | watson/framework/support/__init__.py | watsonpy/watson-framework | ffe157cb3fe24366ee55869d4260cce778012b4a | [
"BSD-3-Clause"
] | 8 | 2015-07-22T23:40:34.000Z | 2021-07-29T06:15:34.000Z | # Any code relating to integration with 3rd party libraries
| 30 | 59 | 0.816667 | 9 | 60 | 5.444444 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02 | 0.166667 | 60 | 1 | 60 | 60 | 0.96 | 0.95 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
3478876058388831c47b015c49a72679638a55c0 | 2,705 | py | Python | app/http/controllers/PdfController.py | stahiralijan/pdfminer | f50f13a2c55b4159b26a4af2d98422fc39457f65 | [
"MIT"
] | null | null | null | app/http/controllers/PdfController.py | stahiralijan/pdfminer | f50f13a2c55b4159b26a4af2d98422fc39457f65 | [
"MIT"
] | null | null | null | app/http/controllers/PdfController.py | stahiralijan/pdfminer | f50f13a2c55b4159b26a4af2d98422fc39457f65 | [
"MIT"
] | null | null | null | """ A PdfControllerController Module """
from masonite.request import Request
from masonite.view import View
from masonite.controllers import Controller
from app.Pdf import Pdf
from masonite.response import Response
from masonite import Upload
from app.jobs.FileStorage import FileStorage
class PdfController(Controller):
"""Class Docstring Description
"""
def show(self, filename, response: Response, view: View):
"""Show a single resource listing
ex. Model.find('id')
Get().route("/show", PdfControllerController
"""
import os
pdf_file_dir = os.path.abspath('storage/uploads/pdf_files')
return view.render(pdf_file_dir + "/" + filename)
def index(self, view: View):
"""Show several resource listings
ex. Model.all()
Get().route("/index", PdfControllerController
"""
pdfs = Pdf.all()
return view.render("pdfs/index.html", {'pdfs': pdfs})
def create(self, view: View):
"""Show form to create new resource listings
ex. Get().route("/create", PdfControllerController
"""
return view.render("pdfs/create.html")
def store(self, response: Response, upload: Upload):
"""Create a new resource listing
ex. Post target to create new Model
Post().route("/store", PdfControllerController)
"""
# queue.push(FileStorage(request()))
file = upload.accept("pdf").store(request().input('pdf_file'), location='storage/uploads/pdf_files')
# dd(request().input('pdf_file').filename)
from app.PdfConverter import PdfConverter
import os
import hashlib
txtfilename = hashlib.md5(file.encode('utf-8')).hexdigest()
pdf_file_dir = os.path.abspath('storage/uploads/pdf_files')
converter = PdfConverter(pdf_file_dir + "/" + file, pdf_file_dir + "/" + txtfilename + ".txt", "")
converter.toTxtfile()
Pdf.create(name= request().input('pdf_file').filename, pdf_path=file, txt_path=txtfilename + ".txt", html_path='')
return response.redirect('/pdfs/index')
def edit(self):
"""Show form to edit an existing resource listing
ex. Get().route("/edit", PdfControllerController)
"""
pass
def update(self):
"""Edit an existing resource listing
ex. Post target to update new Model
Post().route("/update", PdfControllerController)
"""
pass
def destroy(self):
"""Delete an existing resource listing
ex. Delete().route("/destroy", PdfControllerController)
"""
pass | 31.823529 | 122 | 0.618484 | 294 | 2,705 | 5.62585 | 0.278912 | 0.033857 | 0.051391 | 0.039903 | 0.165659 | 0.116687 | 0.054414 | 0.054414 | 0.054414 | 0.054414 | 0 | 0.000997 | 0.25841 | 2,705 | 85 | 123 | 31.823529 | 0.823529 | 0.319409 | 0 | 0.205882 | 0 | 0 | 0.095006 | 0.045676 | 0 | 0 | 0 | 0 | 0 | 1 | 0.205882 | false | 0.088235 | 0.323529 | 0 | 0.676471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 4 |
1b48881664b3df32c0a62f1524c4952b905c7ff1 | 171 | py | Python | gpio/ch3listing1.py | mslinklater/pi_stuff | 9de3ac7b9f0df235202502eeea146c711f2f99b9 | [
"MIT"
] | null | null | null | gpio/ch3listing1.py | mslinklater/pi_stuff | 9de3ac7b9f0df235202502eeea146c711f2f99b9 | [
"MIT"
] | null | null | null | gpio/ch3listing1.py | mslinklater/pi_stuff | 9de3ac7b9f0df235202502eeea146c711f2f99b9 | [
"MIT"
] | null | null | null | from gpiozero import Button
button= Button(21)
while True:
if button.is_pressed:
print("Button is pressed")
else:
print("Button is not pressed")
| 17.1 | 38 | 0.654971 | 23 | 171 | 4.826087 | 0.565217 | 0.216216 | 0.27027 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015748 | 0.25731 | 171 | 9 | 39 | 19 | 0.858268 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.285714 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
1b59d612eaae0e1240e7c8ec3d133f71dda5964f | 77 | py | Python | router/qrcode.py | Sayaji911/url-shortner | fb7785c28eb39673474cedc60efd58f88c706611 | [
"MIT"
] | null | null | null | router/qrcode.py | Sayaji911/url-shortner | fb7785c28eb39673474cedc60efd58f88c706611 | [
"MIT"
] | null | null | null | router/qrcode.py | Sayaji911/url-shortner | fb7785c28eb39673474cedc60efd58f88c706611 | [
"MIT"
] | null | null | null | import qrcode as QR
from router.redirect import router as RedirectRouter
| 19.25 | 53 | 0.805195 | 11 | 77 | 5.636364 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 77 | 3 | 54 | 25.666667 | 0.984127 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
1b817ea5d0ba06a137d5431c38640f24078e0c16 | 93 | py | Python | aforoInfo/apps.py | joelsegoviacrespo/control_aforo_migrado | be90d1d45a20f735e7ef20449c4ab91ca05b5d85 | [
"MIT"
] | null | null | null | aforoInfo/apps.py | joelsegoviacrespo/control_aforo_migrado | be90d1d45a20f735e7ef20449c4ab91ca05b5d85 | [
"MIT"
] | null | null | null | aforoInfo/apps.py | joelsegoviacrespo/control_aforo_migrado | be90d1d45a20f735e7ef20449c4ab91ca05b5d85 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class AforoinfoConfig(AppConfig):
name = 'aforoInfo'
| 15.5 | 33 | 0.763441 | 10 | 93 | 7.1 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16129 | 93 | 5 | 34 | 18.6 | 0.910256 | 0 | 0 | 0 | 0 | 0 | 0.096774 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
1b9dee2ffd6f064132e3a3e07ff900024301ed60 | 46,733 | py | Python | StoveOpt/create_blockmeshfile.py | Liam-Cassidy/StoveOpt | 3908b0ad6bd9b496948310bef66099b532c16f0e | [
"MIT"
] | null | null | null | StoveOpt/create_blockmeshfile.py | Liam-Cassidy/StoveOpt | 3908b0ad6bd9b496948310bef66099b532c16f0e | [
"MIT"
] | null | null | null | StoveOpt/create_blockmeshfile.py | Liam-Cassidy/StoveOpt | 3908b0ad6bd9b496948310bef66099b532c16f0e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Thu Apr 25 11:26:04 2019
@author: Lee
"""
""" Goal is to take inputs from the import_geometry module and edit a blockmesh file template"""
import shutil
from shutil import copyfile
from shutil import copy
import os
def locate_blockmesh_template():
"""the function uses the StoveOpt path and blockmesh template name to open the
template version of the blockMeshDict file for editing in the system folder
Args:
None
Returns:
blockmesh_template (str): full file path where blockmesh template lives
"""
# Current working dir for stove opt master
path_StoveOpt_master = os.getcwd()
# Steps to system folder
dir_steps = "//blockMeshDict_foamfile//template//blockMeshDict_template"
blockmesh_template = path_StoveOpt_master + dir_steps # location and path name of blockmesh template
print("blockmesh template located at:")
print(blockmesh_template)
return blockmesh_template
#blockmesh_template = locate_blockmesh_template()
def compute_num_cells(max_delta_x, pt_0_y, pt_1_y):
"""compute the number of blocks associated with the max_delta_x for the largest spatial step (combustion chamber)
Args:
max_delta_x (double): User defined maximum grid spacing.
pt0x (double): x-coordinate of bottom LHS cookstove combustion chamber
pt1x (double): x-coordinate of bottom RHS cookstove combustion chamber
Returns:
num_cells_int (int): number of cells to be written to the openfoam blockmesh file for entire domain
num_cells_double (double): number of cells to be written to the openfoam blockmesh file for entire domain BEFORE INT ROUNDING
num_cells_int_str (str): number of cells to be written to the openfoam blockmesh file for entire domain, converted to string type for f.write
num_cells_int_str_concat (str): cells formatted for OF
"""
max_space = abs(pt_1_y-pt_0_y) # maximum spatial step in domain defined by coordinates
num_cells_double = max_space/max_delta_x # unrounded number of cells per block
num_cells_int = int(round(num_cells_double)) # round to integer value
num_cells_int_str = str(num_cells_int)
num_cells_int_str_concat = "(" + num_cells_int_str + " " + num_cells_int_str + " " + num_cells_int_str + ")"
print("num cells int")
print(num_cells_int)
print("num cells int converted to str")
print(num_cells_int_str)
print("num cells int str concat")
print(num_cells_int_str_concat)
return num_cells_int, num_cells_double, num_cells_int_str, num_cells_int_str_concat
#num_cells_int, num_cells_double, num_cells_int_str, num_cells_int_str_concat = compute_num_cells(max_delta_x, pt0x, pt1x)
def write_blockmesh(num_cells_int_str_concat, blockmesh_template, num_cells_int, pt_0_str, pt_1_str, pt_2_str, pt_3_str, pt_4_str, pt_5_str, pt_6_str, pt_7_str, pt_8_str, pt_9_str, pt_10_str, pt_11_str, pt_12_str, pt_13_str, pt_14_str, pt_15_str, pt_16_str, pt_17_str, pt_18_str, pt_19_str, pt_20_str, pt_21_str, pt_22_str, pt_23_str, pt_24_str, pt_25_str, pt_26_str, pt_27_str, pt_28_str, pt_29_str, pt_30_str, pt_31_str, pt_32_str, pt_33_str, pt_34_str, pt_35_str, pt_36_str, pt_37_str, pt_38_str, pt_39_str, pt_40_str, pt_41_str, pt_42_str, pt_43_str, pt_44_str, pt_45_str, pt_46_str, pt_47_str, pt_48_str, pt_49_str, pt_50_str, pt_51_str, pt_52_str, pt_53_str, pt_54_str, pt_55_str, pt_56_str, pt_57_str, pt_58_str, pt_59_str, pt_60_str, pt_61_str, pt_62_str, pt_63_str, pt_64_str, pt_65_str, pt_66_str, pt_67_str, pt_68_str, pt_69_str, pt_70_str, pt_71_str, pt_72_str, pt_73_str, pt_74_str, pt_75_str, pt_76_str, pt_77_str, pt_78_str, pt_79_str, pt_80_str, pt_81_str, pt_82_str, pt_83_str, pt_84_str, pt_85_str, pt_86_str, pt_87_str, pt_88_str, pt_89_str, pt_90_str, pt_91_str, pt_92_str, pt_93_str, pt_94_str, pt_95_str, pt_96_str, pt_97_str, pt_98_str, pt_99_str, pt_100_str, pt_101_str, pt_102_str, pt_103_str, pt_104_str, pt_105_str, pt_106_str, pt_107_str, pt_108_str, pt_109_str, pt_110_str, pt_111_str, pt_112_str, pt_113_str, pt_114_str, pt_115_str, pt_116_str, pt_117_str, pt_118_str, pt_119_str, pt_120_str, pt_121_str, pt_122_str, pt_123_str, pt_124_str, pt_125_str, pt_126_str, pt_127_str, pt_128_str, pt_129_str, pt_130_str, pt_131_str, pt_132_str, pt_133_str, pt_134_str, pt_135_str, pt_136_str, pt_137_str, pt_138_str, pt_139_str, pt_140_str, pt_141_str, pt_142_str, pt_143_str, pt_144_str, pt_145_str, pt_146_str, pt_147_str, pt_148_str, pt_149_str, pt_150_str, pt_151_str, pt_152_str, pt_153_str, pt_154_str, pt_155_str, pt_156_str, pt_157_str, pt_158_str, pt_159_str):
"""Open the blockmesh template, edit and move to the Run folder
Args:
vertices defined as strings
blockmesh_template (str): full file path where blockmesh template lives
Returns:
block_mesh_run_path (str): file path with the blockmesh file saved for running cases
"""
with open(blockmesh_template,'r+') as f:
f.seek(602) #where writing begins
f.write("convertToMeters 1;"+"\n")
f.write("\n")
f.write("vertices"+"\n")
f.write("("+"\n")
# Write vertice strings
f.write(pt_0_str +'\n')
f.write(pt_1_str +'\n')
f.write(pt_2_str+'\n')
f.write(pt_3_str+'\n')
f.write(pt_4_str+'\n')
f.write(pt_5_str+'\n')
f.write(pt_6_str+'\n')
f.write(pt_7_str+'\n')
f.write(pt_8_str+'\n')
f.write(pt_9_str+'\n')
f.write(pt_10_str+'\n')
f.write(pt_11_str+'\n')
f.write(pt_12_str+'\n')
f.write(pt_13_str+'\n')
f.write(pt_14_str+'\n')
f.write(pt_15_str+'\n')
f.write(pt_16_str+'\n')
f.write(pt_17_str+'\n')
f.write(pt_18_str+'\n')
f.write(pt_19_str+'\n')
f.write(pt_20_str+'\n')
f.write(pt_21_str+'\n')
f.write(pt_22_str+'\n')
f.write(pt_23_str+'\n')
f.write(pt_24_str+'\n')
f.write(pt_25_str+'\n')
f.write(pt_26_str+'\n')
f.write(pt_27_str+'\n')
f.write(pt_28_str+'\n')
f.write(pt_29_str+'\n')
f.write(pt_30_str+'\n')
f.write(pt_31_str+'\n')
f.write(pt_32_str+'\n')
f.write(pt_33_str+'\n')
f.write(pt_34_str+'\n')
f.write(pt_35_str+'\n')
f.write(pt_36_str+'\n')
f.write(pt_37_str+'\n')
f.write(pt_38_str+'\n')
f.write(pt_39_str+'\n')
f.write(pt_40_str+'\n')
f.write(pt_41_str+'\n')
f.write(pt_42_str+'\n')
f.write(pt_43_str+'\n')
f.write(pt_44_str+'\n')
f.write(pt_45_str+'\n')
f.write(pt_46_str+'\n')
f.write(pt_47_str+'\n')
f.write(pt_48_str+'\n')
f.write(pt_49_str+'\n')
f.write(pt_50_str+'\n')
f.write(pt_51_str+'\n')
f.write(pt_52_str+'\n')
f.write(pt_53_str+'\n')
f.write(pt_54_str+'\n')
f.write(pt_55_str+'\n')
f.write(pt_56_str+'\n')
f.write(pt_57_str+'\n')
f.write(pt_58_str+'\n')
f.write(pt_59_str+'\n')
f.write(pt_60_str+'\n')
f.write(pt_61_str+'\n')
f.write(pt_62_str+'\n')
f.write(pt_63_str+'\n')
f.write(pt_64_str+'\n')
f.write(pt_65_str+'\n')
f.write(pt_66_str+'\n')
f.write(pt_67_str+'\n')
f.write(pt_68_str+'\n')
f.write(pt_69_str+'\n')
f.write(pt_70_str+'\n')
f.write(pt_71_str+'\n')
f.write(pt_72_str+'\n')
f.write(pt_73_str+'\n')
f.write(pt_74_str+'\n')
f.write(pt_75_str+'\n')
f.write(pt_76_str+'\n')
f.write(pt_77_str+'\n')
f.write(pt_78_str+'\n')
f.write(pt_79_str+'\n')
f.write(pt_80_str+'\n')
f.write(pt_81_str+'\n')
f.write(pt_82_str+'\n')
f.write(pt_83_str+'\n')
f.write(pt_84_str+'\n')
f.write(pt_85_str+'\n')
f.write(pt_86_str+'\n')
f.write(pt_87_str+'\n')
f.write(pt_88_str+'\n')
f.write(pt_89_str+'\n')
f.write(pt_90_str+'\n')
f.write(pt_91_str+'\n')
f.write(pt_92_str+'\n')
f.write(pt_93_str+'\n')
f.write(pt_94_str+'\n')
f.write(pt_95_str+'\n')
f.write(pt_96_str+'\n')
f.write(pt_97_str+'\n')
f.write(pt_98_str+'\n')
f.write(pt_99_str+'\n')
f.write(pt_100_str+'\n')
f.write(pt_101_str+'\n')
f.write(pt_102_str+'\n')
f.write(pt_103_str+'\n')
f.write(pt_104_str+'\n')
f.write(pt_105_str+'\n')
f.write(pt_106_str+'\n')
f.write(pt_107_str+'\n')
f.write(pt_108_str+'\n')
f.write(pt_109_str+'\n')
f.write(pt_110_str+'\n')
f.write(pt_111_str+'\n')
f.write(pt_112_str+'\n')
f.write(pt_113_str+'\n')
f.write(pt_114_str+'\n')
f.write(pt_115_str+'\n')
f.write(pt_116_str+'\n')
f.write(pt_117_str+'\n')
f.write(pt_118_str+'\n')
f.write(pt_119_str+'\n')
f.write(pt_120_str+'\n')
f.write(pt_121_str+'\n')
f.write(pt_122_str+'\n')
f.write(pt_123_str+'\n')
f.write(pt_124_str+'\n')
f.write(pt_125_str+'\n')
f.write(pt_126_str+'\n')
f.write(pt_127_str+'\n')
f.write(pt_128_str+'\n')
f.write(pt_129_str+'\n')
f.write(pt_130_str+'\n')
f.write(pt_131_str+'\n')
f.write(pt_132_str+'\n')
f.write(pt_133_str+'\n')
f.write(pt_134_str+'\n')
f.write(pt_135_str+'\n')
f.write(pt_136_str+'\n')
f.write(pt_137_str+'\n')
f.write(pt_138_str+'\n')
f.write(pt_139_str+'\n')
f.write(pt_140_str+'\n')
f.write(pt_141_str+'\n')
f.write(pt_142_str+'\n')
f.write(pt_143_str+'\n')
f.write(pt_144_str+'\n')
f.write(pt_145_str+'\n')
f.write(pt_146_str+'\n')
f.write(pt_147_str+'\n')
f.write(pt_148_str+'\n')
f.write(pt_149_str+'\n')
f.write(pt_150_str+'\n')
f.write(pt_151_str+'\n')
f.write(pt_152_str+'\n')
f.write(pt_153_str+'\n')
f.write(pt_154_str+'\n')
f.write(pt_155_str+'\n')
f.write(pt_156_str+'\n')
f.write(pt_157_str+'\n')
f.write(pt_158_str+'\n')
f.write(pt_159_str+'\n')
f.write("\n")
f.write(");"+"\n")
f.write("\n")
f.write("blocks"+"\n")
f.write("(")
f.write("\n")
# write the blocks, with
#f.write("hex (16 17 39 38 19 18 40 41) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (0 26 32 22 44 70 76 66) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (26 27 142 140 70 71 143 141) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (27 28 34 33 71 72 78 77) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (28 29 150 148 72 73 151 149) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (29 30 36 35 73 74 80 79) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (30 31 158 156 74 75 159 157) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (31 1 23 37 75 45 67 81) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (22 140 136 24 66 141 137 68) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (32 33 39 38 76 77 83 82) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (142 148 144 138 143 149 145 139) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (34 35 41 40 78 79 85 84) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (150 156 152 146 151 157 153 147) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (36 37 43 42 80 81 87 86) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (158 23 25 154 159 67 69 155) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (24 38 88 2 68 82 112 46) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (136 138 89 88 137 139 113 112) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (39 40 90 89 83 84 114 113) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (144 146 91 90 145 147 115 114) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (41 42 92 91 85 86 116 115) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (152 154 93 92 153 155 117 116) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (43 25 3 93 87 69 47 117) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (2 88 94 4 46 112 118 48) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (88 89 95 94 112 113 119 118) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (89 90 96 95 113 114 120 119) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (90 91 97 96 114 115 121 120) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (91 92 98 97 115 116 122 121) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (92 93 99 98 116 117 123 122) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (93 3 5 99 117 47 49 123) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (4 94 100 6 48 118 124 50) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (94 95 101 100 118 119 125 124) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (95 96 102 101 119 120 126 125) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (96 97 103 102 120 121 127 126) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (97 98 104 103 121 122 128 127) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (98 99 105 104 122 123 129 128) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (99 5 7 105 123 49 51 129) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (16 14 18 20 60 58 62 64) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (10 8 14 16 54 52 58 60) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (8 6 12 14 52 50 56 58) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (6 100 106 12 50 124 130 56) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (100 101 107 106 124 125 131 130) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (101 102 108 107 125 126 132 131) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (102 103 109 108 126 127 133 132) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (103 104 110 109 127 128 134 133) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (104 105 111 110 128 129 135 134) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (105 7 13 111 129 51 57 135) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (7 9 15 13 51 53 59 57) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (9 11 17 15 53 55 61 59) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (15 17 21 19 59 61 65 63) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write(");" + "\n")
f.write("\n")
# edges
f.write("edges" + "\n")
f.write("(" + "\n")
f.write(");" + "\n")
f.write("\n")
#Boundaries
# Fuel
f.write("boundary" + "\n")
f.write("(" + "\n")
# LHS outlet
f.write("outlet_LHS" + "\n")
f.write("{" + "\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(20 18 62 64)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
# RHS outlet
f.write("outlet_RHS" + "\n")
f.write("{" + "\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(19 21 65 63)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
#Secondary air RHS
f.write("secondary_air_RHS"+ "\n")
f.write("{"+"\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(3 47 49 5)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
#Secondary air LHS
f.write("secondary_air_LHS" + "\n")
f.write("{"+"\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(2 46 48 4)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
# Top faces of fuel
f.write("top_fuel" + "\n")
f.write("{"+ "\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(38 39 83 82)" + "\n")
f.write("(40 41 85 84)" + "\n")
f.write("(42 43 87 86)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
# left faces of fuel
f.write("L_fuel" + "\n")
f.write("{"+ "\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(32 76 82 38)" + "\n")
f.write("(34 78 84 40)" + "\n")
f.write("(36 80 86 42)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
# Right faces of fuel
f.write("R_fuel" + "\n")
f.write("{"+ "\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(33 77 83 39)" + "\n")
f.write("(35 79 85 41)" + "\n")
f.write("(37 81 87 43)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
# bottom faces of fuel
f.write("bottom_fuel" + "\n")
f.write("{"+ "\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(32 33 77 76)" + "\n")
f.write("(34 35 79 78)" + "\n")
f.write("(36 37 81 80)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
# Primary air inlets
f.write("primary_inlets"+"\n")
f.write("{"+"\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(0 26 70 44)" + "\n")
f.write("(26 27 71 70)" + "\n")
f.write("(27 28 72 71)" + "\n")
f.write("(28 29 73 72)" + "\n")
f.write("(29 30 74 73)" + "\n")
f.write("(30 31 75 74)" + "\n")
f.write("(31 1 45 75)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
#Stove Body
f.write("stove_body" + "\n")
f.write("{"+"\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(0 44 66 22)" + "\n")
f.write("(22 66 68 24)" + "\n")
f.write("(24 68 46 2)" + "\n")
f.write("(4 48 50 6)" + "\n")
f.write("(8 6 50 52)" + "\n")
f.write("(10 8 52 54)" + "\n")
f.write("(10 54 60 16)" + "\n")
f.write("(16 60 64 20)" + "\n")
f.write("(14 58 62 18)" + "\n")
f.write("(14 12 56 58)" + "\n")
f.write("(12 106 130 56)" + "\n")
f.write("(106 107 131 130)" + "\n")
f.write("(107 108 132 131)" + "\n")
f.write("(108 109 133 132)" + "\n")
f.write("(109 110 134 133)" + "\n")
f.write("(110 111 135 134)" + "\n")
f.write("(111 13 57 135)" + "\n")
f.write("(13 15 59 57)" + "\n")
f.write("(15 59 63 19)" + "\n")
f.write("(17 61 65 21)" + "\n")
f.write("(11 55 61 17)" + "\n")
f.write("(9 11 55 53)" + "\n")
f.write("(7 9 53 51)" + "\n")
f.write("(5 49 51 7)" + "\n")
f.write("(25 69 47 3)" + "\n")
f.write("(23 67 69 25)" + "\n")
f.write("(1 45 67 23)" +"\n")
f.write(");" + "\n")
f.write("}" + "\n")
#Empty front and back faces
f.write("\n")
f.write("frontAndBack" + "\n")
f.write("{" + "\n")
f.write("type empty;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(0 26 32 22)" + "\n")
f.write("(44 70 76 66)" + "\n")
f.write("(26 27 142 140)" + "\n")
f.write("(70 71 143 141)" + "\n")
f.write("(27 28 34 33)" + "\n")
f.write("(71 72 78 77)" + "\n")
f.write("(28 29 150 148)" + "\n")
f.write("(72 73 151 149)" + "\n")
f.write("(29 30 36 35)" + "\n")
f.write("(73 74 80 79)" + "\n")
f.write("(30 31 158 156)" + "\n")
f.write("(74 75 159 157)" + "\n")
f.write("(31 1 23 37)" + "\n")
f.write("(75 45 67 81)" + "\n")
f.write("(22 140 136 24)" + "\n")
f.write("(66 141 137 68)" + "\n")
f.write("(32 33 39 38)" + "\n")
f.write("(76 77 83 82)" + "\n")
f.write("(142 148 144 138)" + "\n")
f.write("(143 149 145 139)" + "\n")
f.write("(34 35 41 40)" + "\n")
f.write("(78 79 85 84)" + "\n")
f.write("(150 156 152 146)" + "\n")
f.write("(151 157 153 147)" + "\n")
f.write("(36 37 43 42)" + "\n")
f.write("(80 81 87 86)" + "\n")
f.write("(158 23 25 154)" + "\n")
f.write("(159 67 69 155)" + "\n")
f.write("(24 38 88 2)" + "\n")
f.write("(68 82 112 46)" + "\n")
f.write("(136 138 89 88)" + "\n")
f.write("(137 139 113 112)" + "\n")
f.write("(39 40 90 89)" + "\n")
f.write("(83 84 114 113)" + "\n")
f.write("(144 146 91 90)" + "\n")
f.write("(145 147 115 114)" + "\n")
f.write("(41 42 92 91)" + "\n")
f.write("(85 86 116 115)" + "\n")
f.write("(152 154 93 92)" + "\n")
f.write("(153 155 117 116)" + "\n")
f.write("(43 25 3 93)" + "\n")
f.write("(87 69 47 117)" + "\n")
f.write("(2 88 94 4)" + "\n")
f.write("(46 112 118 48)" + "\n")
f.write("(88 89 95 94)" + "\n")
f.write("(112 113 119 118)" + "\n")
f.write("(89 90 96 95)" + "\n")
f.write("(113 114 120 119)" + "\n")
f.write("(90 91 97 96)" + "\n")
f.write("(114 115 121 120)" + "\n")
f.write("(91 92 98 97)" + "\n")
f.write("(115 116 122 121)" + "\n")
f.write("(92 93 99 98)" + "\n")
f.write("(116 117 123 122)" + "\n")
f.write("(93 3 5 99)" + "\n")
f.write("(117 47 49 123)" + "\n")
f.write("(4 94 100 6)" + "\n")
f.write("(48 118 124 50)" + "\n")
f.write("(94 95 101 100)" + "\n")
f.write("(118 119 125 124)" + "\n")
f.write("(95 96 102 101)" + "\n")
f.write("(119 120 126 125)" + "\n")
f.write("(96 97 103 102)" + "\n")
f.write("(120 121 127 126)" + "\n")
f.write("(97 98 104 103)" + "\n")
f.write("(121 122 128 127)" + "\n")
f.write("(98 99 105 104)" + "\n")
f.write("(122 123 129 128)" + "\n")
f.write("(99 5 7 105)" + "\n")
f.write("(123 49 51 129)" + "\n")
f.write("(16 14 18 20)" + "\n")
f.write("(60 58 62 64)" + "\n")
f.write("(10 8 14 16)" + "\n")
f.write("(54 52 58 60)" + "\n")
f.write("(8 6 12 14)" + "\n")
f.write("(52 50 56 58)" + "\n")
f.write("(6 100 106 12)" + "\n")
f.write("(50 124 130 56)" + "\n")
f.write("(100 101 107 106)" + "\n")
f.write("(124 125 131 130)" + "\n")
f.write("(101 102 108 107)" + "\n")
f.write("(125 126 132 131)" + "\n")
f.write("(102 103 109 108)" + "\n")
f.write("(126 127 133 132)" + "\n")
f.write("(103 104 110 109)" + "\n")
f.write("(127 128 134 133)" + "\n")
f.write("(104 105 111 110)" + "\n")
f.write("(128 129 135 134)" + "\n")
f.write("(105 7 13 111)" + "\n")
f.write("(129 51 57 135)" + "\n")
f.write("(7 9 15 13)" + "\n")
f.write("(51 53 59 57)" + "\n")
f.write("(9 11 17 15)" + "\n")
f.write("(53 55 61 59)" + "\n")
f.write("(15 17 21 19)" + "\n")
f.write("(59 61 65 63)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
f.write(");" + "\n")
f.write("\n")
f.write("mergePatchPairs" + "\n")
f.write("(" + "\n")
f.write(");" + "\n")
f.write("// ************************************************************************* //")
def write_mesh_file(num_cells_int_str_concat, blockmesh_template, num_cells_int, pt0str, pt1str, pt2str, pt3str, pt4str, pt5str, pt6str, pt7str, pt8str, pt9str, pt10str, pt11str, pt12str, pt13str, pt14str, pt15str, pt16str, pt17str, pt18str, pt19str, pt20str, pt21str, pt22str, pt23str, pt24str, pt25str, pt26str, pt27str, pt28str, pt29str, pt30str, pt31str, pt32str, pt33str, pt34str, pt35str, pt36str, pt37str, pt38str, pt39str, pt40str, pt41str, pt42str, pt43str, pt44str, pt45str, pt46str, pt47str, pt48str, pt49str, pt50str, pt51str, pt52str, pt53str, pt54str, pt55str, pt56str, pt57str, pt58str, pt59str, pt60str, pt61str, pt62str, pt63str, pt64str, pt65str, pt66str, pt67str, pt68str, pt69str, pt70str, pt71str, pt72str, pt73str, pt74str, pt75str, pt76str, pt77str, pt78str, pt79str, pt80str, pt81str, pt82str, pt83str, pt84str, pt85str, pt86str, pt87str, pt88str, pt89str, pt90str, pt91str, pt92str, pt93str, pt94str, pt95str, pt96str, pt97str, pt98str, pt99str, pt100str, pt101str, pt102str, pt103str, pt104str, pt105str, pt106str, pt107str, pt108str, pt109str, pt110str, pt111str, pt112str, pt113str, pt114str, pt115str, pt116str, pt117str, pt118str, pt119str, pt120str, pt121str, pt122str, pt123str, pt124str, pt125str, pt126str, pt127str):
"""Open the blockmesh template, edit and move to the Run folder
Args:
vertices defined as strings
blockmesh_template (str): full file path where blockmesh template lives
Returns:
block_mesh_run_path (str): file path with the blockmesh file saved for running cases
"""
with open(blockmesh_template,'r+') as f:
f.seek(602) #where writing begins
f.write("convertToMeters 1;"+"\n")
f.write("\n")
f.write("vertices"+"\n")
f.write("("+"\n")
# Write vertice strings
f.write(pt0str +'\n')
f.write(pt1str +'\n')
f.write(pt2str+'\n')
f.write(pt3str+'\n')
f.write(pt4str+'\n')
f.write(pt5str+'\n')
f.write(pt6str+'\n')
f.write(pt7str+'\n')
f.write(pt8str+'\n')
f.write(pt9str+'\n')
f.write(pt10str+'\n')
f.write(pt11str+'\n')
f.write(pt12str+'\n')
f.write(pt13str+'\n')
f.write(pt14str+'\n')
f.write(pt15str+'\n')
f.write(pt16str+'\n')
f.write(pt17str+'\n')
f.write(pt18str+'\n')
f.write(pt19str+'\n')
f.write(pt20str+'\n')
f.write(pt21str+'\n')
f.write(pt22str +'\n')
f.write(pt23str +'\n')
f.write(pt24str+'\n')
f.write(pt25str+'\n')
f.write(pt26str+'\n')
f.write(pt27str+'\n')
f.write(pt28str+'\n')
f.write(pt29str+'\n')
f.write(pt30str+'\n')
f.write(pt31str+'\n')
f.write(pt32str+'\n')
f.write(pt33str+'\n')
f.write(pt34str+'\n')
f.write(pt35str+'\n')
f.write(pt36str+'\n')
f.write(pt37str+'\n')
f.write(pt38str+'\n')
f.write(pt39str+'\n')
f.write(pt40str+'\n')
f.write(pt41str+'\n')
f.write(pt42str+'\n')
f.write(pt43str+'\n')
f.write(pt44str+'\n')
f.write(pt45str+'\n')
f.write(pt46str+'\n')
f.write(pt47str+'\n')
f.write(pt48str+'\n')
f.write(pt49str+'\n')
f.write(pt50str+'\n')
f.write(pt51str+'\n')
# PRIMARY INLET vertices
f.write(pt52str+'\n')
f.write(pt53str+'\n')
f.write(pt54str+'\n')
f.write(pt55str+'\n')
f.write(pt56str+'\n')
f.write(pt57str+'\n')
f.write(pt58str+'\n')
f.write(pt59str+'\n')
f.write(pt60str+'\n')
f.write(pt61str+'\n')
f.write(pt62str+'\n')
f.write(pt63str+'\n')
f.write(pt64str+'\n')
f.write(pt65str+'\n')
f.write(pt66str+'\n')
f.write(pt67str+'\n')
# Shifted Down
f.write(pt68str+'\n')
f.write(pt69str+'\n')
f.write(pt70str+'\n')
f.write(pt71str+'\n')
f.write(pt72str+'\n')
f.write(pt73str+'\n')
f.write(pt74str+'\n')
f.write(pt75str+'\n')
f.write(pt76str+'\n')
f.write(pt77str+'\n')
f.write(pt78str+'\n')
f.write(pt79str+'\n')
f.write(pt80str+'\n')
f.write(pt81str+'\n')
f.write(pt82str+'\n')
f.write(pt83str+'\n')
f.write(pt84str+'\n')
f.write(pt85str+'\n')
f.write(pt86str+'\n')
f.write(pt87str+'\n')
f.write(pt88str+'\n')
f.write(pt89str+'\n')
f.write(pt90str+'\n')
f.write(pt91str+'\n')
f.write(pt92str+'\n')
f.write(pt93str+'\n')
f.write(pt94str+'\n')
f.write(pt95str+'\n')
f.write(pt96str+'\n')
f.write(pt97str+'\n')
f.write(pt98str+'\n')
f.write(pt99str+'\n')
f.write(pt100str+'\n')
f.write(pt101str+'\n')
f.write(pt102str+'\n')
f.write(pt103str+'\n')
f.write(pt104str+'\n')
f.write(pt105str+'\n')
f.write(pt106str+'\n')
f.write(pt107str+'\n')
f.write(pt108str+'\n')
f.write(pt109str+'\n')
f.write(pt110str+'\n')
f.write(pt111str+'\n')
f.write(pt112str+'\n')
f.write(pt113str+'\n')
f.write(pt114str+'\n')
f.write(pt115str+'\n')
f.write(pt116str+'\n')
f.write(pt117str+'\n')
f.write(pt118str+'\n')
f.write(pt119str+'\n')
f.write(pt120str+'\n')
f.write(pt121str+'\n')
f.write(pt122str+'\n')
f.write(pt123str+'\n')
f.write(pt124str+'\n')
f.write(pt125str+'\n')
f.write(pt126str+'\n')
f.write(pt127str+'\n')
f.write("\n")
f.write(");"+"\n")
f.write("\n")
f.write("blocks"+"\n")
f.write("(")
f.write("\n")
# write the blocks, with
#f.write("hex (16 17 39 38 19 18 40 41) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (0 1 23 22 2 3 25 24) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (5 4 26 27 6 7 29 28) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (2 3 25 24 5 4 26 27) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (6 7 29 28 20 21 43 42) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (7 46 47 29 21 15 37 43) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (44 6 28 45 14 20 42 36) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (9 44 45 31 48 14 36 49) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (48 14 36 49 10 11 33 32) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (46 8 30 47 15 50 51 37) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (15 50 51 37 12 13 35 34) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (68 69 77 76 52 53 61 60) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (70 71 79 78 54 55 63 62) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (72 73 81 80 56 57 65 64) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (74 75 83 82 58 59 67 66) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (84 88 89 85 0 52 60 22) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (90 86 87 91 59 1 23 67) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (92 93 94 95 53 54 62 61) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (96 97 98 99 55 56 64 63) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (100 101 102 103 57 58 66 65) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (104 105 106 107 108 109 110 111) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (112 113 114 115 116 117 118 119) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write("hex (120 121 122 123 124 125 126 127) " + num_cells_int_str_concat + " " + "simpleGrading (1 1 1)" + "\n")
f.write(");" + "\n")
f.write("\n")
# edges
f.write("edges" + "\n")
f.write("(" + "\n")
f.write(");" + "\n")
f.write("\n")
#Boundaries
# Fuel
f.write("boundary" + "\n")
f.write("(" + "\n")
f.write("fuel" + "\n")
f.write("{"+ "\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(108 109 110 111)" + "\n")
f.write("(116 117 118 119)" + "\n")
f.write("(124 125 126 127)" + "\n")
# "(16 17 39 38)" + "\n" + "(38 16 19 41)" + "\n" + "(17 39 40 18)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
# Primary air -- will need edits for air tray
f.write("primary_air"+"\n")
f.write("{"+"\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(68 69 77 76)" + "\n")
f.write("(70 71 79 78)" + "\n")
f.write("(72 73 81 80)" + "\n")
f.write("(74 75 83 82)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
#Secondary air RHS
f.write("Secondary_air_RHS"+ "\n")
f.write("{"+"\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(3 25 26 4)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
#Secondary air LHS
f.write("Secondary_air_LHS" + "\n")
f.write("{"+"\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(2 24 27 5)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
#Outlet
f.write("outlet"+"\n")
f.write("{"+"\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(12 13 35 34)" + "\n" + "(10 11 33 32)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
#Stove Body
f.write("stove_body" + "\n")
f.write("{"+"\n")
f.write("type patch;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(22 0 2 24)" + "\n" + "(1 23 25 3)" + "\n" + "(5 27 28 6)" + "\n" + "(4 26 29 7)" + "\n" + "(20 21 43 42)" + "\n" + "(14 20 42 36)" + "\n" + "(44 6 28 45)" + "\n" + "(21 15 37 43)" + "\n" + "(7 46 47 29)" + "\n" + "(14 36 33 11)" + "\n" + "(48 49 32 10)" + "\n" + "(9 44 45 31)" + "\n" + "(15 37 34 12)" + "\n" + "(50 51 35 13)" + "\n" + "(46 8 30 47)" + "\n" + "(8 30 51 50)" + "\n" + "(9 31 49 48)" + "\n")
f.write("(68 76 60 52)" + "\n")
f.write("(69 77 61 53)" + "\n")
f.write("(70 78 62 54)" + "\n")
f.write("(71 79 63 55)" + "\n")
f.write("(72 80 64 56)" + "\n")
f.write("(73 81 65 57)" + "\n")
f.write("(74 82 66 58)" + "\n")
f.write("(75 83 67 59)" + "\n")
#f.write("(16 17 39 38)" + "\n")
#f.write("(38 16 19 41)" + "\n")
#f.write("(17 39 40 18)" + "\n")
f.write("(0 52 60 22)" + "\n")
f.write("(1 23 67 59)" +"\n")
f.write("(84 85 22 0)" + "\n")
f.write("(88 89 60 52)" + "\n")
f.write("(90 91 67 59)" + "\n")
f.write("(86 87 23 1)" + "\n")
f.write("(84 88 89 85)" + "\n")
f.write("(90 86 87 91)" + "\n")
# MAY NEED TO ADD THE FACES OF TOP OF THE OUTER BOXS OF BOTTOM OF STOVE INTO Here
# Space 1
f.write("(92 95 61 53)" + "\n")
f.write("(93 94 62 54)" + "\n")
f.write("(92 93 94 95)" + "\n")
f.write("(53 54 62 61)" + "\n")
# Space 2
f.write("(96 99 63 55)" + "\n")
f.write("(97 98 64 56)" + "\n")
f.write("(96 97 98 99)" + "\n")
f.write("(55 56 64 63)" + "\n")
# Space 3
f.write("(100 103 65 57)" + "\n")
f.write("(101 102 66 58)" + "\n")
f.write("(100 101 102 103)" + "\n")
f.write("(57 58 66 65)" + "\n")
# New fuel blocks bottom and sides:
f.write("(104 105 106 107)" + "\n")
f.write("(105 106 110 109)" + "\n")
f.write("(104 107 111 108)" + "\n")
f.write("(112 113 114 115)" + "\n")
f.write("(113 114 118 117)" + "\n")
f.write("(112 115 119 116)" + "\n")
f.write("(120 121 122 123)" + "\n")
f.write("(121 122 126 125)" + "\n")
f.write("(120 123 127 124)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
#Empty front and back faces
f.write("\n")
f.write("frontAndBack" + "\n")
f.write("{" + "\n")
f.write("type empty;" + "\n")
f.write("faces" + "\n")
f.write("(" + "\n")
f.write("(0 1 3 2)" + "\n" + "(22 23 25 24)" + "\n" + "(2 3 4 5)" + "\n" + "(24 25 26 27)" + "\n" + "(5 4 7 6)" + "\n" + "(27 28 29 26)" + "\n" + "(6 7 21 20)" + "\n" + "(28 29 43 42)" + "\n" + "(44 6 20 14)" + "\n" + "(45 28 42 36)" + "\n" + "(29 47 37 43)" + "\n" + "(7 46 15 21)" + "\n" + "(9 44 14 48)" + "\n" + "(48 14 11 10)" + "\n" + "(31 45 36 49)" + "\n" + "(49 36 33 32)" + "\n" + "(46 8 50 15)" + "\n" + "(15 50 13 12)" + "\n" + "(47 30 51 37)" + "\n" + "(37 51 35 34)" + "\n" + "(52 53 69 68)" + "\n" + "(60 61 77 76)" + "\n" + "(54 55 71 70)" + "\n" + "(62 63 79 78)" + "\n" + "(56 57 73 72)" + "\n" + "(64 65 81 80)" + "\n" + "(58 59 75 74)" + "\n" + "(66 67 83 82)" + "\n")
f.write("(90 86 1 59)" + "\n")
f.write("(91 87 23 67)" + "\n")
f.write("(84 88 52 0)" + "\n")
f.write("(85 89 60 22)" + "\n")
# Space 1
f.write("(92 93 54 53)" + "\n")
f.write("(95 94 62 61)" + "\n")
# Space 2
f.write("(96 97 56 55)" + "\n")
f.write("(99 98 64 63)" + "\n")
# Space 3
f.write("(100 101 58 57)" + "\n")
f.write("(103 102 66 65)" + "\n")
# New fuel front and backs
f.write("(104 105 109 108)" + "\n")
f.write("(107 106 110 111)" + "\n")
f.write("(112 113 116 117)" + "\n")
f.write("(115 114 118 119)" + "\n")
f.write("(120 121 125 124)" + "\n")
f.write("(123 122 126 127)" + "\n")
f.write(");" + "\n")
f.write("}" + "\n")
f.write(");" + "\n")
f.write("\n")
f.write("mergePatchPairs" + "\n")
f.write("(" + "\n")
f.write(");" + "\n")
f.write("// ************************************************************************* //")
#write_blockmesh(blockmesh_template, num_cells_int, pt0str, pt1str, pt2str, pt3str, pt4str, pt5str, pt6str, pt7str, pt8str, pt9str, pt10str, pt11str, pt12str, pt13str, pt14str, pt15str, pt16str, pt17str, pt18str, pt19str, pt20str, pt21str, pt22str, pt23str, pt24str, pt25str, pt26str, pt27str, pt28str, pt29str, pt30str, pt31str, pt32str, pt33str, pt34str, pt35str, pt36str, pt37str, pt38str, pt39str, pt40str, pt41str, pt42str, pt43str, pt44str, pt45str, pt46str, pt47str, pt48str, pt49str, pt50str, pt51str)
def move_blockmesh(blockmesh_template):
"""purpose is to move the edited blockmesh template to the run folder prior to relocating to the case files. rename the file blockMeshDict for OF convention. Additionally, move the empty template from backup to the template folder
Args:
blockmesh_template (str): path where the NOW edited blockmeshDict file lives
Returns:
blockmesh_for_run (str): path where the blockMeshDict file for run is located prior to case runs.
"""
# Running directory
current_dir = os.getcwd()
steps_for_run = "//blockMeshDict_foamfile//Run//"
filename_for_run = "blockMeshDict"
blockmesh_for_run = current_dir + steps_for_run + filename_for_run #blockmesh to be relocated to the case files
# backup directory
steps_for_backup = "//blockMeshDict_foamfile//Backup//"
filename_for_backup = "blockMeshDict_template_backup"
blockmesh_for_backup = current_dir + steps_for_backup + filename_for_backup #blockmesh to be relocated to the case files
copy(blockmesh_template, blockmesh_for_run) # move to running directory
copy(blockmesh_for_backup, blockmesh_template) # Move the backup blockmeshdict file to the template folder to re-produce original file convention
return blockmesh_for_run
| 48.478216 | 1,892 | 0.471038 | 6,861 | 46,733 | 3.044454 | 0.070544 | 0.215722 | 0.238271 | 0.076599 | 0.53423 | 0.36667 | 0.362074 | 0.3581 | 0.342589 | 0.339812 | 0 | 0.169821 | 0.341878 | 46,733 | 963 | 1,893 | 48.528557 | 0.509332 | 0.048231 | 0 | 0.211196 | 0 | 0 | 0.243446 | 0.007234 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.005089 | null | null | 0.010178 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
1ba0db3028562f7bd13243214c8d344048f87797 | 246 | py | Python | kii/results/group.py | ta2xeo/python3-kii | 892da42601318bcc15e70378614be76d68681881 | [
"MIT"
] | 2 | 2018-02-04T21:16:02.000Z | 2021-12-01T16:51:43.000Z | kii/results/group.py | ta2xeo/python3-kii | 892da42601318bcc15e70378614be76d68681881 | [
"MIT"
] | null | null | null | kii/results/group.py | ta2xeo/python3-kii | 892da42601318bcc15e70378614be76d68681881 | [
"MIT"
] | null | null | null | from .base import BaseResult
class GroupResult(BaseResult):
@property
def group_id(self):
return self._result['groupID']
@property
def groupID(self):
"""
synonym
"""
return self.group_id
| 16.4 | 38 | 0.585366 | 25 | 246 | 5.64 | 0.6 | 0.156028 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.317073 | 246 | 14 | 39 | 17.571429 | 0.839286 | 0.028455 | 0 | 0.25 | 0 | 0 | 0.032558 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.125 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
1ba58cb35d017965af84a89105c9caa914d16323 | 100 | py | Python | app/extensions.py | rahowa/workzone | b6fd3241fdbc9463e0e7eb863f82f9524be50830 | [
"MIT"
] | 1 | 2020-04-25T07:49:11.000Z | 2020-04-25T07:49:11.000Z | app/extensions.py | rahowa/workzone | b6fd3241fdbc9463e0e7eb863f82f9524be50830 | [
"MIT"
] | null | null | null | app/extensions.py | rahowa/workzone | b6fd3241fdbc9463e0e7eb863f82f9524be50830 | [
"MIT"
] | 1 | 2020-04-23T10:24:56.000Z | 2020-04-23T10:24:56.000Z | from flask_pymongo import PyMongo
from flask_caching import Cache
mongo = PyMongo()
cache = Cache() | 20 | 33 | 0.8 | 14 | 100 | 5.571429 | 0.5 | 0.230769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.14 | 100 | 5 | 34 | 20 | 0.906977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
1bc84cc7bc5a588f5cd9171c8e2e18d2d23b19c8 | 26,317 | py | Python | tests/src/python/test_db_manager_spatialite.py | dyna-mis/Hilabeling | cb7d5d4be29624a20c8a367162dbc6fd779b2b52 | [
"MIT"
] | null | null | null | tests/src/python/test_db_manager_spatialite.py | dyna-mis/Hilabeling | cb7d5d4be29624a20c8a367162dbc6fd779b2b52 | [
"MIT"
] | null | null | null | tests/src/python/test_db_manager_spatialite.py | dyna-mis/Hilabeling | cb7d5d4be29624a20c8a367162dbc6fd779b2b52 | [
"MIT"
] | 1 | 2021-12-25T08:40:30.000Z | 2021-12-25T08:40:30.000Z | # -*- coding: utf-8 -*-
"""QGIS Unit tests for the DBManager SPATIALITE plugin
.. note:: This program is free software; you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
(at your option) any later version.
"""
__author__ = 'Even Rouault'
__date__ = '2016-10-17'
__copyright__ = 'Copyright 2016, Even Rouault'
# This will get replaced with a git SHA1 when you do a git archive
__revision__ = '176c06ceefb5f555205e72b20c962740cc0ec183'
import qgis # NOQA
import os
import tempfile
import shutil
from osgeo import gdal, ogr, osr
from qgis.core import QgsDataSourceUri, QgsSettings
from qgis.PyQt.QtCore import QCoreApplication
from qgis.testing import start_app, unittest
from plugins.db_manager.db_plugins import supportedDbTypes, createDbPlugin
from plugins.db_manager.db_plugins.plugin import TableField
def GDAL_COMPUTE_VERSION(maj, min, rev):
return ((maj) * 1000000 + (min) * 10000 + (rev) * 100)
class TestPyQgsDBManagerSpatialite(unittest.TestCase):
@classmethod
def setUpClass(cls):
"""Run before all tests"""
QCoreApplication.setOrganizationName("QGIS_Test")
QCoreApplication.setOrganizationDomain("TestPyQgsDBManagerSpatialite.com")
QCoreApplication.setApplicationName("TestPyQgsDBManagerSpatialite")
QgsSettings().clear()
start_app()
cls.basetestpath = tempfile.mkdtemp()
cls.test_spatialite = os.path.join(cls.basetestpath, 'TestPyQgsDBManagerSpatialite.spatialite')
ds = ogr.GetDriverByName('SQLite').CreateDataSource(cls.test_spatialite)
lyr = ds.CreateLayer('testlayer', geom_type=ogr.wkbLineString, options=['SPATIAL_INDEX=NO'])
cls.supportsAlterFieldDefn = lyr.TestCapability(ogr.OLCAlterFieldDefn) == 1
lyr.CreateField(ogr.FieldDefn('text_field', ogr.OFTString))
f = ogr.Feature(lyr.GetLayerDefn())
f['text_field'] = 'foo'
f.SetGeometry(ogr.CreateGeometryFromWkt('LINESTRING(1 2,3 4)'))
lyr.CreateFeature(f)
f = None
ds = None
@classmethod
def tearDownClass(cls):
"""Run after all tests"""
QgsSettings().clear()
shutil.rmtree(cls.basetestpath, True)
def testSupportedDbTypes(self):
self.assertIn('spatialite', supportedDbTypes())
def testCreateDbPlugin(self):
plugin = createDbPlugin('spatialite')
self.assertIsNotNone(plugin)
def testConnect(self):
connection_name = 'testConnect'
plugin = createDbPlugin('spatialite')
uri = QgsDataSourceUri()
uri.setDatabase(self.test_spatialite)
self.assertTrue(plugin.addConnection(connection_name, uri))
connections = plugin.connections()
self.assertEqual(len(connections), 1)
connection = createDbPlugin('spatialite', connection_name + '_does_not_exist')
connection_succeeded = False
try:
connection.connect()
connection_succeeded = True
except:
pass
self.assertFalse(connection_succeeded, 'exception should have been raised')
connection = connections[0]
connection.connect()
connection.reconnect()
connection.remove()
self.assertEqual(len(plugin.connections()), 0)
connection = createDbPlugin('spatialite', connection_name)
connection_succeeded = False
try:
connection.connect()
connection_succeeded = True
except:
pass
self.assertFalse(connection_succeeded, 'exception should have been raised')
def testListLayer(self):
connection_name = 'testListLayer'
plugin = createDbPlugin('spatialite')
uri = QgsDataSourceUri()
uri.setDatabase(self.test_spatialite)
self.assertTrue(plugin.addConnection(connection_name, uri))
connection = createDbPlugin('spatialite', connection_name)
connection.connect()
db = connection.database()
self.assertIsNotNone(db)
tables = db.tables()
self.assertEqual(len(tables), 1)
table = tables[0]
self.assertEqual(table.name, 'testlayer')
info = table.info()
# expected_html = """<div class="section"><h2>General info</h2><div><table><tr><td>Relation type: </td><td>Table </td></tr><tr><td>Rows: </td><td>1 </td></tr></table></div></div><div class="section"><h2>GeoPackage</h2><div><table><tr><td>Column: </td><td>geom </td></tr><tr><td>Geometry: </td><td>LINESTRING </td></tr><tr><td>Dimension: </td><td>XY </td></tr><tr><td>Spatial ref: </td><td>Undefined (-1) </td></tr><tr><td>Extent: </td><td>1.00000, 2.00000 - 3.00000, 4.00000 </td></tr></table><p><warning> No spatial index defined (<a href="action:spatialindex/create">create it</a>)</p></div></div><div class="section"><h2>Fields</h2><div><table class="header"><tr><th># </th><th>Name </th><th>Type </th><th>Null </th><th>Default </th></tr><tr><td>0 </td><td class="underline">fid </td><td>INTEGER </td><td>Y </td><td> </td></tr><tr><td>1 </td><td>geom </td><td>LINESTRING </td><td>Y </td><td> </td></tr><tr><td>2 </td><td>text_field </td><td>TEXT </td><td>Y </td><td> </td></tr></table></div></div>"""
# # GDAL 2.2.0
# expected_html_2 = """<div class="section"><h2>General info</h2><div><table><tr><td>Relation type: </td><td>Table </td></tr><tr><td>Rows: </td><td>1 </td></tr></table></div></div><div class="section"><h2>GeoPackage</h2><div><table><tr><td>Column: </td><td>geom </td></tr><tr><td>Geometry: </td><td>LINESTRING </td></tr><tr><td>Dimension: </td><td>XY </td></tr><tr><td>Spatial ref: </td><td>Undefined (-1) </td></tr><tr><td>Extent: </td><td>1.00000, 2.00000 - 3.00000, 4.00000 </td></tr></table><p><warning> No spatial index defined (<a href="action:spatialindex/create">create it</a>)</p></div></div><div class="section"><h2>Fields</h2><div><table class="header"><tr><th># </th><th>Name </th><th>Type </th><th>Null </th><th>Default </th></tr><tr><td>0 </td><td class="underline">fid </td><td>INTEGER </td><td>N </td><td> </td></tr><tr><td>1 </td><td>geom </td><td>LINESTRING </td><td>Y </td><td> </td></tr><tr><td>2 </td><td>text_field </td><td>TEXT </td><td>Y </td><td> </td></tr></table></div></div><div class="section"><h2>Triggers</h2><div><table class="header"><tr><th>Name </th><th>Function </th></tr><tr><td>trigger_insert_feature_count_testlayer (<a href="action:trigger/trigger_insert_feature_count_testlayer/delete">delete</a>) </td><td>CREATE TRIGGER "trigger_insert_feature_count_testlayer" AFTER INSERT ON "testlayer" BEGIN UPDATE spatialite_ogr_contents SET feature_count = feature_count + 1 WHERE table_name = 'testlayer'; END </td></tr><tr><td>trigger_delete_feature_count_testlayer (<a href="action:trigger/trigger_delete_feature_count_testlayer/delete">delete</a>) </td><td>CREATE TRIGGER "trigger_delete_feature_count_testlayer" AFTER DELETE ON "testlayer" BEGIN UPDATE spatialite_ogr_contents SET feature_count = feature_count - 1 WHERE table_name = 'testlayer'; END </td></tr></table></div></div>"""
# # GDAL 2.3.0
# expected_html_3 = """<div class="section"><h2>General info</h2><div><table><tr><td>Relation type: </td><td>Table </td></tr><tr><td>Rows: </td><td>1 </td></tr></table></div></div><div class="section"><h2>GeoPackage</h2><div><table><tr><td>Column: </td><td>geom </td></tr><tr><td>Geometry: </td><td>LINESTRING </td></tr><tr><td>Dimension: </td><td>XY </td></tr><tr><td>Spatial ref: </td><td>Undefined (-1) </td></tr><tr><td>Extent: </td><td>1.00000, 2.00000 - 3.00000, 4.00000 </td></tr></table><p><warning> No spatial index defined (<a href="action:spatialindex/create">create it</a>)</p></div></div><div class="section"><h2>Fields</h2><div><table class="header"><tr><th># </th><th>Name </th><th>Type </th><th>Null </th><th>Default </th></tr><tr><td>0 </td><td class="underline">fid </td><td>INTEGER </td><td>N </td><td> </td></tr><tr><td>1 </td><td>geom </td><td>LINESTRING </td><td>Y </td><td> </td></tr><tr><td>2 </td><td>text_field </td><td>TEXT </td><td>Y </td><td> </td></tr></table></div></div><div class="section"><h2>Triggers</h2><div><table class="header"><tr><th>Name </th><th>Function </th></tr><tr><td>trigger_insert_feature_count_testlayer (<a href="action:trigger/trigger_insert_feature_count_testlayer/delete">delete</a>) </td><td>CREATE TRIGGER "trigger_insert_feature_count_testlayer" AFTER INSERT ON "testlayer" BEGIN UPDATE spatialite_ogr_contents SET feature_count = feature_count + 1 WHERE lower(table_name) = lower('testlayer'); END </td></tr><tr><td>trigger_delete_feature_count_testlayer (<a href="action:trigger/trigger_delete_feature_count_testlayer/delete">delete</a>) </td><td>CREATE TRIGGER "trigger_delete_feature_count_testlayer" AFTER DELETE ON "testlayer" BEGIN UPDATE spatialite_ogr_contents SET feature_count = feature_count - 1 WHERE lower(table_name) = lower('testlayer'); END </td></tr></table></div></div>"""
# GDAL 2.3.0
expected_html = """<div class="section"><h2>General info</h2><div><table><tr><td>Relation type: </td><td>Table </td></tr><tr><td>Rows: </td><td>1 </td></tr></table></div></div><div class="section"><h2>Fields</h2><div><table class="header"><tr><th># </th><th>Name </th><th>Type </th><th>Null </th><th>Default </th></tr><tr><td>0 </td><td class="underline">ogc_fid </td><td>INTEGER </td><td>Y </td><td> </td></tr><tr><td>1 </td><td>GEOMETRY </td><td>BLOB </td><td>Y </td><td> </td></tr><tr><td>2 </td><td>text_field </td><td>VARCHAR </td><td>Y </td><td> </td></tr></table></div></div>"""
self.assertIn(info.toHtml(), [expected_html])
connection.remove()
def testCreateRenameDeleteTable(self):
connection_name = 'testCreateRenameDeleteTable'
plugin = createDbPlugin('spatialite')
uri = QgsDataSourceUri()
test_spatialite_new = os.path.join(self.basetestpath, 'testCreateRenameDeleteTable.spatialite')
shutil.copy(self.test_spatialite, test_spatialite_new)
uri.setDatabase(test_spatialite_new)
self.assertTrue(plugin.addConnection(connection_name, uri))
connection = createDbPlugin('spatialite', connection_name)
connection.connect()
db = connection.database()
self.assertIsNotNone(db)
tables = db.tables()
self.assertEqual(len(tables), 1)
table = tables[0]
self.assertTrue(table.rename('newName'))
self.assertEqual(table.name, 'newName')
connection.reconnect()
db = connection.database()
tables = db.tables()
self.assertEqual(len(tables), 1)
table = tables[0]
self.assertEqual(table.name, 'newName')
fields = []
geom = ['geometry', 'POINT', 4326, 3]
field1 = TableField(table)
field1.name = 'fid'
field1.dataType = 'INTEGER'
field1.notNull = True
field1.primaryKey = True
field2 = TableField(table)
field2.name = 'str_field'
field2.dataType = 'TEXT'
field2.modifier = 20
fields = [field1, field2]
self.assertTrue(db.createVectorTable('newName2', fields, geom))
tables = db.tables()
self.assertEqual(len(tables), 2)
new_table = tables[1]
self.assertEqual(new_table.name, 'newName2')
fields = new_table.fields()
self.assertEqual(len(fields), 2)
# self.assertFalse(new_table.hasSpatialIndex())
# self.assertTrue(new_table.createSpatialIndex())
# self.assertTrue(new_table.hasSpatialIndex())
self.assertTrue(new_table.delete())
tables = db.tables()
self.assertEqual(len(tables), 1)
connection.remove()
def testCreateRenameDeleteFields(self):
if not self.supportsAlterFieldDefn:
return
connection_name = 'testCreateRenameDeleteFields'
plugin = createDbPlugin('spatialite')
uri = QgsDataSourceUri()
test_spatialite_new = os.path.join(self.basetestpath, 'testCreateRenameDeleteFields.spatialite')
shutil.copy(self.test_spatialite, test_spatialite_new)
uri.setDatabase(test_spatialite_new)
self.assertTrue(plugin.addConnection(connection_name, uri))
connection = createDbPlugin('spatialite', connection_name)
connection.connect()
db = connection.database()
self.assertIsNotNone(db)
tables = db.tables()
self.assertEqual(len(tables), 1)
table = tables[0]
field_before_count = len(table.fields())
field = TableField(table)
field.name = 'real_field'
field.dataType = 'DOUBLE'
self.assertTrue(table.addField(field))
self.assertEqual(len(table.fields()), field_before_count + 1)
# not supported in spatialite
# self.assertTrue(field.update('real_field2', new_type_str='TEXT (30)', new_not_null=True, new_default_str='foo'))
field = table.fields()[field_before_count]
self.assertEqual(field.name, 'real_field')
self.assertEqual(field.dataType, 'DOUBLE')
# self.assertEqual(field.notNull, 1)
# self.assertEqual(field.default, "'foo'")
# self.assertTrue(table.deleteField(field))
# self.assertEqual(len(table.fields()), field_before_count)
connection.remove()
def testTableDataModel(self):
connection_name = 'testTableDataModel'
plugin = createDbPlugin('spatialite')
uri = QgsDataSourceUri()
uri.setDatabase(self.test_spatialite)
self.assertTrue(plugin.addConnection(connection_name, uri))
connection = createDbPlugin('spatialite', connection_name)
connection.connect()
db = connection.database()
self.assertIsNotNone(db)
tables = db.tables()
self.assertEqual(len(tables), 1)
table = tables[0]
self.assertEqual(table.name, 'testlayer')
model = table.tableDataModel(None)
self.assertEqual(model.rowCount(), 1)
self.assertEqual(model.getData(0, 0), 1) # fid
wkb = model.getData(0, 1)
geometry = ogr.CreateGeometryFromWkb(wkb)
self.assertEqual(geometry.ExportToWkt(), 'LINESTRING (1 2,3 4)')
self.assertEqual(model.getData(0, 2), 'foo')
connection.remove()
# def testRaster(self):
# if int(gdal.VersionInfo('VERSION_NUM')) < GDAL_COMPUTE_VERSION(2, 0, 2):
# return
# connection_name = 'testRaster'
# plugin = createDbPlugin('spatialite')
# uri = QgsDataSourceUri()
# test_spatialite_new = os.path.join(self.basetestpath, 'testRaster.spatialite')
# shutil.copy(self.test_spatialite, test_spatialite_new)
# mem_ds = gdal.GetDriverByName('MEM').Create('', 20, 20)
# mem_ds.SetGeoTransform([2, 0.01, 0, 49, 0, -0.01])
# sr = osr.SpatialReference()
# sr.ImportFromEPSG(4326)
# mem_ds.SetProjection(sr.ExportToWkt())
# mem_ds.GetRasterBand(1).Fill(255)
# gdal.GetDriverByName('SQLite').CreateCopy(test_spatialite_new, mem_ds, options=['APPEND_SUBDATASET=YES', 'RASTER_TABLE=raster_table'])
# mem_ds = None
# uri.setDatabase(test_spatialite_new)
# self.assertTrue(plugin.addConnection(connection_name, uri))
# connection = createDbPlugin('spatialite', connection_name)
# connection.connect()
# db = connection.database()
# self.assertIsNotNone(db)
# tables = db.tables()
# self.assertEqual(len(tables), 2)
# table = None
# for i in range(2):
# if tables[i].name == 'raster_table':
# table = tables[i]
# break
# self.assertIsNotNone(table)
# info = table.info()
# expected_html = """<div class="section"><h2>General info</h2><div><table><tr><td>Relation type: </td><td>Table </td></tr><tr><td>Rows: </td><td>Unknown (<a href="action:rows/count">find out</a>) </td></tr></table></div></div><div class="section"><h2>GeoPackage</h2><div><table><tr><td>Column: </td><td> </td></tr><tr><td>Geometry: </td><td>RASTER </td></tr><tr><td>Spatial ref: </td><td>WGS 84 geodetic (4326) </td></tr><tr><td>Extent: </td><td>2.00000, 48.80000 - 2.20000, 49.00000 </td></tr></table></div></div><div class="section"><h2>Fields</h2><div><table class="header"><tr><th># </th><th>Name </th><th>Type </th><th>Null </th><th>Default </th></tr><tr><td>0 </td><td class="underline">id </td><td>INTEGER </td><td>Y </td><td> </td></tr><tr><td>1 </td><td>zoom_level </td><td>INTEGER </td><td>N </td><td> </td></tr><tr><td>2 </td><td>tile_column </td><td>INTEGER </td><td>N </td><td> </td></tr><tr><td>3 </td><td>tile_row </td><td>INTEGER </td><td>N </td><td> </td></tr><tr><td>4 </td><td>tile_data </td><td>BLOB </td><td>N </td><td> </td></tr></table></div></div><div class="section"><h2>Indexes</h2><div><table class="header"><tr><th>Name </th><th>Column(s) </th></tr><tr><td>sqlite_autoindex_raster_table_1 </td><td>zoom_level<br>tile_column<br>tile_row </td></tr></table></div></div>"""
# self.assertEqual(info.toHtml(), expected_html)
# connection.remove()
# def testTwoRaster(self):
# if int(gdal.VersionInfo('VERSION_NUM')) < GDAL_COMPUTE_VERSION(2, 0, 2):
# return
# connection_name = 'testTwoRaster'
# plugin = createDbPlugin('spatialite')
# uri = QgsDataSourceUri()
# test_spatialite_new = os.path.join(self.basetestpath, 'testTwoRaster.spatialite')
# shutil.copy(self.test_spatialite, test_spatialite_new)
# mem_ds = gdal.GetDriverByName('MEM').Create('', 20, 20)
# mem_ds.SetGeoTransform([2, 0.01, 0, 49, 0, -0.01])
# sr = osr.SpatialReference()
# sr.ImportFromEPSG(4326)
# mem_ds.SetProjection(sr.ExportToWkt())
# mem_ds.GetRasterBand(1).Fill(255)
# for i in range(2):
# gdal.GetDriverByName('SQLite').CreateCopy(test_spatialite_new, mem_ds, options=['APPEND_SUBDATASET=YES', 'RASTER_TABLE=raster_table%d' % (i + 1)])
# mem_ds = None
# uri.setDatabase(test_spatialite_new)
# self.assertTrue(plugin.addConnection(connection_name, uri))
# connection = createDbPlugin('spatialite', connection_name)
# connection.connect()
# db = connection.database()
# self.assertIsNotNone(db)
# tables = db.tables()
# self.assertEqual(len(tables), 3)
# table = None
# for i in range(2):
# if tables[i].name.startswith('raster_table'):
# table = tables[i]
# info = table.info()
# info.toHtml()
# connection.remove()
def testNonSpatial(self):
connection_name = 'testnonspatial'
plugin = createDbPlugin('spatialite')
uri = QgsDataSourceUri()
test_spatialite = os.path.join(self.basetestpath, 'testnonspatial.spatialite')
ds = ogr.GetDriverByName('SQLite').CreateDataSource(test_spatialite)
lyr = ds.CreateLayer('testnonspatial', geom_type=ogr.wkbNone)
lyr.CreateField(ogr.FieldDefn('text_field', ogr.OFTString))
f = ogr.Feature(lyr.GetLayerDefn())
f['text_field'] = 'foo'
lyr.CreateFeature(f)
f = None
ds = None
uri.setDatabase(test_spatialite)
self.assertTrue(plugin.addConnection(connection_name, uri))
connection = createDbPlugin('spatialite', connection_name)
connection.connect()
db = connection.database()
self.assertIsNotNone(db)
tables = db.tables()
self.assertEqual(len(tables), 1)
table = tables[0]
self.assertEqual(table.name, 'testnonspatial')
info = table.info()
# expected_html = """<div class="section"><h2>General info</h2><div><table><tr><td>Relation type: </td><td>Table </td></tr><tr><td>Rows: </td><td>1 </td></tr></table></div></div><div class="section"><h2>Fields</h2><div><table class="header"><tr><th># </th><th>Name </th><th>Type </th><th>Null </th><th>Default </th></tr><tr><td>0 </td><td class="underline">fid </td><td>INTEGER </td><td>Y </td><td> </td></tr><tr><td>1 </td><td>text_field </td><td>TEXT </td><td>Y </td><td> </td></tr></table></div></div>"""
# # GDAL 2.2.0
# expected_html_2 = """<div class="section"><h2>General info</h2><div><table><tr><td>Relation type: </td><td>Table </td></tr><tr><td>Rows: </td><td>1 </td></tr></table></div></div><div class="section"><h2>Fields</h2><div><table class="header"><tr><th># </th><th>Name </th><th>Type </th><th>Null </th><th>Default </th></tr><tr><td>0 </td><td class="underline">fid </td><td>INTEGER </td><td>N </td><td> </td></tr><tr><td>1 </td><td>text_field </td><td>TEXT </td><td>Y </td><td> </td></tr></table></div></div><div class="section"><h2>Triggers</h2><div><table class="header"><tr><th>Name </th><th>Function </th></tr><tr><td>trigger_insert_feature_count_testnonspatial (<a href="action:trigger/trigger_insert_feature_count_testnonspatial/delete">delete</a>) </td><td>CREATE TRIGGER "trigger_insert_feature_count_testnonspatial" AFTER INSERT ON "testnonspatial" BEGIN UPDATE spatialite_ogr_contents SET feature_count = feature_count + 1 WHERE table_name = 'testnonspatial'; END </td></tr><tr><td>trigger_delete_feature_count_testnonspatial (<a href="action:trigger/trigger_delete_feature_count_testnonspatial/delete">delete</a>) </td><td>CREATE TRIGGER "trigger_delete_feature_count_testnonspatial" AFTER DELETE ON "testnonspatial" BEGIN UPDATE spatialite_ogr_contents SET feature_count = feature_count - 1 WHERE table_name = 'testnonspatial'; END </td></tr></table></div></div>"""
# # GDAL 2.3.0
# expected_html_3 = """<div class="section"><h2>General info</h2><div><table><tr><td>Relation type: </td><td>Table </td></tr><tr><td>Rows: </td><td>1 </td></tr></table></div></div><div class="section"><h2>Fields</h2><div><table class="header"><tr><th># </th><th>Name </th><th>Type </th><th>Null </th><th>Default </th></tr><tr><td>0 </td><td class="underline">fid </td><td>INTEGER </td><td>N </td><td> </td></tr><tr><td>1 </td><td>text_field </td><td>TEXT </td><td>Y </td><td> </td></tr></table></div></div><div class="section"><h2>Triggers</h2><div><table class="header"><tr><th>Name </th><th>Function </th></tr><tr><td>trigger_insert_feature_count_testnonspatial (<a href="action:trigger/trigger_insert_feature_count_testnonspatial/delete">delete</a>) </td><td>CREATE TRIGGER "trigger_insert_feature_count_testnonspatial" AFTER INSERT ON "testnonspatial" BEGIN UPDATE spatialite_ogr_contents SET feature_count = feature_count + 1 WHERE lower(table_name) = lower('testnonspatial'); END </td></tr><tr><td>trigger_delete_feature_count_testnonspatial (<a href="action:trigger/trigger_delete_feature_count_testnonspatial/delete">delete</a>) </td><td>CREATE TRIGGER "trigger_delete_feature_count_testnonspatial" AFTER DELETE ON "testnonspatial" BEGIN UPDATE spatialite_ogr_contents SET feature_count = feature_count - 1 WHERE lower(table_name) = lower('testnonspatial'); END </td></tr></table></div></div>"""
# self.assertIn(info.toHtml(), [expected_html, expected_html_2, expected_html_3], info.toHtml())
connection.remove()
def testAllGeometryTypes(self):
connection_name = 'testAllGeometryTypes'
plugin = createDbPlugin('spatialite')
uri = QgsDataSourceUri()
test_spatialite = os.path.join(self.basetestpath, 'testAllGeometryTypes.spatialite')
ds = ogr.GetDriverByName('SQLite').CreateDataSource(test_spatialite)
ds.CreateLayer('testPoint', geom_type=ogr.wkbPoint)
ds.CreateLayer('testLineString', geom_type=ogr.wkbLineString)
ds.CreateLayer('testPolygon', geom_type=ogr.wkbPolygon)
ds.CreateLayer('testMultiPoint', geom_type=ogr.wkbMultiPoint)
ds.CreateLayer('testMultiLineString', geom_type=ogr.wkbMultiLineString)
ds.CreateLayer('testMultiPolygon', geom_type=ogr.wkbMultiPolygon)
ds.CreateLayer('testGeometryCollection', geom_type=ogr.wkbGeometryCollection)
ds.CreateLayer('testCircularString', geom_type=ogr.wkbCircularString)
ds.CreateLayer('testCompoundCurve', geom_type=ogr.wkbCompoundCurve)
ds.CreateLayer('testCurvePolygon', geom_type=ogr.wkbCurvePolygon)
ds.CreateLayer('testMultiCurve', geom_type=ogr.wkbMultiCurve)
ds.CreateLayer('testMultiSurface', geom_type=ogr.wkbMultiSurface)
ds = None
uri.setDatabase(test_spatialite)
self.assertTrue(plugin.addConnection(connection_name, uri))
connection = createDbPlugin('spatialite', connection_name)
connection.connect()
db = connection.database()
self.assertIsNotNone(db)
# tables = db.tables()
# for i in range(len(tables)):
# table = tables[i]
# info = table.info()
connection.remove()
if __name__ == '__main__':
unittest.main()
| 57.335512 | 2,057 | 0.660638 | 3,474 | 26,317 | 4.901842 | 0.102188 | 0.071525 | 0.063891 | 0.024664 | 0.744788 | 0.721534 | 0.707734 | 0.693229 | 0.680839 | 0.671795 | 0 | 0.017556 | 0.151537 | 26,317 | 458 | 2,058 | 57.460699 | 0.745085 | 0.541703 | 0 | 0.529167 | 0 | 0.004167 | 0.156797 | 0.085602 | 0 | 0 | 0 | 0 | 0.191667 | 1 | 0.05 | false | 0.008333 | 0.041667 | 0.004167 | 0.104167 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
943b324fd77f835dd6d587c78c8be83d84bc826f | 250 | py | Python | tests/test_compat.py | tltx/iommi | a0ca5e261040cc0452d7452e9320a88af5222b30 | [
"BSD-3-Clause"
] | 192 | 2020-01-30T14:29:56.000Z | 2022-03-28T19:55:30.000Z | tests/test_compat.py | tltx/iommi | a0ca5e261040cc0452d7452e9320a88af5222b30 | [
"BSD-3-Clause"
] | 105 | 2020-03-29T21:59:01.000Z | 2022-03-24T12:29:09.000Z | tests/test_compat.py | tltx/iommi | a0ca5e261040cc0452d7452e9320a88af5222b30 | [
"BSD-3-Clause"
] | 28 | 2020-02-02T20:51:09.000Z | 2022-03-08T16:23:42.000Z | # This is for testing the compat stuff in tests, not the main iommi compat components
from tests.compat_flask import Jinja2RequestFactory
def test_jinja2_request_factory():
Jinja2RequestFactory().get('/', params={'foo': 'bar'}, HTTP_HOST='7')
| 31.25 | 85 | 0.76 | 34 | 250 | 5.441176 | 0.852941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018433 | 0.132 | 250 | 7 | 86 | 35.714286 | 0.834101 | 0.332 | 0 | 0 | 0 | 0 | 0.04878 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
944c3f28ac6bf25ebb3fc4834cd468067fed64c4 | 250 | py | Python | moonstone/parsers/counts/taxonomy/base.py | motleystate/moonstone | 37c38fabf361722f7002626ef13c68c443ace4ac | [
"MIT"
] | null | null | null | moonstone/parsers/counts/taxonomy/base.py | motleystate/moonstone | 37c38fabf361722f7002626ef13c68c443ace4ac | [
"MIT"
] | 84 | 2020-07-27T13:01:12.000Z | 2022-03-16T17:10:23.000Z | moonstone/parsers/counts/taxonomy/base.py | motleystate/moonstone | 37c38fabf361722f7002626ef13c68c443ace4ac | [
"MIT"
] | null | null | null | from moonstone.parsers.base import BaseParser
from moonstone.plot import PlotTaxonomyCounts
from moonstone.utils.taxonomy import TaxonomyCountsBase
class BaseTaxonomyCountsParser(TaxonomyCountsBase, BaseParser):
PLOT_CLASS = PlotTaxonomyCounts
| 31.25 | 63 | 0.864 | 24 | 250 | 8.958333 | 0.541667 | 0.181395 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.096 | 250 | 7 | 64 | 35.714286 | 0.951327 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
9461715ce3303414fda02330ac769e79279c1f28 | 2,678 | py | Python | flexx/ui/__init__.py | abhishekgahlot/flexx | 27a995603b5e9d758f9d38eababcc626aa661299 | [
"BSD-2-Clause"
] | 1 | 2015-11-05T19:17:37.000Z | 2015-11-05T19:17:37.000Z | flexx/ui/__init__.py | abhishekgahlot/flexx | 27a995603b5e9d758f9d38eababcc626aa661299 | [
"BSD-2-Clause"
] | null | null | null | flexx/ui/__init__.py | abhishekgahlot/flexx | 27a995603b5e9d758f9d38eababcc626aa661299 | [
"BSD-2-Clause"
] | null | null | null | """
This module consists solely of widget classes. Once you are familiar with the
Widget class, understanding all other widgets should be straightforward.
The Widget class is the base component of all other ui classes. On
itself it does not do or show much, though we can make it visible:
.. UIExample:: 100
from flexx import app, ui
# A red widget
class Example(ui.Widget):
CSS = ".flx-example {background:#f00; min-width: 20px; min-height:20px}"
Widgets are also used as a container class:
.. UIExample:: 100
from flexx import app, ui
class Example(ui.Widget):
def init(self):
ui.Button(text='hello')
ui.Button(text='world')
Such "compound widgets" can be used anywhere in your app. They are
constructed by implementing the ``init()`` method. Inside this method
the widget is the *default parent*.
Any widget class can also be used as a *context manager*. Within the context,
the widget is the default parent; any widgets created in that context
that do not specify a parent, will have the widget as a parent. (The
default-parent-mechanism is thread-safe, since there is a default widget
per thread.)
.. UIExample:: 100
from flexx import app, ui
class Example(ui.Widget):
def init(self):
with ui.HBox():
ui.Button(flex=1, text='hello')
ui.Button(flex=1, text='world')
To create an actual app from a widget, there are three possibilities:
``serve()`` it as a web app, ``launch()`` it as a desktop app or
``export()`` it as a standalone HTML document:
.. code-block:: py
from flexx import app, ui
@app.serve
class Example(ui.Widget):
def init(self):
ui.Label(text='hello world')
example = app.launch(Example)
app.export(Example, 'example.html')
To lean about the individual widgets, check the
:doc:`list of widget classes <api>`.
"""
# We follow the convention of having one module per widget class (or a
# small set of closely related classes). In order not to pollute this
# namespace, we prefix the module names with an underscrore.
from ._widget import Widget
from ._layout import Layout
from ._box import Box, HBox, VBox
from ._splitter import Splitter, HSplitter, VSplitter
from ._formlayout import BaseTableLayout, FormLayout, GridLayout
from ._pinboardlayout import PinboardLayout
from ._button import Button
from ._slider import Slider
from ._lineedit import LineEdit
from ._label import Label
from ._panel import Panel
from ._progressbar import ProgressBar
from ._plotwidget import PlotWidget
from ._plotlayout import PlotLayout
| 29.755556 | 80 | 0.699776 | 387 | 2,678 | 4.806202 | 0.395349 | 0.009677 | 0.032258 | 0.03871 | 0.165054 | 0.136022 | 0.136022 | 0.086559 | 0.067742 | 0.067742 | 0 | 0.008153 | 0.221434 | 2,678 | 89 | 81 | 30.089888 | 0.883933 | 0.804705 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
847ca1f21f46f8eeffca45900536f0e7af46cd17 | 378 | py | Python | tests/utils.py | TatuArvela/magic | 71e33d290633fa17e94ca8b1a3d974f72e0e98a3 | [
"MIT"
] | null | null | null | tests/utils.py | TatuArvela/magic | 71e33d290633fa17e94ca8b1a3d974f72e0e98a3 | [
"MIT"
] | null | null | null | tests/utils.py | TatuArvela/magic | 71e33d290633fa17e94ca8b1a3d974f72e0e98a3 | [
"MIT"
] | null | null | null | # Syntactic sugar
class Expect:
def __init__(self, value):
self.value = value
def to_be_true(self):
assert self.value
def to_be_false(self):
assert not self.value
def to_be(self, value):
assert value == self.value
def expect(value):
return Expect(value)
def strip_output(output):
return output.getvalue().strip()
| 15.75 | 36 | 0.637566 | 51 | 378 | 4.529412 | 0.352941 | 0.233766 | 0.12987 | 0.155844 | 0.138528 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.26455 | 378 | 23 | 37 | 16.434783 | 0.830935 | 0.039683 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.230769 | 1 | 0.461538 | false | 0 | 0 | 0.153846 | 0.692308 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 4 |
849447b1f024f22e3c29a89fb3d009bfb3878da5 | 670 | py | Python | tests/import/import05.py | ktok07b6/polyphony | 657c5c7440520db6b4985970bd50547407693ac4 | [
"MIT"
] | 83 | 2015-11-30T09:59:13.000Z | 2021-08-03T09:12:28.000Z | tests/import/import05.py | jesseclin/polyphony | 657c5c7440520db6b4985970bd50547407693ac4 | [
"MIT"
] | 4 | 2017-02-10T01:43:11.000Z | 2020-07-14T03:52:25.000Z | tests/import/import05.py | jesseclin/polyphony | 657c5c7440520db6b4985970bd50547407693ac4 | [
"MIT"
] | 11 | 2016-11-18T14:39:15.000Z | 2021-02-23T10:05:20.000Z | import polyphony
from sub1 import SubC
import sub1
def import05_a1(x):
subc = SubC(x)
return subc.x
def import05_a2(x):
subc = sub1.SubC(x)
return subc.x
def import05_b1():
return SubC.VALUE1
def import05_b2():
return sub1.SubC.VALUE1
def import05_c1(x):
return SubC.VALUE2[x]
def import05_c2(x):
return sub1.SubC.VALUE2[x]
@polyphony.testbench
def test():
assert 100 == import05_a1(10)
assert 100 == import05_a2(10)
assert 1234 == import05_b1()
assert 1234 == import05_b2()
assert 1 == import05_c1(0)
assert 1 == import05_c2(0)
assert 4 == import05_c1(3)
assert 4 == import05_c2(3)
test()
| 14.888889 | 33 | 0.656716 | 102 | 670 | 4.176471 | 0.264706 | 0.15493 | 0.077465 | 0.070423 | 0.126761 | 0.126761 | 0.126761 | 0 | 0 | 0 | 0 | 0.149225 | 0.229851 | 670 | 44 | 34 | 15.227273 | 0.676357 | 0 | 0 | 0.071429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 1 | 0.25 | false | 0 | 0.607143 | 0.142857 | 1.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 4 |
849f14dd88697468f90a0112dab9e2d85ade9953 | 113 | py | Python | tests/test.py | dontbanmeplz/image-to-ascii | d7191b349f68cfce8ff9688847797a3eed79e064 | [
"MIT"
] | 8 | 2021-03-03T15:40:34.000Z | 2021-09-18T22:28:47.000Z | tests/test.py | dontbanmeplz/image-to-ascii | d7191b349f68cfce8ff9688847797a3eed79e064 | [
"MIT"
] | null | null | null | tests/test.py | dontbanmeplz/image-to-ascii | d7191b349f68cfce8ff9688847797a3eed79e064 | [
"MIT"
] | 3 | 2021-03-04T15:06:39.000Z | 2021-08-19T03:19:28.000Z | from image_to_ascii import ImageToAscii
ImageToAscii(imagePath="..images\pickachu.png",outputFile="output.txt") | 28.25 | 71 | 0.823009 | 14 | 113 | 6.5 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053097 | 113 | 4 | 71 | 28.25 | 0.850467 | 0 | 0 | 0 | 0 | 0 | 0.27193 | 0.184211 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
84a82ddbf72d0b7bad924d5a2f82cabb7c417dd0 | 4,721 | py | Python | tests/unit/test_new.py | besbes/formica | 94b43a11ed534ee7afa6a4f45848842bb163bbb6 | [
"MIT"
] | 50 | 2017-02-14T13:26:04.000Z | 2019-02-05T08:02:45.000Z | tests/unit/test_new.py | besbes/formica | 94b43a11ed534ee7afa6a4f45848842bb163bbb6 | [
"MIT"
] | 54 | 2017-02-06T11:06:33.000Z | 2019-02-07T16:55:08.000Z | tests/unit/test_new.py | besbes/formica | 94b43a11ed534ee7afa6a4f45848842bb163bbb6 | [
"MIT"
] | 7 | 2017-03-20T10:29:46.000Z | 2018-08-02T12:41:31.000Z | import pytest
from formica import cli
from tests.unit.constants import REGION, PROFILE, STACK, TEMPLATE
def test_create_changeset_for_new_stack(change_set, client, loader):
loader.return_value.template.return_value = TEMPLATE
cli.main(['new', '--stack', STACK, '--profile', PROFILE, '--region', REGION])
change_set.assert_called_with(stack=STACK, nested_change_sets=False)
change_set.return_value.create.assert_called_once_with(template=TEMPLATE, change_set_type='CREATE',
parameters={}, tags={}, capabilities=None,
resource_types=False, role_arn=None, s3=False)
change_set.return_value.describe.assert_called_once()
def test_new_uses_parameters_for_creation(change_set, client, loader):
loader.return_value.template.return_value = TEMPLATE
cli.main(['new', '--stack', STACK, '--parameters', 'A=B', 'C=D', '--profile', PROFILE, '--region', REGION, ])
change_set.assert_called_with(stack=STACK, nested_change_sets=False)
change_set.return_value.create.assert_called_once_with(template=TEMPLATE, change_set_type='CREATE',
parameters={'A': 'B', 'C': 'D'}, tags={},
capabilities=None, resource_types=False, role_arn=None,
s3=False)
def test_new_uses_tags_for_creation(change_set, client, loader):
loader.return_value.template.return_value = TEMPLATE
cli.main(['new', '--stack', STACK, '--tags', 'A=C', 'C=D', '--profile', PROFILE, '--region', REGION, ])
change_set.assert_called_with(stack=STACK, nested_change_sets=False)
change_set.return_value.create.assert_called_once_with(template=TEMPLATE, change_set_type='CREATE',
parameters={},
tags={'A': 'C', 'C': 'D'}, capabilities=None,
resource_types=False, role_arn=None, s3=False)
def test_new_tests_parameter_format(capsys):
with pytest.raises(SystemExit) as pytest_wrapped_e:
cli.main(
['new', '--stack', STACK, '--parameters', 'A=B', '--profile', PROFILE, '--region', REGION, '--tags', 'CD'])
out, err = capsys.readouterr()
assert "needs to be in format KEY=VALUE" in err
assert pytest_wrapped_e.value.code == 2
def test_new_uses_capabilities_for_creation(change_set, client, loader):
loader.return_value.template.return_value = TEMPLATE
cli.main(['new', '--stack', STACK, '--capabilities', 'A', 'B'])
change_set.assert_called_with(stack=STACK, nested_change_sets=False)
change_set.return_value.create.assert_called_once_with(template=TEMPLATE, change_set_type='CREATE',
parameters={},
tags={}, capabilities=['A', 'B'], resource_types=False,
role_arn=None, s3=False)
def test_upload_artifacts(change_set, aws_client, loader, temp_bucket_cli, mocker):
mocker.patch('formica.cli.collect_vars').return_value = {}
loader.return_value.template.return_value = TEMPLATE
cli.main(['new', '--stack', STACK, '--upload-artifacts', '--artifacts', 'testfile'])
change_set.assert_called_with(stack=STACK, nested_change_sets=False)
change_set.return_value.create.assert_called_once_with(change_set_type='CREATE',
parameters={},
tags={}, capabilities=None, resource_types=False,
role_arn=None, s3=False, template=TEMPLATE)
temp_bucket_cli.add_file.assert_called_once_with('testfile')
temp_bucket_cli.upload.assert_called_once()
def test_nested_change_sets(change_set, aws_client, loader):
loader.return_value.template.return_value = TEMPLATE
cli.main(['new', '--stack', STACK, '--nested-change-sets'])
change_set.assert_called_with(stack=STACK, nested_change_sets=True)
change_set.return_value.create.assert_called_once_with(change_set_type='CREATE',
parameters={},
tags={}, capabilities=None, resource_types=False,
template=TEMPLATE,
role_arn=None, s3=False)
| 58.283951 | 119 | 0.580386 | 506 | 4,721 | 5.108696 | 0.160079 | 0.087041 | 0.088201 | 0.040619 | 0.770213 | 0.705996 | 0.705996 | 0.705996 | 0.688975 | 0.688975 | 0 | 0.002132 | 0.304596 | 4,721 | 80 | 120 | 59.0125 | 0.785257 | 0 | 0 | 0.435484 | 0 | 0 | 0.079009 | 0.005084 | 0 | 0 | 0 | 0 | 0.274194 | 1 | 0.112903 | false | 0 | 0.048387 | 0 | 0.16129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
84bb56ac180f474faf7e0265f8e8d8f28ea5a268 | 81 | py | Python | uas/apps.py | shobron/analytics | d51079f81c876aa4df57b0635fabfe592fd3a9c9 | [
"MIT"
] | 1 | 2018-02-14T20:10:57.000Z | 2018-02-14T20:10:57.000Z | uas/apps.py | shobron/analytics | d51079f81c876aa4df57b0635fabfe592fd3a9c9 | [
"MIT"
] | null | null | null | uas/apps.py | shobron/analytics | d51079f81c876aa4df57b0635fabfe592fd3a9c9 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class UasConfig(AppConfig):
name = 'uas'
| 13.5 | 33 | 0.728395 | 10 | 81 | 5.9 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185185 | 81 | 5 | 34 | 16.2 | 0.893939 | 0 | 0 | 0 | 0 | 0 | 0.037037 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 4 |
84bedf7f0df6a81a2ecb5e5434ac2d1080c84278 | 86 | py | Python | practice_file01.py | RamdhashDreamGuys/Practice01 | dd2b852a646a334815d475f949fbf98ed3fe250a | [
"Apache-2.0"
] | null | null | null | practice_file01.py | RamdhashDreamGuys/Practice01 | dd2b852a646a334815d475f949fbf98ed3fe250a | [
"Apache-2.0"
] | null | null | null | practice_file01.py | RamdhashDreamGuys/Practice01 | dd2b852a646a334815d475f949fbf98ed3fe250a | [
"Apache-2.0"
] | null | null | null | "for testing purpose i have create this file"
"after commiti have enter this details"
| 28.666667 | 45 | 0.790698 | 14 | 86 | 4.857143 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 86 | 2 | 46 | 43 | 0.944444 | 0.5 | 0 | 0 | 0 | 0 | 0.930233 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
84e12d396addcde6767815b707c4d0f59d77d081 | 464 | py | Python | MC-MPC/scripts/tests.py | Anna-Kuosmanen/DAGChainer | 2e095e30dd9b158563c05d088443d6b548eeb870 | [
"MIT"
] | 9 | 2018-04-18T12:48:49.000Z | 2022-03-23T20:53:30.000Z | MC-MPC/scripts/tests.py | Anna-Kuosmanen/DAGChainer | 2e095e30dd9b158563c05d088443d6b548eeb870 | [
"MIT"
] | null | null | null | MC-MPC/scripts/tests.py | Anna-Kuosmanen/DAGChainer | 2e095e30dd9b158563c05d088443d6b548eeb870 | [
"MIT"
] | 3 | 2019-04-10T13:02:47.000Z | 2022-03-23T20:54:02.000Z | import solvers
TEST_GRAPH_NAME = "test_graph"
def test_sum(k, n, m):
solvers.generate_k_path_graph(k, n , m, TEST_GRAPH_NAME)
decomposed_sum = solvers.solve_with_decomposition(TEST_GRAPH_NAME)
normal_sum = solvers.solve_without_decomposition(TEST_GRAPH_NAME)
print decomposed_sum
print normal_sum
if( decomposed_sum != normal_sum):
print "Test failed"
return 0
print "Test passed."
return 1
#for i in range(1, 10):
test_sum(2, 3, 10) | 21.090909 | 68 | 0.747845 | 74 | 464 | 4.364865 | 0.432432 | 0.139319 | 0.160991 | 0.160991 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023196 | 0.163793 | 464 | 22 | 69 | 21.090909 | 0.809278 | 0.047414 | 0 | 0 | 1 | 0 | 0.074661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.071429 | 0.071429 | null | null | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 4 |
84e6081b5e696882f2fdd2c05a8f193e8908fe91 | 30 | py | Python | Design_Patterns/Comportamental/Strategy/Strategy.py | 1King-coder/Developing-Some-Python-Skills | 17ee81de0bebee693dd274ca2adbeb0ef249b46a | [
"MIT"
] | 3 | 2021-08-17T10:30:02.000Z | 2021-11-17T10:46:26.000Z | Design_Patterns/Comportamental/Strategy/Strategy.py | 1King-coder/Learning-Python | 17ee81de0bebee693dd274ca2adbeb0ef249b46a | [
"MIT"
] | null | null | null | Design_Patterns/Comportamental/Strategy/Strategy.py | 1King-coder/Learning-Python | 17ee81de0bebee693dd274ca2adbeb0ef249b46a | [
"MIT"
] | null | null | null | """
Open/closed principle
"""
| 7.5 | 21 | 0.633333 | 3 | 30 | 6.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 3 | 22 | 10 | 0.730769 | 0.7 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
84fcfb7588af876a49d5dda67cf8acf4d32c69d7 | 296 | py | Python | thau.py | son3ca55/Lista-de-exerc-cios-em-Python | a9e4643fb0de5bbe8c1e2f0402c30815ad379ad2 | [
"Unlicense"
] | null | null | null | thau.py | son3ca55/Lista-de-exerc-cios-em-Python | a9e4643fb0de5bbe8c1e2f0402c30815ad379ad2 | [
"Unlicense"
] | null | null | null | thau.py | son3ca55/Lista-de-exerc-cios-em-Python | a9e4643fb0de5bbe8c1e2f0402c30815ad379ad2 | [
"Unlicense"
] | null | null | null | from random import randint
print ('Bem vindo!')
sorteado = randint(1, 100)
chute = 0
while chute != sorteado:
chute = int(input ('Chute: '))
if chute == sorteado:
print ('Você venceu!')
if chute > sorteado:
print ('Menor')
if chute < sorteado:
print ('Maior')
print ('Fim do jogo!')
| 21.142857 | 31 | 0.64527 | 40 | 296 | 4.775 | 0.575 | 0.272251 | 0.235602 | 0.314136 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021097 | 0.199324 | 296 | 13 | 32 | 22.769231 | 0.78481 | 0 | 0 | 0 | 0 | 0 | 0.180212 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.076923 | null | null | 0.384615 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
ca11a717c0109a100eb6d5e5e52808e119212d1c | 93 | py | Python | config/settings/local.py | almostolmos/Metecho | 7f58eca163faafea1ce07ffb6f4de2449fa0b8df | [
"BSD-3-Clause"
] | 21 | 2020-04-02T21:39:58.000Z | 2022-01-31T19:43:47.000Z | config/settings/local.py | almostolmos/Metecho | 7f58eca163faafea1ce07ffb6f4de2449fa0b8df | [
"BSD-3-Clause"
] | 1,613 | 2020-03-26T16:39:57.000Z | 2022-03-07T14:54:16.000Z | config/settings/local.py | almostolmos/Metecho | 7f58eca163faafea1ce07ffb6f4de2449fa0b8df | [
"BSD-3-Clause"
] | 21 | 2020-07-21T11:58:47.000Z | 2021-11-25T00:48:21.000Z | from .base import * # NOQA
INSTALLED_APPS = INSTALLED_APPS + ["django_extensions"] # NOQA
| 23.25 | 63 | 0.72043 | 11 | 93 | 5.818182 | 0.727273 | 0.40625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172043 | 93 | 3 | 64 | 31 | 0.831169 | 0.096774 | 0 | 0 | 0 | 0 | 0.209877 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
ca2256c4dc4495c53a60297898f24ec3014eeb61 | 188 | py | Python | testing_martina/ex3_martina.py | mjyoo2/2019-1-OSS-L1 | 62b3b0290416f9472cc3108cf21a1afb1a7cc4e6 | [
"MIT"
] | 3 | 2019-05-24T08:13:02.000Z | 2019-06-03T13:35:30.000Z | testing_martina/ex3_martina.py | mjyoo2/2019-1-OSS-L1 | 62b3b0290416f9472cc3108cf21a1afb1a7cc4e6 | [
"MIT"
] | 11 | 2019-05-15T04:02:09.000Z | 2019-06-11T05:37:42.000Z | testing_martina/ex3_martina.py | mjyoo2/2019-1-OSS-L1 | 62b3b0290416f9472cc3108cf21a1afb1a7cc4e6 | [
"MIT"
] | 5 | 2019-05-07T10:07:44.000Z | 2019-05-28T09:55:08.000Z | '''
lee young suk
2014314433
'''
greeting = "\"Hello, World\""
farewell = '\'Bye, World\''
'''
greeting = "\"Hello, World\" \n" + '\'Bye, World\''
'''
print(greeting)
print(farewell)
| 11.058824 | 51 | 0.574468 | 20 | 188 | 5.4 | 0.55 | 0.240741 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.063694 | 0.164894 | 188 | 16 | 52 | 11.75 | 0.624204 | 0.12766 | 0 | 0 | 0 | 0 | 0.020833 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 4 |
ca61711f1f184c6c7d7042ecd53fccf3128c0d25 | 10,814 | py | Python | python/neural_srl/shared/reader.py | wangtong106/deep_srl | 544217ccb68c363abe98a2a5835ca1d215864841 | [
"Apache-2.0"
] | 354 | 2017-06-01T03:35:56.000Z | 2022-03-16T02:50:27.000Z | python/neural_srl/shared/reader.py | wangtong106/deep_srl | 544217ccb68c363abe98a2a5835ca1d215864841 | [
"Apache-2.0"
] | 30 | 2017-10-15T05:48:57.000Z | 2021-12-22T18:50:55.000Z | python/neural_srl/shared/reader.py | wangtong106/deep_srl | 544217ccb68c363abe98a2a5835ca1d215864841 | [
"Apache-2.0"
] | 96 | 2017-06-16T10:05:04.000Z | 2022-03-16T13:02:36.000Z | import random
from constants import *
from dictionary import Dictionary
import features
def get_sentences(filepath, use_se_marker=False):
""" Read tokenized sentences from file """
sentences = []
with open(filepath) as f:
for line in f.readlines():
inputs = line.strip().split('|||')
lefthand_input = inputs[0].strip().split()
# If gold tags are not provided, create a sequence of dummy tags.
righthand_input = inputs[1].strip().split() if len(inputs) > 1 \
else ['O' for _ in lefthand_input]
if use_se_marker:
words = [START_MARKER] + lefthand_input + [END_MARKER]
labels = [None] + righthand_input + [None]
else:
words = lefthand_input
labels = righthand_input
sentences.append((words, labels))
return sentences
#lines = f.readlines()
#sentences = [line.strip().split('|||') for line in lines]
#if use_se_marker:
# return [([START_MARKER] + words.strip().split() + [END_MARKER], [None] + labels.strip().split() + [None])
# for words,labels in sentences]
#else:
# return [(words.strip().split(), labels.strip().split()) for words,labels in sentences]
def get_srl_sentences(filepath, use_se_marker=False):
""" Read tokenized SRL sentences from file.
File format: {predicate_id} [word0, word1 ...] ||| [label0, label1 ...]
Return:
A list of sentences, with structure: [[words], predicate, [labels]]
"""
sentences = []
with open(filepath) as f:
for line in f.readlines():
inputs = line.strip().split('|||')
lefthand_input = inputs[0].strip().split()
# If gold tags are not provided, create a sequence of dummy tags.
righthand_input = inputs[1].strip().split() if len(inputs) > 1 \
else ['O' for _ in lefthand_input[1:]]
predicate = int(lefthand_input[0])
if use_se_marker:
words = [START_MARKER] + lefthand_input[1:] + [END_MARKER]
labels = [None] + righthand_input + [None]
else:
words = lefthand_input[1:]
labels = righthand_input
sentences.append((words, predicate, labels))
return sentences
def get_pretrained_embeddings(filepath):
embeddings = dict()
with open(filepath, 'r') as f:
for line in f:
info = line.strip().split()
#lines = [line.strip().split() for line in f.readlines()]
#embeddings = dict([(line[0], [float(r) for r in line[1:]]) for line in lines])
embeddings[info[0]] = [float(r) for r in info[1:]]
f.close()
embedding_size = len(embeddings.values()[0])
print 'Embedding size={}'.format(embedding_size)
embeddings[START_MARKER] = [random.gauss(0, 0.01) for _ in range(embedding_size)]
embeddings[END_MARKER] = [random.gauss(0, 0.01) for _ in range(embedding_size)]
if not UNKNOWN_TOKEN in embeddings:
embeddings[UNKNOWN_TOKEN] = [random.gauss(0, 0.01) for _ in range(embedding_size)]
return embeddings
def string_sequence_to_ids(str_seq, dictionary, lowercase=False, pretrained_embeddings=None):
""" If pretrained_embeddings is provided, strings not in the embeddings
Pretrained embeddings is a dictionary from strings to python list.
"""
ids = []
for s in str_seq:
if s is None:
ids.append(-1)
continue
if lowercase:
s = s.lower()
if (pretrained_embeddings != None) and not (s in pretrained_embeddings) :
s = UNKNOWN_TOKEN
ids.append(dictionary.add(s))
return ids
def get_postag_data(config, train_path, dev_path, vocab_path=None, label_path=None):
use_se_marker = config.use_se_marker
raw_train_sents = get_sentences(train_path, use_se_marker)
raw_dev_sents = get_sentences(dev_path, use_se_marker)
word_to_embeddings = get_pretrained_embeddings(WORD_EMBEDDINGS[config.word_embedding])
# Prepare word dictionary.
word_dict = Dictionary(unknown_token=UNKNOWN_TOKEN)
if use_se_marker:
word_dict.add_all([START_MARKER, END_MARKER])
if vocab_path != None:
with open(vocab_path, 'r') as f_vocab:
for line in f_vocab:
word_dict.add(line.strip())
f_vocab.close()
word_dict.accept_new = False
print 'Load {} words. Dictionary freezed.'.format(word_dict.size())
# Parpare label dictionary.
label_dict = Dictionary()
if label_path != None:
with open(label_path, 'r') as f_labels:
for line in f_labels:
label_dict.add(line.strip())
f_labels.close()
label_dict.set_unknown_token(UNKNOWN_LABEL)
label_dict.accept_new = False
print 'Load {} labels. Dictionary freezed.'.format(label_dict.size())
train_sents = [(string_sequence_to_ids(sent[0], word_dict, True, word_to_embeddings),
string_sequence_to_ids(sent[1], label_dict)) for sent in raw_train_sents]
dev_sents = [(string_sequence_to_ids(sent[0], word_dict, True, word_to_embeddings),
string_sequence_to_ids(sent[1], label_dict)) for sent in raw_dev_sents]
print("Extracted {} words and {} tags".format(word_dict.size(), label_dict.size()))
print("Max training sentence length: {}".format(max([len(s[0]) for s in train_sents])))
print("Max development sentence length: {}".format(max([len(s[0]) for s in dev_sents])))
word_embedding = [word_to_embeddings[w] for w in word_dict.idx2str]
word_embedding_shape = [len(word_embedding), len(word_embedding[0])]
return (train_sents, dev_sents, word_dict, label_dict, [word_embedding], [word_embedding_shape])
def get_srl_data(config, train_data_path, dev_data_path, vocab_path=None, label_path=None):
'''
'''
use_se_marker = config.use_se_marker
raw_train_sents = get_srl_sentences(train_data_path, use_se_marker)
raw_dev_sents = get_srl_sentences(dev_data_path, use_se_marker)
word_to_embeddings = get_pretrained_embeddings(WORD_EMBEDDINGS[config.word_embedding])
# Prepare word dictionary.
word_dict = Dictionary(unknown_token=UNKNOWN_TOKEN)
if use_se_marker:
word_dict.add_all([START_MARKER, END_MARKER])
if vocab_path != None:
with open(vocab_path, 'r') as f_vocab:
for line in f_vocab:
word_dict.add(line.strip())
f_vocab.close()
word_dict.accept_new = False
print 'Load {} words. Dictionary freezed.'.format(word_dict.size())
# Parpare label dictionary.
label_dict = Dictionary()
if label_path != None:
with open(label_path, 'r') as f_labels:
for line in f_labels:
label_dict.add(line.strip())
f_labels.close()
label_dict.set_unknown_token(UNKNOWN_LABEL)
label_dict.accept_new = False
print 'Load {} labels. Dictionary freezed.'.format(label_dict.size())
# Get tokens and labels
train_tokens = [string_sequence_to_ids(sent[0], word_dict, True, word_to_embeddings) for sent in raw_train_sents]
train_labels = [string_sequence_to_ids(sent[2], label_dict) for sent in raw_train_sents]
if label_dict.accept_new:
label_dict.set_unknown_token(UNKNOWN_LABEL)
label_dict.accept_new = False
dev_tokens = [string_sequence_to_ids(sent[0], word_dict, True, word_to_embeddings) for sent in raw_dev_sents]
dev_labels = [string_sequence_to_ids(sent[2], label_dict) for sent in raw_dev_sents]
# Get features
print 'Extracting features'
train_features, feature_shapes = features.get_srl_features(raw_train_sents, config)
dev_features, feature_shapes2 = features.get_srl_features(raw_dev_sents, config)
for f1, f2 in zip(feature_shapes, feature_shapes2):
assert f1 == f2
# For additional features. Unused now.
feature_dicts = []
for feature in config.features:
feature_dicts.append(None)
train_sents = []
dev_sents = []
for i in range(len(train_tokens)):
train_sents.append((train_tokens[i],) + tuple(train_features[i]) + (train_labels[i],))
for i in range(len(dev_tokens)):
dev_sents.append((dev_tokens[i],) + tuple(dev_features[i]) + (dev_labels[i],))
print("Extraced {} words and {} tags".format(word_dict.size(), label_dict.size()))
print("Max training sentence length: {}".format(max([len(s[0]) for s in train_sents])))
print("Max development sentence length: {}".format(max([len(s[0]) for s in dev_sents])))
word_embedding = [word_to_embeddings[w] for w in word_dict.idx2str]
word_embedding_shape = [len(word_embedding), len(word_embedding[0])]
return (train_sents, dev_sents, word_dict, label_dict,
[word_embedding, None, None],
[word_embedding_shape] + feature_shapes,
[word_dict, ] + feature_dicts)
def get_postag_test_data(filepath, config, word_dict, label_dict, allow_new_words=True):
# New words are allowed as long as they are covered by pre-trained embeddings.
word_dict.accept_new = allow_new_words
if label_dict.accept_new:
label_dict.set_unknown_token(UNKNOWN_LABEL)
label_dict.accept_new = False
if filepath != None and filepath != '':
samples = get_sentences(filepath, config.use_se_marker)
else:
samples = []
word_to_embeddings = get_pretrained_embeddings(WORD_EMBEDDINGS[config.word_embedding])
if allow_new_words:
tokens = [string_sequence_to_ids(sent[0], word_dict, True, word_to_embeddings) for sent in samples]
else:
tokens = [string_sequence_to_ids(sent[0], word_dict, True) for sent in samples]
labels = [string_sequence_to_ids(sent[1], label_dict) for sent in samples]
sentences = []
for i in range(len(tokens)):
sentences.append((tokens[i],) + (labels[i],))
word_embedding = [word_to_embeddings[w] for w in word_dict.idx2str]
word_embedding_shape = [len(word_embedding), len(word_embedding[0])]
return (sentences, [word_embedding], [word_embedding_shape])
def get_srl_test_data(filepath, config, word_dict, label_dict, allow_new_words=True):
word_dict.accept_new = allow_new_words
if label_dict.accept_new:
label_dict.set_unknown_token(UNKNOWN_LABEL)
label_dict.accept_new = False
if filepath != None and filepath != '':
samples = get_srl_sentences(filepath, config.use_se_marker)
else:
samples = []
word_to_embeddings = get_pretrained_embeddings(WORD_EMBEDDINGS[config.word_embedding])
if allow_new_words:
tokens = [string_sequence_to_ids(sent[0], word_dict, True, word_to_embeddings) for sent in samples]
else:
tokens = [string_sequence_to_ids(sent[0], word_dict, True) for sent in samples]
labels = [string_sequence_to_ids(sent[2], label_dict) for sent in samples]
srl_features, feature_shapes = features.get_srl_features(samples, config)
sentences = []
for i in range(len(tokens)):
sentences.append((tokens[i],) + tuple(srl_features[i]) + (labels[i],))
word_embedding = [word_to_embeddings[w] for w in word_dict.idx2str]
word_embedding_shape = [len(word_embedding), len(word_embedding[0])]
return (sentences, [word_embedding, None, None], [word_embedding_shape,] + feature_shapes)
| 42.077821 | 115 | 0.707601 | 1,553 | 10,814 | 4.644559 | 0.099163 | 0.034382 | 0.025925 | 0.039512 | 0.747955 | 0.730764 | 0.703452 | 0.691529 | 0.658256 | 0.635796 | 0 | 0.007174 | 0.175051 | 10,814 | 256 | 116 | 42.242188 | 0.801368 | 0.078047 | 0 | 0.551546 | 0 | 0 | 0.039841 | 0 | 0 | 0 | 0 | 0 | 0.005155 | 0 | null | null | 0 | 0.020619 | null | null | 0.061856 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 4 |
04868977e62fbeb9c5e9b662cea5710a1f36668b | 58 | py | Python | ml_project/utils/__init__.py | made-ml-in-prod-2021/marina-zav | 7b4b6e5f333707001e36dfb014dcd36bf975d969 | [
"FTL"
] | null | null | null | ml_project/utils/__init__.py | made-ml-in-prod-2021/marina-zav | 7b4b6e5f333707001e36dfb014dcd36bf975d969 | [
"FTL"
] | null | null | null | ml_project/utils/__init__.py | made-ml-in-prod-2021/marina-zav | 7b4b6e5f333707001e36dfb014dcd36bf975d969 | [
"FTL"
] | null | null | null | from .utils import init_logger
__all__ = ["init_logger"]
| 14.5 | 30 | 0.758621 | 8 | 58 | 4.75 | 0.75 | 0.526316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 58 | 3 | 31 | 19.333333 | 0.76 | 0 | 0 | 0 | 0 | 0 | 0.189655 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
048c25eb7d6da951f4dfcbb7327628482f4fd27b | 142 | py | Python | Bird.py | Hejinzefinance/Learn_git | 83903bd1da8493d6855c959dddec6793ff29d89e | [
"MIT"
] | null | null | null | Bird.py | Hejinzefinance/Learn_git | 83903bd1da8493d6855c959dddec6793ff29d89e | [
"MIT"
] | null | null | null | Bird.py | Hejinzefinance/Learn_git | 83903bd1da8493d6855c959dddec6793ff29d89e | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Mon Mar 15 22:40:29 2021
@author: lenovo
"""
class Bird(object):
def __init__(self):
pass | 14.2 | 35 | 0.584507 | 21 | 142 | 3.761905 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12037 | 0.239437 | 142 | 10 | 36 | 14.2 | 0.611111 | 0.528169 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.333333 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 4 |
04a609227851621f21a44e150cae1897457d7ebf | 264 | py | Python | video_games/urls.py | LeeThomas13/django-video-games | a9ee999e9f4a01ffe0248daeec59eb54288264ba | [
"MIT"
] | null | null | null | video_games/urls.py | LeeThomas13/django-video-games | a9ee999e9f4a01ffe0248daeec59eb54288264ba | [
"MIT"
] | null | null | null | video_games/urls.py | LeeThomas13/django-video-games | a9ee999e9f4a01ffe0248daeec59eb54288264ba | [
"MIT"
] | null | null | null | from django.urls import path
from .views import GameListview, GameDetailView, GameCreateView, GameUpdateView, GameDeleteView
urlpatterns = [
path('', GameListView.as_view(), name='game_list')
path('detail/', GameDetailView.as_view(), name='game_detail')
] | 37.714286 | 95 | 0.757576 | 29 | 264 | 6.758621 | 0.62069 | 0.061224 | 0.102041 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113636 | 264 | 7 | 96 | 37.714286 | 0.837607 | 0 | 0 | 0 | 0 | 0 | 0.101887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.333333 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 4 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.