hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
497f15c429fd532c380ac01a11e7c5c3eef3e73d | 17,326 | py | Python | rocketbox/geometrylib.py | qmichalski/1d-rocket-flow | 7c9b38e42a1cd4b0343fdd36f89c52b9f2337b5b | [
"BSD-3-Clause"
] | null | null | null | rocketbox/geometrylib.py | qmichalski/1d-rocket-flow | 7c9b38e42a1cd4b0343fdd36f89c52b9f2337b5b | [
"BSD-3-Clause"
] | null | null | null | rocketbox/geometrylib.py | qmichalski/1d-rocket-flow | 7c9b38e42a1cd4b0343fdd36f89c52b9f2337b5b | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Sun Dec 26 14:09:49 2021
@author: quent
"""
class OneDGeometry():
def __init__(
self, grid = None, A = None, Ph = None, Dh = None, Pexch = None,
roughness = None):
if not(grid==None):
self.grid = grid
if not(A==None):
self.crossSection = A
if not(Ph==None):
self.hydraulicPerimeter = Ph
if not(Dh==None):
self.hydraulicDiameter = Dh
if not(Pexch==None):
self.heatExchangePerimeter = Pexch
if not(roughness==None):
self.roughness = roughness
self.requiredProperties = ['grid',
'crossSection',
'hydraulicPerimeter',
'hydraulicDiameter',
'heatExchangePerimeter',
'roughness']
def rocketEngineConstructor(self,
combustionChamberLength,
chamberDiameter,
convergentAngle,
throatDiameter,
roughness,
nozzleLength,
nozzleDiameter,
innerCore=False,
innerCoreDict=None,
nbrPoints=500,
plot=False):
from scipy.optimize import root
import numpy as np
import matplotlib.pyplot as plt
innerCoreDiameter = 0
LC = combustionChamberLength
Rt = throatDiameter/2
RC = chamberDiameter/2
Rn = nozzleDiameter/2
Rconge = 0
convergentAngleMax = np.arctan(((RC-2.5*Rt)**2/((1.5*Rt)**2-(RC-2.5*Rt)**2))**(1/2))/np.pi*180
convergentAngleRad = convergentAngle / 180 * np.pi
if convergentAngleRad > convergentAngleMax and convergentAngleMax > 0:
print('convergent Angle over convergent Angle max = {} - consider smaller angle'.format(convergentAngleMax*180/np.pi))
xK = LC - 1.5*Rt*np.cos(np.pi/2 - convergentAngleRad)
yK = 2.5*Rt - ((1.5*Rt)**2 - (xK-LC)**2)**(1/2)
dxC = (RC-yK) / np.tan(convergentAngleRad)
xC = xK-dxC
yC = RC
yD = RC - Rconge
yB = yD + Rconge*np.sin(np.pi/2-convergentAngleRad)
fun = lambda x: (x-xC)**2 / (np.sin(np.pi/2-convergentAngleRad)**2) - (xC - x + Rconge*np.cos(np.pi/2-convergentAngleRad))**2 - (yC-yD)**2 + Rconge**2
sol = root(fun,xC)
xB = sol['x'][0]
xD = xB - Rconge*np.cos(np.pi/2-convergentAngleRad)
xA = xD
yA = RC
Rthroatnozzle = 1.5*Rt
fun = lambda theta: np.tan(theta)-(Rn-(Rt+Rthroatnozzle*(1-np.cos(theta))))/(nozzleLength-Rthroatnozzle*np.sin(theta))
thetaN = root(fun,0)['x'][0]
# thetaN = 30/180*np.pi
# plt.plot(np.linspace(0,45,100),fun(np.linspace(0,np.pi/4,100)))
xN = Rthroatnozzle*np.sin(thetaN) + LC
yN = Rthroatnozzle*(1-np.cos(thetaN)) + Rt
# print(thetaN*180/np.pi,xN,yN)
xSet = np.linspace(0,combustionChamberLength+nozzleLength,nbrPoints)
ySet = np.zeros(len(xSet))
for ii,x in enumerate(xSet):
if x <= xA:
ySet[ii] = RC
# print(0,x)
if xA < x <= xB:
ySet[ii] = yD + ((Rconge)**2 - (xD-x)**2)**(1/2)
if xB < x <= xK:
ySet[ii] = RC - np.tan(convergentAngleRad)*(x-(xK-dxC))
if (xK) < x <= combustionChamberLength:
ySet[ii] = (1+1.5)*Rt - ((1.5*Rt)**2 - (LC-x)**2)**(1/2)
if LC < x <= xN:
ySet[ii] = Rt+Rthroatnozzle - (Rthroatnozzle**2 - (LC-x)**2)**(1/2)
if xN < x:
ySet[ii] = yN + np.tan(thetaN)*(x-xN)
ySet_inner = np.zeros(len(xSet))
if innerCore == True:
if innerCoreDict['type'] == 'convergent':
print(innerCoreDict['type'])
xFinal = RC / np.tan(innerCoreDict['innerAngle']) + innerCoreDict['flatLength']
xM = innerCoreDict['flatLength']
if xM > xA:
xM = xA
for ii,x in enumerate(xSet):
if x <= xM:
ySet_inner[ii] = RC - innerCoreDict['channelWidth']
if xM < x <= xFinal:
ySet_inner[ii] = RC - innerCoreDict['channelWidth'] - np.tan(innerCoreDict['innerAngle'])*(x-xM)
S_outer = np.pi*ySet**2
S_tot = np.pi*(ySet**2-ySet_inner**2)
# if xSet[-1] > xFinal:
argXFinal = np.where(xSet>xFinal)[0][0]
dS = S_outer - S_tot
# print(np.argmin(dS[0:argXFinal]))
for ii,x in enumerate(xSet):
if xSet[np.argmin(dS[0:argXFinal])] <= x:
ySet_inner[ii] = 0
if innerCoreDict['type'] == 'copy':
print(innerCoreDict['type'])
ySet_inner = ySet - innerCoreDict['channelWidth']
ySet_inner[xSet>(xK-25e-3)] = 0
# for ii,x in enumerate(xSet):
# if (xK) < x <= combustionChamberLength:
# ySet[ii] = (1+1.5)*Rt - ((1.5*Rt)**2 - (LC-x)**2)**(1/2)
# model properties
self._gridLength = xSet[-1]
self._chamberDiameter = chamberDiameter
self._convergentAngle = convergentAngle
self._throatDiameter = throatDiameter
self._nbrPoints = nbrPoints
# essential properties
self.grid = xSet
self.crossSection = np.pi*(ySet**2 - ySet_inner**2)
self.hydraulicPerimeter = 2*np.pi*(ySet + ySet_inner)
self.hydraulicDiameter = 4*self.crossSection/self.hydraulicPerimeter
self.heatExchangePerimeter = self.hydraulicPerimeter
self.roughness = roughness*np.ones(len(xSet))
if plot:
RC = np.max(ySet)
plt.figure(figsize=(6, 6*5*RC/(LC*1.2)), dpi=400)
plt.plot(self.grid,ySet,'-x',color='k')
plt.plot(self.grid,-ySet,'-x',color='k')
if innerCore == True:
plt.plot(self.grid,ySet_inner,'-x',color='b')
plt.plot(self.grid,-ySet_inner,'-x',color='b')
plt.xlim([0,self.grid[-1]*1.2])
plt.ylim([-RC*1.2,RC*1.2])
plt.show()
plt.plot(self.crossSection)
plt.show()
def rdeEngineConstructor(self,
outerFlatLength,
chamberDiameter,
convergentAngle,
throatDiameter,
roughness,
nozzleLength,
nozzleDiameter,
innerCore=False,
innerCoreDict=None,
nbrPoints=500,
plot=False):
from scipy.optimize import root
import numpy as np
import matplotlib.pyplot as plt
innerCoreDiameter = 0
Rt = throatDiameter/2
RC = chamberDiameter/2
Rn = nozzleDiameter/2
Rconge = 0
convergentAngleMax = np.arctan(((RC-2.5*Rt)**2/((1.5*Rt)**2-(RC-2.5*Rt)**2))**(1/2))/np.pi*180
convergentAngleRad = convergentAngle / 180 * np.pi
if convergentAngleRad > convergentAngleMax and convergentAngleMax > 0:
print('convergent Angle over convergent Angle max = {} - consider smaller angle'.format(convergentAngleMax*180/np.pi))
yF = Rt - 1.5*Rt*(1/np.cos(convergentAngleRad)-1)
combustionChamberLength = (RC-yF) / np.tan(convergentAngleRad) + outerFlatLength
LC = combustionChamberLength
# print(yF,RC,Rt,LC)
xK = LC - 1.5*Rt*np.cos(np.pi/2 - convergentAngleRad)
yK = 2.5*Rt - ((1.5*Rt)**2 - (xK-LC)**2)**(1/2)
dxC = (RC-yK) / np.tan(convergentAngleRad)
xC = xK-dxC
yC = RC
yD = RC - Rconge
yB = yD + Rconge*np.sin(np.pi/2-convergentAngleRad)
fun = lambda x: (x-xC)**2 / (np.sin(np.pi/2-convergentAngleRad)**2) - (xC - x + Rconge*np.cos(np.pi/2-convergentAngleRad))**2 - (yC-yD)**2 + Rconge**2
sol = root(fun,xC)
xB = sol['x'][0]
xD = xB - Rconge*np.cos(np.pi/2-convergentAngleRad)
xA = xD
yA = RC
Rthroatnozzle = 1.5*Rt
fun = lambda theta: np.tan(theta)-(Rn-(Rt+Rthroatnozzle*(1-np.cos(theta))))/(nozzleLength-Rthroatnozzle*np.sin(theta))
thetaN = root(fun,0)['x'][0]
# thetaN = 30/180*np.pi
# plt.plot(np.linspace(0,45,100),fun(np.linspace(0,np.pi/4,100)))
xN = Rthroatnozzle*np.sin(thetaN) + LC
yN = Rthroatnozzle*(1-np.cos(thetaN)) + Rt
# print(thetaN*180/np.pi,xN,yN)
xSet = np.linspace(0,combustionChamberLength+nozzleLength,nbrPoints)
ySet = np.zeros(len(xSet))
for ii,x in enumerate(xSet):
if x <= xA:
ySet[ii] = RC
# print(0,x)
if xA < x <= xB:
ySet[ii] = yD + ((Rconge)**2 - (xD-x)**2)**(1/2)
if xB < x <= xK:
ySet[ii] = RC - np.tan(convergentAngleRad)*(x-(xK-dxC))
if (xK) < x <= combustionChamberLength:
ySet[ii] = (1+1.5)*Rt - ((1.5*Rt)**2 - (LC-x)**2)**(1/2)
if LC < x <= xN:
ySet[ii] = Rt+Rthroatnozzle - (Rthroatnozzle**2 - (LC-x)**2)**(1/2)
if xN < x:
ySet[ii] = yN + np.tan(thetaN)*(x-xN)
ySet_inner = np.zeros(len(xSet))
if innerCore == True:
if innerCoreDict['type'] == 'convergent':
RCi = RC - innerCoreDict['channelWidth']
yDi = RCi - Rconge
print(innerCoreDict['type'])
xFinal = (RC-innerCoreDict['channelWidth']) / np.tan(innerCoreDict['innerAngle']) + innerCoreDict['flatLength']
xM = innerCoreDict['flatLength']
if xM > xA:
xM = xA
for ii,x in enumerate(xSet):
if x <= xM:
ySet_inner[ii] = RC - innerCoreDict['channelWidth']
if xM < x <= xFinal:
ySet_inner[ii] = RC - innerCoreDict['channelWidth'] - np.tan(innerCoreDict['innerAngle'])*(x-xM)
S_outer = np.pi*ySet**2
S_tot = np.pi*(ySet**2-ySet_inner**2)
# if xSet[-1] > xFinal:
argXFinal = np.where(xSet>xFinal)[0][0]
dS = S_outer - S_tot
# print(np.argmin(dS[0:argXFinal]))
for ii,x in enumerate(xSet):
if xSet[np.argmin(dS[0:argXFinal])] <= x:
ySet_inner[ii] = 0
if innerCoreDict['type'] == 'copy':
print(innerCoreDict['type'])
ySet_inner = ySet - innerCoreDict['channelWidth']
ySet_inner[xSet>(xK-25e-3)] = 0
# for ii,x in enumerate(xSet):
# if (xK) < x <= combustionChamberLength:
# ySet[ii] = (1+1.5)*Rt - ((1.5*Rt)**2 - (LC-x)**2)**(1/2)
smoothing = False
if smoothing:
from scipy.signal import savgol_filter
ySet = savgol_filter(ySet, 7, 2)
ySet_inner_final_index = np.where(ySet_inner==0)[0][0]
ySet_inner = savgol_filter(ySet_inner, 7, 2)
ySet_inner[ySet_inner_final_index:] = 0
N = 7
ySet[int((N-1)/2):-int((N-1)/2)] = np.convolve(ySet, np.ones(N)/N, mode='valid')
ySet_inner_final_index = np.where(ySet_inner==0)[0][0]
ySet_inner[int((N-1)/2):-int((N-1)/2)] = np.convolve(ySet_inner, np.ones(N)/N, mode='valid')
ySet_inner[ySet_inner_final_index:] = 0
# model properties
self._gridLength = xSet[-1]
self._chamberDiameter = chamberDiameter
self._convergentAngle = convergentAngle
self._throatDiameter = throatDiameter
self._nbrPoints = nbrPoints
self._innerCore = ySet_inner
self._outerCore = ySet
# essential properties
self.grid = xSet
self.crossSection = np.pi*(ySet**2 - ySet_inner**2)
self.hydraulicPerimeter = 2*np.pi*(ySet + ySet_inner)
self.hydraulicDiameter = 4*self.crossSection/self.hydraulicPerimeter
self.heatExchangePerimeter = self.hydraulicPerimeter
self.roughness = roughness*np.ones(len(xSet))
if plot:
RC = np.max(ySet)
plt.figure(figsize=(6, 6*5*RC/(LC*1.2)), dpi=400)
plt.plot(self.grid,ySet,'-',color='k')
plt.plot(self.grid,-ySet,'-',color='k')
plt.plot(self.grid[ySet_inner>0],ySet_inner[ySet_inner>0],'-',color='b')
plt.plot(self.grid[ySet_inner>0],-ySet_inner[ySet_inner>0],'-',color='b')
plt.xlim([0,self.grid[-1]*1.2])
# plt.xlim([0.025,0.05])
plt.ylim([-RC*1.2,RC*1.2])
# plt.plot(xA,yA,'o',label='A')
# plt.plot(xB,yB,'o',label='B')
# plt.plot(xC,yC,'o',label='C')
# plt.plot(xD,yD,'o',label='D')
# plt.legend()
plt.show()
# plt.plot(self.crossSection)
def andersonConstructor(self, throatDiameter = 1, nbrPoints=51, roughness=10e-6, plot=False):
import numpy as np
import matplotlib.pyplot as plt
Afun = lambda x: throatDiameter*(1 + 2.2*(x-1.5)**2)
length = 3
grid = np.linspace(0,length,nbrPoints)
self.grid = grid
self.crossSection = Afun(grid)
self.hydraulicDiameter = (4*Afun(grid)/np.pi)**(1/2)
self.hydraulicPerimeter = 2*np.pi*self.hydraulicDiameter
self.heatExchangePerimeter = self.hydraulicPerimeter
self.roughness = roughness*np.ones(nbrPoints)
if plot:
RC = np.max(self.hydraulicDiameter)/2
plt.figure(figsize=(6, 6*5*RC/(length*1.2)), dpi=400)
plt.plot(self.grid,self.hydraulicDiameter/2,color='k')
plt.plot(self.grid,-self.hydraulicDiameter/2,color='k')
plt.xlim([0,length*1.2])
plt.ylim([-RC*2.5,RC*2.5])
def _test_rocketEngineConstructor(output=False):
import numpy as np
geometry = OneDGeometry()
# geometry.rocketEngineConstructor(combustionChamberLength=100e-3,
# chamberDiameter=50e-3,
# convergentAngle=15,
# throatDiameter=25e-3,
# roughness=10e-6,
# nozzleLength=50e-3,
# nozzleDiameter=50e-3,
# plot=True,nbrPoints=50)
innerCoreDict = {
'type':'convergent',
'channelWidth':5e-3,
'flatLength':30e-3,
'innerAngle':21.8/180*np.pi
}
# geometry.rocketEngineConstructor(combustionChamberLength=100e-3,
# chamberDiameter=50e-3,
# convergentAngle=15,
# throatDiameter=25e-3,
# roughness=10e-6,
# nozzleLength=15e-3,
# nozzleDiameter=30e-3,
# innerCore=True,
# innerCoreDict=innerCoreDict,
# plot=True,nbrPoints=100)
geometry.rdeEngineConstructor(outerFlatLength=innerCoreDict['flatLength'],
chamberDiameter=100e-3,
convergentAngle=15,
throatDiameter=30.8e-3,
roughness=10e-6,
nozzleLength=15e-3,
nozzleDiameter=35e-3,
innerCore=True,
innerCoreDict=innerCoreDict,
plot=True,nbrPoints=100)
# geometry.andersonConstructor(plot=True)
if output:
return(geometry)
if __name__ == "__main__":
from scipy.signal import savgol_filter
import numpy as np
import matplotlib.pyplot as plt
geometry = _test_rocketEngineConstructor(output=True)
yp = geometry._outerCore
ym = geometry._innerCore
plt.plot(yp,'x')
# plt.plot(np.diff(yp),'x')
N = 7
yp[int((N-1)/2):-int((N-1)/2)] = np.convolve(yp, np.ones(N)/N, mode='valid')
plt.plot(yp,'x')
ym = np.convolve(ym, np.ones(N)/N, mode='valid')
# plt.plot(ym)
# plt.plot(ymhat)
| 43.206983 | 158 | 0.491227 | 1,897 | 17,326 | 4.439642 | 0.105957 | 0.041677 | 0.008074 | 0.027309 | 0.79328 | 0.780575 | 0.762645 | 0.751128 | 0.728093 | 0.705296 | 0 | 0.041028 | 0.37539 | 17,326 | 400 | 159 | 43.315 | 0.737202 | 0.120628 | 0 | 0.707641 | 0 | 0 | 0.037757 | 0.001384 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016611 | false | 0 | 0.043189 | 0 | 0.063123 | 0.019934 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b8e9823ccb995e3c4bcbd54978cf42af774af9ab | 170 | py | Python | 01/00/2.py | pylangstudy/201707 | c1cc72667f1e0b6e8eef4ee85067d7fa4ca500b6 | [
"CC0-1.0"
] | null | null | null | 01/00/2.py | pylangstudy/201707 | c1cc72667f1e0b6e8eef4ee85067d7fa4ca500b6 | [
"CC0-1.0"
] | 46 | 2017-06-30T22:19:07.000Z | 2017-07-31T22:51:31.000Z | 01/00/2.py | pylangstudy/201707 | c1cc72667f1e0b6e8eef4ee85067d7fa4ca500b6 | [
"CC0-1.0"
] | null | null | null | import decimal
decimal.getcontext().prec = 36
print(decimal.Decimal(1) / decimal.Decimal(7))
decimal.getcontext().prec = 4
print(decimal.Decimal(1) / decimal.Decimal(7))
| 28.333333 | 46 | 0.752941 | 24 | 170 | 5.333333 | 0.375 | 0.546875 | 0.328125 | 0.3125 | 0.546875 | 0.546875 | 0.546875 | 0 | 0 | 0 | 0 | 0.044872 | 0.082353 | 170 | 5 | 47 | 34 | 0.775641 | 0 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.2 | 0 | 0.2 | 0.4 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
770c348a246465b127999a62c89543546eef45f1 | 112 | py | Python | alteryx_open_src_update_checker/tests/test_version.py | FeatureLabs/alteryx-open-src-update-checker | 928cce4f80b0d9c5d08291a40d758599a5e524c4 | [
"BSD-3-Clause"
] | 2 | 2019-05-20T21:57:33.000Z | 2019-07-15T13:44:07.000Z | alteryx_open_src_update_checker/tests/test_version.py | alteryx/alteryx-open-src-update-checker | 928cce4f80b0d9c5d08291a40d758599a5e524c4 | [
"BSD-3-Clause"
] | 6 | 2021-06-01T19:47:05.000Z | 2021-09-10T19:32:16.000Z | alteryx_open_src_update_checker/tests/test_version.py | FeatureLabs/alteryx-open-src-update-checker | 928cce4f80b0d9c5d08291a40d758599a5e524c4 | [
"BSD-3-Clause"
] | 2 | 2019-11-23T07:49:42.000Z | 2020-01-14T18:30:12.000Z | from alteryx_open_src_update_checker import __version__
def test_version():
assert __version__ == "2.1.0"
| 18.666667 | 55 | 0.776786 | 16 | 112 | 4.625 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.03125 | 0.142857 | 112 | 5 | 56 | 22.4 | 0.739583 | 0 | 0 | 0 | 0 | 0 | 0.044643 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
91fd39d62130074ff9d63c3c0110bd11e524f919 | 9,694 | py | Python | check_kernel_stats_by_ssh.py | balrawat/check-linux-by-ssh | 7743c2451dc4c1c59470315a85595b5633c13f90 | [
"MIT"
] | 37 | 2015-01-14T00:59:12.000Z | 2022-01-26T12:11:38.000Z | check_kernel_stats_by_ssh.py | balrawat/check-linux-by-ssh | 7743c2451dc4c1c59470315a85595b5633c13f90 | [
"MIT"
] | 38 | 2015-01-30T10:25:23.000Z | 2019-05-15T07:15:20.000Z | check_kernel_stats_by_ssh.py | balrawat/check-linux-by-ssh | 7743c2451dc4c1c59470315a85595b5633c13f90 | [
"MIT"
] | 45 | 2015-01-09T05:18:07.000Z | 2021-08-24T12:11:00.000Z | #!/usr/bin/env python2
# Copyright (C) 2013:
# Gabes Jean, naparuba@gmail.com
# Pasche Sebastien, sebastien.pasche@leshop.ch
#
# Permission is hereby granted, free of charge, to any person obtaining a
# copy of this software and associated documentation files (the "Software"),
# to deal in the Software without restriction, including without limitation
# the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the
# Software is furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
# DEALINGS IN THE SOFTWARE.
#
'''
This script is a check for lookup at memory consumption over ssh without
having an agent on the other side
'''
import os
import sys
import optparse
# Ok try to load our directory to load the plugin utils.
my_dir = os.path.dirname(__file__)
sys.path.insert(0, my_dir)
try:
import schecks
except ImportError:
print "ERROR : this plugin needs the local schecks.py lib. Please install it"
sys.exit(2)
VERSION = "0.1"
DEFAULT_TEMP_FILE = '/tmp/__check_kernel_stats_by_ssh.tmp'
def get_kernel_stats(client):
# We are looking for such lines for the first run:
#1366283417 <----- current unixtime on distant server
#ls: cannot access /tmp/__disks_stats6: No such file or directory <----- there was not previous check
#cat: /tmp/__disks_stats6: No such file or directory <----- same here
#cpu 840802 25337 307315 6694839 157376 3 16239 0 0 0
#cpu0 212495 5980 75330 1673111 38077 0 4370 0 0 0
#intr 53421146 45 3 0 0 4 0 3 1 1 0 0 0 4 0 16351 0 1672609 0 201933 1177852 2955202 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 69827 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
#ctxt 171219536
#btime 1366876148
#processes 42956
#procs_running 1
#procs_blocked 2
#softirq 24231851 0 4331903 3491 3371608 425905 0 1697846 3278376 9334 11113388
# After the first one we will have
#1366283725 <----- current unixtime on distant server
#1366283423 <----- the modification time of the CHK_FILE, so we know how much time we got between the two checks
#cpu 840802 25337 307315 6694839 157376 3 16239 0 0 0
#cpu0 212495 5980 75330 1673111 38077 0 4370 0 0 0
#intr 53421146 45 3 0 0 4 0 3 1 1 0 0 0 4 0 16351 0 1672609 0 201933 1177852 2955202 0 0 3 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 69827 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
#ctxt 171219536
#btime 1366876148
#processes 42956
#procs_running 1
#procs_blocked 2
#softirq 24231851 0 4331903 3491 3371608 425905 0 1697846 3278376 9334 11113388
# Beware of the export!
raw = 'CHK_FILE=%s;' % DEFAULT_TEMP_FILE
raw += r"""date +%s;ls -l --time-style=+%s $CHK_FILE | awk '{print $6}';cat $CHK_FILE; cat /proc/stat /proc/vmstat| tee $CHK_FILE"""
stdin, stdout, stderr = client.exec_command('export LC_LANG=C && unset LANG && %s' % raw)
errors = [l for l in stderr]
for line in errors:
if line.startswith('ls:') and line.endswith('No such file or directory'):
print "OK: the check is initializing"
sys.exit(3)
stats = {'ctxt':[], 'processes':[], 'pgfault':[], 'pgmajfault':[]}
lines = [line for line in stdout]
if len(lines) < 2:
print "Error: something goes wrong during the command launch sorry"
sys.exit(2)
# Remember to close the client when finish
client.close()
# We try to get the diff betweenthe file date and now
now = lines.pop(0)
before = lines.pop(0)
try:
diff = int(now) - int(before)
except ValueError:
print "OK: the check is initializing"
sys.exit(3)
# Ok such things should not be true on day, but we don't really now
if diff <= 0:
print "OK: the check is initializing"
diff = 300
# Let's parse al of this
for line in lines:
line = line.strip()
if not line:
continue
tmp = line.split(' ', 1)
if tmp[0] in ['ctxt', 'processes', 'pgfault', 'pgmajfault']:
stats[tmp[0]].append(int(tmp[1]))
return diff, stats
parser = optparse.OptionParser(
"%prog [options]", version="%prog " + VERSION)
parser.add_option('-H', '--hostname',
dest="hostname", help='Hostname to connect to')
parser.add_option('-p', '--port',
dest="port", type="int", default=22,
help='SSH port to connect to. Default : 22')
parser.add_option('-i', '--ssh-key',
dest="ssh_key_file",
help='SSH key file to use. By default will take ~/.ssh/id_rsa.')
parser.add_option('-u', '--user',
dest="user", help='remote use to use. By default shinken.')
parser.add_option('-P', '--passphrase',
dest="passphrase", help='SSH key passphrase. By default will use void')
if __name__ == '__main__':
# Ok first job : parse args
opts, args = parser.parse_args()
if args:
parser.error("Does not accept any argument.")
port = opts.port
hostname = opts.hostname or ''
ssh_key_file = opts.ssh_key_file or os.path.expanduser('~/.ssh/id_rsa')
user = opts.user or 'shinken'
passphrase = opts.passphrase or ''
# Ok now connect, and try to get values for memory
client = schecks.connect(hostname, port, ssh_key_file, passphrase, user)
diff, stats = get_kernel_stats(client)
# Maybe we failed at getting data
if not stats:
print "Error : cannot fetch kernel stats values from host"
sys.exit(2)
# We are putting diff into float so we are sure we will have float everywhere
diff = float(diff)
perfdata = []
for k in ['ctxt', 'processes', 'pgfault', 'pgmajfault']:
v = stats[k]
if len(v) < 2:
# Ok maybe this value just disapears or pop up, not a problem
continue
p_v = v.pop(0)
n_v = v.pop(0)
# We want only positive values, if not, means that ther ewas a problem
# like a reboot for example, so counters are back to 0
d_v = max(0, n_v - p_v) / diff
# Ok Now it's what we did all of this : dump the io stats!
perfdata.append('%s_by_s=%d%s/s' % (k, d_v, k))
print "OK | %s" % (' '.join(perfdata))
sys.exit(0)
| 51.56383 | 1,474 | 0.595936 | 2,399 | 9,694 | 2.379325 | 0.15965 | 0.485634 | 0.721093 | 0.95445 | 0.396286 | 0.370883 | 0.365802 | 0.365802 | 0.353539 | 0.340575 | 0 | 0.294401 | 0.336703 | 9,694 | 187 | 1,475 | 51.839572 | 0.593313 | 0.611203 | 0 | 0.141176 | 0 | 0.011765 | 0.272702 | 0.010028 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.047059 | 0.058824 | null | null | 0.094118 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
6232b733b96e9f0ac351939c8721bbd066d2500f | 36 | py | Python | pyball/models/league/__init__.py | SebastianDang/PyBall | d1965aa01477b5ee0db9c0463ec584a7e3997395 | [
"MIT"
] | 74 | 2018-03-04T22:58:46.000Z | 2021-07-06T12:28:50.000Z | pyball/models/league/__init__.py | SebastianDang/PyBall | d1965aa01477b5ee0db9c0463ec584a7e3997395 | [
"MIT"
] | 18 | 2018-03-10T19:17:54.000Z | 2020-01-04T15:42:47.000Z | pyball/models/league/__init__.py | SebastianDang/PyBall | d1965aa01477b5ee0db9c0463ec584a7e3997395 | [
"MIT"
] | 13 | 2018-03-06T02:39:38.000Z | 2020-01-17T04:38:53.000Z | from .season_date import SeasonDate
| 18 | 35 | 0.861111 | 5 | 36 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 36 | 1 | 36 | 36 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
624260b93c1ea4383b96531f9cba7dcc1b6e1bd6 | 562 | py | Python | code/src/whats_your_name/config.py | pimlock/whats-your-name | e991f092750746b90a966d318a3dbe63a6c53527 | [
"MIT"
] | 16 | 2018-03-04T13:49:05.000Z | 2021-07-04T15:22:11.000Z | code/src/whats_your_name/config.py | pimlock/whats-your-name | e991f092750746b90a966d318a3dbe63a6c53527 | [
"MIT"
] | null | null | null | code/src/whats_your_name/config.py | pimlock/whats-your-name | e991f092750746b90a966d318a3dbe63a6c53527 | [
"MIT"
] | 3 | 2018-03-13T01:58:50.000Z | 2020-06-17T07:14:58.000Z | import os
class Config(object):
def __init__(self, face_collection_id, faces_bucket_name):
self._face_collection_id = face_collection_id
self._faces_bucket_name = faces_bucket_name
@property
def face_collection_id(self):
return self._face_collection_id
@property
def face_bucket_name(self):
return self._faces_bucket_name
@staticmethod
def create_from_env():
return Config(
os.environ.get('REKOGNITION_COLLECTION_ID'),
os.environ.get('FACES_BUCKET_NAME')
)
| 24.434783 | 62 | 0.683274 | 69 | 562 | 5.072464 | 0.333333 | 0.205714 | 0.228571 | 0.171429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.243772 | 562 | 22 | 63 | 25.545455 | 0.823529 | 0 | 0 | 0.117647 | 0 | 0 | 0.074733 | 0.044484 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.058824 | 0.176471 | 0.529412 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
625292e524f5fe4dda030d79137286421f86de29 | 126 | py | Python | python/coordConv/__init__.py | r-owen/coordConv | b61692b52fa41bb1a6b242d37d7bcc6d500206ad | [
"BSD-3-Clause"
] | 2 | 2018-07-11T17:19:54.000Z | 2019-04-18T17:29:03.000Z | python/coordConv/__init__.py | r-owen/coordConv | b61692b52fa41bb1a6b242d37d7bcc6d500206ad | [
"BSD-3-Clause"
] | null | null | null | python/coordConv/__init__.py | r-owen/coordConv | b61692b52fa41bb1a6b242d37d7bcc6d500206ad | [
"BSD-3-Clause"
] | 1 | 2019-04-18T17:29:06.000Z | 2019-04-18T17:29:06.000Z | from __future__ import absolute_import, division
from .version import *
from .coordConvLib import *
from .testUtils import *
| 21 | 48 | 0.801587 | 15 | 126 | 6.4 | 0.533333 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 126 | 5 | 49 | 25.2 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
654ef3829ecb01ae873234d7479edbedc4f435f5 | 78 | py | Python | spikeextractors/extractors/klustaextractors/__init__.py | zekearneodo/spikeextractors | d30aa85e69d0331fffdb58a03a2bb628f93b405e | [
"MIT"
] | 145 | 2018-12-06T23:12:54.000Z | 2022-02-10T22:57:35.000Z | spikeextractors/extractors/klustaextractors/__init__.py | zekearneodo/spikeextractors | d30aa85e69d0331fffdb58a03a2bb628f93b405e | [
"MIT"
] | 396 | 2018-11-26T11:46:30.000Z | 2022-01-04T07:27:47.000Z | spikeextractors/extractors/klustaextractors/__init__.py | zekearneodo/spikeextractors | d30aa85e69d0331fffdb58a03a2bb628f93b405e | [
"MIT"
] | 67 | 2018-11-19T12:38:01.000Z | 2021-09-25T03:18:22.000Z | from .klustaextractors import KlustaSortingExtractor, KlustaRecordingExtractor | 78 | 78 | 0.923077 | 5 | 78 | 14.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051282 | 78 | 1 | 78 | 78 | 0.972973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
65604e67bb45d57e1bf54da042a77023838fbac0 | 220 | py | Python | flytekit/types/pickle/__init__.py | ggydush-fn/flytekit | 6530601c2538a5d804127a97f63291730b1ba1d8 | [
"Apache-2.0"
] | 1 | 2021-11-11T10:10:10.000Z | 2021-11-11T10:10:10.000Z | flytekit/types/pickle/__init__.py | ggydush-fn/flytekit | 6530601c2538a5d804127a97f63291730b1ba1d8 | [
"Apache-2.0"
] | null | null | null | flytekit/types/pickle/__init__.py | ggydush-fn/flytekit | 6530601c2538a5d804127a97f63291730b1ba1d8 | [
"Apache-2.0"
] | null | null | null | """
Flytekit Pickle Type
==========================================================
.. currentmodule:: flytekit.types.pickle
.. autosummary::
:toctree: generated/
FlytePickle
"""
from .pickle import FlytePickle
| 16.923077 | 58 | 0.522727 | 15 | 220 | 7.666667 | 0.733333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122727 | 220 | 12 | 59 | 18.333333 | 0.595855 | 0.809091 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
658a1ac4fdb050ba322eee1da453d4e8abe86834 | 248 | py | Python | inferno/io/core/data_utils.py | 0h-n0/inferno | f466c84ed72ff92f9113891a96ce58e19eeeff1e | [
"Apache-2.0"
] | 204 | 2017-10-10T20:58:52.000Z | 2021-12-07T03:01:19.000Z | inferno/io/core/data_utils.py | 0h-n0/inferno | f466c84ed72ff92f9113891a96ce58e19eeeff1e | [
"Apache-2.0"
] | 86 | 2017-10-11T11:32:36.000Z | 2021-11-15T17:47:25.000Z | inferno/io/core/data_utils.py | 0h-n0/inferno | f466c84ed72ff92f9113891a96ce58e19eeeff1e | [
"Apache-2.0"
] | 30 | 2017-11-16T23:21:30.000Z | 2021-11-15T15:11:00.000Z |
def implements_sync_primitives(dataset):
return hasattr(dataset, 'sync_with') and callable(getattr(dataset, 'sync_with'))
def defines_base_sequence(dataset):
return hasattr(dataset, 'base_sequence') and dataset.base_sequence is not None
| 31 | 84 | 0.78629 | 33 | 248 | 5.666667 | 0.515152 | 0.192513 | 0.213904 | 0.28877 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116935 | 248 | 7 | 85 | 35.428571 | 0.853881 | 0 | 0 | 0 | 0 | 0 | 0.125506 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
65afcd80145dfd9e837166148305d849a5808935 | 195 | py | Python | pybaseball/analysis/projections/__init__.py | reddigari/pybaseball | 2d878cf3505ce0a5e657694ae967d6275dc3c211 | [
"MIT"
] | 650 | 2017-06-29T20:05:19.000Z | 2022-03-31T03:27:25.000Z | pybaseball/analysis/projections/__init__.py | reddigari/pybaseball | 2d878cf3505ce0a5e657694ae967d6275dc3c211 | [
"MIT"
] | 216 | 2017-10-21T05:05:08.000Z | 2022-03-31T04:04:53.000Z | pybaseball/analysis/projections/__init__.py | reddigari/pybaseball | 2d878cf3505ce0a5e657694ae967d6275dc3c211 | [
"MIT"
] | 214 | 2017-07-18T21:40:01.000Z | 2022-03-29T03:19:55.000Z | from .marcels.marcels_batting import MarcelProjectionsBatting
from .marcels.marcels_pitching import MarcelProjectionsPitching
__all__ = ["MarcelProjectionsBatting", "MarcelProjectionsPitching"]
| 39 | 67 | 0.871795 | 15 | 195 | 10.933333 | 0.533333 | 0.134146 | 0.219512 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 195 | 4 | 68 | 48.75 | 0.901099 | 0 | 0 | 0 | 0 | 0 | 0.251282 | 0.251282 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 1 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
65b1755ec4a71b11c4db9692c50f366bfde6879f | 40,401 | py | Python | tripleo_common/tests/utils/test_config.py | d0ugal/tripleo-common | dcf76e1e905613170d2011d0430bed5d35fe1006 | [
"Apache-2.0"
] | 2 | 2016-05-25T14:55:27.000Z | 2020-04-13T09:53:09.000Z | tripleo_common/tests/utils/test_config.py | d0ugal/tripleo-common | dcf76e1e905613170d2011d0430bed5d35fe1006 | [
"Apache-2.0"
] | null | null | null | tripleo_common/tests/utils/test_config.py | d0ugal/tripleo-common | dcf76e1e905613170d2011d0430bed5d35fe1006 | [
"Apache-2.0"
] | null | null | null | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import datetime
import fixtures
import mock
import os
import uuid
import warnings
import yaml
from mock import call
from mock import patch
from tripleo_common import constants
from tripleo_common.tests import base
from tripleo_common.tests.fake_config import fakes
from tripleo_common.utils import config as ooo_config
from tripleo_common.utils.safe_import import git
class TestConfig(base.TestCase):
def setUp(self):
super(TestConfig, self).setUp()
@patch.object(ooo_config.Config, 'initialize_git_repo')
@patch.object(ooo_config.shutil, 'copyfile')
@patch.object(ooo_config.Config, '_mkdir')
@patch.object(ooo_config.Config, '_open_file')
@patch.object(ooo_config.shutil, 'rmtree')
def test_overcloud_config_generate_config(self,
mock_rmtree,
mock_open,
mock_mkdir,
mock_copyfile,
mock_git_init):
config_type_list = ['config_settings', 'global_config_settings',
'logging_sources', 'monitoring_subscriptions',
'service_config_settings',
'service_metadata_settings',
'service_names',
'upgrade_batch_tasks', 'upgrade_tasks',
'external_deploy_tasks']
fake_role = [role for role in
fakes.FAKE_STACK['outputs'][1]['output_value']]
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
self.config = ooo_config.Config(heat)
self.config.download_config('overcloud', '/tmp/tht', config_type_list)
mock_git_init.assert_called_once_with('/tmp/tht')
expected_mkdir_calls = [call('/tmp/tht/%s' % r) for r in fake_role]
mock_mkdir.assert_has_calls(expected_mkdir_calls, any_order=True)
expected_calls = []
for config in config_type_list:
for role in fake_role:
if 'external' in config:
continue
elif config == 'step_config':
expected_calls += [call('/tmp/tht/%s/%s.pp' %
(role, config))]
elif config == 'param_config':
expected_calls += [call('/tmp/tht/%s/%s.json' %
(role, config))]
else:
expected_calls += [call('/tmp/tht/%s/%s.yaml' %
(role, config))]
mock_open.assert_has_calls(expected_calls, any_order=True)
@patch.object(ooo_config.Config, 'initialize_git_repo')
@patch.object(ooo_config.shutil, 'copyfile')
@patch.object(ooo_config.Config, '_mkdir')
@patch.object(ooo_config.Config, '_open_file')
@patch.object(ooo_config.shutil, 'rmtree')
def test_overcloud_config_one_config_type(self,
mock_rmtree,
mock_open,
mock_mkdir,
mock_copyfile,
mock_git_init):
expected_config_type = 'config_settings'
fake_role = [role for role in
fakes.FAKE_STACK['outputs'][1]['output_value']]
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
self.config = ooo_config.Config(heat)
self.config.download_config('overcloud', '/tmp/tht',
['config_settings'])
expected_mkdir_calls = [call('/tmp/tht/%s' % r) for r in fake_role]
expected_calls = [call('/tmp/tht/%s/%s.yaml'
% (r, expected_config_type))
for r in fake_role]
mock_mkdir.assert_has_calls(expected_mkdir_calls, any_order=True)
mock_open.assert_has_calls(expected_calls, any_order=True)
mock_git_init.assert_called_once_with('/tmp/tht')
@patch.object(ooo_config.git, 'Repo')
@mock.patch('os.mkdir')
@mock.patch('six.moves.builtins.open')
@patch.object(ooo_config.shutil, 'rmtree')
def test_overcloud_config_wrong_config_type(self, mock_rmtree,
mock_open, mock_mkdir,
mock_repo):
args = {'name': 'overcloud', 'config_dir': '/tmp/tht',
'config_type': ['bad_config']}
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
self.config = ooo_config.Config(heat)
self.assertRaises(
KeyError,
self.config.download_config, *args)
def test_overcloud_config_upgrade_tasks(self):
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
self.config = ooo_config.Config(heat)
self.tmp_dir = self.useFixture(fixtures.TempDir()).path
fake_role = [role for role in
fakes.FAKE_STACK['outputs'][1]['output_value']]
expected_tasks = {'FakeController': {0: [],
1: [{'name': 'Stop fake service',
'service': 'name=fake '
'state=stopped',
'when': 'step|int == 1'}],
2: [],
3: [],
4: [],
5: []},
'FakeCompute': {0: [],
1: [{'name': 'Stop fake service',
'service': 'name=fake '
'state=stopped',
'when': ['nova_api_enabled.rc'
' == 0',
'httpd_enabled.rc'
' != 0',
'step|int == 1']}],
2: [{'name': 'Stop nova-compute '
'service',
'service': 'name=openstack-'
'nova-compute state=stopped',
'when': ['nova_compute_'
'enabled.rc == 0',
'step|int == 2',
'existing',
'list']}],
3: [],
4: [],
5: []}}
for role in fake_role:
filedir = os.path.join(self.tmp_dir, role)
os.makedirs(filedir)
for step in range(constants.UPGRADE_STEPS_MAX):
filepath = os.path.join(filedir, "upgrade_tasks_step%s.yaml"
% step)
playbook_tasks = self.config._write_tasks_per_step(
fakes.FAKE_STACK['outputs'][1]['output_value'][role]
['upgrade_tasks'], role, filepath, step)
self.assertTrue(os.path.isfile(filepath))
self.assertEqual(expected_tasks[role][step], playbook_tasks)
def test_get_server_names(self):
heat = mock.MagicMock()
self.config = ooo_config.Config(heat)
self.config.stack_outputs = {
'RoleNetHostnameMap': {
'Controller': {
'ctlplane': [
'c0.ctlplane.localdomain',
'c1.ctlplane.localdomain',
'c2.ctlplane.localdomain']}},
'ServerIdData': {
'server_ids': {
'Controller': [
'8269f736',
'2af0a373',
'c8479674']}}}
server_names = self.config.get_server_names()
expected = {'2af0a373': 'c1', '8269f736': 'c0', 'c8479674': 'c2'}
self.assertEqual(expected, server_names)
def test_get_role_config(self):
heat = mock.MagicMock()
self.config = ooo_config.Config(heat)
self.config.stack_outputs = {'RoleConfig': None}
role_config = self.config.get_role_config()
self.assertEqual({}, role_config)
def test_get_deployment_data(self):
heat = mock.MagicMock()
self.config = ooo_config.Config(heat)
stack = 'overcloud'
first = mock.MagicMock()
first.creation_time = datetime.datetime.now() - datetime.timedelta(2)
second = mock.MagicMock()
second.creation_time = datetime.datetime.now() - datetime.timedelta(1)
third = mock.MagicMock()
third.creation_time = datetime.datetime.now()
# Set return_value in a nonsorted order, as we expect the function to
# sort, so that's what we want to test
heat.resources.list.return_value = [second, third, first]
deployment_data = self.config.get_deployment_data(stack)
self.assertTrue(heat.resources.list.called)
self.assertEqual(
heat.resources.list.call_args,
mock.call(stack,
filters=dict(name=constants.TRIPLEO_DEPLOYMENT_RESOURCE),
nested_depth=constants.NESTED_DEPTH,
with_detail=True))
self.assertEqual(deployment_data,
[first, second, third])
def _get_config_data(self, datafile):
config_data_path = os.path.join(
os.path.dirname(os.path.realpath(__file__)),
'data',
datafile)
with open(config_data_path) as fin:
config_data = yaml.safe_load(fin.read())
deployment_data = []
for deployment in config_data['deployments']:
deployment_mock = mock.MagicMock()
deployment_mock.id = deployment['deployment']
deployment_mock.attributes = dict(
value=dict(server=deployment['server'],
deployment=deployment['deployment'],
config=deployment['config'],
name=deployment['name']))
deployment_data.append(deployment_mock)
configs = config_data['configs']
return deployment_data, configs
def _get_deployment_id(self, deployment):
return deployment.attributes['value']['deployment']
def _get_config_dict(self, deployment_id):
deployment = list(filter(
lambda d: d.id == deployment_id, self.deployments))[0]
config = self.configs[deployment.attributes['value']['config']].copy()
config['inputs'] = []
config['inputs'].append(dict(
name='deploy_server_id',
value=deployment.attributes['value']['server']))
return config
def _get_yaml_file(self, file_name):
file_path = os.path.join(
os.path.dirname(os.path.realpath(__file__)),
'data',
file_name)
with open(file_path) as fin:
return yaml.safe_load(fin.read())
@patch.object(ooo_config.Config, 'initialize_git_repo')
@patch('tripleo_common.utils.config.Config.get_deployment_resource_id')
@patch('tripleo_common.utils.config.Config.get_config_dict')
@patch('tripleo_common.utils.config.Config.get_deployment_data')
def test_config_download(self, mock_deployment_data, mock_config_dict,
mock_deployment_resource_id,
mock_git_init):
heat = mock.MagicMock()
self.config = ooo_config.Config(heat)
stack = mock.MagicMock()
heat.stacks.get.return_value = stack
stack.outputs = [
{'output_key': 'RoleNetHostnameMap',
'output_value': {
'Controller': {
'ctlplane': [
'overcloud-controller-0.ctlplane.localdomain']},
'Compute': {
'ctlplane': [
'overcloud-novacompute-0.ctlplane.localdomain',
'overcloud-novacompute-1.ctlplane.localdomain',
'overcloud-novacompute-2.ctlplane.localdomain']}}},
{'output_key': 'ServerIdData',
'output_value': {
'server_ids': {
'Controller': [
'00b3a5e1-5e8e-4b55-878b-2fa2271f15ad'],
'Compute': [
'a7db3010-a51f-4ae0-a791-2364d629d20d',
'8b07cd31-3083-4b88-a433-955f72039e2c',
'169b46f8-1965-4d90-a7de-f36fb4a830fe']}}},
{'output_key': 'HostnameNetworkConfigMap',
'output_value': {}},
{'output_key': 'AnsibleHostVarsMap',
'output_value': {
'Controller': {
'overcloud-controller-0': {
'uuid': 0,
'my_var': 'foo'}},
'Compute': {
'overcloud-novacompute-0': {
'uuid': 1},
'overcloud-novacompute-1': {
'uuid': 2},
'overcloud-novacompute-2': {
'uuid': 3}}}},
{'output_key': 'RoleGroupVars',
'output_value': {
'Controller': {
'any_errors_fatal': True,
'max_fail_percentage': 15},
'Compute': {
'any_errors_fatal': True,
'max_fail_percentage': 15},
}}]
deployment_data, configs = \
self._get_config_data('config_data.yaml')
self.configs = configs
self.deployments = deployment_data
mock_deployment_data.return_value = deployment_data
mock_deployment_resource_id.side_effect = self._get_deployment_id
mock_config_dict.side_effect = self._get_config_dict
self.tmp_dir = self.useFixture(fixtures.TempDir()).path
tmp_path = self.config.download_config(stack, self.tmp_dir)
mock_git_init.assert_called_once_with(self.tmp_dir)
for f in ['Controller',
'Compute', ]:
with open(os.path.join(tmp_path, 'group_vars', f)) as fin:
self.assertEqual(
self._get_yaml_file(f),
yaml.safe_load(fin.read()))
for f in ['overcloud-controller-0',
'overcloud-novacompute-0',
'overcloud-novacompute-1',
'overcloud-novacompute-2']:
with open(os.path.join(tmp_path, 'host_vars', f)) as fin:
self.assertEqual(
self._get_yaml_file(os.path.join('host_vars', f)),
yaml.safe_load(fin.read()))
for d in ['ControllerHostEntryDeployment',
'NetworkDeployment',
'MyExtraConfigPost',
'MyPostConfig']:
with open(os.path.join(tmp_path, 'Controller',
'overcloud-controller-0',
d)) as fin:
self.assertEqual(
yaml.safe_load(fin.read()),
self._get_yaml_file(os.path.join(
'overcloud-controller-0',
d)))
for d in ['ComputeHostEntryDeployment',
'NetworkDeployment',
'MyExtraConfigPost']:
with open(os.path.join(tmp_path, 'Compute',
'overcloud-novacompute-0',
d)) as fin:
self.assertEqual(
yaml.safe_load(fin.read()),
self._get_yaml_file(os.path.join(
'overcloud-novacompute-0',
d)))
for d in ['ComputeHostEntryDeployment',
'NetworkDeployment',
'MyExtraConfigPost']:
with open(os.path.join(tmp_path, 'Compute',
'overcloud-novacompute-1',
d)) as fin:
self.assertEqual(
yaml.safe_load(fin.read()),
self._get_yaml_file(os.path.join(
'overcloud-novacompute-1',
d)))
for d in ['ComputeHostEntryDeployment',
'NetworkDeployment',
'MyExtraConfigPost',
'AnsibleDeployment']:
with open(os.path.join(tmp_path, 'Compute',
'overcloud-novacompute-2',
d)) as fin:
self.assertEqual(
yaml.safe_load(fin.read()),
self._get_yaml_file(os.path.join(
'overcloud-novacompute-2',
d)))
@patch.object(ooo_config.Config, 'initialize_git_repo')
@patch('tripleo_common.utils.config.Config.get_deployment_resource_id')
@patch('tripleo_common.utils.config.Config.get_config_dict')
@patch('tripleo_common.utils.config.Config.get_deployment_data')
def test_config_download_os_apply_config(
self, mock_deployment_data, mock_config_dict,
mock_deployment_resource_id, mock_git_init):
heat = mock.MagicMock()
self.config = ooo_config.Config(heat)
stack = mock.MagicMock()
heat.stacks.get.return_value = stack
heat.resources.get.return_value = mock.MagicMock()
stack.outputs = [
{'output_key': 'RoleNetHostnameMap',
'output_value': {
'Controller': {
'ctlplane': [
'overcloud-controller-0.ctlplane.localdomain']},
'Compute': {
'ctlplane': [
'overcloud-novacompute-0.ctlplane.localdomain',
'overcloud-novacompute-1.ctlplane.localdomain',
'overcloud-novacompute-2.ctlplane.localdomain']}}},
{'output_key': 'ServerIdData',
'output_value': {
'server_ids': {
'Controller': [
'00b3a5e1-5e8e-4b55-878b-2fa2271f15ad'],
'Compute': [
'a7db3010-a51f-4ae0-a791-2364d629d20d',
'8b07cd31-3083-4b88-a433-955f72039e2c',
'169b46f8-1965-4d90-a7de-f36fb4a830fe']}}},
{'output_key': 'HostnameNetworkConfigMap',
'output_value': {}},
{'output_key': 'RoleGroupVars',
'output_value': {
'Controller': {
'any_errors_fatal': 'yes',
'max_fail_percentage': 15},
'Compute': {
'any_errors_fatal': 'yes',
'max_fail_percentage': 15},
}}]
deployment_data, configs = \
self._get_config_data('config_data.yaml')
# Add a group:os-apply-config config and deployment
config_uuid = str(uuid.uuid4())
configs[config_uuid] = dict(
id=config_uuid,
config=dict(a='a'),
group='os-apply-config',
outputs=[])
deployment_uuid = str(uuid.uuid4())
deployment_mock = mock.MagicMock()
deployment_mock.id = deployment_uuid
deployment_mock.attributes = dict(
value=dict(server='00b3a5e1-5e8e-4b55-878b-2fa2271f15ad',
deployment=deployment_uuid,
config=config_uuid,
name='OsApplyConfigDeployment'))
deployment_data.append(deployment_mock)
self.configs = configs
self.deployments = deployment_data
mock_deployment_data.return_value = deployment_data
mock_config_dict.side_effect = self._get_config_dict
mock_deployment_resource_id.side_effect = self._get_deployment_id
self.tmp_dir = self.useFixture(fixtures.TempDir()).path
with warnings.catch_warnings(record=True) as w:
self.config.download_config(stack, self.tmp_dir)
mock_git_init.assert_called_once_with(self.tmp_dir)
# check that we got at least one of the warnings that we expected
# to throw
self.assertGreaterEqual(len(w), 1)
self.assertGreaterEqual(len([x for x in w
if issubclass(x.category,
DeprecationWarning)]),
1)
self.assertGreaterEqual(len([x for x in w
if "group:os-apply-config"
in str(x.message)]),
1)
@patch.object(ooo_config.Config, 'initialize_git_repo')
@patch('tripleo_common.utils.config.Config.get_deployment_resource_id')
@patch('tripleo_common.utils.config.Config.get_deployment_data')
def test_config_download_no_deployment_name(
self, mock_deployment_data, mock_deployment_resource_id,
mock_git_init):
heat = mock.MagicMock()
self.config = ooo_config.Config(heat)
stack = mock.MagicMock()
heat.stacks.get.return_value = stack
heat.resources.get.return_value = mock.MagicMock()
deployment_data, _ = self._get_config_data('config_data.yaml')
# Delete the name of the first deployment and his parent.
del deployment_data[0].attributes['value']['name']
deployment_data[0].parent_resource = None
self.deployments = deployment_data
mock_deployment_data.return_value = deployment_data
mock_deployment_resource_id.side_effect = self._get_deployment_id
self.tmp_dir = self.useFixture(fixtures.TempDir()).path
self.assertRaises(ValueError,
self.config.download_config, stack, self.tmp_dir)
mock_git_init.assert_called_once_with(self.tmp_dir)
@patch.object(ooo_config.Config, 'initialize_git_repo')
@patch('tripleo_common.utils.config.Config.get_deployment_resource_id')
@patch('tripleo_common.utils.config.Config.get_deployment_data')
def test_config_download_warn_grandparent_resource_name(
self, mock_deployment_data, mock_deployment_resource_id,
mock_git_init):
heat = mock.MagicMock()
self.config = ooo_config.Config(heat)
stack = mock.MagicMock()
heat.stacks.get.return_value = stack
heat.resources.get.return_value = mock.MagicMock()
deployment_data, _ = self._get_config_data('config_data.yaml')
# Set the name of the deployment to an integer to trigger looking up
# the grandparent resource name
deployment_data[0].attributes['value']['name'] = 1
self.deployments = deployment_data
mock_deployment_data.return_value = deployment_data
mock_deployment_resource_id.side_effect = self._get_deployment_id
self.tmp_dir = self.useFixture(fixtures.TempDir()).path
with warnings.catch_warnings(record=True) as w:
self.assertRaises(ValueError,
self.config.download_config, stack, self.tmp_dir)
self.assertGreaterEqual(len(w), 1)
self.assertGreaterEqual(len([x for x in w
if "grandparent"
in str(x.message)]),
1)
mock_git_init.assert_called_once_with(self.tmp_dir)
@patch.object(ooo_config.Config, 'initialize_git_repo')
@patch('tripleo_common.utils.config.Config.get_deployment_resource_id')
@patch('tripleo_common.utils.config.Config.get_config_dict')
@patch('tripleo_common.utils.config.Config.get_deployment_data')
def test_config_download_no_deployment_uuid(self, mock_deployment_data,
mock_config_dict,
mock_deployment_resource_id,
mock_git_init):
heat = mock.MagicMock()
self.config = ooo_config.Config(heat)
stack = mock.MagicMock()
heat.stacks.get.return_value = stack
heat.resources.get.return_value = mock.MagicMock()
stack.outputs = [
{'output_key': 'RoleNetHostnameMap',
'output_value': {
'Controller': {
'ctlplane': [
'overcloud-controller-0.ctlplane.localdomain']},
'Compute': {
'ctlplane': [
'overcloud-novacompute-0.ctlplane.localdomain',
'overcloud-novacompute-1.ctlplane.localdomain',
'overcloud-novacompute-2.ctlplane.localdomain']}}},
{'output_key': 'ServerIdData',
'output_value': {
'server_ids': {
'Controller': [
'00b3a5e1-5e8e-4b55-878b-2fa2271f15ad'],
'Compute': [
'a7db3010-a51f-4ae0-a791-2364d629d20d',
'8b07cd31-3083-4b88-a433-955f72039e2c',
'169b46f8-1965-4d90-a7de-f36fb4a830fe']}}},
{'output_key': 'HostnameNetworkConfigMap',
'output_value': {}},
{'output_key': 'RoleGroupVars',
'output_value': {
'Controller': {
'any_errors_fatal': 'yes',
'max_fail_percentage': 15},
'Compute': {
'any_errors_fatal': 'yes',
'max_fail_percentage': 15},
}}]
deployment_data, configs = self._get_config_data('config_data.yaml')
# Set the deployment to TripleOSoftwareDeployment for the first
# deployment
deployment_data[0].attributes['value']['deployment'] = \
'TripleOSoftwareDeployment'
# Set the physical_resource_id as '' for the second deployment
deployment_data[1].attributes['value']['deployment'] = ''
self.configs = configs
self.deployments = deployment_data
mock_deployment_data.return_value = deployment_data
mock_config_dict.side_effect = self._get_config_dict
mock_deployment_resource_id.side_effect = self._get_deployment_id
self.tmp_dir = self.useFixture(fixtures.TempDir()).path
with warnings.catch_warnings(record=True) as w:
self.config.download_config(stack, self.tmp_dir)
assert "Skipping deployment" in str(w[-1].message)
assert "Skipping deployment" in str(w[-2].message)
@patch.object(ooo_config.Config, 'initialize_git_repo')
@patch.object(ooo_config.git, 'Repo')
@patch.object(ooo_config.shutil, 'copyfile')
@patch.object(ooo_config.Config, '_mkdir')
@patch.object(ooo_config.Config, '_open_file')
@patch.object(ooo_config.shutil, 'rmtree')
@patch.object(ooo_config.os.path, 'exists')
def test_overcloud_config_dont_preserve_config(self,
mock_os_path_exists,
mock_rmtree,
mock_open,
mock_mkdir,
mock_copyfile,
mock_repo,
mock_git_init):
config_type_list = ['config_settings', 'global_config_settings',
'logging_sources', 'monitoring_subscriptions',
'service_config_settings',
'service_metadata_settings',
'service_names',
'upgrade_batch_tasks', 'upgrade_tasks',
'external_deploy_tasks']
fake_role = [role for role in
fakes.FAKE_STACK['outputs'][1]['output_value']]
mock_os_path_exists.get.return_value = True
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
self.config = ooo_config.Config(heat)
self.config.download_config('overcloud', '/tmp/tht', config_type_list,
False)
mock_git_init.assert_called_once_with('/tmp/tht')
expected_rmtree_calls = [call('/tmp/tht')]
mock_rmtree.assert_has_calls(expected_rmtree_calls)
expected_mkdir_calls = [call('/tmp/tht/%s' % r) for r in fake_role]
mock_mkdir.assert_has_calls(expected_mkdir_calls, any_order=True)
expected_calls = []
for config in config_type_list:
for role in fake_role:
if 'external' in config:
continue
elif config == 'step_config':
expected_calls += [call('/tmp/tht/%s/%s.pp' %
(role, config))]
elif config == 'param_config':
expected_calls += [call('/tmp/tht/%s/%s.json' %
(role, config))]
else:
expected_calls += [call('/tmp/tht/%s/%s.yaml' %
(role, config))]
mock_open.assert_has_calls(expected_calls, any_order=True)
@patch.object(ooo_config.shutil, 'rmtree')
@patch.object(ooo_config.os.path, 'exists')
def test_create_config_dir(self, mock_os_path_exists, mock_rmtree):
mock_os_path_exists.get.return_value = True
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
self.config = ooo_config.Config(heat)
self.config.create_config_dir('/tmp/tht', False)
expected_rmtree_calls = [call('/tmp/tht')]
mock_rmtree.assert_has_calls(expected_rmtree_calls)
def test_initialize_git_repo(self):
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
self.config = ooo_config.Config(heat)
self.tmp_dir = self.useFixture(fixtures.TempDir()).path
repo = self.config.initialize_git_repo(self.tmp_dir)
self.assertIsInstance(repo, git.Repo)
@patch('tripleo_common.utils.config.Config.get_config_dict')
@patch('tripleo_common.utils.config.Config.get_deployment_data')
def test_write_config(self, mock_deployment_data, mock_config_dict):
heat = mock.MagicMock()
self.config = ooo_config.Config(heat)
stack = mock.MagicMock()
heat.stacks.get.return_value = stack
stack.outputs = [
{'output_key': 'RoleNetHostnameMap',
'output_value': {
'Controller': {
'ctlplane': [
'overcloud-controller-0.ctlplane.localdomain']},
'Compute': {
'ctlplane': [
'overcloud-novacompute-0.ctlplane.localdomain',
'overcloud-novacompute-1.ctlplane.localdomain',
'overcloud-novacompute-2.ctlplane.localdomain']}}},
{'output_key': 'ServerIdData',
'output_value': {
'server_ids': {
'Controller': [
'00b3a5e1-5e8e-4b55-878b-2fa2271f15ad'],
'Compute': [
'a7db3010-a51f-4ae0-a791-2364d629d20d',
'8b07cd31-3083-4b88-a433-955f72039e2c',
'169b46f8-1965-4d90-a7de-f36fb4a830fe']}}},
{'output_key': 'RoleGroupVars',
'output_value': {
'Controller': {
'any_errors_fatal': True,
'max_fail_percentage': 15},
'Compute': {
'any_errors_fatal': True,
'max_fail_percentage': 15}}},
{'output_key': 'HostnameNetworkConfigMap',
'output_value': {}}
]
deployment_data, configs = \
self._get_config_data('config_data.yaml')
self.configs = configs
self.deployments = deployment_data
stack_data = self.config.fetch_config('overcloud')
mock_deployment_data.return_value = deployment_data
mock_config_dict.side_effect = self._get_config_dict
config_dir = self.useFixture(fixtures.TempDir()).path
self.config.write_config(stack_data, 'overcloud', config_dir)
for f in ['Controller',
'Compute', ]:
with open(os.path.join(config_dir, 'group_vars', f)) as fin:
self.assertEqual(
yaml.safe_load(fin.read()),
self._get_yaml_file(f))
for d in ['ControllerHostEntryDeployment',
'NetworkDeployment',
'MyExtraConfigPost',
'MyPostConfig']:
with open(os.path.join(config_dir, 'Controller',
'overcloud-controller-0', d)) as fin:
self.assertEqual(
yaml.safe_load(fin.read()),
self._get_yaml_file(os.path.join(
'overcloud-controller-0',
d)))
for d in ['ComputeHostEntryDeployment',
'NetworkDeployment',
'MyExtraConfigPost']:
with open(os.path.join(config_dir, 'Compute',
'overcloud-novacompute-0',
d)) as fin:
self.assertEqual(
yaml.safe_load(fin.read()),
self._get_yaml_file(os.path.join(
'overcloud-novacompute-0',
d)))
for d in ['ComputeHostEntryDeployment',
'NetworkDeployment',
'MyExtraConfigPost']:
with open(os.path.join(config_dir, 'Compute',
'overcloud-novacompute-1',
d)) as fin:
self.assertEqual(
yaml.safe_load(fin.read()),
self._get_yaml_file(os.path.join(
'overcloud-novacompute-1',
d)))
for d in ['ComputeHostEntryDeployment',
'NetworkDeployment',
'MyExtraConfigPost',
'AnsibleDeployment']:
with open(os.path.join(config_dir, 'Compute',
'overcloud-novacompute-2', d)) as fin:
self.assertEqual(
yaml.safe_load(fin.read()),
self._get_yaml_file(os.path.join(
'overcloud-novacompute-2',
d)))
@patch('tripleo_common.utils.config.Config.get_config_dict')
@patch('tripleo_common.utils.config.Config.get_deployment_data')
@patch.object(ooo_config.yaml, 'safe_load')
def test_validate_config(self, mock_yaml, mock_deployment_data,
mock_config_dict):
stack_config = """
Controller:
ctlplane:
overcloud-controller-0.ctlplane.localdomain
Compute:
ctlplane:
overcloud-novacompute-0.ctlplane.localdomain
overcloud-novacompute-1.ctlplane.localdomain
overcloud-novacompute-2.ctlplane.localdomain
"""
yaml_file = '/tmp/testfile.yaml'
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
self.config = ooo_config.Config(heat)
self.config.validate_config(stack_config, yaml_file)
expected_yaml_safe_load_calls = [call(stack_config)]
mock_yaml.assert_has_calls(expected_yaml_safe_load_calls)
@patch('tripleo_common.utils.config.Config.get_config_dict')
@patch('tripleo_common.utils.config.Config.get_deployment_data')
def test_validate_config_invalid_yaml(self, mock_deployment_data,
mock_config_dict):
# Use invalid YAML to assert that we properly handle the exception
stack_config = """
Controller:
ctlplane:
overcloud-controller-0.ctlplane.localdomain
Compute:
ctlplane:
overcloud-novacompute-0.ctlplane.localdomain
overcloud-novacompute-1.ctlplane.localdomain
overcloud-novacompute-2.ctlplane.localdomain
"""
yaml_file = '/tmp/testfile.yaml'
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
self.config = ooo_config.Config(heat)
self.assertRaises(yaml.scanner.ScannerError,
self.config.validate_config, stack_config, yaml_file)
@patch('tripleo_common.utils.config.Config.get_network_config_data')
def test_render_network_config_empty_dict(self,
mock_get_network_config_data):
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
config_mock = mock.MagicMock()
config_mock.config = {}
heat.software_configs.get.return_value = config_mock
self.config = ooo_config.Config(heat)
stack = mock.Mock()
server_roles = dict(Controller='controller')
mock_get_network_config_data.return_value = dict(Controller='config')
config_dir = '/tmp/tht'
self.config.render_network_config(stack, config_dir, server_roles)
@patch.object(ooo_config.Config, '_open_file')
@patch('tripleo_common.utils.config.Config.get_network_config_data')
def test_render_network_config(self,
mock_get_network_config_data,
mock_open):
heat = mock.MagicMock()
heat.stacks.get.return_value = fakes.create_tht_stack()
config_mock = mock.MagicMock()
config_mock.config = 'some config'
heat.software_configs.get.return_value = config_mock
self.config = ooo_config.Config(heat)
stack = mock.Mock()
server_roles = dict(Controller='controller')
mock_get_network_config_data.return_value = dict(Controller='config')
config_dir = '/tmp/tht'
self.config.render_network_config(stack, config_dir, server_roles)
self.assertEqual(2, mock_open.call_count)
self.assertEqual('/tmp/tht/controller/Controller/NetworkConfig',
mock_open.call_args_list[0][0][0])
self.assertEqual('/tmp/tht/controller/NetworkConfig',
mock_open.call_args_list[1][0][0])
| 45.754247 | 79 | 0.533328 | 3,853 | 40,401 | 5.330132 | 0.091357 | 0.035059 | 0.025564 | 0.027268 | 0.779082 | 0.756537 | 0.744705 | 0.720261 | 0.701855 | 0.695038 | 0 | 0.020473 | 0.366476 | 40,401 | 882 | 80 | 45.806122 | 0.781911 | 0.027673 | 0 | 0.711538 | 0 | 0 | 0.203582 | 0.10607 | 0 | 0 | 0 | 0 | 0.062821 | 1 | 0.032051 | false | 0 | 0.017949 | 0.001282 | 0.05641 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
028d86105c1f7e57b4e2ea7171bb996c52c99796 | 46 | py | Python | remina-resizer.py | trondhindenes/remmina-resizer | 2dcdea26b0d265d8e7fa294df84dff1a9707d3b4 | [
"MIT"
] | 5 | 2017-09-20T15:38:43.000Z | 2019-07-26T22:14:54.000Z | remina-resizer.py | trondhindenes/remmina-resizer | 2dcdea26b0d265d8e7fa294df84dff1a9707d3b4 | [
"MIT"
] | null | null | null | remina-resizer.py | trondhindenes/remmina-resizer | 2dcdea26b0d265d8e7fa294df84dff1a9707d3b4 | [
"MIT"
] | null | null | null | import remmina_resizer
remmina_resizer.main()
| 15.333333 | 22 | 0.869565 | 6 | 46 | 6.333333 | 0.666667 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065217 | 46 | 2 | 23 | 23 | 0.883721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
02a6b30f17a3a5e99ff7ab90f80f75a0de741a7d | 49,271 | py | Python | metashare/accounts/admin.py | MiltosD/CEFELRC | c9919102c753c030a888ab53a612ebd80d14e3d5 | [
"BSD-3-Clause"
] | null | null | null | metashare/accounts/admin.py | MiltosD/CEFELRC | c9919102c753c030a888ab53a612ebd80d14e3d5 | [
"BSD-3-Clause"
] | null | null | null | metashare/accounts/admin.py | MiltosD/CEFELRC | c9919102c753c030a888ab53a612ebd80d14e3d5 | [
"BSD-3-Clause"
] | null | null | null | from django import forms
from django.contrib import admin, messages
from django.contrib.admin.options import csrf_protect_m
from django.contrib.auth.models import Permission, Group, User
from django.db import transaction
from django.shortcuts import render_to_response
from django.template.context import RequestContext
from django.http import HttpResponseRedirect
from django.utils.translation import ugettext as _
from django.core.mail import send_mail
from django.template.loader import render_to_string
from metashare.accounts.forms import EditorGroupForm, OrganizationForm, \
OrganizationManagersForm, EditorGroupManagersForm
from metashare.accounts.models import RegistrationRequest, ResetRequest, \
UserProfile, EditorGroup, EditorGroupApplication, EditorGroupManagers, \
Organization, OrganizationApplication, OrganizationManagers
from metashare.utils import create_breadcrumb_template_params
class RegistrationRequestAdmin(admin.ModelAdmin):
"""
Administration interface for user registration requests.
"""
list_display = ('user',)
search_fields = ('user__username', 'user__first_name', 'user__last_name',
'user__email')
class ResetRequestAdmin(admin.ModelAdmin):
"""
Administration interface for user reset requests.
"""
list_display = ('user', 'uuid', 'created')
search_fields = ('user',)
class UserProfileAdmin(admin.ModelAdmin):
"""
Administration interface for user profiles.
"""
list_display = ('user', 'modified', 'birthdate', 'phone_number', 'country', 'affiliation', 'position','homepage', '_editor_group_display',
'_organization_display')
search_fields = ('user__username', 'user__first_name', 'user__last_name',
'birthdate', 'phone_number', 'country', 'affiliation', 'position', 'homepage')
def _editor_group_display(self, obj):
"""
Returns a string representing a list of the the editor groups of a user
profile.
"""
return ', '.join([editor_group.name for editor_group
in EditorGroup.objects.filter(name__in=
obj.user.groups.values_list('name', flat=True))])
_editor_group_display.short_description = _('Editor groups')
def _managed_editor_groups_display(self, obj):
"""
Returns a string representing a list of the editor groups that the user
manages.
"""
return ', '.join([mgr_group.managed_group.name for mgr_group
in EditorGroupManagers.objects.filter(name__in=
obj.user.groups.values_list('name', flat=True))])
_managed_editor_groups_display.short_description = \
_('Managed Editor Groups')
def _organization_display(self, obj):
"""
Returns a string representing a list of the the organizations of a user
profile.
"""
return ', '.join([organization.name for organization
in Organization.objects.filter(name__in=
obj.user.groups.values_list('name', flat=True))])
_organization_display.short_description = _('Organizations')
def _managed_organizations_display(self, obj):
"""
Returns a string representing a list of the organization that the user
manages.
"""
return ', '.join([org_mgr_group.managed_organization.name
for org_mgr_group in OrganizationManagers.objects.filter(
name__in=obj.user.groups.values_list('name', flat=True))])
_managed_organizations_display.short_description = \
_('Managed organizations')
class EditorGroupAdmin(admin.ModelAdmin):
"""
Administration interface for `EditorGroup`s.
"""
list_display = ('name', '_members_display', '_managing_group_display',
'_managers_display')
search_fields = ('name',)
actions = ('add_user_to_editor_group', 'remove_user_from_editor_group', )
form = EditorGroupForm
def _members_display(self, obj):
"""
Returns a string representing a list of the members of the given
`EditorGroup` object.
"""
return ', '.join(member.username for member in obj.get_members())
_members_display.short_description = _('Members')
def _managing_group_display(self, obj):
"""
Returns a string representing a list of the managing groups of the
given `EditorGroup` object.
"""
return ', '.join(mgr_group.name for mgr_group
in EditorGroupManagers.objects.filter(managed_group=obj))
_managing_group_display.short_description = _('Managing groups')
def _managers_display(self, obj):
"""
Returns a string representing a list of the managers of the given
`EditorGroup`.
"""
return ', '.join(usr.username
for mgr_group in EditorGroupManagers.objects.filter(managed_group=obj)
for usr in User.objects.filter(groups__name=mgr_group.name))
_managers_display.short_description = _('Managers')
class UserProfileinEditorGroupForm(forms.Form):
_selected_action = forms.CharField(widget=forms.MultipleHiddenInput)
def __init__(self, choices = None, *args, **kwargs):
super(EditorGroupAdmin.UserProfileinEditorGroupForm, self).__init__(*args, **kwargs)
if choices is not None:
self.choices = choices
self.fields['users'] = forms.ModelMultipleChoiceField(self.choices)
@csrf_protect_m
@transaction.commit_on_success
def add_view(self, request, form_url='', extra_context=None):
"""
The 'add' admin view for this model.
"""
# when showing a certain add view for the first time, prepopulate the
# permissions field: we suggest that the new group has all permissions
# that global editors have
if request.method == 'GET':
# request `QueryDict`s are immutable; create a copy before updating
request.GET = request.GET.copy()
from metashare.repository.management import GROUP_GLOBAL_EDITORS
_perms = ','.join([str(pk) for pk in Group.objects.get(name=
GROUP_GLOBAL_EDITORS).permissions.values_list('pk', flat=True)])
request.GET.update({'permissions': _perms})
return super(EditorGroupAdmin, self).add_view(request,
form_url=form_url, extra_context=extra_context)
def queryset(self, request):
queryset = super(EditorGroupAdmin, self).queryset(request)
if request.user.is_superuser:
return queryset
return queryset.filter(editorgroupmanagers__in=EditorGroupManagers.objects.filter(
name__in=request.user.groups.values_list('name', flat=True)))
def add_user_to_editor_group(self, request, queryset):
form = None
if 'cancel' in request.POST:
self.message_user(request, _('Cancelled adding users to the editor group.'))
return
elif 'add_user_profile_to_editor_group' in request.POST:
objs_up = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinEditorGroupForm(objs_up, request.POST)
if form.is_valid():
userprofiles = form.cleaned_data['users']
for userprofile in userprofiles:
for obj in queryset:
if UserProfile.objects.filter(user=request.user)[0].has_manager_permission(obj):
userprofile.user.groups.add(obj)
else:
self.message_user(request,
_('You need to be group manager to add a user to this editor group.'))
return HttpResponseRedirect(request.get_full_path())
self.message_user(request, _('Successfully added users to editor group.'))
return HttpResponseRedirect(request.get_full_path())
if not form:
userprofiles = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinEditorGroupForm(choices=userprofiles,
initial={'_selected_action': request.POST.getlist(admin.ACTION_CHECKBOX_NAME)})
dictionary = {'title': _('Add Users to Editor Group'),
'selected_editorgroups': queryset,
'form': form,
'path': request.get_full_path()
}
dictionary.update(create_breadcrumb_template_params(self.model, _('Add user')))
return render_to_response('accounts/add_user_profile_to_editor_group.html',
dictionary,
context_instance=RequestContext(request))
add_user_to_editor_group.short_description = _("Add users to selected editor groups")
def remove_user_from_editor_group(self, request, queryset):
form = None
if request.user.is_superuser:
if 'cancel' in request.POST:
self.message_user(request, _('Cancelled removing users from the editor group.'))
return
elif 'remove_user_profile_from_editor_group' in request.POST:
objs_up = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinEditorGroupForm(objs_up, request.POST)
if form.is_valid():
userprofiles = form.cleaned_data['users']
for userprofile in userprofiles:
for obj in queryset:
userprofile.user.groups.remove(obj)
self.message_user(request, _('Successfully removed users from editor group.'))
return HttpResponseRedirect(request.get_full_path())
if not form:
userprofiles = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinEditorGroupForm(choices=userprofiles,
initial={'_selected_action': request.POST.getlist(admin.ACTION_CHECKBOX_NAME)})
dictionary = {'title': _('Remove Users from Editor Group'),
'selected_editorgroups': queryset,
'form': form,
'path': request.get_full_path()
}
dictionary.update(create_breadcrumb_template_params(self.model, _('Remove user')))
return render_to_response('accounts/remove_user_profile_from_editor_group.html',
dictionary,
context_instance=RequestContext(request))
remove_user_from_editor_group.short_description = _("Remove users from selected editor groups")
class EditorGroupApplicationAdmin(admin.ModelAdmin):
"""
Administration interface for user editor group application.
"""
list_display = ('user', 'editor_group', 'created')
actions = ('accept_selected', 'delete_selected')
def accept_selected(self, request, queryset):
"""
The action to accept editor group applications.
"""
if not request.user.is_superuser and \
not request.user.get_profile().has_manager_permission():
messages.error(request,
_('You must be superuser or group manager to accept applications.'))
return HttpResponseRedirect(request.get_full_path())
if queryset.count() == 0:
return HttpResponseRedirect(request.get_full_path())
_total_groups = 0
_accepted_groups = 0
for req in queryset:
_total_groups += 1
if request.user.get_profile().has_manager_permission(
req.editor_group) or request.user.is_superuser:
req.user.groups.add(req.editor_group)
req.delete()
_accepted_groups += 1
# if the applying user is not a staff user, yet, then we have
# to make sure that she becomes a staff user for being an editor
if not req.user.is_staff:
req.user.is_staff = True
req.user.save()
# Render notification email template with correct values.
data = {'editor_group': req.editor_group,
'shortname': req.user.get_full_name() }
try:
# Send out notification email to the user
send_mail('Application accepted',
render_to_string('accounts/notification_editor_group_application_accepted.email', data),
'no-reply@elrc-share.ilsp.gr', (req.user.email,),
fail_silently=False)
except: #SMTPException:
# If the email could not be sent successfully, tell the user
# about it.
messages.error(request, _("There was an error sending " \
"out an application acceptance e-mail."))
else:
messages.success(request, _('You have successfully ' \
'accepted "%s" as member of the editor group "%s".')
% (req.user.get_full_name(), req.editor_group,))
if _total_groups != _accepted_groups:
messages.warning(request, _('Successfully accepted %(accepted)d of '
'%(total)d applications. You have no permissions to accept the '
'remaining applications.') % {'accepted': _accepted_groups,
'total': _total_groups})
else:
messages.success(request,
_('Successfully accepted all requests.'))
return HttpResponseRedirect(request.get_full_path())
accept_selected.short_description = \
_("Accept selected editor group applications")
def get_readonly_fields(self, request, obj=None):
"""
Return the list of fields to be in readonly mode.
Managers cannot modify applications, they can only add them or delete them.
"""
if not request.user.is_superuser:
# for non-superusers no part of the group application is editable
return [field.name for field
in EditorGroupApplication._meta.fields]
return super(EditorGroupApplicationAdmin, self) \
.get_readonly_fields(request, obj)
def queryset(self, request):
result = super(EditorGroupApplicationAdmin, self).queryset(request)
if request.user.is_superuser:
return result
# non-superusers may only see the applications that they may also handle
return result.filter(editor_group__name__in=
request.user.groups.values_list('name', flat=True))
# pylint: disable-msg=W0622
def log_deletion(self, request, obj, object_repr):
"""
When an application is turned down by a manager, send an email to the user before
logging the deletion
"""
# Render notification email template with correct values.
data = {'editor_group': obj.editor_group,
'shortname': obj.user.get_full_name() }
try:
# Send out notification email to the user
send_mail('Application turned down', render_to_string('accounts/'
'notification_editor_group_application_turned_down.email', data),
'no-reply@elrc-share.ilsp.gr', (obj.user.email,),
fail_silently=False)
except: #SMTPException:
# If the email could not be sent successfully, tell the user
# about it.
messages.error(request, _("There was an error sending out an "
"e-mail about turning down the application."))
else:
messages.success(request, _('You have successfully turned down "%s" ' \
'from the editor group "%s".') % (obj.user.get_full_name(),
obj.editor_group,))
super(EditorGroupApplicationAdmin, self).log_deletion(request, obj, object_repr)
def delete_selected(self, request, queryset):
"""
The action to turn down editor group applications.
"""
from django import template
from django.contrib.admin.util import get_deleted_objects, model_ngettext
from django.contrib.admin import helpers
from django.db import router
from django.utils.encoding import force_unicode
from django.core.exceptions import PermissionDenied
opts = self.model._meta
app_label = opts.app_label
# Check that the user has delete permission for the actual model
if not self.has_delete_permission(request):
raise PermissionDenied
using = router.db_for_write(self.model)
# Populate deletable_objects, a data structure of all related objects that
# will also be deleted.
deletable_objects, perms_needed, protected = get_deleted_objects(
queryset, opts, request.user, self.admin_site, using)
# The user has already confirmed the deletion.
# Do the deletion and return a None to display the change list view again.
if request.POST.get('post'):
if perms_needed:
raise PermissionDenied
n_count = queryset.count()
if n_count:
for obj in queryset:
obj_display = force_unicode(obj)
self.log_deletion(request, obj, obj_display)
queryset.delete()
self.message_user(request, _("Successfully turned down %(count)d %(items)s.") % {
"count": n_count, "items": model_ngettext(self.opts, n_count)
})
# Return None to display the change list page again.
return None
if len(queryset) == 1:
objects_name = force_unicode(opts.verbose_name)
else:
objects_name = force_unicode(opts.verbose_name_plural)
if perms_needed or protected:
title = _("Cannot turn down %(name)s") % {"name": objects_name}
else:
title = _("Are you sure?")
context = {
"title": title,
"objects_name": objects_name,
"deletable_objects": [deletable_objects],
'queryset': queryset,
"perms_lacking": perms_needed,
"protected": protected,
"opts": opts,
"root_path": self.admin_site.root_path,
"app_label": app_label,
'action_checkbox_name': helpers.ACTION_CHECKBOX_NAME,
}
# Display the confirmation page
return render_to_response("accounts/delete_editor_group_application_selected_confirmation.html", \
context, context_instance=template.RequestContext(request))
delete_selected.short_description = \
_("Turn down selected editor group applications")
class EditorGroupManagersAdmin(admin.ModelAdmin):
"""
Administration interface for `EditorGroupManagers`s.
"""
list_display = ('name', 'managed_group', '_members_display')
search_fields = ('name', 'managed_group')
actions = ('add_user_to_editor_group_managers', 'remove_user_from_editor_group_managers', )
form = EditorGroupManagersForm
def _members_display(self, obj):
"""
Returns a string representing a list of the members of the given
`EditorGroupManagers` object.
"""
return ', '.join(member.username for member in obj.get_members())
_members_display.short_description = _('Members')
class UserProfileinEditorGroupManagersForm(forms.Form):
_selected_action = forms.CharField(widget=forms.MultipleHiddenInput)
def __init__(self, choices = None, *args, **kwargs):
super(EditorGroupManagersAdmin.UserProfileinEditorGroupManagersForm, self).__init__(*args, **kwargs)
if choices is not None:
self.choices = choices
self.fields['users'] = forms.ModelMultipleChoiceField(self.choices)
@csrf_protect_m
@transaction.commit_on_success
def add_view(self, request, form_url='', extra_context=None):
"""
The 'add' admin view for this model.
"""
# when showing a certain add view for the first time, prepopulate the
# permissions field: we suggest that the new group has all required
# model permissions for deleting language resources and for changing and
# deleting editor group application requests
if request.method == 'GET':
# request `QueryDict`s are immutable; create a copy before updating
request.GET = request.GET.copy()
request.GET.update({'permissions': ','.join(str(_perm.pk) for _perm
in EditorGroupManagersAdmin.get_suggested_manager_permissions())})
return super(EditorGroupManagersAdmin, self).add_view(request,
form_url=form_url, extra_context=extra_context)
@staticmethod
def get_suggested_manager_permissions():
"""
Returns a list of `Permission`s that all `EditorGroupManagers`s should have.
"""
result = []
# add language resource delete permission
from metashare.repository.models import resourceInfoType_model
opts = resourceInfoType_model._meta
result.append(Permission.objects.filter(
content_type__app_label=opts.app_label,
codename=opts.get_delete_permission())[0])
# add editor group application request change/delete permission
opts = EditorGroupApplication._meta
result.append(Permission.objects.filter(
content_type__app_label=opts.app_label,
codename=opts.get_change_permission())[0])
result.append(Permission.objects.filter(
content_type__app_label=opts.app_label,
codename=opts.get_delete_permission())[0])
return result
def add_user_to_editor_group_managers(self, request, queryset):
form = None
if request.user.is_superuser:
if 'cancel' in request.POST:
self.message_user(request, _('Cancelled adding users to the editor group managers.'))
return
elif 'add_user_profile_to_editor_group_managers' in request.POST:
objs_up = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinEditorGroupManagersForm(objs_up, request.POST)
if form.is_valid():
userprofiles = form.cleaned_data['users']
for userprofile in userprofiles:
for obj in queryset:
userprofile.user.groups.add(obj)
self.message_user(request, _('Successfully added users to editor group managers.'))
return HttpResponseRedirect(request.get_full_path())
if not form:
userprofiles = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinEditorGroupManagersForm(choices=userprofiles,
initial={'_selected_action': request.POST.getlist(admin.ACTION_CHECKBOX_NAME)})
dictionary = {'title': _('Add Users to Editor Group Managers'),
'selected_editorgroupmanagers': queryset,
'form': form,
'path': request.get_full_path()
}
dictionary.update(create_breadcrumb_template_params(self.model, _('Add user')))
return render_to_response('accounts/add_user_profile_to_editor_group_managers.html',
dictionary,
context_instance=RequestContext(request))
else:
self.message_user(request, _('You need to be a superuser to add ' \
'a user to these editor group managers.'))
return HttpResponseRedirect(request.get_full_path())
add_user_to_editor_group_managers.short_description = _("Add users to selected editor group managers")
def remove_user_from_editor_group_managers(self, request, queryset):
form = None
if request.user.is_superuser:
if 'cancel' in request.POST:
self.message_user(request, _('Cancelled removing users from ' \
'editor group managers.'))
return
elif 'remove_user_profile_from_editor_group_managers' in request.POST:
objs_up = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinEditorGroupManagersForm(objs_up, request.POST)
if form.is_valid():
userprofiles = form.cleaned_data['users']
for userprofile in userprofiles:
for obj in queryset:
userprofile.user.groups.remove(obj)
self.message_user(request, _('Successfully removed users ' \
'from editor group managers.'))
return HttpResponseRedirect(request.get_full_path())
if not form:
userprofiles = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinEditorGroupManagersForm(choices=userprofiles,
initial={'_selected_action': request.POST.getlist(admin.ACTION_CHECKBOX_NAME)})
dictionary = {'title': _('Remove Users from Editor Group Managers'),
'selected_editorgroupmanagers': queryset,
'form': form,
'path': request.get_full_path()
}
dictionary.update(create_breadcrumb_template_params(self.model, _('Remove user')))
return render_to_response('accounts/remove_user_profile_from_editor_group_managers.html',
dictionary,
context_instance=RequestContext(request))
else:
self.message_user(request, _('You need to be a superuser to ' \
'remove a user from these editor group managers.'))
return HttpResponseRedirect(request.get_full_path())
remove_user_from_editor_group_managers.short_description = _("Remove users from selected editor group managers")
class OrganizationAdmin(admin.ModelAdmin):
"""
Administration interface for `Organization`s.
"""
list_display = ('name', '_members_display', '_organization_managing_group_display',
'_organization_managers_display')
search_fields = ('name',)
actions = ('add_user_to_organization', 'remove_user_from_organization', )
form = OrganizationForm
def _members_display(self, obj):
"""
Returns a string representing a list of the members of the given
`Organization` object.
"""
return ', '.join(member.username for member in obj.get_members())
_members_display.short_description = _('Members')
def _organization_managing_group_display(self, obj):
"""
Returns a string representing a list of the organization managing groups of the
given `Organization` object.
"""
return ', '.join(org_mgr_group.name for org_mgr_group
in OrganizationManagers.objects.filter(managed_organization=obj))
_organization_managing_group_display.short_description = _('Managing groups')
def _organization_managers_display(self, obj):
"""
Returns a string representing a list of the managers of the given
`Organization`.
"""
return ', '.join(usr.username
for org_mgr_group in OrganizationManagers.objects.filter(managed_organization=obj)
for usr in User.objects.filter(groups__name=org_mgr_group.name))
_organization_managers_display.short_description = _('Managers')
class UserProfileinOrganizationForm(forms.Form):
_selected_action = forms.CharField(widget=forms.MultipleHiddenInput)
def __init__(self, choices = None, *args, **kwargs):
super(OrganizationAdmin.UserProfileinOrganizationForm, self).__init__(*args, **kwargs)
if choices is not None:
self.choices = choices
self.fields['users'] = forms.ModelMultipleChoiceField(self.choices)
def queryset(self, request):
queryset = super(OrganizationAdmin, self).queryset(request)
if request.user.is_superuser:
return queryset
return queryset.filter(organizationmanagers__in=OrganizationManagers.objects.filter(
name__in=request.user.groups.values_list('name', flat=True)))
def add_user_to_organization(self, request, queryset):
form = None
if 'cancel' in request.POST:
self.message_user(request, _('Cancelled adding users to the organization.'))
return
elif 'add_user_profile_to_organization' in request.POST:
objs_up = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinOrganizationForm(objs_up, request.POST)
if form.is_valid():
userprofiles = form.cleaned_data['users']
for userprofile in userprofiles:
for obj in queryset:
if UserProfile.objects.filter(user=request.user)[0].has_organization_manager_permission(obj):
userprofile.user.groups.add(obj)
else:
self.message_user(request,
_('You need to be organization managers to add a user to this organization.'))
return HttpResponseRedirect(request.get_full_path())
self.message_user(request, _('Successfully added users to organization.'))
return HttpResponseRedirect(request.get_full_path())
if not form:
userprofiles = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinOrganizationForm(choices=userprofiles,
initial={'_selected_action': request.POST.getlist(admin.ACTION_CHECKBOX_NAME)})
dictionary = {'title': _('Add Users to Organization'),
'selected_organizations': queryset,
'form': form,
'path': request.get_full_path()
}
dictionary.update(create_breadcrumb_template_params(self.model, _('Add user')))
return render_to_response('accounts/add_user_profile_to_organization.html',
dictionary,
context_instance=RequestContext(request))
add_user_to_organization.short_description = _("Add users to selected organizations")
def remove_user_from_organization(self, request, queryset):
form = None
if request.user.is_superuser:
if 'cancel' in request.POST:
self.message_user(request, _('Cancelled removing users from the organization.'))
return
elif 'remove_user_profile_from_organization' in request.POST:
objs_up = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinOrganizationForm(objs_up, request.POST)
if form.is_valid():
userprofiles = form.cleaned_data['users']
for userprofile in userprofiles:
for obj in queryset:
userprofile.user.groups.remove(obj)
self.message_user(request, _('Successfully removed users from organization.'))
return HttpResponseRedirect(request.get_full_path())
if not form:
userprofiles = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinOrganizationForm(choices=userprofiles,
initial={'_selected_action': request.POST.getlist(admin.ACTION_CHECKBOX_NAME)})
dictionary = {'title': _('Remove Users from Organization'),
'selected_organizations': queryset,
'form': form,
'path': request.get_full_path()
}
dictionary.update(create_breadcrumb_template_params(self.model, _('Remove user')))
return render_to_response('accounts/remove_user_profile_from_organization.html',
dictionary,
context_instance=RequestContext(request))
remove_user_from_organization.short_description = _("Remove users from selected organizations")
class OrganizationApplicationAdmin(admin.ModelAdmin):
"""
Administration interface for user organization application.
"""
list_display = ('user', 'organization', 'created')
actions = ('accept_selected', 'delete_selected')
def accept_selected(self, request, queryset):
"""
The action to accept organization applications.
"""
if not request.user.is_superuser and \
not request.user.get_profile().has_organization_manager_permission():
messages.error(request,
_('You must be superuser or organization manager to accept applications.'))
return HttpResponseRedirect(request.get_full_path())
if queryset.count() == 0:
return HttpResponseRedirect(request.get_full_path())
_total_groups = 0
_accepted_groups = 0
for req in queryset:
_total_groups += 1
if request.user.get_profile().has_organization_manager_permission(
req.organization) or request.user.is_superuser:
req.user.groups.add(req.organization)
req.delete()
_accepted_groups += 1
# Render notification email template with correct values.
data = {'organization': req.organization,
'shortname': req.user.get_full_name() }
try:
# Send out notification email to the user
send_mail('Application accepted',
render_to_string('accounts/notification_organization_application_accepted.email', data),
'no-reply@elrc-share.ilsp.gr', (req.user.email,),
fail_silently=False)
except: #SMTPException:
# If the email could not be sent successfully, tell the user
# about it.
messages.error(request, _("There was an error sending " \
"out an application acceptance e-mail."))
else:
messages.success(request, _('You have successfully ' \
'accepted "%s" as member of the organization "%s".')
% (req.user.get_full_name(), req.organization,))
if _total_groups != _accepted_groups:
messages.warning(request, _('Successfully accepted %(accepted)d of '
'%(total)d applications. You have no permissions to accept the '
'remaining applications.') % {'accepted': _accepted_groups,
'total': _total_groups})
else:
messages.success(request,
_('Successfully accepted all requests.'))
return HttpResponseRedirect(request.get_full_path())
accept_selected.short_description = \
_("Accept selected organization applications")
def get_readonly_fields(self, request, obj=None):
"""
Return the list of fields to be in readonly mode.
Organization managers cannot modify applications, they can only add them or delete them.
"""
if not request.user.is_superuser:
# for non-superusers no part of the organization application is editable
return [field.name for field
in OrganizationApplication._meta.fields]
return super(OrganizationApplicationAdmin, self) \
.get_readonly_fields(request, obj)
# pylint: disable-msg=W0622
def log_deletion(self, request, obj, object_repr):
"""
When an application is turned down by an organization manager, send an email to the user before
logging the deletion
"""
# Render notification email template with correct values.
data = {'organization': obj.organization,
'shortname': obj.user.get_full_name() }
try:
# Send out notification email to the user
send_mail('Application turned down', render_to_string('accounts/'
'notification_organization_application_turned_down.email', data),
'no-reply@elrc-share.ilsp.gr', (obj.user.email,),
fail_silently=False)
except: #SMTPException:
# If the email could not be sent successfully, tell the user
# about it.
messages.error(request, _("There was an error sending out an "
"e-mail about turning down the application."))
else:
messages.success(request, _('You have turned down the application' \
' of "%s" for membership in the organization "%s".')
% (obj.user.get_full_name(), obj.organization,))
super(OrganizationApplicationAdmin, self).log_deletion(request, obj, object_repr)
def delete_selected(self, request, queryset):
"""
The action to turn down organization applications.
"""
from django import template
from django.contrib.admin.util import get_deleted_objects, model_ngettext
from django.contrib.admin import helpers
from django.db import router
from django.utils.encoding import force_unicode
from django.core.exceptions import PermissionDenied
opts = self.model._meta
app_label = opts.app_label
# Check that the user has delete permission for the actual model
if not self.has_delete_permission(request):
raise PermissionDenied
using = router.db_for_write(self.model)
# Populate deletable_objects, a data structure of all related objects that
# will also be deleted.
deletable_objects, perms_needed, protected = get_deleted_objects(
queryset, opts, request.user, self.admin_site, using)
# The user has already confirmed the deletion.
# Do the deletion and return a None to display the change list view again.
if request.POST.get('post'):
if perms_needed:
raise PermissionDenied
n_count = queryset.count()
if n_count:
for obj in queryset:
obj_display = force_unicode(obj)
self.log_deletion(request, obj, obj_display)
queryset.delete()
self.message_user(request, _("Successfully turned down %(count)d %(items)s.") % {
"count": n_count, "items": model_ngettext(self.opts, n_count)
})
# Return None to display the change list page again.
return None
if len(queryset) == 1:
objects_name = force_unicode(opts.verbose_name)
else:
objects_name = force_unicode(opts.verbose_name_plural)
if perms_needed or protected:
title = _("Cannot turn down %(name)s") % {"name": objects_name}
else:
title = _("Are you sure?")
context = {
"title": title,
"objects_name": objects_name,
"deletable_objects": [deletable_objects],
'queryset': queryset,
"perms_lacking": perms_needed,
"protected": protected,
"opts": opts,
"root_path": self.admin_site.root_path,
"app_label": app_label,
'action_checkbox_name': helpers.ACTION_CHECKBOX_NAME,
}
# Display the confirmation page
return render_to_response("accounts/delete_organization_application_selected_confirmation.html", \
context, context_instance=template.RequestContext(request))
delete_selected.short_description = \
_("Turn down selected organization applications")
class OrganizationManagersAdmin(admin.ModelAdmin):
"""
Administration interface for `OrganizationManagers`s.
"""
list_display = ('name', 'managed_organization', '_members_display')
search_fields = ('name', 'managed_organization')
actions = ('add_user_to_organization_managers', 'remove_user_from_organization_managers', )
form = OrganizationManagersForm
def _members_display(self, obj):
"""
Returns a string representing a list of the members of the given
`OrganizationManagers` object.
"""
return ', '.join(member.username for member in obj.get_members())
_members_display.short_description = _('Members')
class UserProfileinOrganizationManagersForm(forms.Form):
_selected_action = forms.CharField(widget=forms.MultipleHiddenInput)
def __init__(self, choices = None, *args, **kwargs):
super(OrganizationManagersAdmin.UserProfileinOrganizationManagersForm, self).__init__(*args, **kwargs)
if choices is not None:
self.choices = choices
self.fields['users'] = forms.ModelMultipleChoiceField(self.choices)
@csrf_protect_m
@transaction.commit_on_success
def add_view(self, request, form_url='', extra_context=None):
"""
The 'add' admin view for this model.
"""
# when showing a certain add view for the first time, prepopulate the
# permissions field: we suggest that the new group has all required
# model permissions for deleting language resources and for changing and
# deleting organization application requests
if request.method == 'GET':
# request `QueryDict`s are immutable; create a copy before upadating
request.GET = request.GET.copy()
request.GET.update({'permissions': ','.join(str(_perm.pk) for _perm
in OrganizationManagersAdmin.get_suggested_organization_manager_permissions())})
return super(OrganizationManagersAdmin, self).add_view(request,
form_url=form_url, extra_context=extra_context)
@staticmethod
def get_suggested_organization_manager_permissions():
"""
Returns a list of `Permission`s that all `OrganizationManagers`s should
have.
"""
result = []
# add organization application request change/delete permission
opts = OrganizationApplication._meta
result.append(Permission.objects.filter(
content_type__app_label=opts.app_label,
codename=opts.get_change_permission())[0])
result.append(Permission.objects.filter(
content_type__app_label=opts.app_label,
codename=opts.get_delete_permission())[0])
return result
def add_user_to_organization_managers(self, request, queryset):
form = None
if request.user.is_superuser:
if 'cancel' in request.POST:
self.message_user(request, _('Cancelled adding users to the organization managers.'))
return
elif 'add_user_profile_to_organization_managers' in request.POST:
objs_up = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinOrganizationManagersForm(objs_up, request.POST)
if form.is_valid():
userprofiles = form.cleaned_data['users']
for userprofile in userprofiles:
user = userprofile.user
for obj in queryset:
user.groups.add(obj)
user.groups.add(obj.managed_organization)
self.message_user(request, _('Successfully added users to organization managers.'))
return HttpResponseRedirect(request.get_full_path())
if not form:
userprofiles = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinOrganizationManagersForm(choices=userprofiles,
initial={'_selected_action': request.POST.getlist(admin.ACTION_CHECKBOX_NAME)})
dictionary = {'title': _('Add Users to Organization Manager Group'),
'selected_organizationmanagers': queryset,
'form': form,
'path': request.get_full_path()
}
dictionary.update(create_breadcrumb_template_params(self.model, _('Add user')))
return render_to_response('accounts/add_user_profile_to_organization_managers.html',
dictionary,
context_instance=RequestContext(request))
else:
self.message_user(request, _('You need to be a superuser to add ' \
'a user to these organization managers.'))
return HttpResponseRedirect(request.get_full_path())
add_user_to_organization_managers.short_description = _("Add users to selected organization managers")
def remove_user_from_organization_managers(self, request, queryset):
form = None
if request.user.is_superuser:
if 'cancel' in request.POST:
self.message_user(request, _('Cancelled removing users from the organization managers.'))
return
elif 'remove_user_profile_from_organization_managers' in request.POST:
objs_up = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinOrganizationManagersForm(objs_up, request.POST)
if form.is_valid():
userprofiles = form.cleaned_data['users']
for userprofile in userprofiles:
for obj in queryset:
userprofile.user.groups.remove(obj)
self.message_user(request, _('Successfully removed users from organization managers.'))
return HttpResponseRedirect(request.get_full_path())
if not form:
userprofiles = UserProfile.objects.filter(user__is_active=True)
form = self.UserProfileinOrganizationManagersForm(choices=userprofiles,
initial={'_selected_action': request.POST.getlist(admin.ACTION_CHECKBOX_NAME)})
dictionary = {'title':
_('Remove Users from Organization Manager Group'),
'selected_organizationmanagers': queryset,
'form': form,
'path': request.get_full_path()
}
dictionary.update(create_breadcrumb_template_params(self.model, _('Remove user')))
return render_to_response('accounts/remove_user_profile_from_organization_managers.html',
dictionary,
context_instance=RequestContext(request))
else:
self.message_user(request, _('You need to be a superuser to ' \
'remove a user from these organization managers.'))
return HttpResponseRedirect(request.get_full_path())
remove_user_from_organization_managers.short_description = _("Remove users from selected organization managers")
admin.site.register(RegistrationRequest, RegistrationRequestAdmin)
admin.site.register(ResetRequest, ResetRequestAdmin)
admin.site.register(UserProfile, UserProfileAdmin)
admin.site.register(EditorGroup, EditorGroupAdmin)
admin.site.register(EditorGroupApplication, EditorGroupApplicationAdmin)
admin.site.register(EditorGroupManagers, EditorGroupManagersAdmin)
admin.site.register(Organization, OrganizationAdmin)
admin.site.register(OrganizationApplication, OrganizationApplicationAdmin)
admin.site.register(OrganizationManagers, OrganizationManagersAdmin)
| 47.467245 | 142 | 0.617503 | 4,961 | 49,271 | 5.901834 | 0.073977 | 0.024045 | 0.013388 | 0.017214 | 0.832884 | 0.810308 | 0.773831 | 0.746371 | 0.735578 | 0.708562 | 0 | 0.000784 | 0.30111 | 49,271 | 1,037 | 143 | 47.513018 | 0.849485 | 0.107731 | 0 | 0.65651 | 0 | 0 | 0.153312 | 0.041563 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055402 | false | 0 | 0.038781 | 0 | 0.240997 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
02d70b9d127c6f83b5e0de3bcdf739b30360b450 | 14,034 | py | Python | tests/integration/subsys/object_store/test_object_store.py | dspalmer99/anchore-engine | 8c61318be6fec5d767426fa4ccd98472cc85b5cd | [
"Apache-2.0"
] | 1 | 2020-06-22T07:27:41.000Z | 2020-06-22T07:27:41.000Z | tests/integration/subsys/object_store/test_object_store.py | dspalmer99/anchore-engine | 8c61318be6fec5d767426fa4ccd98472cc85b5cd | [
"Apache-2.0"
] | 4 | 2020-11-07T00:16:02.000Z | 2020-11-08T20:52:06.000Z | tests/integration/subsys/object_store/test_object_store.py | dspalmer99/anchore-engine | 8c61318be6fec5d767426fa4ccd98472cc85b5cd | [
"Apache-2.0"
] | null | null | null | """
Tests for the archive subsys. With each configured driver.
"""
import os
import pytest
from anchore_engine.subsys import object_store
from anchore_engine.subsys.object_store.config import DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY
from anchore_engine.subsys.object_store import get_manager
from anchore_engine.subsys.object_store.exc import DriverConfigurationError, BadCredentialsError
from tests.fixtures import anchore_db
from .conftest import test_s3_secret_key, test_s3_key, test_s3_bucket, test_s3_url, test_s3_region, test_swift_auth_url, test_swift_container, test_swift_key, test_swift_user
from anchore_engine.subsys import logger
logger.enable_test_logging()
document_1 = b'{"document": {"user_id": "admin", "final_action_reason": "policy_evaluation", "matched_whitelisted_images_rule": "matched_blacklisted_images_rule": false}}'
document_json = {"user_id": "admin", "final_action_reason": "policy_evaluation",
"matched_whitelisted_images_rule": False, "created_at": 1522454550, "evaluation_problems": [],
"last_modified": 1522454550, "final_action": "stop",
"matched_mapping_rule": {"name": "default", "repository": "*",
"image": {"type": "tag", "value": "*"},
"whitelist_ids": ["37fd763e-1765-11e8-add4-3b16c029ac5c"],
"registry": "*", "id": "c4f9bf74-dc38-4ddf-b5cf-00e9c0074611",
"policy_id": "48e6f7d6-1765-11e8-b5f9-8b6f228548b6"},
"matched_blacklisted_images_rule": False}
test_user_id = 'testuser1'
test_bucket_id = 'testbucket1'
disable_tests = False
def run_test():
"""
Common test path for all configs to test against
:return:
"""
mgr = get_manager()
logger.info('Basic string operations using get/put/delete')
resp = mgr.put(userId=test_user_id, bucket=test_bucket_id, archiveid='document_1', data=document_1)
logger.info('Document 1 PUT: {}'.format(resp))
resp = mgr.get(userId=test_user_id, bucket=test_bucket_id, archiveid='document_1')
assert document_1 == resp
assert mgr.exists(test_user_id, test_bucket_id, 'document_1')
assert not mgr.exists(test_user_id, test_bucket_id, 'document_10')
logger.info('Document operations')
resp = mgr.put_document(userId=test_user_id, bucket=test_bucket_id, archiveId='document_json', data=document_json)
logger.info('Document JSON PUT Doc: {}'.format(resp))
resp = mgr.get_document(userId=test_user_id, bucket=test_bucket_id, archiveId='document_json')
logger.info('Document JSON GET Dock: {}'.format(resp))
assert document_json == resp
logger.info('Document operations')
resp = mgr.put_document(userId=test_user_id, bucket=test_bucket_id, archiveId='document_json', data=document_1.decode('utf-8'))
logger.info('Document string PUT Doc: {}'.format(resp))
resp = mgr.get_document(userId=test_user_id, bucket=test_bucket_id, archiveId='document_json')
logger.info('Document string GET Dock: {}'.format(resp))
assert document_1.decode('utf-8') == resp
def test_noop(anchore_db):
pass
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_fs(anchore_db):
config = {
'archive': {
'compression': {
'enabled': True
},
'storage_driver': {
'name': 'localfs',
'config': {
'archive_data_dir': '/tmp/archive_test/fs_driver'
}
}
}
}
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
run_test()
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_swift(swift_container, anchore_db):
config = {
'archive': {
'compression': {
'enabled': True
},
'storage_driver': {
'name': 'swift',
'config': {
'user': test_swift_user,
'key': test_swift_key,
'auth': test_swift_auth_url,
'container': test_swift_container
}
}
}
}
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
run_test()
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_swift_create_container(swift_container, anchore_db):
config = {
'archive':{
'compression': {
'enabled': True
},
'storage_driver': {
'name': 'swift',
'config': {
'user': test_swift_user,
'key': test_swift_key,
'auth': test_swift_auth_url,
'container': 'testarchive2',
'create_container': True
}
}
}
}
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
run_test()
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_swift_bad_creds(swift_container, anchore_db):
config = {
'archive': {
'compression': {
'enabled': True
},
'storage_driver': {
'name': 'swift',
'config': {
'user': test_swift_user,
'key': 'badkey',
'auth': test_swift_auth_url,
'container': test_swift_container
}
}
}
}
with pytest.raises(BadCredentialsError) as err:
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
pytest.fail('Should have raised bad creds exception on init')
logger.info('Got expected error: {}'.format(err.type))
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_swift_bad_container(swift_container, anchore_db):
config = {
'archive': {
'compression': {
'enabled': True
},
'storage_driver': {
'name': 'swift',
'config': {
'user': test_swift_user,
'key': test_swift_key,
'auth': test_swift_auth_url,
'container': 'testarchive_does_not_exist'
}
}
}
}
with pytest.raises(DriverConfigurationError) as err:
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
logger.info('Got expected error: {}'.format(err.type))
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_db(anchore_db):
config = {
'archive': {
'compression': {
'enabled': True
},
'storage_driver': {
'name': 'db2',
'config': {}
}
}
}
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
run_test()
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_legacy_db(anchore_db):
# NOTE: legacy db driver does not support compression since it uses string type instead of binary for content storage
config = {
'archive': {
'compression': {
'enabled': False
},
'storage_driver': {
'name': 'db',
'config': {}
}
}
}
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
run_test()
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_s3(s3_bucket, anchore_db):
logger.info('Creds: {} / {}'.format(test_s3_key, test_s3_secret_key))
config = {
'archive': {
'compression': {
'enabled': False
},
'storage_driver': {
'name': 's3',
'config': {
'access_key': test_s3_key,
'secret_key': test_s3_secret_key,
'url': test_s3_url,
'region': test_s3_region,
'bucket': test_s3_bucket
}
}
}
}
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
run_test()
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_s3_create_bucket(s3_bucket, anchore_db):
config = {
'archive': {
'compression': {
'enabled': False
},
'storage_driver': {
'name': 's3',
'config': {
'create_bucket': True,
'access_key': test_s3_key,
'secret_key': test_s3_secret_key,
'url': test_s3_url,
'region': test_s3_region,
'bucket': 'testarchivebucket2'
}
}
}
}
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
run_test()
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_s3_bad_creds(s3_bucket, anchore_db):
config = {
'archive': {
'compression': {
'enabled': False
},
'storage_driver': {
'name': 's3',
'config': {
'access_key': test_s3_key,
'secret_key': 'notrealkey',
'url': test_s3_url,
'region': test_s3_region,
'bucket': test_s3_bucket
}
}
}
}
with pytest.raises(BadCredentialsError) as err:
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
pytest.fail('Should have gotten a bad creds error')
logger.info('Got expected error: {}'.format(err.type))
config = {
'archive': {
'compression': {
'enabled': False
},
'storage_driver': {
'name': 's3',
'config': {
'access_key': test_s3_key,
'secret_key': 'notrealkey',
'url': test_s3_url,
'region': test_s3_region,
'bucket': test_s3_bucket
}
}
}
}
with pytest.raises(BadCredentialsError) as err:
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
pytest.fail('Should have gotten a bad creds error')
logger.info('Got expected error: {}'.format(err.type))
@pytest.mark.skipif(disable_tests, reason='skipped by config')
def test_s3_bad_bucket(s3_bucket, anchore_db):
config = {
'archive': {
'compression': {
'enabled': False
},
'storage_driver': {
'name': 's3',
'config': {
'access_key': test_s3_key,
'secret_key': test_s3_secret_key,
'url': test_s3_url,
'region': None,
'bucket': 'testarchivebucket_does_not_exist'
}
}
}
}
with pytest.raises(DriverConfigurationError) as err:
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
logger.info('Got expected error: {}'.format(err.type))
@pytest.mark.skip #if(disable_tests, reason='skipped by config')
def test_s3_auto(s3_bucket, anchore_db):
os.environ['AWS_ACCESS_KEY'] = test_s3_key
os.environ['AWS_SECRET_ACCESS_KEY'] = test_s3_secret_key
config = {
'archive': {
'compression': {
'enabled': False
},
'storage_driver': {
'name': 's3',
'config': {
'iamauto': True,
'bucket': 'testarchivebucket_does_not_exist'
}
}
}
}
with pytest.raises(DriverConfigurationError) as err:
object_store.initialize(config, check_db=False, manager_id=DEFAULT_OBJECT_STORE_MANAGER_ID, config_keys=[DEFAULT_OBJECT_STORE_MANAGER_ID, ALT_OBJECT_STORE_CONFIG_KEY], allow_legacy_fallback=False, force=True)
logger.info('Got expected error: {}'.format(err.typee))
| 38.661157 | 216 | 0.601254 | 1,528 | 14,034 | 5.158377 | 0.123691 | 0.080944 | 0.061659 | 0.085638 | 0.825044 | 0.799036 | 0.774423 | 0.774423 | 0.767699 | 0.744227 | 0 | 0.014465 | 0.290651 | 14,034 | 362 | 217 | 38.767956 | 0.777298 | 0.019809 | 0 | 0.561688 | 0 | 0.003247 | 0.182547 | 0.028992 | 0 | 0 | 0 | 0 | 0.016234 | 1 | 0.045455 | false | 0.003247 | 0.029221 | 0 | 0.074675 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
f327633867b52fdbac8a61ace67b874a9a4507cc | 14,992 | py | Python | src/test/py/test_upgrade.py | iamfork/pipelinedb | 5cc6ef58ab5b1c84bb7b4e932f99bf5c347f46d8 | [
"Apache-2.0"
] | 2,625 | 2015-07-07T18:02:37.000Z | 2022-03-16T10:11:34.000Z | src/test/py/test_upgrade.py | iamfork/pipelinedb | 5cc6ef58ab5b1c84bb7b4e932f99bf5c347f46d8 | [
"Apache-2.0"
] | 943 | 2015-07-07T22:59:12.000Z | 2021-12-16T15:08:44.000Z | src/test/py/test_upgrade.py | iamfork/pipelinedb | 5cc6ef58ab5b1c84bb7b4e932f99bf5c347f46d8 | [
"Apache-2.0"
] | 276 | 2015-07-07T20:30:52.000Z | 2022-02-25T04:30:56.000Z | from base import pipeline, clean_db, PipelineDB
import pytest
import os
import shutil
import subprocess
import time
def test_binary_upgrade(pipeline, clean_db):
"""
Verify that binary upgrades properly transfer all objects and data
into the new installation
"""
if pipeline.version_num == 110000:
pytest.skip('skipping until PG11 supports dump/restore WITH OIDS')
# Create some regular tables with data, and create an index on half of them
for n in range(16):
name = 't_%d' % n
pipeline.create_table(name, x='integer', y='text', z='text')
rows = [(x, name, name) for x in range(1000)]
pipeline.insert(name, ('x', 'y', 'z'), rows)
if n >= 8:
pipeline.execute('CREATE INDEX idx_%s ON %s(y)' % (name, name))
# Create some streams
for n in range(8):
name = 's_%d' % n
pipeline.create_stream(name, x='integer', y='text')
# Now create some CVs with data, some with indices
for n in range(32):
name = 'cv_%d' % n
pipeline.create_stream('stream_%d' % n, x='int', y='text', z='text')
pipeline.create_cv(name, 'SELECT z::text, COUNT(DISTINCT z) AS distinct_count, COUNT(*) FROM stream_%d GROUP BY z' % n)
if n >= 16:
pipeline.execute('CREATE INDEX idx_%s ON %s(z)' % (name, name))
# Create some STJs
for n in range(8):
pipeline.create_cv('stj_%d' % n,
'SELECT t.x, count(*) FROM stream_%d s JOIN t_%d t ON s.x = t.x GROUP BY t.x' % (n, n))
# Create some SW CVs
for n in range(8):
pipeline.create_cv('sw_%d' % n, 'SELECT count(*) FROM stream_%d' % n, sw='%d days' % (n + 1), step_factor=n + 1)
# Create some CVs/CTs/streams that we'll rename
for n in range(4):
pipeline.create_stream('to_rename_s_%d' % n, x='int')
pipeline.create_cv('to_rename_cv_%d' % n, 'SELECT x, count(*) FROM to_rename_s_%d GROUP BY x' % n)
pipeline.create_ct('to_rename_ct_%d' % n, 'SELECT x FROM to_rename_s_%d' % n)
pipeline.create_cv('to_rename_ct_reader_%d' % n, "SELECT count(*) FROM output_of('to_rename_ct_%d')" % n)
rows = [(x,) for x in range(1000)]
pipeline.insert('to_rename_s_%d' % n, ('x',), rows)
# Now rename them
for n in range(4):
pipeline.execute('ALTER FOREIGN TABLE to_rename_s_%d RENAME TO renamed_s_%d' % (n, n))
pipeline.execute('ALTER VIEW to_rename_cv_%d RENAME TO renamed_cv_%d' % (n, n))
pipeline.execute('ALTER VIEW to_rename_ct_%d RENAME TO renamed_ct_%d' % (n, n))
pipeline.execute('ALTER VIEW to_rename_ct_reader_%d RENAME TO renamed_ct_reader_%d' % (n, n))
# And write some data using the new stream names
rows = [(x,) for x in range(1000)]
pipeline.insert('renamed_s_%d' % n, ('x',), rows)
# Create a CV chain that combines output streams
q = """
SELECT (new).z, combine((delta).count) AS count, combine((delta).distinct_count) AS distinct_count FROM output_of('cv_0') GROUP BY (new).z
"""
pipeline.create_cv('combine_cv_0', q)
q = """
SELECT combine((delta).count) AS count, combine((delta).distinct_count) AS distinct_count FROM output_of('combine_cv_0')
"""
pipeline.create_cv('combine_cv_1', q)
for n in range(32):
name = 'cv_%d' % n
rows = [(x, name, name) for x in range(1000)]
pipeline.insert('stream_%d' % n, ('x', 'y', 'z'), rows)
# Create a CV with a TTL to verify TTL info is restored properly
pipeline.create_cv('ttlcv', 'SELECT second(arrival_timestamp), count(*) FROM stream_0 GROUP BY second', ttl='1 hour', ttl_column='second')
# Now create some in another namespace
pipeline.execute('CREATE SCHEMA namespace')
for n in range(8):
name = 'namespace.cv_%d' % n
pipeline.create_stream('namespace.stream_%d' % n, x='int', y='text', z='text')
pipeline.create_cv(name, 'SELECT z::text, COUNT(DISTINCT z) AS distinct_count, COUNT(*) FROM namespace.stream_%d GROUP BY z' % n)
rows = [(x, name, name) for x in range(1000)]
pipeline.insert('namespace.stream_%d' % n, ('x', 'y', 'z'), rows)
if n >= 4:
pipeline.execute('CREATE INDEX namespace_idx_%d ON %s(z)' % (n, name))
create_fn = """
CREATE OR REPLACE FUNCTION tg_fn()
RETURNS trigger AS
$$
BEGIN
RETURN NEW;
END;
$$
LANGUAGE plpgsql;
"""
pipeline.execute(create_fn)
pipeline.create_stream('stream0', z='text')
# Create some transforms with trigger functions
for n in range(8):
name = 'ct_%d' % n
pipeline.create_ct(name, 'SELECT z::text FROM stream0', 'tg_fn')
# Create some transforms without trigger functions
for n in range(8):
name = 'ct_no_trig_%d' % n
pipeline.create_ct(name, 'SELECT z::text FROM stream0')
time.sleep(10)
old_bin_dir = new_bin_dir = pipeline.bin_dir
old_data_dir = pipeline.data_dir
new_data_dir0 = os.path.abspath('test_binary_upgrade_data_dir0')
if os.path.exists(new_data_dir0):
shutil.rmtree(new_data_dir0)
pipeline.stop()
p = subprocess.Popen([
os.path.join(pipeline.bin_dir, 'initdb'), '-D', new_data_dir0])
stdout, stderr = p.communicate()
with open(os.path.join(new_data_dir0, 'postgresql.conf'), 'a') as f:
f.write('shared_preload_libraries=pipelinedb\n')
f.write('max_worker_processes=128\n')
f.write('pipelinedb.stream_insert_level=sync_commit\n')
result = subprocess.check_call([
os.path.join(pipeline.bin_dir, 'pg_upgrade'),
'-b', old_bin_dir, '-B', new_bin_dir,
'-d', old_data_dir, '-D', new_data_dir0])
assert result == 0
# The cleanup path expects this to be running, but we're done with it
pipeline.run()
# pg_upgrade returned successfully and has already done sanity checks
# but let's manually verify that all objects were migrated to the new data directory
upgraded = PipelineDB(data_dir=new_data_dir0)
upgraded.run()
# Tables
for n in range(16):
name = 't_%d' % n
q = 'SELECT x, y, z FROM %s ORDER BY x' % name
rows = upgraded.execute(q)
for i, row in enumerate(rows):
assert row['x'] == i
assert row['y'] == name
assert row['z'] == name
# Streams
for n in range(8):
name = 's_%d' % n
rows = list(upgraded.execute("SELECT oid FROM pg_class WHERE relkind = 'f' AND relname = '%s'" % name))
assert len(rows) == 1
# CVs
for n in range(32):
name = 'cv_%d' % n
rows = list(upgraded.execute('SELECT z, distinct_count, count FROM %s' % name))
assert len(rows) == 1
assert rows[0][0] == name
assert rows[0][1] == 1
assert rows[0][2] == 1000
# CV with TTL
row = list(upgraded.execute("SELECT ttl, ttl_attno FROM pg_class c JOIN pipelinedb.cont_query pq on c.oid = pq.relid WHERE c.relname = 'ttlcv'"))[0]
assert row[0] == 3600
assert row[1] == 1
# CVs in separate schema
for n in range(8):
name = 'namespace.cv_%d' % n
rows = list(upgraded.execute('SELECT z, distinct_count, count FROM %s' % name))
assert len(rows) == 1
assert rows[0][0] == name
assert rows[0][1] == 1
assert rows[0][2] == 1000
# Transforms with trigger functions
for n in range(8):
name = 'ct_%d' % n
q = """
SELECT c.relname FROM pg_class c JOIN pipelinedb.cont_query pq
ON c.oid = pq.relid WHERE pq.type = 't' AND c.relname = '%s'
""" % name
rows = list(upgraded.execute(q))
assert len(rows) == 1
# Transforms without trigger functions
for n in range(8):
name = 'ct_no_trig_%d' % n
q = """
SELECT c.relname FROM pg_class c JOIN pipelinedb.cont_query pq
ON c.oid = pq.relid WHERE pq.type = 't' AND c.relname = '%s'
""" % name
rows = list(upgraded.execute(q))
assert len(rows) == 1
# Verify SW CVs
for n in range(8):
name = 'sw_%d' % n
row = upgraded.execute("SELECT ttl, step_factor FROM pipelinedb.cont_query cq JOIN pg_class c ON cq.relid = c.oid WHERE relname = '%s'" % name)[0]
assert row['ttl'] == (n + 1) * 3600 * 24
assert row['step_factor'] == n + 1
row = upgraded.execute('SELECT count FROM %s' % name)[0]
assert row['count'] == 1000
# Verify renamed CVs/CTs/streams
for n in range(4):
row = upgraded.execute('SELECT combine(count) FROM renamed_cv_%d' % n)[0]
assert row['combine'] == 2000
row = upgraded.execute('SELECT combine(count) FROM renamed_ct_reader_%d' % n)[0]
assert row['combine'] == 2000
# Verify chained CVs
row = upgraded.execute('SELECT z, count, distinct_count FROM combine_cv_0')[0]
assert row['z'] == 'cv_0'
assert row['count'] == 1000
assert row['distinct_count'] == 1
row = upgraded.execute('SELECT count, distinct_count FROM combine_cv_1')[0]
assert row['count'] == 1000
assert row['distinct_count'] == 1
# Now insert some new data and verify CVs are still updating properly
for n in range(32):
name = 'cv_%d' % n
rows = [(x, name, name) for x in range(1000)]
upgraded.insert('stream_%d' % n, ('x', 'y', 'z'), rows)
for n in range(32):
name = 'cv_%d' % n
rows = list(upgraded.execute('SELECT z, distinct_count, count FROM %s' % name))
assert len(rows) == 1
assert rows[0][0] == name
assert rows[0][1] == 1
assert rows[0][2] == 2000
row = upgraded.execute('SELECT z, count, distinct_count FROM combine_cv_0')[0]
assert row['z'] == 'cv_0'
assert row['count'] == 2000
assert row['distinct_count'] == 1
row = upgraded.execute('SELECT count, distinct_count FROM combine_cv_1')[0]
assert row['count'] == 2000
assert row['distinct_count'] == 1
# Verify STJs
for n in range(8):
cv = 'stj_%d' % n
row = upgraded.execute('SELECT sum(count) FROM %s' % cv)[0]
assert row['sum'] == 2000
# Rename objects again before the second upgrade
for n in range(4):
upgraded.execute('ALTER FOREIGN TABLE renamed_s_%d RENAME TO renamed_again_s_%d' % (n, n))
upgraded.execute('ALTER VIEW renamed_cv_%d RENAME TO renamed_again_cv_%d' % (n, n))
upgraded.execute('ALTER VIEW renamed_ct_%d RENAME TO renamed_again_ct_%d' % (n, n))
upgraded.execute('ALTER VIEW renamed_ct_reader_%d RENAME TO renamed_again_ct_reader_%d' % (n, n))
# And write some data using the new stream names
rows = [(x,) for x in range(1000)]
upgraded.insert('renamed_again_s_%d' % n, ('x',), rows)
upgraded.stop()
new_data_dir1 = os.path.abspath('test_binary_upgrade_data_dir1')
if os.path.exists(new_data_dir1):
shutil.rmtree(new_data_dir1)
p = subprocess.Popen([
os.path.join(pipeline.bin_dir, 'initdb'), '-D', new_data_dir1])
stdout, stderr = p.communicate()
with open(os.path.join(new_data_dir1, 'postgresql.conf'), 'a') as f:
f.write('shared_preload_libraries=pipelinedb\n')
f.write('max_worker_processes=128\n')
f.write('pipelinedb.stream_insert_level=sync_commit\n')
# Now upgrade the upgraded DB to verify that restored DBs can be updated properly
result = subprocess.check_call([
os.path.join(pipeline.bin_dir, 'pg_upgrade'),
'-b', old_bin_dir, '-B', new_bin_dir,
'-d', new_data_dir0, '-D', new_data_dir1])
assert result == 0
# but let's manually verify that all objects were migrated to the new data directory
upgraded = PipelineDB(data_dir=new_data_dir1)
upgraded.run()
# Tables
for n in range(16):
name = 't_%d' % n
q = 'SELECT x, y, z FROM %s ORDER BY x' % name
rows = upgraded.execute(q)
for i, row in enumerate(rows):
assert row['x'] == i
assert row['y'] == name
assert row['z'] == name
# Streams
for n in range(8):
name = 's_%d' % n
rows = list(upgraded.execute("SELECT oid FROM pg_class WHERE relkind = 'f' AND relname = '%s'" % name))
assert len(rows) == 1
# CVs
for n in range(32):
name = 'cv_%d' % n
rows = list(upgraded.execute('SELECT z, distinct_count, count FROM %s' % name))
assert len(rows) == 1
assert rows[0][0] == name
assert rows[0][1] == 1
assert rows[0][2] == 2000
# CV with TTL
row = list(upgraded.execute("SELECT ttl, ttl_attno FROM pg_class c JOIN pipelinedb.cont_query pq on c.oid = pq.relid WHERE c.relname = 'ttlcv'"))[0]
assert row[0] == 3600
assert row[1] == 1
# CVs in separate schema
for n in range(8):
name = 'namespace.cv_%d' % n
rows = list(upgraded.execute('SELECT z, distinct_count, count FROM %s' % name))
assert len(rows) == 1
assert rows[0][0] == name
assert rows[0][1] == 1
assert rows[0][2] == 1000
# Transforms with trigger functions
for n in range(8):
name = 'ct_%d' % n
q = """
SELECT c.relname FROM pg_class c JOIN pipelinedb.cont_query pq
ON c.oid = pq.relid WHERE pq.type = 't' AND c.relname = '%s'
""" % name
rows = list(upgraded.execute(q))
assert len(rows) == 1
# Transforms without trigger functions
for n in range(8):
name = 'ct_no_trig_%d' % n
q = """
SELECT c.relname FROM pg_class c JOIN pipelinedb.cont_query pq
ON c.oid = pq.relid WHERE pq.type = 't' AND c.relname = '%s'
""" % name
rows = list(upgraded.execute(q))
assert len(rows) == 1
# Verify SW Cvs
for n in range(8):
name = 'sw_%d' % n
step_factor = n + 1
row = upgraded.execute("SELECT ttl, step_factor FROM pipelinedb.cont_query cq JOIN pg_class c ON cq.relid = c.oid WHERE relname = '%s'" % name)[0]
assert row['ttl'] == (n + 1) * 3600 * 24
assert row['step_factor'] == n + 1
row = upgraded.execute('SELECT count FROM %s' % name)[0]
assert row['count'] == 2000
# Verify renamed CVs/CTs/streams
for n in range(4):
row = upgraded.execute('SELECT combine(count) FROM renamed_again_cv_%d' % n)[0]
assert row['combine'] == 3000
row = upgraded.execute('SELECT combine(count) FROM renamed_again_ct_reader_%d' % n)[0]
assert row['combine'] == 3000
# Verify chained CV
row = upgraded.execute('SELECT z, count, distinct_count FROM combine_cv_0')[0]
assert row['z'] == 'cv_0'
assert row['count'] == 2000
assert row['distinct_count'] == 1
row = upgraded.execute('SELECT count, distinct_count FROM combine_cv_1')[0]
assert row['count'] == 2000
assert row['distinct_count'] == 1
# Now insert some new data and verify CVs are still updating properly
for n in range(32):
name = 'cv_%d' % n
rows = [(x, name, name) for x in range(1000)]
upgraded.insert('stream_%d' % n, ('x', 'y', 'z'), rows)
for n in range(32):
name = 'cv_%d' % n
rows = list(upgraded.execute('SELECT z, distinct_count, count FROM %s' % name))
assert len(rows) == 1
assert rows[0][0] == name
assert rows[0][1] == 1
assert rows[0][2] == 3000
row = upgraded.execute('SELECT z, count, distinct_count FROM combine_cv_0')[0]
assert row['z'] == 'cv_0'
assert row['count'] == 3000
assert row['distinct_count'] == 1
row = upgraded.execute('SELECT count, distinct_count FROM combine_cv_1')[0]
assert row['count'] == 3000
assert row['distinct_count'] == 1
# Verify STJs
for n in range(8):
cv = 'stj_%d' % n
row = upgraded.execute('SELECT sum(count) FROM %s' % cv)[0]
assert row['sum'] == 3000
upgraded.stop()
pipeline.execute('DROP VIEW combine_cv_0 CASCADE')
shutil.rmtree(new_data_dir0)
shutil.rmtree(new_data_dir1)
| 34.306636 | 150 | 0.648279 | 2,421 | 14,992 | 3.875258 | 0.101611 | 0.012151 | 0.021744 | 0.039864 | 0.811874 | 0.756342 | 0.728736 | 0.705287 | 0.675016 | 0.648049 | 0 | 0.028821 | 0.206177 | 14,992 | 436 | 151 | 34.385321 | 0.759516 | 0.109125 | 0 | 0.724026 | 0 | 0.042208 | 0.359651 | 0.054215 | 0 | 0 | 0 | 0 | 0.24026 | 1 | 0.003247 | false | 0 | 0.019481 | 0 | 0.022727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b82ef15a0f3f83ccdca11f828f19a2ba3111acf5 | 190 | py | Python | src/cool_compiler/codegen/__init__.py | matcom-school/cool-compiler-2021 | 0a982f1708ed948a45610035a597d6ff12bab22b | [
"MIT"
] | null | null | null | src/cool_compiler/codegen/__init__.py | matcom-school/cool-compiler-2021 | 0a982f1708ed948a45610035a597d6ff12bab22b | [
"MIT"
] | null | null | null | src/cool_compiler/codegen/__init__.py | matcom-school/cool-compiler-2021 | 0a982f1708ed948a45610035a597d6ff12bab22b | [
"MIT"
] | null | null | null | # from .v0_type_data_code.type_data_code_visitor import CILGenerate
from .v0_cool_to_cil.cool_to_cil_visitor import CoolToCIL
from .v1_mips_generate.mips_generate_visitor import MipsGenerate | 63.333333 | 67 | 0.9 | 31 | 190 | 5 | 0.516129 | 0.251613 | 0.154839 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016854 | 0.063158 | 190 | 3 | 68 | 63.333333 | 0.853933 | 0.342105 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b8475b6596d77c1638a9835ca2f78bc596836ce8 | 96 | py | Python | venv/lib/python3.8/site-packages/debugpy/_vendored/__init__.py | GiulianaPola/select_repeats | 17a0d053d4f874e42cf654dd142168c2ec8fbd11 | [
"MIT"
] | 2 | 2022-03-13T01:58:52.000Z | 2022-03-31T06:07:54.000Z | venv/lib/python3.8/site-packages/debugpy/_vendored/__init__.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | 19 | 2021-11-20T04:09:18.000Z | 2022-03-23T15:05:55.000Z | venv/lib/python3.8/site-packages/debugpy/_vendored/__init__.py | DesmoSearch/Desmobot | b70b45df3485351f471080deb5c785c4bc5c4beb | [
"MIT"
] | null | null | null | /home/runner/.cache/pip/pool/0e/68/84/646f8aaef4bf58771309b9f378f43b53d4a89fd87b00f6ec84748b91fb | 96 | 96 | 0.895833 | 9 | 96 | 9.555556 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.427083 | 0 | 96 | 1 | 96 | 96 | 0.46875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b86731e13ff45ccaf333c293999b87cc2d1a018d | 114 | py | Python | src/moz_image/__init__.py | mozkzki/moz-image | d56d36293c3c7b1ae59cb75d1a829122880463c4 | [
"MIT"
] | null | null | null | src/moz_image/__init__.py | mozkzki/moz-image | d56d36293c3c7b1ae59cb75d1a829122880463c4 | [
"MIT"
] | 30 | 2021-09-21T09:12:33.000Z | 2022-03-29T14:14:03.000Z | src/moz_image/__init__.py | yukkun007/mmimage | 819f2f07b43ea79216c803ac9d2450b56035a02e | [
"MIT"
] | null | null | null | from moz_image.main import resize, download, upload_to_gyazo
__all__ = ["resize", "download", "upload_to_gyazo"]
| 28.5 | 60 | 0.77193 | 16 | 114 | 4.9375 | 0.6875 | 0.35443 | 0.506329 | 0.556962 | 0.683544 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 114 | 3 | 61 | 38 | 0.77451 | 0 | 0 | 0 | 0 | 0 | 0.254386 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
b89bea4cb94d9408c515ca99fde4f3b395eb7336 | 27 | py | Python | src/euler_python_package/euler_python/medium/p189.py | wilsonify/euler | 5214b776175e6d76a7c6d8915d0e062d189d9b79 | [
"MIT"
] | null | null | null | src/euler_python_package/euler_python/medium/p189.py | wilsonify/euler | 5214b776175e6d76a7c6d8915d0e062d189d9b79 | [
"MIT"
] | null | null | null | src/euler_python_package/euler_python/medium/p189.py | wilsonify/euler | 5214b776175e6d76a7c6d8915d0e062d189d9b79 | [
"MIT"
] | null | null | null | def problem189():
pass
| 9 | 17 | 0.62963 | 3 | 27 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 0.259259 | 27 | 2 | 18 | 13.5 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 6 |
b228a18b1e8ab0d474fe0b6663e070d7dee58282 | 9,474 | py | Python | request_signer/tests/test_rest_client.py | imtapps/django-request-signer | b059d021b6e068245030ab682c2cff4318c83ca6 | [
"BSD-2-Clause"
] | 1 | 2017-01-23T19:21:23.000Z | 2017-01-23T19:21:23.000Z | request_signer/tests/test_rest_client.py | imtapps/django-request-signer | b059d021b6e068245030ab682c2cff4318c83ca6 | [
"BSD-2-Clause"
] | 14 | 2016-01-21T17:18:21.000Z | 2022-02-09T19:21:59.000Z | request_signer/tests/test_rest_client.py | imtapps/django-request-signer | b059d021b6e068245030ab682c2cff4318c83ca6 | [
"BSD-2-Clause"
] | 3 | 2016-01-25T19:32:21.000Z | 2016-08-23T15:37:38.000Z | import six
if six.PY3:
from django.test import mock
else:
import mock
from django import test
from request_signer.client.generic import Response, WebException
from django.test.utils import override_settings
from request_signer.client.generic.rest import BaseDjangoRestClient
class BaseDjangoRestClientTests(test.TestCase):
def setUp(self):
self.sut = BaseDjangoRestClient()
self.sut.BASE_API_ENDPOINT = "/api/"
def get_mock_response(self, **kwargs):
return mock.MagicMock(Response, autospec=True, **kwargs)
def test_build_endpoint_returns_endpoint_when_only_group_provided(self):
endpoint = self.sut.build_endpoint("1234")
self.assertEqual("/api/1234/", endpoint)
def test_build_endpoint_returns_endpoint_when_group_and_item_provided(self):
endpoint = self.sut.build_endpoint("1234", "item-detail")
self.assertEqual("/api/1234/item-detail/", endpoint)
def test_get_json_response_gets_response_with_accept_header(self):
endpoint = self.sut.build_endpoint("1234", "item-detail")
data = {"some_data": "lives here"}
with mock.patch.object(self.sut, "_get_response") as get_response:
self.sut._get_json_response("GET", endpoint, data)
get_response.assert_called_once_with("GET", endpoint, data, headers={"Accept": "application/json"})
def test_get_list_issues_get_json_response_for_endpoint(self):
with mock.patch.object(self.sut, "_get_json_response") as get_response:
self.sut.get_list("1234")
get_response.assert_called_once_with("GET", "/api/1234/")
def test_get_list_returns_json_response_when_successful(self):
response = self.get_mock_response()
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
result = self.sut.get_list("1234")
self.assertEqual(response.json, result)
def test_get_list_returns_json_response_when_404(self):
response = self.get_mock_response(status_code=404, is_successful=False)
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
result = self.sut.get_list("1234")
self.assertEqual(response.json, result)
def test_get_list_raises_web_exception_when_not_successful(self):
response = self.get_mock_response(status_code=500, is_successful=False)
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
with self.assertRaises(WebException) as e:
self.sut.get_list("1234")
self.assertEqual(str(response.read.return_value), str(e.exception.message))
def test_get_item_issues_get_json_response_for_endpoint(self):
with mock.patch.object(self.sut, "_get_json_response") as get_response:
self.sut.get_item("1234", "pk-3")
get_response.assert_called_once_with("GET", "/api/1234/pk-3/")
def test_get_item_returns_json_response_when_successful(self):
response = self.get_mock_response()
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
result = self.sut.get_item("1234", "pk-3")
self.assertEqual(response.json, result)
def test_get_item_returns_json_response_when_404(self):
response = self.get_mock_response(status_code=404, is_successful=False)
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
result = self.sut.get_item("1234", "pk-3")
self.assertEqual(response.json, result)
def test_get_item_raises_web_exception_when_not_successful(self):
response = self.get_mock_response(status_code=500, is_successful=False)
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
with self.assertRaises(WebException) as e:
self.sut.get_item("1234", "pk-3")
self.assertEqual(str(response.read.return_value), str(e.exception.message))
def test_create_issues_get_json_response_for_endpoint(self):
data = {'some_data': 'to send'}
with mock.patch.object(self.sut, "_get_json_response") as get_response:
self.sut.create("1234", **data)
get_response.assert_called_once_with("POST", "/api/1234/", data=data)
def test_create_returns_json_response_when_successful(self):
response = self.get_mock_response()
data = {'some_data': 'to send'}
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
result = self.sut.create("1234", **data)
self.assertEqual(response.json, result)
def test_create_raises_web_exception_when_not_successful(self):
response = self.get_mock_response(status_code=400, is_successful=False)
data = {'some_data': 'to send'}
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
with self.assertRaises(WebException) as e:
self.sut.create("1234", **data)
self.assertEqual(str(response.read.return_value), str(e.exception.message))
def test_update_issues_get_json_response_for_endpoint(self):
data = {'some_data': 'to send'}
with mock.patch.object(self.sut, "_get_json_response") as get_response:
self.sut.update("1234", "pk-3", **data)
expected_data = dict(data, _method="PUT")
get_response.assert_called_once_with("POST", "/api/1234/pk-3/", data=expected_data)
def test_update_returns_json_response_when_successful(self):
response = self.get_mock_response()
data = {'some_data': 'to send'}
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
result = self.sut.update("1234", "pk-3", **data)
self.assertEqual(response.json, result)
def test_update_raises_web_exception_when_not_successful(self):
response = self.get_mock_response(status_code=404, is_successful=False)
data = {'some_data': 'to send'}
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
with self.assertRaises(WebException) as e:
self.sut.update("1234", "pk-3", **data)
self.assertEqual(str(response.read.return_value), str(e.exception.message))
def test_delete_issues_get_json_response_for_endpoint(self):
with mock.patch.object(self.sut, "_get_json_response") as get_response:
self.sut.delete("1234", "pk-3")
get_response.assert_called_once_with("POST", "/api/1234/pk-3/", data={"_method": "DELETE"})
def test_delete_returns_json_response_when_successful(self):
response = self.get_mock_response()
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
result = self.sut.delete("1234", "pk-3")
self.assertEqual(response.json, result)
def test_delete_returns_json_response_when_404(self):
response = self.get_mock_response(status_code=404, is_successful=False)
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
result = self.sut.delete("1234", "pk-3")
self.assertEqual(response.json, result)
def test_delete_raises_web_exception_when_not_successful(self):
response = self.get_mock_response(status_code=500, is_successful=False)
with mock.patch.object(self.sut, "_get_json_response") as get_response:
get_response.return_value = response
with self.assertRaises(WebException) as e:
self.sut.delete("1234", "pk-3")
self.assertEqual(str(response.read.return_value), str(e.exception.message))
class BaseDjangoRestClientInitTests(test.TestCase):
@override_settings(TEST_DOMAIN='my_domain')
@override_settings(TEST_CLIENT_ID='my_client_id')
@override_settings(TEST_PRIVATE_KEY='my_private_key')
def test_uses_django_settings_by_default_for_api_credentials(self):
class SomeClient(BaseDjangoRestClient):
domain_settings_name = 'TEST_DOMAIN'
client_id_settings_name = 'TEST_CLIENT_ID'
private_key_settings_name = 'TEST_PRIVATE_KEY'
rest_client = SomeClient()
self.assertEqual('my_domain', rest_client._base_url)
self.assertEqual('my_client_id', rest_client._client_id)
self.assertEqual('my_private_key', rest_client._private_key)
def test_will_use_provided_settings_when_available(self):
class SomeProvider(object):
base_url = "my_domain"
client_id = "my_client_id"
private_key = "my_private_key"
class SomeClient(BaseDjangoRestClient):
pass
provider = SomeProvider()
rest_client = SomeClient(provider)
self.assertEqual('my_domain', rest_client._base_url)
self.assertEqual('my_client_id', rest_client._client_id)
self.assertEqual('my_private_key', rest_client._private_key)
| 43.658986 | 107 | 0.699704 | 1,234 | 9,474 | 5.02107 | 0.094814 | 0.04858 | 0.04519 | 0.058263 | 0.809232 | 0.781795 | 0.776953 | 0.737895 | 0.712879 | 0.66785 | 0 | 0.020986 | 0.195271 | 9,474 | 216 | 108 | 43.861111 | 0.79171 | 0 | 0 | 0.574074 | 0 | 0 | 0.099958 | 0.002322 | 0 | 0 | 0 | 0 | 0.197531 | 1 | 0.154321 | false | 0.006173 | 0.04321 | 0.006173 | 0.234568 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b23b2549addeacb69d3a26905d7b415a79c66cf8 | 153 | py | Python | grand/grand/doctype/order_tracking_item/test_order_tracking_item.py | exvas/grand | 0f090de3dc3b3a11921f06e597fb357bb75e2631 | [
"MIT"
] | null | null | null | grand/grand/doctype/order_tracking_item/test_order_tracking_item.py | exvas/grand | 0f090de3dc3b3a11921f06e597fb357bb75e2631 | [
"MIT"
] | null | null | null | grand/grand/doctype/order_tracking_item/test_order_tracking_item.py | exvas/grand | 0f090de3dc3b3a11921f06e597fb357bb75e2631 | [
"MIT"
] | null | null | null | # Copyright (c) 2021, sammish and Contributors
# See license.txt
# import frappe
import unittest
class TestOrderTrackingItem(unittest.TestCase):
pass
| 17 | 47 | 0.79085 | 18 | 153 | 6.722222 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030303 | 0.137255 | 153 | 8 | 48 | 19.125 | 0.886364 | 0.48366 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
b25a2d58ca8f67e9097fc2a143e56cfd43015ff0 | 9,697 | py | Python | polls/tests/test_polls_index.py | parsampsh/besanj-django | 0ebc3e4294c9c004ffc882353f4fbba724f07b1a | [
"MIT"
] | 2 | 2022-03-26T07:48:57.000Z | 2022-03-28T12:46:15.000Z | polls/tests/test_polls_index.py | parsampsh/besanj-django | 0ebc3e4294c9c004ffc882353f4fbba724f07b1a | [
"MIT"
] | null | null | null | polls/tests/test_polls_index.py | parsampsh/besanj-django | 0ebc3e4294c9c004ffc882353f4fbba724f07b1a | [
"MIT"
] | null | null | null | from django.test import TestCase, Client
from polls.models import *
from account.models import Profile
class TestPollsIndex(TestCase):
def setUp(self):
self.client = Client()
self.user1 = User(username="a", password='123')
self.user1.save()
self.user1.profile = Profile(api_token='1')
self.user1.profile.save()
self.user2 = User(username="b", password='123')
self.user2.save()
self.user2.profile = Profile(api_token='2')
self.user2.profile.save()
for i in range(0, 350):
if i < 150:
u = self.user1
else:
u = self.user2
poll = Poll(title='poll ' + str(i), user=u, description='the description')
if i % 2 == 0:
poll.is_published = True
poll.save()
for j in range(0, 4):
choice = Choice(title='choice ' + str(j), poll=poll, sort=j)
choice.save()
def test_polls_will_be_shown_for_a_specific_user(self):
res = self.client.get('/polls/?user_id=123456')
self.assertEqual(res.status_code, 404)
res = self.client.get('/polls/?user_id=' + str(self.user1.id))
res_json = res.json()
self.assertEqual(res_json['all_count'], 75)
self.assertEqual(res_json['pages_count'], 2)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 50)
res = self.client.get('/polls/?page=2&user_id=' + str(self.user1.id))
res_json = res.json()
self.assertEqual(res_json['all_count'], 75)
self.assertEqual(res_json['pages_count'], 2)
self.assertEqual(res_json['current_page'], 2)
self.assertEqual(len(res_json['polls']), 25)
res = self.client.get('/polls/?user_id=' + str(self.user2.id))
res_json = res.json()
self.assertEqual(res_json['all_count'], 100)
self.assertEqual(res_json['pages_count'], 2)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 50)
res = self.client.get('/polls/?page=2&user_id=' + str(self.user2.id))
res_json = res.json()
self.assertEqual(res_json['all_count'], 100)
self.assertEqual(res_json['pages_count'], 2)
self.assertEqual(res_json['current_page'], 2)
self.assertEqual(len(res_json['polls']), 50)
res = self.client.get('/polls/?user_id=' + str(self.user2.id), HTTP_TOKEN='invalid')
res_json = res.json()
self.assertEqual(res_json['all_count'], 100)
self.assertEqual(res_json['pages_count'], 2)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 50)
self.assertFalse(res_json['polls'][0]['belongs_to_you'])
res = self.client.get('/polls/?user_id=' + str(self.user2.id), HTTP_TOKEN=self.user2.profile.api_token)
res_json = res.json()
self.assertEqual(res_json['all_count'], 200)
self.assertEqual(res_json['pages_count'], 4)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 50)
self.assertTrue(res_json['polls'][0]['belongs_to_you'])
def test_single_poll_can_be_shown(self):
poll1 = self.user1.poll_set.all()[0]
poll1.is_published = False
poll1.save()
poll2 = self.user1.poll_set.all()[1]
poll2.is_published = True
poll2.save()
res = self.client.get('/polls/?single_poll_id=' + str(poll2.id))
self.assertEqual(res.json()['polls'][0]['title'], poll2.title)
res = self.client.get('/polls/?single_poll_id=123456')
self.assertEqual(res.status_code, 404)
res = self.client.get('/polls/?single_poll_id=' + str(poll1.id))
self.assertEqual(res.status_code, 404)
def test_polls_index_works_correctly(self):
res = self.client.get('/polls/')
self.assertEqual(res.status_code, 200)
res_json = res.json()
self.assertEqual(res_json['all_count'], 175)
self.assertEqual(res_json['pages_count'], 4)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 50)
res = self.client.get('/polls/?page=4')
self.assertEqual(res.status_code, 200)
res_json = res.json()
self.assertEqual(res_json['all_count'], 175)
self.assertEqual(res_json['pages_count'], 4)
self.assertEqual(res_json['current_page'], 4)
self.assertEqual(len(res_json['polls']), 25)
def test_single_poll_json_data(self):
poll = self.user1.poll_set.all()[1]
poll.is_published = True
poll.save()
choice = poll.choice_set.all()[0]
choice.users.add(self.user2)
res = self.client.get('/polls/?single_poll_id=' + str(poll.id))
poll_json = res.json()['polls'][0]
self.assertEqual(poll_json['id'], poll.id)
self.assertEqual(poll_json['title'], poll.title)
self.assertEqual(poll_json['description'], poll.description)
self.assertEqual(poll_json['is_published'], True)
self.assertEqual(poll_json['created_at'], str(poll.created_at))
self.assertEqual(poll_json['user']['username'], poll.user.username)
self.assertEqual(poll_json['user']['email'], poll.user.email)
self.assertEqual(poll_json['choices'][0]['id'], choice.id)
self.assertEqual(poll_json['choices'][0]['title'], choice.title)
self.assertEqual(poll_json['choices'][0]['sort'], choice.sort)
self.assertEqual(poll_json['choices'][0]['votes_count'], 1)
self.assertEqual(poll_json['choices'][0]['votes_percent'], 100)
def test_search_works_correctly(self):
res = self.client.get('/polls/?search=poll')
self.assertEqual(res.json()['all_count'], 175)
res = self.client.get('/polls/?search=ol')
self.assertEqual(res.json()['all_count'], 175)
res = self.client.get('/polls/?search=hello')
self.assertEqual(res.json()['all_count'], 0)
res = self.client.get('/polls/?search=the des')
self.assertEqual(res.json()['all_count'], 175)
def test_user_can_see_their_votes(self):
res = self.client.get('/polls/my_votes/')
self.assertEqual(res.status_code, 401)
res = self.client.get('/polls/my_votes/', HTTP_TOKEN=self.user1.profile.api_token)
self.assertEqual(res.status_code, 200)
res_json = res.json()
self.assertEqual(res_json['all_count'], 0)
self.assertEqual(res_json['pages_count'], 1)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 0)
polls = self.user2.poll_set.filter(is_published=True).all()[0:80]
for poll in polls:
poll.choice_set.all()[0].users.add(self.user1)
res = self.client.get('/polls/my_votes/', HTTP_TOKEN=self.user1.profile.api_token)
self.assertEqual(res.status_code, 200)
res_json = res.json()
self.assertEqual(res_json['all_count'], 80)
self.assertEqual(res_json['pages_count'], 2)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 50)
res = self.client.get('/polls/my_votes/?search=320', HTTP_TOKEN=self.user1.profile.api_token)
self.assertEqual(res.status_code, 200)
res_json = res.json()
self.assertEqual(res_json['all_count'], 0)
self.assertEqual(res_json['pages_count'], 1)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 0)
res = self.client.get('/polls/my_votes/?search=308', HTTP_TOKEN=self.user1.profile.api_token)
self.assertEqual(res.status_code, 200)
res_json = res.json()
self.assertEqual(res_json['all_count'], 1)
self.assertEqual(res_json['pages_count'], 1)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 1)
self.assertEqual(res_json['polls'][0]['title'], 'poll 308')
res = self.client.get('/polls/my_votes/?page=gfdg', HTTP_TOKEN=self.user1.profile.api_token)
self.assertEqual(res.status_code, 200)
res_json = res.json()
self.assertEqual(res_json['all_count'], 80)
self.assertEqual(res_json['pages_count'], 2)
self.assertEqual(res_json['current_page'], 1)
self.assertEqual(len(res_json['polls']), 50)
res = self.client.get('/polls/my_votes/?page=2', HTTP_TOKEN=self.user1.profile.api_token)
self.assertEqual(res.status_code, 200)
res_json = res.json()
self.assertEqual(res_json['all_count'], 80)
self.assertEqual(res_json['pages_count'], 2)
self.assertEqual(res_json['current_page'], 2)
self.assertEqual(len(res_json['polls']), 30)
self.assertTrue('selected_choice' in res_json['polls'][0])
self.user1.choice_set.clear()
polls = Poll.objects.order_by('-created_at').filter(is_published=True).all()[0:80]
for poll in polls:
poll.choice_set.all()[0].users.add(self.user1)
res = self.client.get('/polls/?page=2', HTTP_TOKEN=self.user1.profile.api_token)
self.assertEqual(res.status_code, 200)
res_json = res.json()
self.assertEqual(res_json['all_count'], 175)
self.assertEqual(res_json['pages_count'], 4)
self.assertEqual(res_json['current_page'], 2)
self.assertEqual(len(res_json['polls']), 50)
self.assertTrue('selected_choice' in res_json['polls'][0])
self.assertFalse('selected_choice' in res_json['polls'][40])
| 42.90708 | 111 | 0.630814 | 1,315 | 9,697 | 4.458555 | 0.091255 | 0.121781 | 0.196486 | 0.19137 | 0.794474 | 0.761044 | 0.711922 | 0.654102 | 0.62323 | 0.606686 | 0 | 0.033399 | 0.209549 | 9,697 | 225 | 112 | 43.097778 | 0.731507 | 0 | 0 | 0.518717 | 0 | 0 | 0.144581 | 0.027741 | 0 | 0 | 0 | 0 | 0.513369 | 1 | 0.037433 | false | 0.010695 | 0.016043 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b262f39f26e462802383dd812734c5912e4df320 | 20 | py | Python | elliot/evaluation/metrics/accuracy/mrr/__init__.py | gategill/elliot | 113763ba6d595976e14ead2e3d460d9705cd882e | [
"Apache-2.0"
] | 175 | 2021-03-04T15:46:25.000Z | 2022-03-31T05:56:58.000Z | elliot/evaluation/metrics/accuracy/mrr/__init__.py | gategill/elliot | 113763ba6d595976e14ead2e3d460d9705cd882e | [
"Apache-2.0"
] | 15 | 2021-03-06T17:53:56.000Z | 2022-03-24T17:02:07.000Z | elliot/evaluation/metrics/accuracy/mrr/__init__.py | gategill/elliot | 113763ba6d595976e14ead2e3d460d9705cd882e | [
"Apache-2.0"
] | 39 | 2021-03-04T15:46:26.000Z | 2022-03-09T15:37:12.000Z | from .mrr import MRR | 20 | 20 | 0.8 | 4 | 20 | 4 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 20 | 1 | 20 | 20 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b293438f965c4269873f50ee367ce00a1a8b28f3 | 65 | py | Python | kloppy/domain/services/matchers/pattern/regexp/__init__.py | cedrickrause/kloppy | 29a997e144c8abb5c18816e2d3e04288b818931d | [
"BSD-3-Clause"
] | 176 | 2020-04-24T09:12:05.000Z | 2022-03-27T07:03:44.000Z | kloppy/domain/services/matchers/pattern/regexp/__init__.py | cedrickrause/kloppy | 29a997e144c8abb5c18816e2d3e04288b818931d | [
"BSD-3-Clause"
] | 95 | 2020-04-24T18:37:36.000Z | 2022-03-23T21:59:10.000Z | kloppy/domain/services/matchers/pattern/regexp/__init__.py | cedrickrause/kloppy | 29a997e144c8abb5c18816e2d3e04288b818931d | [
"BSD-3-Clause"
] | 39 | 2020-05-08T21:45:26.000Z | 2022-03-19T09:29:41.000Z | from .ast import *
from .matchers import *
from .regexp import *
| 16.25 | 23 | 0.723077 | 9 | 65 | 5.222222 | 0.555556 | 0.425532 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184615 | 65 | 3 | 24 | 21.666667 | 0.886792 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b293d7c5344f388ce307a4b99b5d3ec468a033c4 | 154 | py | Python | src/predictive_model/regression/apps.py | HitLuca/predict-python | 14f2f55cb29f817a5871d4c0b11a3758285301ca | [
"MIT"
] | 12 | 2018-06-27T08:09:18.000Z | 2021-10-10T22:19:04.000Z | src/predictive_model/regression/apps.py | HitLuca/predict-python | 14f2f55cb29f817a5871d4c0b11a3758285301ca | [
"MIT"
] | 17 | 2018-06-12T17:36:11.000Z | 2020-11-16T21:23:22.000Z | src/predictive_model/regression/apps.py | HitLuca/predict-python | 14f2f55cb29f817a5871d4c0b11a3758285301ca | [
"MIT"
] | 16 | 2018-08-02T14:40:17.000Z | 2021-11-12T12:28:46.000Z | from src.predictive_model.apps import PredictiveModelConfig
class RegressionConfig(PredictiveModelConfig):
name = 'src.predictive_model.regression'
| 25.666667 | 59 | 0.837662 | 15 | 154 | 8.466667 | 0.733333 | 0.204724 | 0.283465 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097403 | 154 | 5 | 60 | 30.8 | 0.913669 | 0 | 0 | 0 | 0 | 0 | 0.201299 | 0.201299 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
b2a0d69d1431334bad1a1403cdb4cdfe38bd5a59 | 5,169 | py | Python | phase-4/movements.py | joshdabosh/autopack | e8617e1ddce77bea212fe46443187824d438efb9 | [
"MIT"
] | null | null | null | phase-4/movements.py | joshdabosh/autopack | e8617e1ddce77bea212fe46443187824d438efb9 | [
"MIT"
] | 1 | 2019-02-19T01:17:40.000Z | 2019-07-18T14:19:25.000Z | phase-4/movements.py | joshdabosh/autopack | e8617e1ddce77bea212fe46443187824d438efb9 | [
"MIT"
] | null | null | null | import RPi.GPIO as GPIO
import time
class robot():
def __init__(self):
self.TRIG = 24
self.ECHO = 26
GPIO.setmode(GPIO.BOARD)
GPIO.setup(7, GPIO.OUT)
GPIO.setup(11, GPIO.OUT)
GPIO.setup(13, GPIO.OUT)
GPIO.setup(15, GPIO.OUT)
GPIO.setup(12, GPIO.OUT)
GPIO.setup(16, GPIO.OUT)
GPIO.setup(18, GPIO.OUT)
GPIO.setup(22, GPIO.OUT)
GPIO.setup(TRIG, GPIO.OUT)
GPIO.setup(ECHO, GPIO.IN)
GPIO.output(7, False)
GPIO.output(11, False)
GPIO.output(13, False)
GPIO.output(15, False)
GPIO.output(12, False)
GPIO.output(16, False)
GPIO.output(18, False)
GPIO.output(22, False)
GPIO.output(TRIG, False)
def forward(self):
GPIO.output(11, False)
GPIO.output(15, False)
GPIO.output(16, False)
GPIO.output(22, False)
GPIO.output(7, True)
GPIO.output(13, True)
GPIO.output(12, True)
GPIO.output(18, True)
def backup(self):
GPIO.output(7, False)
GPIO.output(13, False)
GPIO.output(12, False)
GPIO.output(18, False)
GPIO.output(11, True)
GPIO.output(15, True)
GPIO.output(16, True)
GPIO.output(22, True)
def left_forward(self):
GPIO.output(18, False)
GPIO.output(15, False)
GPIO.output(16, False)
GPIO.output(22, False)
GPIO.output(11, False)
GPIO.output(7, True)
GPIO.output(13, True)
GPIO.output(12, True)
def right_forward(self):
GPIO.output(16, False)
GPIO.output(15, False)
GPIO.output(12, False)
GPIO.output(22, False)
GPIO.output(11, False)
GPIO.output(7, True)
GPIO.output(13, True)
GPIO.output(18, True)
def left(self):
GPIO.output(7, False)
GPIO.output(15, False)
GPIO.output(12, False)
GPIO.output(18, False)
GPIO.output(22, False)
GPIO.output(11, True)
GPIO.output(16, True)
GPIO.output(13, True)
def right(self):
GPIO.output(11, False)
GPIO.output(13, False)
GPIO.output(12, False)
GPIO.output(16, False)
GPIO.output(18, False)
GPIO.output(7, True)
GPIO.output(15, True)
GPIO.output(22, True)
def left_backward(self):
GPIO.output(7, False)
GPIO.output(13, False)
GPIO.output(15, False)
GPIO.output(12, False)
GPIO.output(18, False)
GPIO.output(11, True)
GPIO.output(16, True)
GPIO.output(22, True)
def right_backward(self):
GPIO.output(7, False)
GPIO.output(11, False)
GPIO.output(13, False)
GPIO.output(12, False)
GPIO.output(18, False)
GPIO.output(15, True)
GPIO.output(16, True)
GPIO.output(22, True)
def stop(self):
GPIO.output(7, False)
GPIO.output(11, False)
GPIO.output(13, False)
GPIO.output(15, False)
GPIO.output(12, False)
GPIO.output(16, False)
GPIO.output(18, False)
GPIO.output(22, False)
def pause(self):
GPIO.output(7, False)
GPIO.output(11, False)
GPIO.output(13, False)
GPIO.output(15, False)
GPIO.output(12, False)
GPIO.output(16, False)
GPIO.output(18, False)
GPIO.output(22, False)
def scan_for_obstacles(self):
# tells the sensor to fire a burst of sound
GPIO.output(TRIG, True)
time.sleep(0.00001)
GPIO.output(TRIG, False)
while GPIO.input(ECHO) == 0:
pass
startTime = time.time()
while GPIO.input(ECHO) == 1:
pass
stopTime = time.time()
distance = (stopTime-startTime) * 17000
return distance
| 28.092391 | 59 | 0.414007 | 512 | 5,169 | 4.160156 | 0.119141 | 0.42723 | 0.422535 | 0.067606 | 0.72723 | 0.724413 | 0.705634 | 0.661033 | 0.653521 | 0.653521 | 0 | 0.074848 | 0.490811 | 5,169 | 183 | 60 | 28.245902 | 0.734422 | 0.007932 | 0 | 0.71875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.09375 | false | 0.015625 | 0.015625 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
b2af987d52e1faf6a1a24b37561c658aae773827 | 104 | py | Python | pandas-stubs/core/indexes/datetimes.py | commonstock/data-science-types | 58bbe6df01dea593a5564602508e71e03b9fa5b4 | [
"Apache-2.0"
] | null | null | null | pandas-stubs/core/indexes/datetimes.py | commonstock/data-science-types | 58bbe6df01dea593a5564602508e71e03b9fa5b4 | [
"Apache-2.0"
] | null | null | null | pandas-stubs/core/indexes/datetimes.py | commonstock/data-science-types | 58bbe6df01dea593a5564602508e71e03b9fa5b4 | [
"Apache-2.0"
] | null | null | null | from .datetimelike import DatetimeTimedeltaMixin
class DatetimeIndex(DatetimeTimedeltaMixin):
...
| 17.333333 | 48 | 0.807692 | 7 | 104 | 12 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 104 | 5 | 49 | 20.8 | 0.923077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a248ddc41ced6cd3f611cd0d1ecd500963001565 | 179 | py | Python | pynars/NARS/InferenceEngine/GeneralEngine/Rules/__init__.py | AIxer/PyNARS | 443b6a5e1c9779a1b861df1ca51ce5a190998d2e | [
"MIT"
] | null | null | null | pynars/NARS/InferenceEngine/GeneralEngine/Rules/__init__.py | AIxer/PyNARS | 443b6a5e1c9779a1b861df1ca51ce5a190998d2e | [
"MIT"
] | null | null | null | pynars/NARS/InferenceEngine/GeneralEngine/Rules/__init__.py | AIxer/PyNARS | 443b6a5e1c9779a1b861df1ca51ce5a190998d2e | [
"MIT"
] | null | null | null | from .NAL1 import *
from .NAL2 import *
from .NAL3 import *
from .NAL4 import *
from .NAL5 import *
from .NAL6 import *
from .NAL7 import *
from .NAL8 import *
from .NAL9 import * | 19.888889 | 19 | 0.703911 | 27 | 179 | 4.666667 | 0.407407 | 0.634921 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.195531 | 179 | 9 | 20 | 19.888889 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a2751a7edda0d0dd56f33457cba111857d539a10 | 102 | py | Python | obpy/util.py | OpenBikes/obpy | 2d55d2e991a71950976cac067b4b73224f2458ba | [
"MIT"
] | null | null | null | obpy/util.py | OpenBikes/obpy | 2d55d2e991a71950976cac067b4b73224f2458ba | [
"MIT"
] | null | null | null | obpy/util.py | OpenBikes/obpy | 2d55d2e991a71950976cac067b4b73224f2458ba | [
"MIT"
] | null | null | null | def remove_none_values_from_dict(dict_obj):
return dict((k, v) for k, v in dict_obj.items() if v)
| 34 | 57 | 0.72549 | 21 | 102 | 3.238095 | 0.666667 | 0.205882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156863 | 102 | 2 | 58 | 51 | 0.790698 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
a2d228647f16fe658bec55e82795298ce783ba20 | 26,608 | py | Python | microraiden/test/test_channel_manager.py | andrevmatos/microraiden | 2d51e78afaf3c0a8ddab87e59a5260c0064cdbdd | [
"MIT"
] | 417 | 2017-09-19T19:06:23.000Z | 2021-11-28T05:39:23.000Z | microraiden/test/test_channel_manager.py | andrevmatos/microraiden | 2d51e78afaf3c0a8ddab87e59a5260c0064cdbdd | [
"MIT"
] | 259 | 2017-09-19T20:42:57.000Z | 2020-11-18T01:31:41.000Z | microraiden/test/test_channel_manager.py | andrevmatos/microraiden | 2d51e78afaf3c0a8ddab87e59a5260c0064cdbdd | [
"MIT"
] | 126 | 2017-09-19T17:11:39.000Z | 2020-12-17T17:05:27.000Z | import logging
from itertools import count
from typing import List
from eth_utils import is_same_address, encode_hex
from web3 import Web3
from web3.contract import Contract
from microraiden import Client
from microraiden.client import Channel
from microraiden.utils import get_logs, sign_balance_proof, privkey_to_addr
from microraiden.exceptions import InvalidBalanceProof, NoOpenChannel, InvalidBalanceAmount
from microraiden.test.fixtures.channel_manager import start_channel_manager
from microraiden.channel_manager import ChannelManager
from microraiden.test.config import (
RECEIVER_ETH_ALLOWANCE,
RECEIVER_TOKEN_ALLOWANCE
)
import gevent
import pytest
log = logging.getLogger(__name__)
@pytest.fixture
def confirmed_open_channel(
channel_manager: ChannelManager,
client: Client,
receiver_address: str,
wait_for_blocks
):
channel = client.open_channel(receiver_address, 10)
wait_for_blocks(channel_manager.n_confirmations + 1)
gevent.sleep(channel_manager.blockchain.poll_interval)
assert (channel.sender, channel.block) in channel_manager.channels
return channel
def test_channel_opening(
client: Client,
web3: Web3,
make_account,
private_keys: List[str],
channel_manager_contract,
token_contract,
mine_sync_event,
wait_for_blocks,
use_tester,
state_db_path
):
receiver1_privkey = make_account(
RECEIVER_ETH_ALLOWANCE,
RECEIVER_TOKEN_ALLOWANCE,
private_keys[2]
)
receiver2_privkey = make_account(
RECEIVER_ETH_ALLOWANCE,
RECEIVER_TOKEN_ALLOWANCE,
private_keys[3]
)
receiver_address = privkey_to_addr(receiver1_privkey)
# make sure channel_manager1 is terminated properly, otherwise Blockchain will be running
# in the background, ruining other tests' results
channel_manager1 = ChannelManager(
web3,
channel_manager_contract,
token_contract,
receiver1_privkey,
n_confirmations=5,
state_filename=state_db_path
)
start_channel_manager(channel_manager1, use_tester, mine_sync_event)
channel_manager2 = ChannelManager(
web3,
channel_manager_contract,
token_contract,
receiver2_privkey,
n_confirmations=5,
state_filename=state_db_path
)
start_channel_manager(channel_manager2, use_tester, mine_sync_event)
channel_manager1.wait_sync()
channel_manager2.wait_sync()
blockchain = channel_manager1.blockchain
channel = client.open_channel(receiver_address, 10)
# should be in unconfirmed channels
wait_for_blocks(1)
gevent.sleep(blockchain.poll_interval)
assert (channel.sender, channel.block) not in channel_manager1.channels
assert (channel.sender, channel.block) in channel_manager1.unconfirmed_channels
channel_rec = channel_manager1.unconfirmed_channels[channel.sender, channel.block]
assert is_same_address(channel_rec.receiver, receiver_address)
assert is_same_address(channel_rec.sender, channel.sender)
assert channel_rec.mtime == channel_rec.ctime
# should be confirmed after n blocks
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
assert (channel.sender, channel.block) in channel_manager1.channels
channel_rec = channel_manager1.channels[channel.sender, channel.block]
assert is_same_address(channel_rec.receiver, receiver_address)
assert is_same_address(channel_rec.sender, channel.sender)
assert channel_rec.balance == 0
assert channel_rec.last_signature is None
assert channel_rec.is_closed is False
assert channel_rec.settle_timeout == -1
# should not appear in other channel manager
assert (channel.sender, channel.block) not in channel_manager2.channels
assert (channel.sender, channel.block) not in channel_manager2.unconfirmed_channels
channel_manager1.stop()
channel_manager2.stop()
def test_close_unconfirmed_event(
channel_manager: ChannelManager,
client: Client,
receiver_address: str,
wait_for_blocks
):
channel_manager.wait_sync()
blockchain = channel_manager.blockchain
# if unconfirmed channel is closed it should simply be forgotten
channel = client.open_channel(receiver_address, 10)
wait_for_blocks(1)
gevent.sleep(blockchain.poll_interval)
assert (channel.sender, channel.block) in channel_manager.unconfirmed_channels
assert (channel.sender, channel.block) not in channel_manager.channels
channel.close()
wait_for_blocks(channel_manager.blockchain.n_confirmations) # opening confirmed
gevent.sleep(blockchain.poll_interval)
assert (channel.sender, channel.block) not in channel_manager.unconfirmed_channels
assert (channel.sender, channel.block) in channel_manager.channels
wait_for_blocks(1) # closing confirmed
gevent.sleep(blockchain.poll_interval)
assert (channel.sender, channel.block) not in channel_manager.unconfirmed_channels
assert (channel.sender, channel.block) in channel_manager.channels
def test_close_confirmed_event(
channel_manager: ChannelManager,
confirmed_open_channel: Channel,
wait_for_blocks
):
blockchain = channel_manager.blockchain
channel_manager.wait_sync()
confirmed_open_channel.close()
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
channel_id = (confirmed_open_channel.sender, confirmed_open_channel.block)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.is_closed is True
settle_block = channel_manager.channel_manager_contract.call().getChannelInfo(
channel_rec.sender,
channel_rec.receiver,
channel_rec.open_block_number
)[2]
assert channel_rec.settle_timeout == settle_block
def test_channel_settled_event(
channel_manager: ChannelManager,
confirmed_open_channel: Channel,
wait_for_blocks,
web3: Web3,
use_tester: bool
):
if not use_tester:
pytest.skip('This test takes several hours on real blockchains.')
blockchain = channel_manager.blockchain
channel_manager.wait_sync()
channel_id = (confirmed_open_channel.sender, confirmed_open_channel.block)
confirmed_open_channel.close()
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
channel_rec = channel_manager.channels[channel_id]
wait_for_blocks(channel_rec.settle_timeout - web3.eth.blockNumber)
gevent.sleep(blockchain.poll_interval)
assert web3.eth.blockNumber >= channel_rec.settle_timeout
assert channel_id in channel_manager.channels
confirmed_open_channel.settle()
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
assert channel_id not in channel_manager.channels
def test_topup(
channel_manager: ChannelManager,
confirmed_open_channel: Channel,
wait_for_blocks
):
blockchain = channel_manager.blockchain
channel_manager.wait_sync()
channel_id = (confirmed_open_channel.sender, confirmed_open_channel.block)
confirmed_open_channel.topup(5)
wait_for_blocks(1)
gevent.sleep(blockchain.poll_interval)
channel_rec = channel_manager.channels[channel_id]
topup_txs = channel_rec.unconfirmed_topups
assert len(topup_txs) == 1 and list(topup_txs.values())[0] == 5
wait_for_blocks(channel_manager.blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
channel_rec = channel_manager.channels[channel_id]
topup_txs = channel_rec.unconfirmed_topups
assert len(topup_txs) == 0
assert channel_rec.deposit == 15
def test_unconfirmed_topup(
channel_manager: ChannelManager,
client: Client,
receiver_address: str,
wait_for_blocks
):
blockchain = channel_manager.blockchain
channel_manager.wait_sync()
channel = client.open_channel(receiver_address, 10)
wait_for_blocks(1)
gevent.sleep(blockchain.poll_interval)
assert (channel.sender, channel.block) in channel_manager.unconfirmed_channels
channel.topup(5)
wait_for_blocks(channel_manager.blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
assert (channel.sender, channel.block) in channel_manager.channels
channel_rec = channel_manager.channels[channel.sender, channel.block]
assert channel_rec.deposit == 15
def test_payment(
channel_manager: ChannelManager,
confirmed_open_channel,
receiver_address: str,
receiver_privkey: str,
sender_privkey: str,
sender_address: str
):
channel_manager.wait_sync()
channel_id = (confirmed_open_channel.sender, confirmed_open_channel.block)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.last_signature is None
assert channel_rec.balance == 0
# valid transfer
sig1 = encode_hex(confirmed_open_channel.create_transfer(2))
channel_manager.register_payment(sender_address, channel_rec.open_block_number, 2, sig1)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.balance == 2
assert channel_rec.last_signature == sig1
# transfer signed with wrong private key
invalid_sig = encode_hex(sign_balance_proof(
receiver_privkey, # should be sender's privkey
channel_rec.receiver,
channel_rec.open_block_number,
4,
channel_manager.channel_manager_contract.address
))
with pytest.raises(InvalidBalanceProof):
channel_manager.register_payment(sender_address, channel_rec.open_block_number, 4,
invalid_sig)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.balance == 2
assert channel_rec.last_signature == sig1
# transfer to different receiver
invalid_sig = encode_hex(sign_balance_proof(
sender_privkey,
sender_address, # should be receiver's address
channel_rec.open_block_number,
4,
channel_manager.channel_manager_contract.address
))
with pytest.raises(InvalidBalanceProof):
channel_manager.register_payment(sender_address, channel_rec.open_block_number, 4,
invalid_sig)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.balance == 2
assert channel_rec.last_signature == sig1
# transfer negative amount
invalid_sig = encode_hex(sign_balance_proof(
sender_privkey,
receiver_address,
channel_rec.open_block_number,
1, # should be greater than 2
channel_manager.channel_manager_contract.address
))
with pytest.raises(InvalidBalanceAmount):
channel_manager.register_payment(sender_address, channel_rec.open_block_number, 1,
invalid_sig)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.balance == 2
assert channel_rec.last_signature == sig1
# parameters should match balance proof
sig2 = encode_hex(confirmed_open_channel.create_transfer(2))
with pytest.raises(NoOpenChannel):
channel_manager.register_payment(receiver_address, channel_rec.open_block_number,
4, sig2)
with pytest.raises(NoOpenChannel):
channel_manager.register_payment(sender_address, channel_rec.open_block_number + 1,
4, sig2)
with pytest.raises(InvalidBalanceProof):
channel_manager.register_payment(sender_address, channel_rec.open_block_number,
5, sig2)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.balance == 2
assert channel_rec.last_signature == sig1
channel_manager.register_payment(sender_address, channel_rec.open_block_number, 4, sig2)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.balance == 4
assert channel_rec.last_signature == sig2
# should transfer up to deposit
sig3 = encode_hex(confirmed_open_channel.create_transfer(6))
channel_manager.register_payment(sender_address, channel_rec.open_block_number, 10, sig3)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.balance == 10
assert channel_rec.last_signature == sig3
# transfer too much
invalid_sig = encode_hex(sign_balance_proof(
sender_privkey,
receiver_address,
channel_rec.open_block_number,
12, # should not be greater than 10
channel_manager.channel_manager_contract.address
))
with pytest.raises(InvalidBalanceProof):
channel_manager.register_payment(sender_address, channel_rec.open_block_number, 12,
invalid_sig)
assert channel_rec.balance == 10
assert channel_rec.last_signature == sig3
def test_challenge(
channel_manager: ChannelManager,
confirmed_open_channel: Channel,
receiver_address: str,
sender_address: str,
wait_for_blocks,
web3: Web3,
client: Client
):
blockchain = channel_manager.blockchain
channel_id = (confirmed_open_channel.sender, confirmed_open_channel.block)
sig = encode_hex(confirmed_open_channel.create_transfer(5))
channel_manager.register_payment(sender_address, confirmed_open_channel.block, 5, sig)
# hack channel to decrease balance
confirmed_open_channel.update_balance(0)
sig = confirmed_open_channel.create_transfer(3)
block_before = web3.eth.blockNumber
confirmed_open_channel.close()
# should challenge and immediately settle
for waited_blocks in count():
logs = get_logs(client.context.token, 'Transfer', from_block=block_before - 1)
if logs:
break
wait_for_blocks(1)
assert waited_blocks < 10
assert len([l for l in logs
if is_same_address(l['args']['_to'], receiver_address) and
l['args']['_value'] == 5]) == 1
assert len([l for l in logs
if is_same_address(l['args']['_to'], sender_address) and
l['args']['_value'] == 5]) == 1
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
assert channel_id not in channel_manager.channels
# update channel state so that it will not be closed twice
client.sync_channels()
new_state = None
for channel in client.channels:
if all(channel.sender == confirmed_open_channel.sender,
channel.receiver == confirmed_open_channel.receiver,
channel.block == confirmed_open_channel.block):
new_state = channel.state
if new_state is None:
confirmed_open_channel.state = confirmed_open_channel.State.closed
else:
confirmed_open_channel.state = new_state
def test_multiple_topups(
channel_manager: ChannelManager,
confirmed_open_channel: Channel,
wait_for_blocks
):
blockchain = channel_manager.blockchain
channel_id = (confirmed_open_channel.sender, confirmed_open_channel.block)
channel_rec = channel_manager.channels[channel_id]
# first unconfirmed topup
assert channel_rec.deposit == 10
confirmed_open_channel.topup(5)
wait_for_blocks(1)
gevent.sleep(blockchain.poll_interval)
channel_rec = channel_manager.channels[channel_id]
assert len(channel_rec.unconfirmed_topups) == 1
assert list(channel_rec.unconfirmed_topups.values()) == [5]
assert channel_rec.deposit == 10
# second unconfirmed_topups
confirmed_open_channel.topup(10)
wait_for_blocks(1)
gevent.sleep(blockchain.poll_interval)
channel_rec = channel_manager.channels[channel_id]
assert len(channel_rec.unconfirmed_topups) >= 1 # equality if first is confirmed
assert 10 in channel_rec.unconfirmed_topups.values()
assert channel_rec.deposit in [10, 15] # depends if first topup is confirmed or not
# wait for confirmations
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
channel_rec = channel_manager.channels[channel_id]
assert len(channel_rec.unconfirmed_topups) == 0
assert channel_rec.deposit == 25
def test_settlement(
channel_manager: ChannelManager,
confirmed_open_channel: Channel,
receiver_address: str,
wait_for_blocks,
web3: Web3,
token_contract: Contract,
sender_address: str,
use_tester: bool
):
if not use_tester:
pytest.skip('This test takes several hours on real blockchains.')
blockchain = channel_manager.blockchain
channel_id = (confirmed_open_channel.sender, confirmed_open_channel.block)
sig = encode_hex(confirmed_open_channel.create_transfer(2))
channel_manager.register_payment(sender_address, confirmed_open_channel.block, 2, sig)
confirmed_open_channel.close()
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
block_before = web3.eth.blockNumber
channel_rec = channel_manager.channels[channel_id]
wait_for_blocks(channel_rec.settle_timeout - block_before)
confirmed_open_channel.settle()
logs = get_logs(token_contract, 'Transfer', from_block=block_before - 1)
assert len([l for l in logs
if is_same_address(l['args']['_to'], receiver_address) and
l['args']['_value'] == 2]) == 1
assert len([l for l in logs
if is_same_address(l['args']['_to'], sender_address) and
l['args']['_value'] == 8]) == 1
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
assert channel_id not in channel_manager.channels
def test_cooperative(
channel_manager: ChannelManager,
confirmed_open_channel: Channel,
receiver_address: str,
web3: Web3,
token_contract: Contract,
wait_for_blocks,
sender_address: str
):
blockchain = channel_manager.blockchain
channel_id = (confirmed_open_channel.sender, confirmed_open_channel.block)
sig1 = encode_hex(confirmed_open_channel.create_transfer(5))
channel_manager.register_payment(sender_address, confirmed_open_channel.block, 5, sig1)
receiver_sig = channel_manager.sign_close(sender_address, confirmed_open_channel.block, 5)
channel_rec = channel_manager.channels[channel_id]
assert channel_rec.is_closed is True
block_before = web3.eth.blockNumber
confirmed_open_channel.close_cooperatively(receiver_sig)
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
logs = get_logs(token_contract, 'Transfer', from_block=block_before - 1)
assert len([l for l in logs
if is_same_address(l['args']['_to'], receiver_address) and
l['args']['_value'] == 5]) == 1
assert len([l for l in logs
if is_same_address(l['args']['_to'], sender_address) and
l['args']['_value'] == 5]) == 1
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
assert channel_id not in channel_manager.channels
def test_cooperative_wrong_balance_proof(
channel_manager: ChannelManager,
confirmed_open_channel: Channel,
sender_address: str
):
channel_id = (confirmed_open_channel.sender, confirmed_open_channel.block)
channel_rec = channel_manager.channels[channel_id]
sig1 = encode_hex(confirmed_open_channel.create_transfer(5))
channel_manager.register_payment(sender_address, confirmed_open_channel.block, 5, sig1)
sig2 = encode_hex(confirmed_open_channel.create_transfer(1))
with pytest.raises(InvalidBalanceProof):
channel_manager.sign_close(sender_address, confirmed_open_channel.block, sig2)
assert channel_rec.is_closed is False
def test_balances(
channel_manager: ChannelManager,
confirmed_open_channel: Channel,
wait_for_blocks,
sender_address: str,
use_tester: bool
):
blockchain = channel_manager.blockchain
initial_liquid_balance = channel_manager.get_liquid_balance()
initial_locked_balance = channel_manager.get_locked_balance()
if use_tester:
assert initial_liquid_balance == 0
assert initial_locked_balance == 0
sig = encode_hex(confirmed_open_channel.create_transfer(5))
channel_manager.register_payment(sender_address, confirmed_open_channel.block, 5, sig)
assert channel_manager.get_liquid_balance() == initial_liquid_balance
assert channel_manager.get_locked_balance() == 5
receiver_sig = channel_manager.sign_close(sender_address, confirmed_open_channel.block, 5)
confirmed_open_channel.close_cooperatively(receiver_sig)
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
assert channel_manager.get_liquid_balance() == initial_liquid_balance + 5
assert channel_manager.get_locked_balance() == initial_locked_balance
def test_different_receivers(
web3: Web3,
make_account,
private_keys: List[str],
channel_manager_contract: Contract,
token_contract: Contract,
mine_sync_event,
client: Client,
sender_address: str,
wait_for_blocks,
use_tester: bool,
state_db_path: str
):
if not use_tester:
pytest.skip('This test takes several hours on real blockchains.')
receiver1_privkey = make_account(
RECEIVER_ETH_ALLOWANCE,
RECEIVER_TOKEN_ALLOWANCE,
private_keys[2]
)
receiver2_privkey = make_account(
RECEIVER_ETH_ALLOWANCE,
RECEIVER_TOKEN_ALLOWANCE,
private_keys[3]
)
receiver1_address = privkey_to_addr(receiver1_privkey)
channel_manager1 = ChannelManager(
web3,
channel_manager_contract,
token_contract,
receiver1_privkey,
n_confirmations=5,
state_filename=state_db_path
)
start_channel_manager(channel_manager1, use_tester, mine_sync_event)
channel_manager2 = ChannelManager(
web3,
channel_manager_contract,
token_contract,
receiver2_privkey,
n_confirmations=5,
state_filename=state_db_path
)
start_channel_manager(channel_manager2, use_tester, mine_sync_event)
channel_manager1.wait_sync()
channel_manager2.wait_sync()
blockchain = channel_manager1.blockchain
assert channel_manager2.blockchain.n_confirmations == blockchain.n_confirmations
assert channel_manager2.blockchain.poll_interval == blockchain.poll_interval
# unconfirmed open
channel = client.open_channel(receiver1_address, 10)
wait_for_blocks(1)
gevent.sleep(blockchain.poll_interval)
assert (sender_address, channel.block) in channel_manager1.unconfirmed_channels
assert (sender_address, channel.block) not in channel_manager2.unconfirmed_channels
# confirmed open
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
assert (sender_address, channel.block) in channel_manager1.channels
assert (sender_address, channel.block) not in channel_manager2.channels
# unconfirmed topup
channel.topup(5)
wait_for_blocks(1)
gevent.sleep(blockchain.poll_interval)
channel_rec = channel_manager1.channels[sender_address, channel.block]
assert len(channel_rec.unconfirmed_topups) == 1
assert channel_rec.deposit == 10
# confirmed topup
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
channel_rec = channel_manager1.channels[sender_address, channel.block]
assert len(channel_rec.unconfirmed_topups) == 0
assert channel_rec.deposit == 15
# closing
channel.close()
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
channel_rec = channel_manager1.channels[sender_address, channel.block]
assert channel_rec.is_closed is True
# settlement
block_before = web3.eth.blockNumber
wait_for_blocks(channel_rec.settle_timeout - block_before)
channel.settle()
wait_for_blocks(blockchain.n_confirmations)
gevent.sleep(blockchain.poll_interval)
assert (sender_address, channel.block) not in channel_manager1.channels
channel_manager1.stop()
channel_manager2.stop()
def test_reorg(
web3: Web3,
channel_manager: ChannelManager,
client: Client,
receiver_address: str,
wait_for_blocks,
use_tester: bool
):
if not use_tester:
pytest.skip('Chain reorg tests only work in tester chain')
wait_for_blocks(10)
# create unconfirmed channel
channel_manager.wait_sync()
snapshot_id = web3.testing.snapshot()
channel = client.open_channel(receiver_address, 10)
wait_for_blocks(0)
assert (channel.sender, channel.block) in channel_manager.unconfirmed_channels
# remove unconfirmed channel opening with reorg
web3.testing.revert(snapshot_id)
snapshot_id = web3.testing.snapshot()
wait_for_blocks(0)
assert (channel.sender, channel.block) not in channel_manager.unconfirmed_channels
web3.testing.mine(channel_manager.n_confirmations)
assert (channel.sender, channel.block) not in channel_manager.channels
# leave confirmed channel opening
web3.testing.revert(snapshot_id)
channel = client.open_channel(receiver_address, 10)
wait_for_blocks(channel_manager.n_confirmations)
assert (channel.sender, channel.block) in channel_manager.channels
confirmed_snapshot_id = web3.testing.snapshot()
wait_for_blocks(3)
web3.testing.revert(confirmed_snapshot_id)
assert (channel.sender, channel.block) in channel_manager.channels
# remove unconfirmed topup
channel = client.open_channel(receiver_address, 10)
wait_for_blocks(channel_manager.n_confirmations)
assert (channel.sender, channel.block) in channel_manager.channels
topup_snapshot_id = web3.testing.snapshot()
channel.topup(5)
wait_for_blocks(0)
channel_rec = channel_manager.channels[channel.sender, channel.block]
assert len(channel_rec.unconfirmed_topups) == 1
web3.testing.revert(topup_snapshot_id)
wait_for_blocks(0)
assert (channel.sender, channel.block) in channel_manager.channels
channel_rec = channel_manager.channels[channel.sender, channel.block]
assert len(channel_rec.unconfirmed_topups) == 0
| 37.795455 | 94 | 0.734892 | 3,225 | 26,608 | 5.729612 | 0.068217 | 0.098496 | 0.070354 | 0.037883 | 0.826063 | 0.785962 | 0.751921 | 0.728759 | 0.698452 | 0.660569 | 0 | 0.012211 | 0.193626 | 26,608 | 703 | 95 | 37.849218 | 0.848993 | 0.04307 | 0 | 0.74276 | 0 | 0 | 0.012547 | 0 | 0 | 0 | 0 | 0 | 0.166951 | 1 | 0.027257 | false | 0 | 0.025554 | 0 | 0.054514 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a2fae789aa47f0205fe2b607a33a9aaf2304d9bd | 109 | py | Python | server/src/routes/__init__.py | mounirchaabani/centrale | c3dbe51a7292316e4918869d95bf69f0302933f5 | [
"MIT"
] | null | null | null | server/src/routes/__init__.py | mounirchaabani/centrale | c3dbe51a7292316e4918869d95bf69f0302933f5 | [
"MIT"
] | null | null | null | server/src/routes/__init__.py | mounirchaabani/centrale | c3dbe51a7292316e4918869d95bf69f0302933f5 | [
"MIT"
] | null | null | null | from .user import USER_BLUEPRINT
from .movie import MOVIE_BLUEPRINT
from .notation import NOTATION_BLUEPRINT
| 27.25 | 40 | 0.862385 | 15 | 109 | 6.066667 | 0.4 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110092 | 109 | 3 | 41 | 36.333333 | 0.938144 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0c3f67ba482810c1ddc87a6d2f0c7ba1a5f62a7e | 3,450 | py | Python | tests/test_import_entry_points.py | inmanta/inmanta-core | ae2153d57f124d00ad1b58e6d4bc6818364be4a8 | [
"Apache-2.0"
] | 6 | 2021-03-09T10:24:02.000Z | 2022-01-16T03:52:11.000Z | tests/test_import_entry_points.py | inmanta/inmanta-core | ae2153d57f124d00ad1b58e6d4bc6818364be4a8 | [
"Apache-2.0"
] | 1,319 | 2020-12-18T08:52:29.000Z | 2022-03-31T18:17:32.000Z | tests/test_import_entry_points.py | inmanta/inmanta-core | ae2153d57f124d00ad1b58e6d4bc6818364be4a8 | [
"Apache-2.0"
] | 4 | 2021-03-03T15:36:50.000Z | 2022-03-11T11:41:51.000Z | """
Copyright 2020 Inmanta
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Contact: code@inmanta.com
"""
"""
These tests make sure that for each module mentioned in the compiler API docs, using it as an entry point for importing
does not result in an import loop (see #2341 and #2342).
"""
import importlib
import multiprocessing
from typing import Callable, Iterator, Optional
import pytest
@pytest.fixture(scope="session")
def import_entry_point() -> Iterator[Callable[[str], Optional[int]]]:
"""
Yields a function that imports a module in a seperate Python process and returns the exit code.
"""
context = multiprocessing.get_context("spawn")
def do_import(module: str) -> Optional[int]:
process = context.Process(target=importlib.import_module, args=(module,))
process.start()
process.join()
return process.exitcode
yield do_import
def test_import_exceptions(import_entry_point) -> None:
assert import_entry_point("inmanta.ast") == 0
assert import_entry_point("inmanta.parser") == 0
def test_import_plugins(import_entry_point) -> None:
assert import_entry_point("inmanta.plugins") == 0
def test_import_resources(import_entry_point) -> None:
assert import_entry_point("inmanta.resources") == 0
assert import_entry_point("inmanta.execute.util") == 0
def test_import_handlers(import_entry_point) -> None:
assert import_entry_point("inmanta.agent.handler") == 0
assert import_entry_point("inmanta.agent.io.local") == 0
def test_import_export(import_entry_point) -> None:
assert import_entry_point("inmanta.export") == 0
def test_import_attributes(import_entry_point) -> None:
assert import_entry_point("inmanta.ast.attribute") == 0
def test_import_typing(import_entry_point) -> None:
assert import_entry_point("inmanta.ast.type") == 0
def test_import_proxy(import_entry_point) -> None:
assert import_entry_point("inmanta.execute.proxy") == 0
def test_import_data(import_entry_point) -> None:
assert import_entry_point("inmanta.data") == 0
assert import_entry_point("inmanta.data.model") == 0
def test_import_compile_data(import_entry_point) -> None:
assert import_entry_point("inmanta.ast.export") == 0
def test_import_module(import_entry_point) -> None:
assert import_entry_point("inmanta.module") == 0
def test_import_protocol(import_entry_point) -> None:
assert import_entry_point("inmanta.protocol") == 0
assert import_entry_point("inmanta.protocol.exceptions") == 0
def test_import_const(import_entry_point) -> None:
assert import_entry_point("inmanta.const") == 0
def test_import_util(import_entry_point: Callable[[str], Optional[int]]) -> None:
assert import_entry_point("inmanta.util") == 0
def test_import_ast(import_entry_point: Callable[[str], Optional[int]]) -> None:
assert import_entry_point("inmanta.ast.constraint.expression") == 0
| 31.651376 | 123 | 0.738841 | 480 | 3,450 | 5.0875 | 0.30625 | 0.151515 | 0.235872 | 0.18018 | 0.440622 | 0.4095 | 0.328419 | 0.327191 | 0.327191 | 0.166667 | 0 | 0.012418 | 0.15971 | 3,450 | 108 | 124 | 31.944444 | 0.829941 | 0.194203 | 0 | 0 | 0 | 0 | 0.145577 | 0.057517 | 0 | 0 | 0 | 0 | 0.416667 | 1 | 0.354167 | false | 0 | 0.895833 | 0 | 1.270833 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
a76b6fc4e8e67156030050ff62bfb65cf3733429 | 80,646 | py | Python | cottonformation/res/networkfirewall.py | MacHu-GWU/cottonformation-project | 23e28c08cfb5a7cc0db6dbfdb1d7e1585c773f3b | [
"BSD-2-Clause"
] | 5 | 2021-07-22T03:45:59.000Z | 2021-12-17T21:07:14.000Z | cottonformation/res/networkfirewall.py | MacHu-GWU/cottonformation-project | 23e28c08cfb5a7cc0db6dbfdb1d7e1585c773f3b | [
"BSD-2-Clause"
] | 1 | 2021-06-25T18:01:31.000Z | 2021-06-25T18:01:31.000Z | cottonformation/res/networkfirewall.py | MacHu-GWU/cottonformation-project | 23e28c08cfb5a7cc0db6dbfdb1d7e1585c773f3b | [
"BSD-2-Clause"
] | 2 | 2021-06-27T03:08:21.000Z | 2021-06-28T22:15:51.000Z | # -*- coding: utf-8 -*-
"""
This module
"""
import attr
import typing
from ..core.model import (
Property, Resource, Tag, GetAtt, TypeHint, TypeCheck,
)
from ..core.constant import AttrMeta
#--- Property declaration ---
@attr.s
class PropRuleGroupTCPFlagField(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.TCPFlagField"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-tcpflagfield.html
Property Document:
- ``rp_Flags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-tcpflagfield.html#cfn-networkfirewall-rulegroup-tcpflagfield-flags
- ``p_Masks``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-tcpflagfield.html#cfn-networkfirewall-rulegroup-tcpflagfield-masks
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.TCPFlagField"
rp_Flags: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Flags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-tcpflagfield.html#cfn-networkfirewall-rulegroup-tcpflagfield-flags"""
p_Masks: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Masks"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-tcpflagfield.html#cfn-networkfirewall-rulegroup-tcpflagfield-masks"""
@attr.s
class PropLoggingConfigurationLogDestinationConfig(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::LoggingConfiguration.LogDestinationConfig"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-logdestinationconfig.html
Property Document:
- ``rp_LogDestination``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-logdestinationconfig.html#cfn-networkfirewall-loggingconfiguration-logdestinationconfig-logdestination
- ``rp_LogDestinationType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-logdestinationconfig.html#cfn-networkfirewall-loggingconfiguration-logdestinationconfig-logdestinationtype
- ``rp_LogType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-logdestinationconfig.html#cfn-networkfirewall-loggingconfiguration-logdestinationconfig-logtype
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::LoggingConfiguration.LogDestinationConfig"
rp_LogDestination: typing.Dict[str, TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.deep_mapping(key_validator=attr.validators.instance_of(str), value_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "LogDestination"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-logdestinationconfig.html#cfn-networkfirewall-loggingconfiguration-logdestinationconfig-logdestination"""
rp_LogDestinationType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "LogDestinationType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-logdestinationconfig.html#cfn-networkfirewall-loggingconfiguration-logdestinationconfig-logdestinationtype"""
rp_LogType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "LogType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-logdestinationconfig.html#cfn-networkfirewall-loggingconfiguration-logdestinationconfig-logtype"""
@attr.s
class PropRuleGroupHeader(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.Header"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html
Property Document:
- ``rp_Destination``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-destination
- ``rp_DestinationPort``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-destinationport
- ``rp_Direction``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-direction
- ``rp_Protocol``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-protocol
- ``rp_Source``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-source
- ``rp_SourcePort``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-sourceport
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.Header"
rp_Destination: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Destination"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-destination"""
rp_DestinationPort: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "DestinationPort"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-destinationport"""
rp_Direction: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Direction"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-direction"""
rp_Protocol: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Protocol"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-protocol"""
rp_Source: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Source"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-source"""
rp_SourcePort: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "SourcePort"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-header.html#cfn-networkfirewall-rulegroup-header-sourceport"""
@attr.s
class PropRuleGroupDimension(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.Dimension"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-dimension.html
Property Document:
- ``rp_Value``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-dimension.html#cfn-networkfirewall-rulegroup-dimension-value
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.Dimension"
rp_Value: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Value"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-dimension.html#cfn-networkfirewall-rulegroup-dimension-value"""
@attr.s
class PropFirewallPolicyStatelessRuleGroupReference(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::FirewallPolicy.StatelessRuleGroupReference"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statelessrulegroupreference.html
Property Document:
- ``rp_Priority``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statelessrulegroupreference.html#cfn-networkfirewall-firewallpolicy-statelessrulegroupreference-priority
- ``rp_ResourceArn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statelessrulegroupreference.html#cfn-networkfirewall-firewallpolicy-statelessrulegroupreference-resourcearn
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::FirewallPolicy.StatelessRuleGroupReference"
rp_Priority: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "Priority"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statelessrulegroupreference.html#cfn-networkfirewall-firewallpolicy-statelessrulegroupreference-priority"""
rp_ResourceArn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ResourceArn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statelessrulegroupreference.html#cfn-networkfirewall-firewallpolicy-statelessrulegroupreference-resourcearn"""
@attr.s
class PropFirewallPolicyStatefulRuleGroupReference(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::FirewallPolicy.StatefulRuleGroupReference"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statefulrulegroupreference.html
Property Document:
- ``rp_ResourceArn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statefulrulegroupreference.html#cfn-networkfirewall-firewallpolicy-statefulrulegroupreference-resourcearn
- ``p_Priority``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statefulrulegroupreference.html#cfn-networkfirewall-firewallpolicy-statefulrulegroupreference-priority
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::FirewallPolicy.StatefulRuleGroupReference"
rp_ResourceArn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ResourceArn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statefulrulegroupreference.html#cfn-networkfirewall-firewallpolicy-statefulrulegroupreference-resourcearn"""
p_Priority: int = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(int)),
metadata={AttrMeta.PROPERTY_NAME: "Priority"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statefulrulegroupreference.html#cfn-networkfirewall-firewallpolicy-statefulrulegroupreference-priority"""
@attr.s
class PropRuleGroupRuleOption(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.RuleOption"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruleoption.html
Property Document:
- ``rp_Keyword``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruleoption.html#cfn-networkfirewall-rulegroup-ruleoption-keyword
- ``p_Settings``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruleoption.html#cfn-networkfirewall-rulegroup-ruleoption-settings
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.RuleOption"
rp_Keyword: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Keyword"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruleoption.html#cfn-networkfirewall-rulegroup-ruleoption-keyword"""
p_Settings: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Settings"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruleoption.html#cfn-networkfirewall-rulegroup-ruleoption-settings"""
@attr.s
class PropFirewallSubnetMapping(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::Firewall.SubnetMapping"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewall-subnetmapping.html
Property Document:
- ``rp_SubnetId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewall-subnetmapping.html#cfn-networkfirewall-firewall-subnetmapping-subnetid
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::Firewall.SubnetMapping"
rp_SubnetId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "SubnetId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewall-subnetmapping.html#cfn-networkfirewall-firewall-subnetmapping-subnetid"""
@attr.s
class PropRuleGroupRulesSourceList(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.RulesSourceList"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessourcelist.html
Property Document:
- ``rp_GeneratedRulesType``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessourcelist.html#cfn-networkfirewall-rulegroup-rulessourcelist-generatedrulestype
- ``rp_TargetTypes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessourcelist.html#cfn-networkfirewall-rulegroup-rulessourcelist-targettypes
- ``rp_Targets``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessourcelist.html#cfn-networkfirewall-rulegroup-rulessourcelist-targets
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.RulesSourceList"
rp_GeneratedRulesType: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "GeneratedRulesType"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessourcelist.html#cfn-networkfirewall-rulegroup-rulessourcelist-generatedrulestype"""
rp_TargetTypes: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "TargetTypes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessourcelist.html#cfn-networkfirewall-rulegroup-rulessourcelist-targettypes"""
rp_Targets: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Targets"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessourcelist.html#cfn-networkfirewall-rulegroup-rulessourcelist-targets"""
@attr.s
class PropFirewallPolicyDimension(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::FirewallPolicy.Dimension"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-dimension.html
Property Document:
- ``rp_Value``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-dimension.html#cfn-networkfirewall-firewallpolicy-dimension-value
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::FirewallPolicy.Dimension"
rp_Value: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Value"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-dimension.html#cfn-networkfirewall-firewallpolicy-dimension-value"""
@attr.s
class PropRuleGroupPortRange(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.PortRange"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-portrange.html
Property Document:
- ``rp_FromPort``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-portrange.html#cfn-networkfirewall-rulegroup-portrange-fromport
- ``rp_ToPort``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-portrange.html#cfn-networkfirewall-rulegroup-portrange-toport
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.PortRange"
rp_FromPort: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "FromPort"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-portrange.html#cfn-networkfirewall-rulegroup-portrange-fromport"""
rp_ToPort: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "ToPort"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-portrange.html#cfn-networkfirewall-rulegroup-portrange-toport"""
@attr.s
class PropRuleGroupIPSet(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.IPSet"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ipset.html
Property Document:
- ``p_Definition``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ipset.html#cfn-networkfirewall-rulegroup-ipset-definition
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.IPSet"
p_Definition: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Definition"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ipset.html#cfn-networkfirewall-rulegroup-ipset-definition"""
@attr.s
class PropLoggingConfigurationLoggingConfiguration(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::LoggingConfiguration.LoggingConfiguration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-loggingconfiguration.html
Property Document:
- ``rp_LogDestinationConfigs``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-loggingconfiguration.html#cfn-networkfirewall-loggingconfiguration-loggingconfiguration-logdestinationconfigs
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::LoggingConfiguration.LoggingConfiguration"
rp_LogDestinationConfigs: typing.List[typing.Union['PropLoggingConfigurationLogDestinationConfig', dict]] = attr.ib(
default=None,
converter=PropLoggingConfigurationLogDestinationConfig.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropLoggingConfigurationLogDestinationConfig), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "LogDestinationConfigs"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-loggingconfiguration-loggingconfiguration.html#cfn-networkfirewall-loggingconfiguration-loggingconfiguration-logdestinationconfigs"""
@attr.s
class PropFirewallPolicyStatefulEngineOptions(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::FirewallPolicy.StatefulEngineOptions"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statefulengineoptions.html
Property Document:
- ``p_RuleOrder``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statefulengineoptions.html#cfn-networkfirewall-firewallpolicy-statefulengineoptions-ruleorder
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::FirewallPolicy.StatefulEngineOptions"
p_RuleOrder: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "RuleOrder"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-statefulengineoptions.html#cfn-networkfirewall-firewallpolicy-statefulengineoptions-ruleorder"""
@attr.s
class PropRuleGroupStatefulRuleOptions(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.StatefulRuleOptions"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulruleoptions.html
Property Document:
- ``p_RuleOrder``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulruleoptions.html#cfn-networkfirewall-rulegroup-statefulruleoptions-ruleorder
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.StatefulRuleOptions"
p_RuleOrder: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "RuleOrder"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulruleoptions.html#cfn-networkfirewall-rulegroup-statefulruleoptions-ruleorder"""
@attr.s
class PropRuleGroupPortSet(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.PortSet"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-portset.html
Property Document:
- ``p_Definition``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-portset.html#cfn-networkfirewall-rulegroup-portset-definition
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.PortSet"
p_Definition: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Definition"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-portset.html#cfn-networkfirewall-rulegroup-portset-definition"""
@attr.s
class PropRuleGroupRuleVariables(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.RuleVariables"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulevariables.html
Property Document:
- ``p_IPSets``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulevariables.html#cfn-networkfirewall-rulegroup-rulevariables-ipsets
- ``p_PortSets``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulevariables.html#cfn-networkfirewall-rulegroup-rulevariables-portsets
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.RuleVariables"
p_IPSets: typing.Union['PropRuleGroupIPSet', dict] = attr.ib(
default=None,
converter=PropRuleGroupIPSet.from_list,
validator=attr.validators.optional(attr.validators.instance_of(PropRuleGroupIPSet)),
metadata={AttrMeta.PROPERTY_NAME: "IPSets"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulevariables.html#cfn-networkfirewall-rulegroup-rulevariables-ipsets"""
p_PortSets: typing.Union['PropRuleGroupPortSet', dict] = attr.ib(
default=None,
converter=PropRuleGroupPortSet.from_list,
validator=attr.validators.optional(attr.validators.instance_of(PropRuleGroupPortSet)),
metadata={AttrMeta.PROPERTY_NAME: "PortSets"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulevariables.html#cfn-networkfirewall-rulegroup-rulevariables-portsets"""
@attr.s
class PropRuleGroupPublishMetricAction(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.PublishMetricAction"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-publishmetricaction.html
Property Document:
- ``rp_Dimensions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-publishmetricaction.html#cfn-networkfirewall-rulegroup-publishmetricaction-dimensions
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.PublishMetricAction"
rp_Dimensions: typing.List[typing.Union['PropRuleGroupDimension', dict]] = attr.ib(
default=None,
converter=PropRuleGroupDimension.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupDimension), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Dimensions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-publishmetricaction.html#cfn-networkfirewall-rulegroup-publishmetricaction-dimensions"""
@attr.s
class PropRuleGroupAddress(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.Address"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-address.html
Property Document:
- ``rp_AddressDefinition``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-address.html#cfn-networkfirewall-rulegroup-address-addressdefinition
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.Address"
rp_AddressDefinition: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "AddressDefinition"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-address.html#cfn-networkfirewall-rulegroup-address-addressdefinition"""
@attr.s
class PropRuleGroupStatefulRule(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.StatefulRule"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulrule.html
Property Document:
- ``rp_Action``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulrule.html#cfn-networkfirewall-rulegroup-statefulrule-action
- ``rp_Header``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulrule.html#cfn-networkfirewall-rulegroup-statefulrule-header
- ``rp_RuleOptions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulrule.html#cfn-networkfirewall-rulegroup-statefulrule-ruleoptions
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.StatefulRule"
rp_Action: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Action"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulrule.html#cfn-networkfirewall-rulegroup-statefulrule-action"""
rp_Header: typing.Union['PropRuleGroupHeader', dict] = attr.ib(
default=None,
converter=PropRuleGroupHeader.from_dict,
validator=attr.validators.instance_of(PropRuleGroupHeader),
metadata={AttrMeta.PROPERTY_NAME: "Header"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulrule.html#cfn-networkfirewall-rulegroup-statefulrule-header"""
rp_RuleOptions: typing.List[typing.Union['PropRuleGroupRuleOption', dict]] = attr.ib(
default=None,
converter=PropRuleGroupRuleOption.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupRuleOption), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "RuleOptions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statefulrule.html#cfn-networkfirewall-rulegroup-statefulrule-ruleoptions"""
@attr.s
class PropRuleGroupActionDefinition(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.ActionDefinition"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-actiondefinition.html
Property Document:
- ``p_PublishMetricAction``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-actiondefinition.html#cfn-networkfirewall-rulegroup-actiondefinition-publishmetricaction
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.ActionDefinition"
p_PublishMetricAction: typing.Union['PropRuleGroupPublishMetricAction', dict] = attr.ib(
default=None,
converter=PropRuleGroupPublishMetricAction.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(PropRuleGroupPublishMetricAction)),
metadata={AttrMeta.PROPERTY_NAME: "PublishMetricAction"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-actiondefinition.html#cfn-networkfirewall-rulegroup-actiondefinition-publishmetricaction"""
@attr.s
class PropFirewallPolicyPublishMetricAction(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::FirewallPolicy.PublishMetricAction"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-publishmetricaction.html
Property Document:
- ``rp_Dimensions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-publishmetricaction.html#cfn-networkfirewall-firewallpolicy-publishmetricaction-dimensions
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::FirewallPolicy.PublishMetricAction"
rp_Dimensions: typing.List[typing.Union['PropFirewallPolicyDimension', dict]] = attr.ib(
default=None,
converter=PropFirewallPolicyDimension.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropFirewallPolicyDimension), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Dimensions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-publishmetricaction.html#cfn-networkfirewall-firewallpolicy-publishmetricaction-dimensions"""
@attr.s
class PropFirewallPolicyActionDefinition(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::FirewallPolicy.ActionDefinition"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-actiondefinition.html
Property Document:
- ``p_PublishMetricAction``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-actiondefinition.html#cfn-networkfirewall-firewallpolicy-actiondefinition-publishmetricaction
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::FirewallPolicy.ActionDefinition"
p_PublishMetricAction: typing.Union['PropFirewallPolicyPublishMetricAction', dict] = attr.ib(
default=None,
converter=PropFirewallPolicyPublishMetricAction.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(PropFirewallPolicyPublishMetricAction)),
metadata={AttrMeta.PROPERTY_NAME: "PublishMetricAction"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-actiondefinition.html#cfn-networkfirewall-firewallpolicy-actiondefinition-publishmetricaction"""
@attr.s
class PropRuleGroupCustomAction(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.CustomAction"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-customaction.html
Property Document:
- ``rp_ActionDefinition``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-customaction.html#cfn-networkfirewall-rulegroup-customaction-actiondefinition
- ``rp_ActionName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-customaction.html#cfn-networkfirewall-rulegroup-customaction-actionname
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.CustomAction"
rp_ActionDefinition: typing.Union['PropRuleGroupActionDefinition', dict] = attr.ib(
default=None,
converter=PropRuleGroupActionDefinition.from_dict,
validator=attr.validators.instance_of(PropRuleGroupActionDefinition),
metadata={AttrMeta.PROPERTY_NAME: "ActionDefinition"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-customaction.html#cfn-networkfirewall-rulegroup-customaction-actiondefinition"""
rp_ActionName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ActionName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-customaction.html#cfn-networkfirewall-rulegroup-customaction-actionname"""
@attr.s
class PropRuleGroupMatchAttributes(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.MatchAttributes"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html
Property Document:
- ``p_DestinationPorts``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-destinationports
- ``p_Destinations``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-destinations
- ``p_Protocols``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-protocols
- ``p_SourcePorts``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-sourceports
- ``p_Sources``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-sources
- ``p_TCPFlags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-tcpflags
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.MatchAttributes"
p_DestinationPorts: typing.List[typing.Union['PropRuleGroupPortRange', dict]] = attr.ib(
default=None,
converter=PropRuleGroupPortRange.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupPortRange), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "DestinationPorts"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-destinationports"""
p_Destinations: typing.List[typing.Union['PropRuleGroupAddress', dict]] = attr.ib(
default=None,
converter=PropRuleGroupAddress.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupAddress), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Destinations"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-destinations"""
p_Protocols: typing.List[int] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(int), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Protocols"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-protocols"""
p_SourcePorts: typing.List[typing.Union['PropRuleGroupPortRange', dict]] = attr.ib(
default=None,
converter=PropRuleGroupPortRange.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupPortRange), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "SourcePorts"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-sourceports"""
p_Sources: typing.List[typing.Union['PropRuleGroupAddress', dict]] = attr.ib(
default=None,
converter=PropRuleGroupAddress.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupAddress), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Sources"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-sources"""
p_TCPFlags: typing.List[typing.Union['PropRuleGroupTCPFlagField', dict]] = attr.ib(
default=None,
converter=PropRuleGroupTCPFlagField.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupTCPFlagField), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "TCPFlags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-matchattributes.html#cfn-networkfirewall-rulegroup-matchattributes-tcpflags"""
@attr.s
class PropFirewallPolicyCustomAction(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::FirewallPolicy.CustomAction"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-customaction.html
Property Document:
- ``rp_ActionDefinition``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-customaction.html#cfn-networkfirewall-firewallpolicy-customaction-actiondefinition
- ``rp_ActionName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-customaction.html#cfn-networkfirewall-firewallpolicy-customaction-actionname
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::FirewallPolicy.CustomAction"
rp_ActionDefinition: typing.Union['PropFirewallPolicyActionDefinition', dict] = attr.ib(
default=None,
converter=PropFirewallPolicyActionDefinition.from_dict,
validator=attr.validators.instance_of(PropFirewallPolicyActionDefinition),
metadata={AttrMeta.PROPERTY_NAME: "ActionDefinition"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-customaction.html#cfn-networkfirewall-firewallpolicy-customaction-actiondefinition"""
rp_ActionName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "ActionName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-customaction.html#cfn-networkfirewall-firewallpolicy-customaction-actionname"""
@attr.s
class PropRuleGroupRuleDefinition(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.RuleDefinition"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruledefinition.html
Property Document:
- ``rp_Actions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruledefinition.html#cfn-networkfirewall-rulegroup-ruledefinition-actions
- ``rp_MatchAttributes``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruledefinition.html#cfn-networkfirewall-rulegroup-ruledefinition-matchattributes
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.RuleDefinition"
rp_Actions: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "Actions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruledefinition.html#cfn-networkfirewall-rulegroup-ruledefinition-actions"""
rp_MatchAttributes: typing.Union['PropRuleGroupMatchAttributes', dict] = attr.ib(
default=None,
converter=PropRuleGroupMatchAttributes.from_dict,
validator=attr.validators.instance_of(PropRuleGroupMatchAttributes),
metadata={AttrMeta.PROPERTY_NAME: "MatchAttributes"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-ruledefinition.html#cfn-networkfirewall-rulegroup-ruledefinition-matchattributes"""
@attr.s
class PropRuleGroupStatelessRule(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.StatelessRule"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrule.html
Property Document:
- ``rp_Priority``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrule.html#cfn-networkfirewall-rulegroup-statelessrule-priority
- ``rp_RuleDefinition``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrule.html#cfn-networkfirewall-rulegroup-statelessrule-ruledefinition
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.StatelessRule"
rp_Priority: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "Priority"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrule.html#cfn-networkfirewall-rulegroup-statelessrule-priority"""
rp_RuleDefinition: typing.Union['PropRuleGroupRuleDefinition', dict] = attr.ib(
default=None,
converter=PropRuleGroupRuleDefinition.from_dict,
validator=attr.validators.instance_of(PropRuleGroupRuleDefinition),
metadata={AttrMeta.PROPERTY_NAME: "RuleDefinition"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrule.html#cfn-networkfirewall-rulegroup-statelessrule-ruledefinition"""
@attr.s
class PropFirewallPolicyFirewallPolicy(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::FirewallPolicy.FirewallPolicy"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html
Property Document:
- ``rp_StatelessDefaultActions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statelessdefaultactions
- ``rp_StatelessFragmentDefaultActions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statelessfragmentdefaultactions
- ``p_StatefulDefaultActions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statefuldefaultactions
- ``p_StatefulEngineOptions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statefulengineoptions
- ``p_StatefulRuleGroupReferences``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statefulrulegroupreferences
- ``p_StatelessCustomActions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statelesscustomactions
- ``p_StatelessRuleGroupReferences``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statelessrulegroupreferences
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::FirewallPolicy.FirewallPolicy"
rp_StatelessDefaultActions: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "StatelessDefaultActions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statelessdefaultactions"""
rp_StatelessFragmentDefaultActions: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "StatelessFragmentDefaultActions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statelessfragmentdefaultactions"""
p_StatefulDefaultActions: typing.List[TypeHint.intrinsic_str] = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "StatefulDefaultActions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statefuldefaultactions"""
p_StatefulEngineOptions: typing.Union['PropFirewallPolicyStatefulEngineOptions', dict] = attr.ib(
default=None,
converter=PropFirewallPolicyStatefulEngineOptions.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(PropFirewallPolicyStatefulEngineOptions)),
metadata={AttrMeta.PROPERTY_NAME: "StatefulEngineOptions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statefulengineoptions"""
p_StatefulRuleGroupReferences: typing.List[typing.Union['PropFirewallPolicyStatefulRuleGroupReference', dict]] = attr.ib(
default=None,
converter=PropFirewallPolicyStatefulRuleGroupReference.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropFirewallPolicyStatefulRuleGroupReference), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "StatefulRuleGroupReferences"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statefulrulegroupreferences"""
p_StatelessCustomActions: typing.List[typing.Union['PropFirewallPolicyCustomAction', dict]] = attr.ib(
default=None,
converter=PropFirewallPolicyCustomAction.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropFirewallPolicyCustomAction), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "StatelessCustomActions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statelesscustomactions"""
p_StatelessRuleGroupReferences: typing.List[typing.Union['PropFirewallPolicyStatelessRuleGroupReference', dict]] = attr.ib(
default=None,
converter=PropFirewallPolicyStatelessRuleGroupReference.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropFirewallPolicyStatelessRuleGroupReference), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "StatelessRuleGroupReferences"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-firewallpolicy-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy-statelessrulegroupreferences"""
@attr.s
class PropRuleGroupStatelessRulesAndCustomActions(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.StatelessRulesAndCustomActions"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrulesandcustomactions.html
Property Document:
- ``rp_StatelessRules``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrulesandcustomactions.html#cfn-networkfirewall-rulegroup-statelessrulesandcustomactions-statelessrules
- ``p_CustomActions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrulesandcustomactions.html#cfn-networkfirewall-rulegroup-statelessrulesandcustomactions-customactions
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.StatelessRulesAndCustomActions"
rp_StatelessRules: typing.List[typing.Union['PropRuleGroupStatelessRule', dict]] = attr.ib(
default=None,
converter=PropRuleGroupStatelessRule.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupStatelessRule), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "StatelessRules"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrulesandcustomactions.html#cfn-networkfirewall-rulegroup-statelessrulesandcustomactions-statelessrules"""
p_CustomActions: typing.List[typing.Union['PropRuleGroupCustomAction', dict]] = attr.ib(
default=None,
converter=PropRuleGroupCustomAction.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupCustomAction), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "CustomActions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-statelessrulesandcustomactions.html#cfn-networkfirewall-rulegroup-statelessrulesandcustomactions-customactions"""
@attr.s
class PropRuleGroupRulesSource(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.RulesSource"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessource.html
Property Document:
- ``p_RulesSourceList``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessource.html#cfn-networkfirewall-rulegroup-rulessource-rulessourcelist
- ``p_RulesString``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessource.html#cfn-networkfirewall-rulegroup-rulessource-rulesstring
- ``p_StatefulRules``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessource.html#cfn-networkfirewall-rulegroup-rulessource-statefulrules
- ``p_StatelessRulesAndCustomActions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessource.html#cfn-networkfirewall-rulegroup-rulessource-statelessrulesandcustomactions
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.RulesSource"
p_RulesSourceList: typing.Union['PropRuleGroupRulesSourceList', dict] = attr.ib(
default=None,
converter=PropRuleGroupRulesSourceList.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(PropRuleGroupRulesSourceList)),
metadata={AttrMeta.PROPERTY_NAME: "RulesSourceList"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessource.html#cfn-networkfirewall-rulegroup-rulessource-rulessourcelist"""
p_RulesString: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "RulesString"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessource.html#cfn-networkfirewall-rulegroup-rulessource-rulesstring"""
p_StatefulRules: typing.List[typing.Union['PropRuleGroupStatefulRule', dict]] = attr.ib(
default=None,
converter=PropRuleGroupStatefulRule.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropRuleGroupStatefulRule), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "StatefulRules"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessource.html#cfn-networkfirewall-rulegroup-rulessource-statefulrules"""
p_StatelessRulesAndCustomActions: typing.Union['PropRuleGroupStatelessRulesAndCustomActions', dict] = attr.ib(
default=None,
converter=PropRuleGroupStatelessRulesAndCustomActions.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(PropRuleGroupStatelessRulesAndCustomActions)),
metadata={AttrMeta.PROPERTY_NAME: "StatelessRulesAndCustomActions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulessource.html#cfn-networkfirewall-rulegroup-rulessource-statelessrulesandcustomactions"""
@attr.s
class PropRuleGroupRuleGroup(Property):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup.RuleGroup"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulegroup.html
Property Document:
- ``rp_RulesSource``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroup-rulessource
- ``p_RuleVariables``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroup-rulevariables
- ``p_StatefulRuleOptions``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroup-statefulruleoptions
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup.RuleGroup"
rp_RulesSource: typing.Union['PropRuleGroupRulesSource', dict] = attr.ib(
default=None,
converter=PropRuleGroupRulesSource.from_dict,
validator=attr.validators.instance_of(PropRuleGroupRulesSource),
metadata={AttrMeta.PROPERTY_NAME: "RulesSource"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroup-rulessource"""
p_RuleVariables: typing.Union['PropRuleGroupRuleVariables', dict] = attr.ib(
default=None,
converter=PropRuleGroupRuleVariables.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(PropRuleGroupRuleVariables)),
metadata={AttrMeta.PROPERTY_NAME: "RuleVariables"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroup-rulevariables"""
p_StatefulRuleOptions: typing.Union['PropRuleGroupStatefulRuleOptions', dict] = attr.ib(
default=None,
converter=PropRuleGroupStatefulRuleOptions.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(PropRuleGroupStatefulRuleOptions)),
metadata={AttrMeta.PROPERTY_NAME: "StatefulRuleOptions"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-properties-networkfirewall-rulegroup-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroup-statefulruleoptions"""
#--- Resource declaration ---
@attr.s
class FirewallPolicy(Resource):
"""
AWS Object Type = "AWS::NetworkFirewall::FirewallPolicy"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html
Property Document:
- ``rp_FirewallPolicy``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy
- ``rp_FirewallPolicyName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicyname
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-description
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-tags
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::FirewallPolicy"
rp_FirewallPolicy: typing.Union['PropFirewallPolicyFirewallPolicy', dict] = attr.ib(
default=None,
converter=PropFirewallPolicyFirewallPolicy.from_dict,
validator=attr.validators.instance_of(PropFirewallPolicyFirewallPolicy),
metadata={AttrMeta.PROPERTY_NAME: "FirewallPolicy"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicy"""
rp_FirewallPolicyName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "FirewallPolicyName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-firewallpolicyname"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-description"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#cfn-networkfirewall-firewallpolicy-tags"""
@property
def rv_FirewallPolicyArn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#aws-resource-networkfirewall-firewallpolicy-return-values"""
return GetAtt(resource=self, attr_name="FirewallPolicyArn")
@property
def rv_FirewallPolicyId(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewallpolicy.html#aws-resource-networkfirewall-firewallpolicy-return-values"""
return GetAtt(resource=self, attr_name="FirewallPolicyId")
@attr.s
class Firewall(Resource):
"""
AWS Object Type = "AWS::NetworkFirewall::Firewall"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html
Property Document:
- ``rp_FirewallName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-firewallname
- ``rp_FirewallPolicyArn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-firewallpolicyarn
- ``rp_SubnetMappings``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-subnetmappings
- ``rp_VpcId``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-vpcid
- ``p_DeleteProtection``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-deleteprotection
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-description
- ``p_FirewallPolicyChangeProtection``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-firewallpolicychangeprotection
- ``p_SubnetChangeProtection``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-subnetchangeprotection
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-tags
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::Firewall"
rp_FirewallName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "FirewallName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-firewallname"""
rp_FirewallPolicyArn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "FirewallPolicyArn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-firewallpolicyarn"""
rp_SubnetMappings: typing.List[typing.Union['PropFirewallSubnetMapping', dict]] = attr.ib(
default=None,
converter=PropFirewallSubnetMapping.from_list,
validator=attr.validators.deep_iterable(member_validator=attr.validators.instance_of(PropFirewallSubnetMapping), iterable_validator=attr.validators.instance_of(list)),
metadata={AttrMeta.PROPERTY_NAME: "SubnetMappings"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-subnetmappings"""
rp_VpcId: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "VpcId"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-vpcid"""
p_DeleteProtection: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "DeleteProtection"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-deleteprotection"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-description"""
p_FirewallPolicyChangeProtection: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "FirewallPolicyChangeProtection"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-firewallpolicychangeprotection"""
p_SubnetChangeProtection: bool = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(bool)),
metadata={AttrMeta.PROPERTY_NAME: "SubnetChangeProtection"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-subnetchangeprotection"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#cfn-networkfirewall-firewall-tags"""
@property
def rv_FirewallArn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#aws-resource-networkfirewall-firewall-return-values"""
return GetAtt(resource=self, attr_name="FirewallArn")
@property
def rv_FirewallId(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#aws-resource-networkfirewall-firewall-return-values"""
return GetAtt(resource=self, attr_name="FirewallId")
@property
def rv_EndpointIds(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-firewall.html#aws-resource-networkfirewall-firewall-return-values"""
return GetAtt(resource=self, attr_name="EndpointIds")
@attr.s
class LoggingConfiguration(Resource):
"""
AWS Object Type = "AWS::NetworkFirewall::LoggingConfiguration"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-loggingconfiguration.html
Property Document:
- ``rp_FirewallArn``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-loggingconfiguration.html#cfn-networkfirewall-loggingconfiguration-firewallarn
- ``rp_LoggingConfiguration``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-loggingconfiguration.html#cfn-networkfirewall-loggingconfiguration-loggingconfiguration
- ``p_FirewallName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-loggingconfiguration.html#cfn-networkfirewall-loggingconfiguration-firewallname
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::LoggingConfiguration"
rp_FirewallArn: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "FirewallArn"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-loggingconfiguration.html#cfn-networkfirewall-loggingconfiguration-firewallarn"""
rp_LoggingConfiguration: typing.Union['PropLoggingConfigurationLoggingConfiguration', dict] = attr.ib(
default=None,
converter=PropLoggingConfigurationLoggingConfiguration.from_dict,
validator=attr.validators.instance_of(PropLoggingConfigurationLoggingConfiguration),
metadata={AttrMeta.PROPERTY_NAME: "LoggingConfiguration"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-loggingconfiguration.html#cfn-networkfirewall-loggingconfiguration-loggingconfiguration"""
p_FirewallName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "FirewallName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-loggingconfiguration.html#cfn-networkfirewall-loggingconfiguration-firewallname"""
@attr.s
class RuleGroup(Resource):
"""
AWS Object Type = "AWS::NetworkFirewall::RuleGroup"
Resource Document: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html
Property Document:
- ``rp_Capacity``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-capacity
- ``rp_RuleGroupName``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroupname
- ``rp_Type``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-type
- ``p_Description``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-description
- ``p_RuleGroup``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroup
- ``p_Tags``: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-tags
"""
AWS_OBJECT_TYPE = "AWS::NetworkFirewall::RuleGroup"
rp_Capacity: int = attr.ib(
default=None,
validator=attr.validators.instance_of(int),
metadata={AttrMeta.PROPERTY_NAME: "Capacity"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-capacity"""
rp_RuleGroupName: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "RuleGroupName"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroupname"""
rp_Type: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.instance_of(TypeCheck.intrinsic_str_type),
metadata={AttrMeta.PROPERTY_NAME: "Type"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-type"""
p_Description: TypeHint.intrinsic_str = attr.ib(
default=None,
validator=attr.validators.optional(attr.validators.instance_of(TypeCheck.intrinsic_str_type)),
metadata={AttrMeta.PROPERTY_NAME: "Description"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-description"""
p_RuleGroup: typing.Union['PropRuleGroupRuleGroup', dict] = attr.ib(
default=None,
converter=PropRuleGroupRuleGroup.from_dict,
validator=attr.validators.optional(attr.validators.instance_of(PropRuleGroupRuleGroup)),
metadata={AttrMeta.PROPERTY_NAME: "RuleGroup"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-rulegroup"""
p_Tags: typing.List[typing.Union[Tag, dict]] = attr.ib(
default=None,
converter=Tag.from_list,
validator=attr.validators.optional(attr.validators.deep_iterable(member_validator=attr.validators.instance_of(Tag), iterable_validator=attr.validators.instance_of(list))),
metadata={AttrMeta.PROPERTY_NAME: "Tags"},
)
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#cfn-networkfirewall-rulegroup-tags"""
@property
def rv_RuleGroupArn(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#aws-resource-networkfirewall-rulegroup-return-values"""
return GetAtt(resource=self, attr_name="RuleGroupArn")
@property
def rv_RuleGroupId(self) -> GetAtt:
"""Doc: http://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-networkfirewall-rulegroup.html#aws-resource-networkfirewall-rulegroup-return-values"""
return GetAtt(resource=self, attr_name="RuleGroupId")
| 66.375309 | 259 | 0.784999 | 8,004 | 80,646 | 7.815217 | 0.024488 | 0.109731 | 0.039918 | 0.061692 | 0.890349 | 0.884898 | 0.864915 | 0.799834 | 0.799834 | 0.799834 | 0 | 0.000014 | 0.095454 | 80,646 | 1,214 | 260 | 66.429984 | 0.857488 | 0.347742 | 0 | 0.389241 | 0 | 0 | 0.114557 | 0.083659 | 0 | 0 | 0 | 0 | 0 | 1 | 0.011076 | false | 0 | 0.006329 | 0 | 0.287975 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a76fc4b2faa234528bcd6289046829da4f8e5a2d | 14,085 | py | Python | sdk/python/pulumi_gcp/accesscontextmanager/service_perimeter_resource.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 121 | 2018-06-18T19:16:42.000Z | 2022-03-31T06:06:48.000Z | sdk/python/pulumi_gcp/accesscontextmanager/service_perimeter_resource.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 492 | 2018-06-22T19:41:03.000Z | 2022-03-31T15:33:53.000Z | sdk/python/pulumi_gcp/accesscontextmanager/service_perimeter_resource.py | sisisin/pulumi-gcp | af6681d70ea457843409110c1324817fe55f68ad | [
"ECL-2.0",
"Apache-2.0"
] | 43 | 2018-06-19T01:43:13.000Z | 2022-03-23T22:43:37.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['ServicePerimeterResourceArgs', 'ServicePerimeterResource']
@pulumi.input_type
class ServicePerimeterResourceArgs:
def __init__(__self__, *,
perimeter_name: pulumi.Input[str],
resource: pulumi.Input[str]):
"""
The set of arguments for constructing a ServicePerimeterResource resource.
:param pulumi.Input[str] perimeter_name: The name of the Service Perimeter to add this resource to.
:param pulumi.Input[str] resource: A GCP resource that is inside of the service perimeter.
Currently only projects are allowed.
Format: projects/{project_number}
"""
pulumi.set(__self__, "perimeter_name", perimeter_name)
pulumi.set(__self__, "resource", resource)
@property
@pulumi.getter(name="perimeterName")
def perimeter_name(self) -> pulumi.Input[str]:
"""
The name of the Service Perimeter to add this resource to.
"""
return pulumi.get(self, "perimeter_name")
@perimeter_name.setter
def perimeter_name(self, value: pulumi.Input[str]):
pulumi.set(self, "perimeter_name", value)
@property
@pulumi.getter
def resource(self) -> pulumi.Input[str]:
"""
A GCP resource that is inside of the service perimeter.
Currently only projects are allowed.
Format: projects/{project_number}
"""
return pulumi.get(self, "resource")
@resource.setter
def resource(self, value: pulumi.Input[str]):
pulumi.set(self, "resource", value)
@pulumi.input_type
class _ServicePerimeterResourceState:
def __init__(__self__, *,
perimeter_name: Optional[pulumi.Input[str]] = None,
resource: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering ServicePerimeterResource resources.
:param pulumi.Input[str] perimeter_name: The name of the Service Perimeter to add this resource to.
:param pulumi.Input[str] resource: A GCP resource that is inside of the service perimeter.
Currently only projects are allowed.
Format: projects/{project_number}
"""
if perimeter_name is not None:
pulumi.set(__self__, "perimeter_name", perimeter_name)
if resource is not None:
pulumi.set(__self__, "resource", resource)
@property
@pulumi.getter(name="perimeterName")
def perimeter_name(self) -> Optional[pulumi.Input[str]]:
"""
The name of the Service Perimeter to add this resource to.
"""
return pulumi.get(self, "perimeter_name")
@perimeter_name.setter
def perimeter_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "perimeter_name", value)
@property
@pulumi.getter
def resource(self) -> Optional[pulumi.Input[str]]:
"""
A GCP resource that is inside of the service perimeter.
Currently only projects are allowed.
Format: projects/{project_number}
"""
return pulumi.get(self, "resource")
@resource.setter
def resource(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "resource", value)
class ServicePerimeterResource(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
perimeter_name: Optional[pulumi.Input[str]] = None,
resource: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Allows configuring a single GCP resource that should be inside of a service perimeter.
This resource is intended to be used in cases where it is not possible to compile a full list
of projects to include in a `accesscontextmanager.ServicePerimeter` resource,
to enable them to be added separately.
> **Note:** If this resource is used alongside a `accesscontextmanager.ServicePerimeter` resource,
the service perimeter resource must have a `lifecycle` block with `ignore_changes = [status[0].resources]` so
they don't fight over which resources should be in the policy.
To get more information about ServicePerimeterResource, see:
* [API documentation](https://cloud.google.com/access-context-manager/docs/reference/rest/v1/accessPolicies.servicePerimeters)
* How-to Guides
* [Service Perimeter Quickstart](https://cloud.google.com/vpc-service-controls/docs/quickstart)
> **Warning:** If you are using User ADCs (Application Default Credentials) with this resource,
you must specify a `billing_project` and set `user_project_override` to true
in the provider configuration. Otherwise the ACM API will return a 403 error.
Your account must have the `serviceusage.services.use` permission on the
`billing_project` you defined.
## Example Usage
### Access Context Manager Service Perimeter Resource Basic
```python
import pulumi
import pulumi_gcp as gcp
access_policy = gcp.accesscontextmanager.AccessPolicy("access-policy",
parent="organizations/123456789",
title="my policy")
service_perimeter_resource_service_perimeter = gcp.accesscontextmanager.ServicePerimeter("service-perimeter-resourceServicePerimeter",
parent=access_policy.name.apply(lambda name: f"accessPolicies/{name}"),
title="restrict_all",
status=gcp.accesscontextmanager.ServicePerimeterStatusArgs(
restricted_services=["storage.googleapis.com"],
))
service_perimeter_resource_service_perimeter_resource = gcp.accesscontextmanager.ServicePerimeterResource("service-perimeter-resourceServicePerimeterResource",
perimeter_name=service_perimeter_resource_service_perimeter.name,
resource="projects/987654321")
```
## Import
ServicePerimeterResource can be imported using any of these accepted formats
```sh
$ pulumi import gcp:accesscontextmanager/servicePerimeterResource:ServicePerimeterResource default {{perimeter_name}}/{{resource}}
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] perimeter_name: The name of the Service Perimeter to add this resource to.
:param pulumi.Input[str] resource: A GCP resource that is inside of the service perimeter.
Currently only projects are allowed.
Format: projects/{project_number}
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: ServicePerimeterResourceArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Allows configuring a single GCP resource that should be inside of a service perimeter.
This resource is intended to be used in cases where it is not possible to compile a full list
of projects to include in a `accesscontextmanager.ServicePerimeter` resource,
to enable them to be added separately.
> **Note:** If this resource is used alongside a `accesscontextmanager.ServicePerimeter` resource,
the service perimeter resource must have a `lifecycle` block with `ignore_changes = [status[0].resources]` so
they don't fight over which resources should be in the policy.
To get more information about ServicePerimeterResource, see:
* [API documentation](https://cloud.google.com/access-context-manager/docs/reference/rest/v1/accessPolicies.servicePerimeters)
* How-to Guides
* [Service Perimeter Quickstart](https://cloud.google.com/vpc-service-controls/docs/quickstart)
> **Warning:** If you are using User ADCs (Application Default Credentials) with this resource,
you must specify a `billing_project` and set `user_project_override` to true
in the provider configuration. Otherwise the ACM API will return a 403 error.
Your account must have the `serviceusage.services.use` permission on the
`billing_project` you defined.
## Example Usage
### Access Context Manager Service Perimeter Resource Basic
```python
import pulumi
import pulumi_gcp as gcp
access_policy = gcp.accesscontextmanager.AccessPolicy("access-policy",
parent="organizations/123456789",
title="my policy")
service_perimeter_resource_service_perimeter = gcp.accesscontextmanager.ServicePerimeter("service-perimeter-resourceServicePerimeter",
parent=access_policy.name.apply(lambda name: f"accessPolicies/{name}"),
title="restrict_all",
status=gcp.accesscontextmanager.ServicePerimeterStatusArgs(
restricted_services=["storage.googleapis.com"],
))
service_perimeter_resource_service_perimeter_resource = gcp.accesscontextmanager.ServicePerimeterResource("service-perimeter-resourceServicePerimeterResource",
perimeter_name=service_perimeter_resource_service_perimeter.name,
resource="projects/987654321")
```
## Import
ServicePerimeterResource can be imported using any of these accepted formats
```sh
$ pulumi import gcp:accesscontextmanager/servicePerimeterResource:ServicePerimeterResource default {{perimeter_name}}/{{resource}}
```
:param str resource_name: The name of the resource.
:param ServicePerimeterResourceArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(ServicePerimeterResourceArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
perimeter_name: Optional[pulumi.Input[str]] = None,
resource: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = ServicePerimeterResourceArgs.__new__(ServicePerimeterResourceArgs)
if perimeter_name is None and not opts.urn:
raise TypeError("Missing required property 'perimeter_name'")
__props__.__dict__["perimeter_name"] = perimeter_name
if resource is None and not opts.urn:
raise TypeError("Missing required property 'resource'")
__props__.__dict__["resource"] = resource
super(ServicePerimeterResource, __self__).__init__(
'gcp:accesscontextmanager/servicePerimeterResource:ServicePerimeterResource',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
perimeter_name: Optional[pulumi.Input[str]] = None,
resource: Optional[pulumi.Input[str]] = None) -> 'ServicePerimeterResource':
"""
Get an existing ServicePerimeterResource resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] perimeter_name: The name of the Service Perimeter to add this resource to.
:param pulumi.Input[str] resource: A GCP resource that is inside of the service perimeter.
Currently only projects are allowed.
Format: projects/{project_number}
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _ServicePerimeterResourceState.__new__(_ServicePerimeterResourceState)
__props__.__dict__["perimeter_name"] = perimeter_name
__props__.__dict__["resource"] = resource
return ServicePerimeterResource(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="perimeterName")
def perimeter_name(self) -> pulumi.Output[str]:
"""
The name of the Service Perimeter to add this resource to.
"""
return pulumi.get(self, "perimeter_name")
@property
@pulumi.getter
def resource(self) -> pulumi.Output[str]:
"""
A GCP resource that is inside of the service perimeter.
Currently only projects are allowed.
Format: projects/{project_number}
"""
return pulumi.get(self, "resource")
| 45.879479 | 167 | 0.67405 | 1,530 | 14,085 | 6.020915 | 0.161438 | 0.053626 | 0.042553 | 0.031915 | 0.787234 | 0.766609 | 0.760096 | 0.733391 | 0.722102 | 0.717434 | 0 | 0.004416 | 0.244444 | 14,085 | 306 | 168 | 46.029412 | 0.86121 | 0.525737 | 0 | 0.495727 | 1 | 0 | 0.11173 | 0.026944 | 0 | 0 | 0 | 0 | 0 | 1 | 0.145299 | false | 0.008547 | 0.042735 | 0 | 0.273504 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
a779598b96c2f342381b836a5ff9334be6f8c21a | 35 | py | Python | setup.py | miketheinkman/facebook_admin | 14dbab65b70ac4f3bb4fa9e78918c8f383115787 | [
"MIT"
] | null | null | null | setup.py | miketheinkman/facebook_admin | 14dbab65b70ac4f3bb4fa9e78918c8f383115787 | [
"MIT"
] | null | null | null | setup.py | miketheinkman/facebook_admin | 14dbab65b70ac4f3bb4fa9e78918c8f383115787 | [
"MIT"
] | null | null | null | from . import settings
import os
| 7 | 22 | 0.742857 | 5 | 35 | 5.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228571 | 35 | 4 | 23 | 8.75 | 0.962963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ac29f00222d01c1e47ed73b02e080433888bbbf3 | 153 | py | Python | genre_classifier/classifier.py | palicand/genre_classifier | c9eb8b5086579c7be7b870720cfa7a53052b22b6 | [
"MIT"
] | null | null | null | genre_classifier/classifier.py | palicand/genre_classifier | c9eb8b5086579c7be7b870720cfa7a53052b22b6 | [
"MIT"
] | null | null | null | genre_classifier/classifier.py | palicand/genre_classifier | c9eb8b5086579c7be7b870720cfa7a53052b22b6 | [
"MIT"
] | null | null | null | from sklearn.externals import joblib
__author__ = 'Andrej Palicka <andrej.palicka@merck.com>'
def load_classifier(path):
return joblib.load(path)
| 19.125 | 56 | 0.771242 | 20 | 153 | 5.65 | 0.75 | 0.230089 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130719 | 153 | 7 | 57 | 21.857143 | 0.849624 | 0 | 0 | 0 | 0 | 0 | 0.267974 | 0.169935 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
022f407c155e9ba9b1add831b227ccec053d98fe | 34 | py | Python | gerritssh/internal/__init__.py | kdopen/gerritssh | 363e97fc981ef83d72d73c622af12a8449c8d424 | [
"Apache-2.0"
] | 2 | 2016-02-24T08:32:30.000Z | 2017-09-25T08:08:58.000Z | gerritssh/internal/__init__.py | kdopen/gerritssh | 363e97fc981ef83d72d73c622af12a8449c8d424 | [
"Apache-2.0"
] | null | null | null | gerritssh/internal/__init__.py | kdopen/gerritssh | 363e97fc981ef83d72d73c622af12a8449c8d424 | [
"Apache-2.0"
] | 4 | 2015-04-07T21:58:37.000Z | 2018-11-20T20:18:21.000Z | from .cmdoptions import * # noqa
| 17 | 33 | 0.705882 | 4 | 34 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205882 | 34 | 1 | 34 | 34 | 0.888889 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5a1827c1e5da23a76d9a40f81f142b6f7f80fada | 29 | py | Python | pygame/symbian/lib/event.py | CiubucAlexandra/Theremine-Projet-Micriprocesseurs | 7670d9cb468b060135dc5f057b734db970da0f0c | [
"BSD-3-Clause"
] | 3 | 2016-04-11T00:41:52.000Z | 2016-06-11T03:58:12.000Z | pygame/symbian/lib/event.py | CiubucAlexandra/Theremine-Projet-Micriprocesseurs | 7670d9cb468b060135dc5f057b734db970da0f0c | [
"BSD-3-Clause"
] | 5 | 2016-02-11T01:16:51.000Z | 2016-04-08T06:10:28.000Z | pygame/symbian/lib/event.py | CiubucAlexandra/Theremine-Projet-Micriprocesseurs | 7670d9cb468b060135dc5f057b734db970da0f0c | [
"BSD-3-Clause"
] | null | null | null | from pygame_event import *
| 14.5 | 28 | 0.758621 | 4 | 29 | 5.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206897 | 29 | 1 | 29 | 29 | 0.913043 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5a7d38e46acd4468a371b8769551125f9a170118 | 30 | py | Python | tosh/commands/__init__.py | javitonino/tosh | e096ee435264849ab644c6b27d643352446d2b2a | [
"BSD-3-Clause"
] | 1 | 2017-09-23T09:30:49.000Z | 2017-09-23T09:30:49.000Z | tosh/commands/__init__.py | javitonino/tosh | e096ee435264849ab644c6b27d643352446d2b2a | [
"BSD-3-Clause"
] | null | null | null | tosh/commands/__init__.py | javitonino/tosh | e096ee435264849ab644c6b27d643352446d2b2a | [
"BSD-3-Clause"
] | null | null | null | from .exit import ExitCommand
| 15 | 29 | 0.833333 | 4 | 30 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 30 | 1 | 30 | 30 | 0.961538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5a7f7de5e1167506f79bd9e8143f93852f7c31d5 | 33 | py | Python | app/gws/qgis/__init__.py | ewie/gbd-websuite | 6f2814c7bb64d11cb5a0deec712df751718fb3e1 | [
"Apache-2.0"
] | 3 | 2020-07-24T10:10:18.000Z | 2022-03-16T10:22:04.000Z | app/gws/qgis/__init__.py | ewie/gbd-websuite | 6f2814c7bb64d11cb5a0deec712df751718fb3e1 | [
"Apache-2.0"
] | 28 | 2020-03-03T17:35:58.000Z | 2021-07-12T12:05:47.000Z | app/gws/qgis/__init__.py | ewie/gbd-websuite | 6f2814c7bb64d11cb5a0deec712df751718fb3e1 | [
"Apache-2.0"
] | 1 | 2021-02-22T14:32:10.000Z | 2021-02-22T14:32:10.000Z | from .types import PrintTemplate
| 16.5 | 32 | 0.848485 | 4 | 33 | 7 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.121212 | 33 | 1 | 33 | 33 | 0.965517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ce458546dc9414899dcead27db2e65e2dccae59d | 43 | py | Python | eval/__init__.py | y-vas/jict | 8ad9590452455c2fc27adbb89a563fe6a3a0a0f1 | [
"MIT"
] | null | null | null | eval/__init__.py | y-vas/jict | 8ad9590452455c2fc27adbb89a563fe6a3a0a0f1 | [
"MIT"
] | null | null | null | eval/__init__.py | y-vas/jict | 8ad9590452455c2fc27adbb89a563fe6a3a0a0f1 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from .base import *
| 10.75 | 21 | 0.674419 | 7 | 43 | 4.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162791 | 43 | 3 | 22 | 14.333333 | 0.805556 | 0.465116 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ce707cc3d90868e4f9f95bc7a4712bc3acf714b7 | 441 | py | Python | modules/functional/__init__.py | Masterchef365/pvcnn | db13331a46f672e74e7b5bde60e7bf30d445cd2d | [
"MIT"
] | 477 | 2019-12-10T01:03:43.000Z | 2022-03-28T14:10:08.000Z | modules/functional/__init__.py | Masterchef365/pvcnn | db13331a46f672e74e7b5bde60e7bf30d445cd2d | [
"MIT"
] | 57 | 2019-12-10T10:14:26.000Z | 2022-03-26T04:59:43.000Z | modules/functional/__init__.py | Masterchef365/pvcnn | db13331a46f672e74e7b5bde60e7bf30d445cd2d | [
"MIT"
] | 126 | 2019-12-10T07:59:50.000Z | 2022-03-12T07:21:19.000Z | from modules.functional.ball_query import ball_query
from modules.functional.devoxelization import trilinear_devoxelize
from modules.functional.grouping import grouping
from modules.functional.interpolatation import nearest_neighbor_interpolate
from modules.functional.loss import kl_loss, huber_loss
from modules.functional.sampling import gather, furthest_point_sample, logits_mask
from modules.functional.voxelization import avg_voxelize
| 55.125 | 82 | 0.891156 | 56 | 441 | 6.821429 | 0.482143 | 0.201571 | 0.384817 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070295 | 441 | 7 | 83 | 63 | 0.931707 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ce93b2dc87a7d46b6cd8ff5f74890960beed6797 | 12,542 | py | Python | ISP_parallel/util/simulation.py | NREL/GANISP | 3ce6979e26f837d05b8f7cfbe2b949f900b6026b | [
"BSD-3-Clause"
] | null | null | null | ISP_parallel/util/simulation.py | NREL/GANISP | 3ce6979e26f837d05b8f7cfbe2b949f900b6026b | [
"BSD-3-Clause"
] | null | null | null | ISP_parallel/util/simulation.py | NREL/GANISP | 3ce6979e26f837d05b8f7cfbe2b949f900b6026b | [
"BSD-3-Clause"
] | 1 | 2022-02-23T13:48:06.000Z | 2022-02-23T13:48:06.000Z | import numpy as np
import time
import sys
sys.path.append('util')
import importanceSplitting as isplt
from myProgressBar import printProgressBar
def run(Sim):
# Timing
t_start = time.time()
t_init_start = time.time()
Result = {}
# Sim details
Ndof = Sim['Ndof']
NSim = Sim['NSim']
h = Sim['Timestep']
tmax = Sim['Tf']
nmax = Sim['nmax']
nplt = Sim['nplt']
stepFunc = Sim['stepFunc']
qoiFunc = Sim['qoiFunc']
epsilon_init = Sim['epsilon_init']
recordSolution = Sim['Record solution']
nMonitor = int(round(nmax/100))
uu = Sim['uu']
tt = Sim['tt']
# Initial condition
u = np.transpose(Sim['u0']*np.ones((NSim,Ndof))) + epsilon_init*np.random.normal(loc=0.0, scale=1.0, size=(Ndof,NSim))
t_init_end = time.time()
# main loop
t_main_start = time.time()
t_step = 0
if recordSolution:
uu[0,:,:]=u
else:
uu[0,:,0]=u[:,0]
qoi = qoiFunc(u)
Sim['qoiTot'][0,:,:]=qoi
advancementCounter = 0
printProgressBar(0, nmax, prefix = 'Iter ' + str(0) + ' / ' +str(nmax),suffix = 'Complete', length = 50)
for n in range(1, nmax+1):
t = n*h
t_start_step = time.time()
u = stepFunc(u,Sim)
t_end_step = time.time()
t_step += t_end_step-t_start_step
qoi = qoiFunc(u)
if n%nplt == 0:
if recordSolution:
uu[n,:,:] = u
else:
uu[n,:,0] = u[:,0]
Sim['qoiTot'][n,:,:]=qoi
if n%nMonitor == 0:
advancementCounter +=1
printProgressBar(n, nmax, prefix = 'Iter ' + str(n) + ' / ' +str(nmax),suffix = 'Complete', length = 50)
t_main_end = time.time()
# Timing
t_end = time.time()
Result['tt'] = tt
Result['uu'] = uu
Result['qoiTot'] = Sim['qoiTot']
Result['timeExec'] = t_end-t_start
Result['timeExecInit'] = t_init_end-t_init_start
Result['timeExecMain'] = t_main_end-t_main_start
Result['timeExecStep'] = t_step
return Result
def runEpsClone(Sim):
# Timing
t_start = time.time()
t_init_start = time.time()
Result = {}
# Sim details
Ndof = Sim['Ndof']
NSim = Sim['NSim']
h = Sim['Timestep']
tmax = Sim['Tf']
nmax = Sim['nmax']
nplt = Sim['nplt']
stepFunc = Sim['stepFunc']
qoiFunc = Sim['qoiFunc']
epsilon_init = Sim['epsilon_init']
epsilon_clone = Sim['Epsilon clone']
recordSolution = Sim['Record solution']
nMonitor = int(round(nmax/100))
uu = Sim['uu']
tt = Sim['tt']
# Initial condition
u = np.transpose(Sim['u0']*np.ones((NSim,Ndof))) + epsilon_init*np.random.normal(loc=0.0, scale=1.0, size=(Ndof,NSim))
t_init_end = time.time()
# main loop
t_main_start = time.time()
t_step = 0
if recordSolution:
uu[0,:,:]=u
else:
uu[0,:,0]=u[:,0]
qoi = qoiFunc(u)
Sim['qoiTot'][0,:,:]=qoi
advancementCounter = 0
printProgressBar(0, nmax, prefix = 'Iter ' + str(0) + ' / ' +str(nmax),suffix = 'Complete', length = 50)
for n in range(1, nmax+1):
Sim['Ni'] += 1
t = n*h
t_start_step = time.time()
u = stepFunc(u,Sim)
t_end_step = time.time()
t_step += t_end_step-t_start_step
qoi = qoiFunc(u)
if n%nplt == 0:
if recordSolution:
uu[n,:,:] = u
else:
uu[n,:,0] = u[:,0]
Sim['qoiTot'][n,:,:]=qoi
if n%nMonitor == 0:
advancementCounter +=1
if Sim['Ni'] == Sim['NselectionThreshold']:
Sim['Ni'] = 0
u += epsilon_clone*np.random.normal(loc=0.0,scale=1.0,size=u.shape)
printProgressBar(n, nmax, prefix = 'Iter ' + str(n) + ' / ' +str(nmax),suffix = 'Complete', length = 50)
t_main_end = time.time()
# Timing
t_end = time.time()
Result['tt'] = tt
Result['uu'] = uu
Result['qoiTot'] = Sim['qoiTot']
Result['timeExec'] = t_end-t_start
Result['timeExecInit'] = t_init_end-t_init_start
Result['timeExecMain'] = t_main_end-t_main_start
Result['timeExecStep'] = t_step
return Result
def runLE(Sim):
# Timing
t_start = time.time()
t_init_start = time.time()
Result = {}
# Sim details
Ndof = Sim['Ndof']
NSim = 2
NLE = NSim - 1
h = Sim['Timestep']
tmax = Sim['Tf']
nmax = Sim['nmax']
nplt = Sim['nplt']
stepFunc = Sim['stepFunc']
qoiFunc = Sim['qoiFunc']
epsilon_init = Sim['epsilon_init']
recordSolution = Sim['Record solution']
nMonitor = int(round(nmax/100))
normInit = Sim['normPerturb']
nTimestepLE = 5
uu = Sim['uu']
tt = Sim['tt']
# Initial condition
u = np.transpose(Sim['u0']*np.ones((NSim,Ndof))) + epsilon_init*np.random.normal(loc=0.0, scale=1.0, size=(Ndof,NSim))
# Initialize perturbation
pert = np.random.uniform(-0.5,0.5)*normInit
# Normalize perturbation
norm = np.linalg.norm(pert)
# Apply perturbation
u[:,1] = u[:,0] + pert*normInit/norm
t_init_end = time.time()
# main loop
t_main_start = time.time()
t_step = 0
if recordSolution:
uu[0,:,:]=u
else:
uu[0,:,0]=u[:,0]
qoi = qoiFunc(u)
Sim['qoiTot'][0,:,:]=qoi
LEval = []
LEQval = []
LETime = []
advancementLE = 0
advancementCounter = 0
printProgressBar(0, nmax, prefix = 'Iter ' + str(0) + ' / ' +str(nmax),suffix = 'Complete', length = 50)
for n in range(1, nmax+1):
if advancementLE==0:
normQinit = abs(qoiFunc(u[:,0])-qoiFunc(u[:,1]))
advancementLE += 1
t = n*h
t_start_step = time.time()
u = stepFunc(u,Sim)
t_end_step = time.time()
t_step += t_end_step-t_start_step
qoi = qoiFunc(u)
if n%nplt == 0:
if recordSolution:
uu[n,:,:] = u
else:
uu[n,:,0] = u[:,0]
Sim['qoiTot'][n,:,:]=qoi
if n%nMonitor == 0:
advancementCounter +=1
printProgressBar(n, nmax, prefix = 'Iter ' + str(n) + ' / ' +str(nmax),suffix = 'Complete', length = 50)
if advancementLE>=nTimestepLE:
normQDiff = abs(qoiFunc(u[:,0])-qoiFunc(u[:,1]))
LEQval.append( (np.log(normQDiff)-np.log(normQinit))/(nTimestepLE*h) )
diff = u[:,1]-u[:,0]
normDiff = np.linalg.norm(diff)
LEval.append( (np.log(normDiff)-np.log(normInit))/(nTimestepLE*h) )
u[:,1] = u[:,0] + diff*normInit/normDiff
advancementLE=0
LETime.append((n-advancementLE)*h)
t_main_end = time.time()
# Timing
t_end = time.time()
Result['tt'] = tt
Result['uu'] = uu
Result['qoiTot'] = Sim['qoiTot']
Result['timeExec'] = t_end-t_start
Result['timeExecInit'] = t_init_end-t_init_start
Result['timeExecMain'] = t_main_end-t_main_start
Result['timeExecStep'] = t_step
Result['LE'] = np.array(LEval)
cumulativeLE = np.cumsum(Result['LE'])
Result['LERunAve'] = cumulativeLE/np.array(list(range(1,len(cumulativeLE)+1)))
Result['LEQ'] = np.array(LEQval)
cumulativeLEQ = np.cumsum(Result['LEQ'])
Result['LEQRunAve'] = cumulativeLEQ/np.array(list(range(1,len(cumulativeLEQ)+1)))
Result['LETime'] = np.array(LETime)
return Result
def runFrontBack(Sim):
# Timing
t_start = time.time()
t_init_start = time.time()
Result = {}
Ndof = Sim['Ndof']
NSim = Sim['NSim']
h = Sim['Timestep']
tmax = Sim['Tf']
nmax = Sim['nmax']
nplt = Sim['nplt']
stepFuncForward = Sim['forwardStepFunc']
stepFuncBackward = Sim['backwardStepFunc']
qoiFunc = Sim['qoiFunc']
epsilon_init = Sim['epsilon_init']
recordSolution = Sim['Record solution']
uu = Sim['uu']
tt = Sim['tt']
# Initial conditions
u = np.transpose(Sim['u0']*np.ones((NSim,Ndof))) + epsilon_init*np.random.normal(loc=0.0, scale=1.0, size=(Ndof,NSim))
t_init_end = time.time()
# main loop
t_main_start = time.time()
if recordSolution:
uu[0,:,:]=u
else:
uu[0,:,0]=u[:,0]
qoi = qoiFunc(u)
Sim['qoiTot'][0,:,:] = qoi
t_step = 0
for n in range(1, nmax+1):
t = n*h
t_start_step = time.time()
if n<nmax/2:
u = stepFuncForward(u,Sim)
else:
u = stepFuncBackward(u,Sim)
t_end_step = time.time()
t_step += t_end_step-t_start_step
qoi = qoiFunc(u)
if n%nplt == 0:
if recordSolution:
uu[n,:,:] = u
else:
uu[n,:,0] = u[:,0]
Sim['qoiTot'][n,:,:] = qoi
t_main_end = time.time()
# Timing
t_end = time.time()
Result['tt'] = tt
Result['uu'] = uu
Result['qoiTot'] = Sim['qoiTot']
Result['timeExec'] = t_end-t_start
Result['timeExecInit'] = t_init_end-t_init_start
Result['timeExecMain'] = t_main_end-t_main_start
Result['timeExecStep'] = t_step
return Result
def runIS(Sim):
# Timing
t_start = time.time()
t_init_start = time.time()
Result = {}
# Sim details
Ndof = Sim['Ndof']
NSim = Sim['NSim']
h = Sim['Timestep']
tmax = Sim['Tf']
nmax = Sim['nmax']
nplt = Sim['nplt']
stepFunc = Sim['stepFunc']
qoiFunc = Sim['qoiFunc']
epsilon_init = Sim['epsilon_init']
recordSolution = Sim['Record solution']
nMonitor = int(round(nmax/100))
uu = Sim['uu']
tt = Sim['tt']
t_init_end = time.time()
printProgressBar(0, Sim['nRep_'], prefix = 'ISP reps ' + str(0) + ' / ' +str(Sim['nRep_']),suffix = 'Complete', length = 50)
for iRep in range(Sim['nRep_']):
# main loop
t_main_start = time.time()
isplt.reset(Sim)
t_step = 0
# Initial condition
u = np.transpose(Sim['u0']*np.ones((NSim,Ndof))) + epsilon_init*np.random.normal(loc=0.0, scale=1.0, size=(Ndof,NSim))
if recordSolution:
uu[0,:,:]=u
else:
uu[0,:,0]=u[:,0]
qoi = qoiFunc(u)
Sim['qoiTot'][0,:,:]=qoi
advancementCounter = 0
for n in range(1, nmax+1):
isplt.prestep(u,Sim,n)
t = n*h
t_start_step = time.time()
u = stepFunc(u,Sim)
isplt.step(Sim)
t_end_step = time.time()
t_step += t_end_step-t_start_step
qoi = qoiFunc(u)
Sim['qoiTot'][n,:,:]=qoi
Sim['u'] = u
isplt.poststep(Sim,n,iRep)
u = Sim['u']
if n%nplt == 0:
if recordSolution:
uu[n,:,:] = u
else:
uu[n,:,0] = u[:,0]
if n%nMonitor == 0:
advancementCounter +=1
#print("Done " + str(round(100*float(n/(nmax)))) + "%")
# ~~~~ Finalize
isplt.finalize(Sim,iRep)
# ~~~~ Log advancement
printProgressBar(iRep+1, Sim['nRep_'], prefix = 'ISP reps ' + str(iRep+1) + ' / ' +str(Sim['nRep_']),suffix = 'Complete', length = 50)
t_main_end = time.time()
# Timing
t_end = time.time()
Result['tt'] = tt
Result['uu'] = uu
Result['qoiTot'] = Sim['qoiTot']
Result['timeExec'] = t_end-t_start
Result['timeExecInit'] = t_init_end-t_init_start
Result['timeExecMain'] = t_main_end-t_main_start
Result['timeExecStep'] = t_step
return Result
def simRun(Sim):
if Sim['Simulation name']=='KS' or Sim['Simulation name']=='L96':
return run(Sim)
if Sim['Simulation name']=='KSFrontBack' or Sim['Simulation name']=='L96FrontBack':
return runFrontBack(Sim)
def simRunLE(Sim):
if Sim['Simulation name']=='KS' or Sim['Simulation name']=='L96':
return runLE(Sim)
if Sim['Simulation name']=='KSFrontBack' or Sim['Simulation name']=='L96FrontBack':
print('not implemented, LE is done only for forward sim')
sys.exit()
return
def simRunEpsClone(Sim):
if Sim['Simulation name']=='KS' or Sim['Simulation name']=='L96':
return runEpsClone(Sim)
if Sim['Simulation name']=='KSFrontBack' or Sim['Simulation name']=='L96FrontBack':
print('not implemented, LE is done only for forward sim')
sys.exit()
return
def simRunIS(Sim):
if Sim['Simulation name']=='KS' or Sim['Simulation name']=='L96':
return runIS(Sim)
| 27.030172 | 143 | 0.54465 | 1,620 | 12,542 | 4.107407 | 0.096914 | 0.048091 | 0.029306 | 0.016832 | 0.793207 | 0.788999 | 0.771416 | 0.751728 | 0.749023 | 0.749023 | 0 | 0.020115 | 0.290464 | 12,542 | 463 | 144 | 27.088553 | 0.72761 | 0.032929 | 0 | 0.801749 | 0 | 0 | 0.115143 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.026239 | false | 0 | 0.014577 | 0 | 0.075802 | 0.03207 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0b493129087424a92ae0f5559228e07764258966 | 21 | py | Python | lights/__init__.py | carderne/lifx-transitions | 2c9c3b119c170e4ca78b59e52885a750c0c0fc9a | [
"MIT"
] | 9 | 2018-11-12T11:08:40.000Z | 2022-01-13T17:56:29.000Z | poker_ai/cli/__init__.py | fedden/pluribus | 73fb394b26623c897459ffa3e66d7a5cb47e9962 | [
"MIT"
] | 27 | 2020-09-20T21:18:48.000Z | 2021-07-31T13:02:10.000Z | bunnies/reproduce/__init__.py | big-c-note/bunnies | af66025ba47d4e0458c9cf0fbe52dae71f8f6e92 | [
"MIT"
] | 3 | 2019-07-31T07:49:17.000Z | 2021-04-12T15:45:52.000Z | from . import runner
| 10.5 | 20 | 0.761905 | 3 | 21 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 21 | 1 | 21 | 21 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0b78ebe0669fac81305e92a45c2523a56889ef67 | 148 | py | Python | Dmail/esp/__init__.py | hmiladhia/Dmail | 386350331cc5973f79cd4ce42a9a23cfb3a8b87a | [
"MIT"
] | 6 | 2020-11-28T18:00:04.000Z | 2021-07-07T02:19:48.000Z | Dmail/esp/__init__.py | hmiladhia/Dmail | 386350331cc5973f79cd4ce42a9a23cfb3a8b87a | [
"MIT"
] | 1 | 2021-06-23T10:58:45.000Z | 2021-07-24T22:11:44.000Z | Dmail/esp/__init__.py | hmiladhia/Dmail | 386350331cc5973f79cd4ce42a9a23cfb3a8b87a | [
"MIT"
] | 1 | 2021-07-07T02:20:24.000Z | 2021-07-07T02:20:24.000Z | from Dmail.esp.gmail import Gmail
from Dmail.esp.hotmail import Hotmail
from Dmail.esp.office365 import Office365
from Dmail.esp.yahoo import Yahoo
| 29.6 | 41 | 0.837838 | 24 | 148 | 5.166667 | 0.333333 | 0.290323 | 0.387097 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.108108 | 148 | 4 | 42 | 37 | 0.893939 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
0b8ed37929c9e5c454591569c0a569e142bec784 | 104 | py | Python | vnpy/app/data_recorder/__init__.py | Billy-Meng/vnpy_origin | b0b0868027d70b1ba5dac65aa1a6d5e4246a0900 | [
"MIT"
] | 1 | 2020-06-18T16:38:29.000Z | 2020-06-18T16:38:29.000Z | vnpy/app/data_recorder/__init__.py | Billy-Meng/vnpy_origin | b0b0868027d70b1ba5dac65aa1a6d5e4246a0900 | [
"MIT"
] | 2 | 2020-06-22T12:12:43.000Z | 2020-06-23T01:26:10.000Z | vnpy/app/data_recorder/__init__.py | Billy-Meng/vnpy | b0b0868027d70b1ba5dac65aa1a6d5e4246a0900 | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
import sys
import vnpy_datarecorder
sys.modules[__name__] = vnpy_datarecorder
| 13 | 41 | 0.740385 | 13 | 104 | 5.461538 | 0.692308 | 0.450704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011111 | 0.134615 | 104 | 7 | 42 | 14.857143 | 0.777778 | 0.192308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
0b9b8d11fce6216656c109393359d76daf440443 | 32 | py | Python | ENTRY_MODULE/FirstStepsInCoding/EXERCISE/01_USD_to_BGN.py | sleepychild/ProgramingBasicsPython | d96dc4662adc1c8329b731b9c9b7fa4ecf69ec16 | [
"MIT"
] | null | null | null | ENTRY_MODULE/FirstStepsInCoding/EXERCISE/01_USD_to_BGN.py | sleepychild/ProgramingBasicsPython | d96dc4662adc1c8329b731b9c9b7fa4ecf69ec16 | [
"MIT"
] | 1 | 2022-01-15T10:33:56.000Z | 2022-01-15T10:33:56.000Z | ENTRY_MODULE/FirstStepsInCoding/EXERCISE/01_USD_to_BGN.py | sleepychild/ProgramingBasicsPython | d96dc4662adc1c8329b731b9c9b7fa4ecf69ec16 | [
"MIT"
] | null | null | null | print(1.79549 * float(input()))
| 16 | 31 | 0.65625 | 5 | 32 | 4.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206897 | 0.09375 | 32 | 1 | 32 | 32 | 0.517241 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
0ba25d16646589e5f582634ec9c2d134a8bf6fcf | 1,165 | py | Python | controller/telemetry.py | TUPilotsClub/TUTello | 965b236866c34d2444afd4813418131438978e13 | [
"MIT"
] | null | null | null | controller/telemetry.py | TUPilotsClub/TUTello | 965b236866c34d2444afd4813418131438978e13 | [
"MIT"
] | null | null | null | controller/telemetry.py | TUPilotsClub/TUTello | 965b236866c34d2444afd4813418131438978e13 | [
"MIT"
] | null | null | null | from drone import Tello
class Telemetry:
def __init__(self, tello: Tello):
self.tello = tello
def getBattery(self):
return self.tello.battery
def getBarometer(self):
return self.tello.barometer
def getFlightTime(self):
return self.tello.flight_time
def getRoll(self):
return self.tello.pitch
def getRoll(self):
return self.tello.roll
def getYaw(self):
return self.tello.yaw
def getSpeedX(self):
return self.tello.speed_x
def getSpeedY(self):
return self.tello.speed_y
def getSpeedZ(self):
return self.tello.speed_z
def getTempLow(self):
return self.tello.temperature_lowest
def getTempHigh(self):
return self.tello.temperature_highest
def getFlightDistance(self):
return self.tello.distance_tof
def getHeight(self):
return self.tello.height
def getAccelerationX(self):
return self.tello.acceleration_x
def getAccelerationY(self):
return self.tello.acceleration_y
def getAccelerationZ(self):
return self.tello.acceleration_z
| 21.574074 | 45 | 0.651502 | 136 | 1,165 | 5.477941 | 0.316176 | 0.21745 | 0.300671 | 0.408054 | 0.379866 | 0.077852 | 0 | 0 | 0 | 0 | 0 | 0 | 0.272961 | 1,165 | 54 | 46 | 21.574074 | 0.879575 | 0 | 0 | 0.055556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.472222 | false | 0 | 0.027778 | 0.444444 | 0.972222 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
0bae46eadc87983b491d602b53f5e51ca58ae98c | 126 | py | Python | music_essentials/__init__.py | charlottepierce/music_basics | 097a2bb14373662fcb0f6b585abca2913a8affb0 | [
"MIT"
] | 6 | 2018-03-22T03:09:24.000Z | 2021-08-13T16:19:21.000Z | music_essentials/__init__.py | charlottepierce/music_basics | 097a2bb14373662fcb0f6b585abca2913a8affb0 | [
"MIT"
] | 3 | 2018-06-19T02:09:52.000Z | 2018-06-20T00:09:46.000Z | music_essentials/__init__.py | charlottepierce/music_basics | 097a2bb14373662fcb0f6b585abca2913a8affb0 | [
"MIT"
] | 1 | 2018-06-18T19:07:03.000Z | 2018-06-18T19:07:03.000Z | from .note import Note
from .note import Rest
from .interval import Interval
from .scale import Scale
from .chord import Chord | 25.2 | 30 | 0.809524 | 20 | 126 | 5.1 | 0.35 | 0.156863 | 0.27451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150794 | 126 | 5 | 31 | 25.2 | 0.953271 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f01b169e4f20a41e97dab54396f6953a441b2345 | 191 | py | Python | hiredis/__init__.py | asafkahlon/hiredis-py | cca535dc76d4f9380dd2e4214bca73ad826a8387 | [
"BSD-3-Clause"
] | 296 | 2015-01-03T01:21:13.000Z | 2022-03-22T23:02:09.000Z | hiredis/__init__.py | asafkahlon/hiredis-py | cca535dc76d4f9380dd2e4214bca73ad826a8387 | [
"BSD-3-Clause"
] | 85 | 2015-01-23T21:32:02.000Z | 2022-03-25T14:40:15.000Z | hiredis/__init__.py | asafkahlon/hiredis-py | cca535dc76d4f9380dd2e4214bca73ad826a8387 | [
"BSD-3-Clause"
] | 64 | 2015-02-05T07:59:57.000Z | 2022-03-23T01:50:21.000Z | from .hiredis import Reader, HiredisError, ProtocolError, ReplyError
from .version import __version__
__all__ = [
"Reader", "HiredisError", "ProtocolError", "ReplyError",
"__version__"]
| 27.285714 | 68 | 0.759162 | 17 | 191 | 7.823529 | 0.529412 | 0.270677 | 0.466165 | 0.616541 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125654 | 191 | 6 | 69 | 31.833333 | 0.796407 | 0 | 0 | 0 | 0 | 0 | 0.272251 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
6543d7e7c5a7c239b2b98fe08e1c8f236135c9bd | 4,741 | py | Python | tests/test_scatnet_fwd.py | hologerry/pytorch_wavelets | b1418c87f8151f94c2fecc5b31ae2fb7ea31cc03 | [
"MIT"
] | 3 | 2021-07-11T14:40:23.000Z | 2022-03-26T12:41:56.000Z | tests/test_scatnet_fwd.py | hologerry/pytorch_wavelets | b1418c87f8151f94c2fecc5b31ae2fb7ea31cc03 | [
"MIT"
] | null | null | null | tests/test_scatnet_fwd.py | hologerry/pytorch_wavelets | b1418c87f8151f94c2fecc5b31ae2fb7ea31cc03 | [
"MIT"
] | 1 | 2020-07-18T08:39:36.000Z | 2020-07-18T08:39:36.000Z | from Transform2d_np import Transform2d
from pytorch_wavelets.scatternet import ScatLayer, ScatLayerj2
import numpy as np
import torch
import torch.nn.functional as F
import pytest
@pytest.mark.parametrize('biort', ['near_sym_a', 'near_sym_b', 'near_sym_b_bp'])
def test_equal(biort):
b = 1e-2
scat = ScatLayer(biort=biort, magbias=b)
xfm = Transform2d(biort=biort)
x = torch.randn(3, 4, 32, 32)
z = scat(x)
X = x.data.numpy()
Yl, Yh = xfm.forward(X, nlevels=1)
yl = torch.tensor(Yl)
yl2 = F.avg_pool2d(yl, 2)
M = np.sqrt(Yh[0].real**2 + Yh[0].imag**2 + b**2) - b
M = M.transpose(0, 2, 1, 3, 4)
m = torch.tensor(M)
m2 = m.view(3, 24, 16, 16)
z2 = torch.cat((yl2, m2), dim=1)
np.testing.assert_array_almost_equal(z, z2, decimal=4)
@pytest.mark.parametrize('biort', ['near_sym_a', 'near_sym_b', 'near_sym_b_bp'])
def test_equal_colour(biort):
b = 1e-2
scat = ScatLayer(biort=biort, combine_colour=True, magbias=b)
xfm = Transform2d(biort=biort)
x = torch.randn(4, 3, 32, 32)
z = scat(x)
X = x.data.numpy()
Yl, Yh = xfm.forward(X, nlevels=1)
yl = torch.tensor(Yl)
yl2 = F.avg_pool2d(yl, 2)
M = np.sqrt(Yh[0][:,0].real**2 + Yh[0][:,0].imag**2 +
Yh[0][:,1].real**2 + Yh[0][:,1].imag**2 +
Yh[0][:,2].real**2 + Yh[0][:,2].imag**2 + b**2) - b
m = torch.tensor(M)
z2 = torch.cat((yl2, m), dim=1)
np.testing.assert_array_almost_equal(z, z2, decimal=4)
@pytest.mark.parametrize('sz', [32, 30, 31, 29, 28])
def test_odd_size(sz):
scat = ScatLayer(biort='near_sym_a')
x = torch.randn(5, 5, sz, sz)
z = scat(x)
assert z.shape[-1] == (sz + 1)//2
@pytest.mark.parametrize('biort,qshift', [('near_sym_a', 'qshift_a'),
('near_sym_b', 'qshift_b'),
('near_sym_b_bp', 'qshift_b_bp')])
def test_equal_j2(biort, qshift):
b = 1e-2
scat = ScatLayerj2(biort=biort, qshift=qshift, magbias=b)
xfm = Transform2d(biort=biort, qshift=qshift)
x = torch.randn(3, 4, 32, 32)
z = scat(x)
X = x.data.numpy()
yl, yh = xfm.forward(X, nlevels=2)
# Make it a tensor to average pool
yl = torch.tensor(yl)
S0 = F.avg_pool2d(yl, 2).numpy()
# First order scatter coeffs
M1 = np.sqrt(yh[0].real**2 + yh[0].imag**2 + b**2) - b
M1 = M1.transpose(0, 2, 1, 3, 4)
M2 = np.sqrt(yh[1].real**2 + yh[1].imag**2 + b**2) - b
S1_2 = M2.transpose(0, 2, 1, 3, 4)
M1 = M1.reshape(3, 24, 16, 16)
yl, yh = xfm.forward(M1, nlevels=1)
# Make yl a tensor to average pool
yl = torch.tensor(yl)
S1_1 = F.avg_pool2d(yl, 2).numpy()
S1_1 = S1_1.reshape(3, 6, 4, 8, 8)
M2_1 = np.sqrt(yh[0].real**2 + yh[0].imag**2 + b**2) - b
S2_1 = M2_1.transpose(0, 2, 1, 3, 4)
S2_1 = S2_1.reshape(3, 36, 4, 8, 8)
z2 = np.concatenate((S0[:, None], S1_1, S1_2, S2_1), axis=1)
z2 = z2.reshape(3, (1+6+6+36)*4, 8, 8)
np.testing.assert_array_almost_equal(z.numpy(), z2, decimal=4)
@pytest.mark.parametrize('biort,qshift', [('near_sym_a', 'qshift_a'),
('near_sym_b', 'qshift_b'),
('near_sym_b_bp', 'qshift_b_bp')])
def test_equal_j2_colour(biort, qshift):
b = 1e-2
scat = ScatLayerj2(biort=biort, qshift=qshift, magbias=b,
combine_colour=True)
xfm = Transform2d(biort=biort, qshift=qshift)
x = torch.randn(4, 3, 32, 32)
z = scat(x)
X = x.data.numpy()
yl, Yh = xfm.forward(X, nlevels=2)
# Make it a tensor to average pool
yl = torch.tensor(yl)
S0 = F.avg_pool2d(yl, 2).numpy()
# First order scatter coeffs
M1 = np.sqrt(Yh[0][:,0].real**2 + Yh[0][:,0].imag**2 +
Yh[0][:,1].real**2 + Yh[0][:,1].imag**2 +
Yh[0][:,2].real**2 + Yh[0][:,2].imag**2 + b**2) - b
M2 = np.sqrt(Yh[1][:,0].real**2 + Yh[1][:,0].imag**2 +
Yh[1][:,1].real**2 + Yh[1][:,1].imag**2 +
Yh[1][:,2].real**2 + Yh[1][:,2].imag**2 + b**2) - b
yl, yh = xfm.forward(M1, nlevels=1)
# Make yl a tensor to average pool
yl = torch.tensor(yl)
S1_1 = F.avg_pool2d(yl, 2).numpy()
M2_1 = np.sqrt(yh[0].real**2 + yh[0].imag**2 + b**2) - b
S2_1 = M2_1.transpose(0, 2, 1, 3, 4)
S2_1 = S2_1.reshape(4, 36, 8, 8)
z2 = np.concatenate((S0, S1_1, M2, S2_1), axis=1)
np.testing.assert_array_almost_equal(z.numpy(), z2, decimal=4)
@pytest.mark.parametrize('sz', [32, 30, 31, 29, 28])
def test_odd_size_j2(sz):
scat = ScatLayerj2(biort='near_sym_a', qshift='qshift_a')
x = torch.randn(5, 5, sz, sz)
z = scat(x)
assert z.shape[-1] == 8
| 33.153846 | 80 | 0.557477 | 829 | 4,741 | 3.072376 | 0.120627 | 0.023557 | 0.038477 | 0.03141 | 0.810758 | 0.792305 | 0.756576 | 0.756576 | 0.731056 | 0.67766 | 0 | 0.089376 | 0.249525 | 4,741 | 142 | 81 | 33.387324 | 0.626476 | 0.039021 | 0 | 0.592593 | 0 | 0 | 0.055397 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 1 | 0.055556 | false | 0 | 0.055556 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
e8edc302497cec6c37dd67781e26e5f8d5706263 | 13,237 | py | Python | tests/unit/utils/vmware/test_vm.py | guoxiaod/salt | 2cd6c03b40932be137e6e8a672967b59025a2d34 | [
"Apache-2.0"
] | 12 | 2015-01-21T00:18:25.000Z | 2021-07-11T07:35:26.000Z | tests/unit/utils/vmware/test_vm.py | guoxiaod/salt | 2cd6c03b40932be137e6e8a672967b59025a2d34 | [
"Apache-2.0"
] | 1 | 2015-02-23T19:37:47.000Z | 2015-02-23T19:37:47.000Z | tests/unit/utils/vmware/test_vm.py | guoxiaod/salt | 2cd6c03b40932be137e6e8a672967b59025a2d34 | [
"Apache-2.0"
] | 12 | 2015-01-05T09:50:42.000Z | 2019-08-19T01:43:40.000Z | # -*- coding: utf-8 -*-
'''
:codeauthor: :email:`Agnes Tevesz <agnes.tevesz@morganstanley.com>`
Tests for virtual machine related functions in salt.utils.vmware
'''
# Import python libraries
from __future__ import absolute_import, print_function, unicode_literals
import logging
# Import Salt testing libraries
from tests.support.unit import TestCase, skipIf
from tests.support.mock import NO_MOCK, NO_MOCK_REASON, patch, MagicMock
from salt.exceptions import VMwareRuntimeError, VMwareApiError, ArgumentValueError
# Import Salt libraries
import salt.utils.vmware as vmware
# Import Third Party Libs
try:
from pyVmomi import vim, vmodl
HAS_PYVMOMI = True
except ImportError:
HAS_PYVMOMI = False
# Get Logging Started
log = logging.getLogger(__name__)
@skipIf(NO_MOCK, NO_MOCK_REASON)
class ConvertToKbTestCase(TestCase):
'''
Tests for converting units
'''
def setUp(self):
pass
def test_gb_conversion_call(self):
self.assertEqual(vmware.convert_to_kb('Gb', 10), {'size': int(10485760), 'unit': 'KB'})
def test_mb_conversion_call(self):
self.assertEqual(vmware.convert_to_kb('Mb', 10), {'size': int(10240), 'unit': 'KB'})
def test_kb_conversion_call(self):
self.assertEqual(vmware.convert_to_kb('Kb', 10), {'size': int(10), 'unit': 'KB'})
def test_conversion_bad_input_argument_fault(self):
self.assertRaises(ArgumentValueError, vmware.convert_to_kb, 'test', 10)
@skipIf(NO_MOCK, NO_MOCK_REASON)
@skipIf(not HAS_PYVMOMI, 'The \'pyvmomi\' library is missing')
@patch('salt.utils.vmware.get_managed_object_name', MagicMock())
@patch('salt.utils.vmware.wait_for_task', MagicMock())
class CreateVirtualMachineTestCase(TestCase):
'''
Tests for salt.utils.vmware.create_vm
'''
def setUp(self):
self.vm_name = 'fake_vm'
self.mock_task = MagicMock()
self.mock_config_spec = MagicMock()
self.mock_resourcepool_object = MagicMock()
self.mock_host_object = MagicMock()
self.mock_vm_create_task = MagicMock(return_value=self.mock_task)
self.mock_folder_object = MagicMock(CreateVM_Task=self.mock_vm_create_task)
def test_create_vm_pool_task_call(self):
vmware.create_vm(self.vm_name, self.mock_config_spec,
self.mock_folder_object, self.mock_resourcepool_object)
self.assert_called_once(self.mock_vm_create_task)
def test_create_vm_host_task_call(self):
vmware.create_vm(self.vm_name, self.mock_config_spec,
self.mock_folder_object, self.mock_resourcepool_object,
host_object=self.mock_host_object)
self.assert_called_once(self.mock_vm_create_task)
def test_create_vm_raise_no_permission(self):
exception = vim.fault.NoPermission()
exception.msg = 'vim.fault.NoPermission msg'
self.mock_folder_object.CreateVM_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareApiError) as exc:
vmware.create_vm(self.vm_name, self.mock_config_spec,
self.mock_folder_object, self.mock_resourcepool_object)
self.assertEqual(exc.exception.strerror,
'Not enough permissions. Required privilege: ')
def test_create_vm_raise_vim_fault(self):
exception = vim.fault.VimFault()
exception.msg = 'vim.fault.VimFault msg'
self.mock_folder_object.CreateVM_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareApiError) as exc:
vmware.create_vm(self.vm_name, self.mock_config_spec,
self.mock_folder_object, self.mock_resourcepool_object)
self.assertEqual(exc.exception.strerror, 'vim.fault.VimFault msg')
def test_create_vm_raise_runtime_fault(self):
exception = vmodl.RuntimeFault()
exception.msg = 'vmodl.RuntimeFault msg'
self.mock_folder_object.CreateVM_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareRuntimeError) as exc:
vmware.create_vm(self.vm_name, self.mock_config_spec,
self.mock_folder_object, self.mock_resourcepool_object)
self.assertEqual(exc.exception.strerror, 'vmodl.RuntimeFault msg')
def test_create_vm_wait_for_task(self):
mock_wait_for_task = MagicMock()
with patch('salt.utils.vmware.wait_for_task', mock_wait_for_task):
vmware.create_vm(self.vm_name, self.mock_config_spec,
self.mock_folder_object, self.mock_resourcepool_object)
mock_wait_for_task.assert_called_once_with(
self.mock_task, self.vm_name, 'CreateVM Task', 10, 'info')
@skipIf(NO_MOCK, NO_MOCK_REASON)
@skipIf(not HAS_PYVMOMI, 'The \'pyvmomi\' library is missing')
@patch('salt.utils.vmware.get_managed_object_name', MagicMock())
@patch('salt.utils.vmware.wait_for_task', MagicMock())
class RegisterVirtualMachineTestCase(TestCase):
'''
Tests for salt.utils.vmware.register_vm
'''
def setUp(self):
self.vm_name = 'fake_vm'
self.mock_task = MagicMock()
self.mock_vmx_path = MagicMock()
self.mock_resourcepool_object = MagicMock()
self.mock_host_object = MagicMock()
self.mock_vm_register_task = MagicMock(return_value=self.mock_task)
self.vm_folder_object = MagicMock(RegisterVM_Task=self.mock_vm_register_task)
self.datacenter = MagicMock(vmFolder=self.vm_folder_object)
def test_register_vm_pool_task_call(self):
vmware.register_vm(self.datacenter, self.vm_name, self.mock_vmx_path,
self.mock_resourcepool_object)
self.assert_called_once(self.mock_vm_register_task)
def test_register_vm_host_task_call(self):
vmware.register_vm(self.datacenter, self.vm_name, self.mock_vmx_path,
self.mock_resourcepool_object,
host_object=self.mock_host_object)
self.assert_called_once(self.mock_vm_register_task)
def test_register_vm_raise_no_permission(self):
exception = vim.fault.NoPermission()
self.vm_folder_object.RegisterVM_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareApiError) as exc:
vmware.register_vm(self.datacenter, self.vm_name, self.mock_vmx_path,
self.mock_resourcepool_object)
self.assertEqual(exc.exception.strerror,
'Not enough permissions. Required privilege: ')
def test_register_vm_raise_vim_fault(self):
exception = vim.fault.VimFault()
exception.msg = 'vim.fault.VimFault msg'
self.vm_folder_object.RegisterVM_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareApiError) as exc:
vmware.register_vm(self.datacenter, self.vm_name, self.mock_vmx_path,
self.mock_resourcepool_object)
self.assertEqual(exc.exception.strerror, 'vim.fault.VimFault msg')
def test_register_vm_raise_runtime_fault(self):
exception = vmodl.RuntimeFault()
exception.msg = 'vmodl.RuntimeFault msg'
self.vm_folder_object.RegisterVM_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareRuntimeError) as exc:
vmware.register_vm(self.datacenter, self.vm_name, self.mock_vmx_path,
self.mock_resourcepool_object)
self.assertEqual(exc.exception.strerror, 'vmodl.RuntimeFault msg')
def test_register_vm_wait_for_task(self):
mock_wait_for_task = MagicMock()
with patch('salt.utils.vmware.wait_for_task', mock_wait_for_task):
vmware.register_vm(self.datacenter, self.vm_name, self.mock_vmx_path,
self.mock_resourcepool_object)
mock_wait_for_task.assert_called_once_with(
self.mock_task, self.vm_name, 'RegisterVM Task')
@skipIf(NO_MOCK, NO_MOCK_REASON)
@skipIf(not HAS_PYVMOMI, 'The \'pyvmomi\' library is missing')
@patch('salt.utils.vmware.get_managed_object_name', MagicMock())
@patch('salt.utils.vmware.wait_for_task', MagicMock())
class UpdateVirtualMachineTestCase(TestCase):
'''
Tests for salt.utils.vmware.update_vm
'''
def setUp(self):
self.mock_task = MagicMock()
self.mock_config_spec = MagicMock()
self.mock_vm_update_task = MagicMock(return_value=self.mock_task)
self.mock_vm_ref = MagicMock(ReconfigVM_Task=self.mock_vm_update_task)
def test_update_vm_task_call(self):
vmware.update_vm(self.mock_vm_ref, self.mock_config_spec)
self.assert_called_once(self.mock_vm_update_task)
def test_update_vm_raise_vim_fault(self):
exception = vim.fault.VimFault()
exception.msg = 'vim.fault.VimFault'
self.mock_vm_ref.ReconfigVM_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareApiError) as exc:
vmware.update_vm(self.mock_vm_ref, self.mock_config_spec)
self.assertEqual(exc.exception.strerror, 'vim.fault.VimFault')
def test_update_vm_raise_runtime_fault(self):
exception = vmodl.RuntimeFault()
exception.msg = 'vmodl.RuntimeFault'
self.mock_vm_ref.ReconfigVM_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareRuntimeError) as exc:
vmware.update_vm(self.mock_vm_ref, self.mock_config_spec)
self.assertEqual(exc.exception.strerror, 'vmodl.RuntimeFault')
def test_update_vm_wait_for_task(self):
mock_wait_for_task = MagicMock()
with patch('salt.utils.vmware.get_managed_object_name',
MagicMock(return_value='my_vm')):
with patch('salt.utils.vmware.wait_for_task', mock_wait_for_task):
vmware.update_vm(self.mock_vm_ref, self.mock_config_spec)
mock_wait_for_task.assert_called_once_with(
self.mock_task, 'my_vm', 'ReconfigureVM Task')
@skipIf(NO_MOCK, NO_MOCK_REASON)
@skipIf(not HAS_PYVMOMI, 'The \'pyvmomi\' library is missing')
@patch('salt.utils.vmware.get_managed_object_name', MagicMock())
@patch('salt.utils.vmware.wait_for_task', MagicMock())
class DeleteVirtualMachineTestCase(TestCase):
'''
Tests for salt.utils.vmware.delete_vm
'''
def setUp(self):
self.mock_task = MagicMock()
self.mock_vm_destroy_task = MagicMock(return_value=self.mock_task)
self.mock_vm_ref = MagicMock(Destroy_Task=self.mock_vm_destroy_task)
def test_destroy_vm_task_call(self):
vmware.delete_vm(self.mock_vm_ref)
self.assert_called_once(self.mock_vm_destroy_task)
def test_destroy_vm_raise_vim_fault(self):
exception = vim.fault.VimFault()
exception.msg = 'vim.fault.VimFault'
self.mock_vm_ref.Destroy_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareApiError) as exc:
vmware.delete_vm(self.mock_vm_ref)
self.assertEqual(exc.exception.strerror, 'vim.fault.VimFault')
def test_destroy_vm_raise_runtime_fault(self):
exception = vmodl.RuntimeFault()
exception.msg = 'vmodl.RuntimeFault'
self.mock_vm_ref.Destroy_Task = MagicMock(side_effect=exception)
with self.assertRaises(VMwareRuntimeError) as exc:
vmware.delete_vm(self.mock_vm_ref)
self.assertEqual(exc.exception.strerror, 'vmodl.RuntimeFault')
def test_destroy_vm_wait_for_task(self):
mock_wait_for_task = MagicMock()
with patch('salt.utils.vmware.get_managed_object_name',
MagicMock(return_value='my_vm')):
with patch('salt.utils.vmware.wait_for_task', mock_wait_for_task):
vmware.delete_vm(self.mock_vm_ref)
mock_wait_for_task.assert_called_once_with(
self.mock_task, 'my_vm', 'Destroy Task')
@skipIf(NO_MOCK, NO_MOCK_REASON)
@skipIf(not HAS_PYVMOMI, 'The \'pyvmomi\' library is missing')
@patch('salt.utils.vmware.get_managed_object_name', MagicMock())
class UnregisterVirtualMachineTestCase(TestCase):
'''
Tests for salt.utils.vmware.unregister_vm
'''
def setUp(self):
self.mock_vm_unregister = MagicMock()
self.mock_vm_ref = MagicMock(UnregisterVM=self.mock_vm_unregister)
def test_unregister_vm_task_call(self):
vmware.unregister_vm(self.mock_vm_ref)
self.assert_called_once(self.mock_vm_unregister)
def test_unregister_vm_raise_vim_fault(self):
exception = vim.fault.VimFault()
exception.msg = 'vim.fault.VimFault'
self.mock_vm_ref.UnregisterVM = MagicMock(side_effect=exception)
with self.assertRaises(VMwareApiError) as exc:
vmware.unregister_vm(self.mock_vm_ref)
self.assertEqual(exc.exception.strerror, 'vim.fault.VimFault')
def test_unregister_vm_raise_runtime_fault(self):
exception = vmodl.RuntimeFault()
exception.msg = 'vmodl.RuntimeFault'
self.mock_vm_ref.UnregisterVM = MagicMock(side_effect=exception)
with self.assertRaises(VMwareRuntimeError) as exc:
vmware.unregister_vm(self.mock_vm_ref)
self.assertEqual(exc.exception.strerror, 'vmodl.RuntimeFault')
| 43.25817 | 95 | 0.706882 | 1,682 | 13,237 | 5.230083 | 0.089774 | 0.09094 | 0.04206 | 0.029556 | 0.850858 | 0.828578 | 0.803683 | 0.799477 | 0.784245 | 0.728544 | 0 | 0.002447 | 0.197401 | 13,237 | 305 | 96 | 43.4 | 0.825584 | 0.037697 | 0 | 0.68 | 0 | 0 | 0.102651 | 0.044084 | 0 | 0 | 0 | 0 | 0.173333 | 1 | 0.146667 | false | 0.004444 | 0.035556 | 0 | 0.208889 | 0.004444 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
3317e7909541b8b337c0ddcceb1098acd7f146d4 | 203 | py | Python | lab/admin/__init__.py | betagouv/euphrosyne | a67857a8716b5060cd9a2c6fa5f3d45c3fff435a | [
"MIT"
] | 1 | 2022-02-21T19:46:20.000Z | 2022-02-21T19:46:20.000Z | lab/admin/__init__.py | betagouv/euphrosyne | a67857a8716b5060cd9a2c6fa5f3d45c3fff435a | [
"MIT"
] | 37 | 2021-10-18T18:33:26.000Z | 2022-03-31T12:38:38.000Z | lab/admin/__init__.py | betagouv/euphrosyne | a67857a8716b5060cd9a2c6fa5f3d45c3fff435a | [
"MIT"
] | 2 | 2022-03-03T15:41:30.000Z | 2022-03-07T14:20:26.000Z | from ..objects.admin import ObjectGroupAdmin # noqa: F401
from .institution import InstitutionAdmin # noqa: F401
from .project import ProjectAdmin # noqa: F401
from .run import RunAdmin # noqa: F401
| 40.6 | 58 | 0.773399 | 25 | 203 | 6.28 | 0.52 | 0.203822 | 0.229299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070175 | 0.157635 | 203 | 4 | 59 | 50.75 | 0.847953 | 0.211823 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
335d2f741a3f32253ed46296456cc77ebb9718f7 | 195 | py | Python | sequence_classification/transformers/sequence_transformer.py | piotermiarer/Sequence-Classification | 05b5491cc8470f477a160fb4207aacd1b12aa5ed | [
"MIT"
] | 1 | 2021-03-14T17:22:41.000Z | 2021-03-14T17:22:41.000Z | sequence_classification/transformers/sequence_transformer.py | piotermiarer/Sequence-Classification | 05b5491cc8470f477a160fb4207aacd1b12aa5ed | [
"MIT"
] | 23 | 2018-11-17T21:05:31.000Z | 2019-01-15T09:58:33.000Z | sequence_classification/transformers/sequence_transformer.py | piotermiarer/Sequence-Classification | 05b5491cc8470f477a160fb4207aacd1b12aa5ed | [
"MIT"
] | 1 | 2020-07-31T06:09:31.000Z | 2020-07-31T06:09:31.000Z | # generic class must be extended
class SequenceTransformer:
def transform(self, data):
raise NotImplementedError
def fit_transform(self, data):
raise NotImplementedError
| 24.375 | 34 | 0.728205 | 20 | 195 | 7.05 | 0.65 | 0.184397 | 0.241135 | 0.312057 | 0.58156 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.220513 | 195 | 7 | 35 | 27.857143 | 0.927632 | 0.153846 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 6 |
6869644b7a1d1c7b6d9543d91536d55aa9192482 | 272 | py | Python | testsuite/coiio/run.py | servantf/oiio | a5dd972bc8a804e486600331513f0b2939e992d2 | [
"BSD-3-Clause"
] | 1 | 2022-03-20T06:13:39.000Z | 2022-03-20T06:13:39.000Z | testsuite/coiio/run.py | servantf/oiio | a5dd972bc8a804e486600331513f0b2939e992d2 | [
"BSD-3-Clause"
] | null | null | null | testsuite/coiio/run.py | servantf/oiio | a5dd972bc8a804e486600331513f0b2939e992d2 | [
"BSD-3-Clause"
] | null | null | null | command += run_app("cmake --config Release data -DCMAKE_PREFIX_PATH=/home/anders/code/oiio-al/dist/lib/cmake > build.txt 2>&1", silent=True)
command += run_app("cmake --build . >> build.txt 2>&1", silent=True)
command += run_app("./coiiotest > out.txt 2>&1", silent=True)
| 68 | 140 | 0.702206 | 45 | 272 | 4.133333 | 0.555556 | 0.16129 | 0.209677 | 0.177419 | 0.435484 | 0.354839 | 0.354839 | 0.354839 | 0.354839 | 0 | 0 | 0.02449 | 0.099265 | 272 | 3 | 141 | 90.666667 | 0.734694 | 0 | 0 | 0 | 0 | 0.333333 | 0.602941 | 0.220588 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
d7b788d671aa73f66911ff4dc5da58e746f94a7f | 2,202 | py | Python | typhon/arts/sensor.py | tmieslinger/typhon | 588539e5c4831ee18753d7ead5b2f2736e922bb1 | [
"MIT"
] | 1 | 2020-12-18T17:19:16.000Z | 2020-12-18T17:19:16.000Z | typhon/arts/sensor.py | tmieslinger/typhon | 588539e5c4831ee18753d7ead5b2f2736e922bb1 | [
"MIT"
] | null | null | null | typhon/arts/sensor.py | tmieslinger/typhon | 588539e5c4831ee18753d7ead5b2f2736e922bb1 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Implementation of functions related to sensor settings.
"""
import numpy as np
__all__ = ['get_f_backend_rel_width',
'get_f_backend_const_width',
]
def get_f_backend_rel_width(f_start, f_end, bandwidth):
"""Compute backend frequencies with relative bandwidth.
This function computes backend frequencies for a given frequency range and
a relative bandwidth.
Parameters:
f_start (float): beginning of frequency range [Hz]
f_end (float): end of frequency range [Hz]
bandwidth (float): relative bandwidth [dimensionless]
Return:
np.array, np.array: backend frequencies [Hz], channel widths [Hz]
"""
if f_start <= 0:
raise Exception('Start frequency must be > 0.')
if f_start > f_end:
raise Exception('End frequency has to be larger than start frequency.')
f_backend = [f_start]
while f_backend[-1] <= f_end:
f_backend.append(f_backend[-1] * (bandwidth + 2) / (2 - bandwidth))
# do not include last value in results as it exceeds f_end
f_backend = np.array(f_backend[:-1])
backend_bandwidth = f_backend * bandwidth
return f_backend, backend_bandwidth
def get_f_backend_const_width(f_start, f_end, bandwidth):
"""Compute backend frequencies with constant bandwidth.
This function computes backend frequencies for a given frequency range and
a constant bandwidth.
Parameters:
f_start (float): beginning of frequency range [Hz]
f_end (float): end of frequency range [Hz]
bandwidth (float): bandwidth [Hz]
Return:
np.array, np.array: backend frequencies [Hz], channel widths [Hz]
"""
if f_start <= 0:
raise Exception('Start frequency must be > 0.')
if f_start > f_end:
raise Exception('End frequency has to be larger than start frequency.')
f_backend = [f_start]
while f_backend[-1] <= f_end:
f_backend.append(f_backend[-1] + bandwidth)
# do not include last value in results as it exceeds f_end
f_backend = np.array(f_backend[:-1])
backend_bandwidth = np.array([bandwidth])
return f_backend, backend_bandwidth
| 28.597403 | 79 | 0.671208 | 301 | 2,202 | 4.714286 | 0.232558 | 0.107118 | 0.038055 | 0.028189 | 0.878788 | 0.829457 | 0.774489 | 0.774489 | 0.774489 | 0.774489 | 0 | 0.007738 | 0.237057 | 2,202 | 76 | 80 | 28.973684 | 0.836905 | 0.44505 | 0 | 0.615385 | 0 | 0 | 0.184889 | 0.042667 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.038462 | 0 | 0.192308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
cc04122f7499096e1521e29648720c1dff47c279 | 5,806 | py | Python | src/networks.py | ajenningsfrankston/ICIAR2018 | b0e6a19ee31ab941113294b32055bd14deca35cd | [
"MIT"
] | 175 | 2018-01-20T04:20:47.000Z | 2022-03-14T02:09:22.000Z | src/networks.py | ajenningsfrankston/ICIAR2018 | b0e6a19ee31ab941113294b32055bd14deca35cd | [
"MIT"
] | 12 | 2018-04-28T15:24:50.000Z | 2020-10-01T10:48:55.000Z | src/networks.py | ajenningsfrankston/ICIAR2018 | b0e6a19ee31ab941113294b32055bd14deca35cd | [
"MIT"
] | 75 | 2018-03-17T01:29:19.000Z | 2022-03-30T04:24:52.000Z | import numpy as np
import torch.nn as nn
import torch.nn.functional as F
class BaseNetwork(nn.Module):
def __init__(self, name, channels=1):
super(BaseNetwork, self).__init__()
self._name = name
self._channels = channels
def name(self):
return self._name
def initialize_weights(self):
for m in self.modules():
if isinstance(m, nn.Conv2d):
nn.init.kaiming_normal_(m.weight, nonlinearity='relu')
if m.bias is not None:
m.bias.data.zero_()
elif isinstance(m, nn.BatchNorm2d):
m.weight.data.fill_(1)
m.bias.data.zero_()
elif isinstance(m, nn.Linear):
m.weight.data.normal_(0, 0.01)
m.bias.data.zero_()
class PatchWiseNetwork(BaseNetwork):
def __init__(self, channels=1):
super(PatchWiseNetwork, self).__init__('pw' + str(channels), channels)
self.features = nn.Sequential(
# Block 1
nn.Conv2d(in_channels=3, out_channels=16, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(16),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=16, out_channels=16, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(16),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=16, out_channels=16, kernel_size=2, stride=2),
nn.BatchNorm2d(16),
nn.ReLU(inplace=True),
# Block 2
nn.Conv2d(in_channels=16, out_channels=32, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(32),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=32, out_channels=32, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(32),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=32, out_channels=32, kernel_size=2, stride=2),
nn.BatchNorm2d(32),
nn.ReLU(inplace=True),
# Block 3
nn.Conv2d(in_channels=32, out_channels=64, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(64),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=64, out_channels=64, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(64),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=64, out_channels=64, kernel_size=2, stride=2),
nn.BatchNorm2d(64),
nn.ReLU(inplace=True),
# Block 4
nn.Conv2d(in_channels=64, out_channels=128, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(128),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=128, out_channels=128, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(128),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=128, out_channels=128, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(128),
nn.ReLU(inplace=True),
# Block 5
nn.Conv2d(in_channels=128, out_channels=256, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(256),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=256, out_channels=256, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(256),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=256, out_channels=256, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(256),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=256, out_channels=channels, kernel_size=1, stride=1),
)
self.classifier = nn.Sequential(
nn.Linear(channels * 64 * 64, 4),
)
self.initialize_weights()
def forward(self, x):
x = self.features(x)
x = x.view(x.size(0), -1)
x = self.classifier(x)
x = F.log_softmax(x, dim=1)
return x
class ImageWiseNetwork(BaseNetwork):
def __init__(self, channels=1):
super(ImageWiseNetwork, self).__init__('iw' + str(channels), channels)
self.features = nn.Sequential(
# Block 1
nn.Conv2d(in_channels=12 * channels, out_channels=64, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(64),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=64, out_channels=64, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(64),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=64, out_channels=64, kernel_size=2, stride=2),
nn.BatchNorm2d(64),
nn.ReLU(inplace=True),
# Block 2
nn.Conv2d(in_channels=64, out_channels=128, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(128),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=128, out_channels=128, kernel_size=3, stride=1, padding=1),
nn.BatchNorm2d(128),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=128, out_channels=128, kernel_size=2, stride=2),
nn.BatchNorm2d(128),
nn.ReLU(inplace=True),
nn.Conv2d(in_channels=128, out_channels=1, kernel_size=1, stride=1),
)
self.classifier = nn.Sequential(
nn.Linear(1 * 16 * 16, 128),
nn.ReLU(inplace=True),
nn.Dropout(0.5, inplace=True),
nn.Linear(128, 128),
nn.ReLU(inplace=True),
nn.Dropout(0.5, inplace=True),
nn.Linear(128, 64),
nn.ReLU(inplace=True),
nn.Dropout(0.5, inplace=True),
nn.Linear(64, 4),
)
self.initialize_weights()
def forward(self, x):
x = self.features(x)
x = x.view(x.size(0), -1)
x = self.classifier(x)
x = F.log_softmax(x, dim=1)
return x
| 35.839506 | 102 | 0.568033 | 760 | 5,806 | 4.197368 | 0.102632 | 0.093103 | 0.097806 | 0.1279 | 0.841379 | 0.838245 | 0.837618 | 0.77931 | 0.760502 | 0.760502 | 0 | 0.078727 | 0.302101 | 5,806 | 161 | 103 | 36.062112 | 0.708539 | 0.009473 | 0 | 0.650794 | 0 | 0 | 0.001393 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.02381 | 0.007937 | 0.126984 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
0bc576e8bac2b739b0dd0f0a3d676cca6136453e | 141 | py | Python | faker/providers/color/en_US/__init__.py | jacksmith15/faker | bc5dda1983e4d055aa2698ccf0806a462cb8370e | [
"MIT"
] | 12,077 | 2015-01-01T18:30:07.000Z | 2022-03-31T23:22:01.000Z | faker/providers/color/en_US/__init__.py | jacksmith15/faker | bc5dda1983e4d055aa2698ccf0806a462cb8370e | [
"MIT"
] | 1,306 | 2015-01-03T05:18:55.000Z | 2022-03-31T02:43:04.000Z | faker/providers/color/en_US/__init__.py | jacksmith15/faker | bc5dda1983e4d055aa2698ccf0806a462cb8370e | [
"MIT"
] | 1,855 | 2015-01-08T14:20:10.000Z | 2022-03-25T17:23:32.000Z | from .. import Provider as ColorProvider
class Provider(ColorProvider):
"""Implement color provider for ``en_US`` locale."""
pass
| 17.625 | 56 | 0.702128 | 16 | 141 | 6.125 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184397 | 141 | 7 | 57 | 20.142857 | 0.852174 | 0.326241 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
043ba7e4425fc72ae35e1af2aff05f52f19f4e15 | 9,587 | py | Python | data_processing/ddh5_Plotting/gain_trace.py | PITT-HATLAB/data_processing | ad49bb921e0fc90b90f0b696e2cbb662019f5f40 | [
"MIT"
] | null | null | null | data_processing/ddh5_Plotting/gain_trace.py | PITT-HATLAB/data_processing | ad49bb921e0fc90b90f0b696e2cbb662019f5f40 | [
"MIT"
] | null | null | null | data_processing/ddh5_Plotting/gain_trace.py | PITT-HATLAB/data_processing | ad49bb921e0fc90b90f0b696e2cbb662019f5f40 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Mon Sep 13 13:57:13 2021
@author: Hatlab-RRK
"""
from plottr.data.datadict_storage import all_datadicts_from_hdf5
import matplotlib.pyplot as plt
import numpy as np
filepath = r'Z:/Data/N25_L3_SQ/traces/gain/2022-04-25/2022-04-25_0001_bp1_gain/2022-04-25_0001_bp1_gain.ddh5'
dd = all_datadicts_from_hdf5(filepath)['data']
pows = dd.extract('power')['power']['values']
freqs = dd.extract('power')['frequency']['values']
#plot the LO leakage vs power
fig, ax = plt.subplots(figsize = (8,6))
ax.plot(freqs/1e9, pows)
ax.set_xlabel('VNA frequency (GHz)')
ax.set_ylabel('VNA Gain (dB)')
ax.legend()
ax.grid()
ax.set_title(f'0.06mA, {8.65-30-10}dBm Cryo, +277kHZ generator detuning: 6.6MHz BW')
#%%
filepath = r'Z:/Data/N25_L3_SQ/traces/sat/2022-04-25/2022-04-25_0001_bp1_sat/2022-04-25_0001_bp1_sat.ddh5'
dd = all_datadicts_from_hdf5(filepath)['data']
pows = dd.extract('power')['power']['values']
freqs = dd.extract('power')['frequency']['values']
#plot the LO leakage vs power
fig, ax = plt.subplots(figsize = (8,6))
ax.plot(freqs-77, pows)
ax.set_xlabel('VNA power (dBm Cryo)')
ax.set_ylabel('VNA Gain (dB)')
ax.legend()
ax.grid()
ax.set_title('0.06mA, 8.65dBm RT, +277kHZ generator detuning: 6.6MHz BW')
#%% vna gen power sweep
filepath = r'Z:/Data/Hakan/SH_5B1_SS_Gain_6.064GHz/vna_gain_sweep/2021-09-13/2021-09-13_0006_gain_vs_gen_power_50dBatten/2021-09-13_0006_gain_vs_gen_power_50dBatten.ddh5'
specData = all_datadicts_from_hdf5(filepath)['data']
spec_freqs = specData.extract('power')['VNA_frequency']['values']
spec_powers = specData.extract('power')['power']['values']
gen_powers = specData.extract('power')['Gen_power']['values']
#take middle value for LO leakage
detuning = -1e6
center_freq = 6.096e9
lower_sideband_freq = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq-detuning)))]
IM_spur_lower = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq-3*detuning)))]
upper_sideband_freq = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq+detuning)))]
IM_spur_upper = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq+3*detuning)))]
lower_sideband_filt = spec_freqs == lower_sideband_freq
upper_sideband_filt = spec_freqs == upper_sideband_freq
IM_spur_lower_filt = spec_freqs == IM_spur_lower
IM_spur_upper_filt = spec_freqs == IM_spur_upper
#plot the LO leakage vs power
fig, ax = plt.subplots(figsize = (8,6))
for i, gp in enumerate(np.unique(gen_powers)):
gp_filt = gen_powers == gp
center_freq = spec_freqs[np.argmax(spec_powers[gp_filt])]
if i%5 ==0:
ax.plot(spec_freqs[gp_filt], spec_powers[gp_filt], label = f'{gp} dBm')
highlight_freq = center_freq+detuning
ax.vlines(center_freq, 0, 30, linestyles = 'dashed', colors = 'black')
ax.vlines(highlight_freq, 0, 30, linestyles = 'dashed', colors = 'black')
# ax.plot(gen_powers[leakage_filt], spec_powers[leakage_filt], label = 'LO leakage (dBm)')
# ax.plot(gen_powers[upper_sideband_filt], spec_powers[upper_sideband_filt], label = 'Upper input tone power (dBm)')
# ax.plot(gen_powers[lower_sideband_filt], spec_powers[lower_sideband_filt], label = 'Lower input tone power (dBm)')
# ax.plot(gen_powers[IM_spur_upper_filt], spec_powers[IM_spur_upper_filt], label = 'Upper spur power (dBm)')
# ax.plot(gen_powers[IM_spur_lower_filt], spec_powers[IM_spur_lower_filt], label = 'Lower spur power (dBm)')
ax.set_xlabel('VNA Frequency (Hz)')
ax.legend()
ax.grid()
ax.set_ylabel('VNA Gain (dB)')
ax.set_title(f'Amplifier Low-power gain: 20dB, f1-f2: {np.round(detuning/1e3)} kHz ')
ax.set_ylim([0,30])
ax.annotate(f'Signal input:\n{np.round((highlight_freq)/1e9, 5)} GHz', [highlight_freq, 20], [highlight_freq-2e6, 22], arrowprops=dict(facecolor='black', shrink=0.05))
#%% vna saturation gen power sweep
filepath = r'Z:/Data/Hakan/SH_5B1_SS_Gain_6.064GHz/vna_saturation_sweep/2021-09-14/2021-09-14_0002_SH_5B1_saturation_sweep_offres_+500kHz/2021-09-14_0002_SH_5B1_saturation_sweep_offres_+500kHz.ddh5'
# sat_gen_freq = [gen_freq],
# sat_gen_power = [gen_power-gen_att],
# sat_vna_freq = [vna_cw_freq],
# sat_vna_powers = pows.reshape(1,-1)-vna_att,
# sat_gain = gains.reshape(1,-1),
# sat_phases = phases.reshape(1, -1)
specData = all_datadicts_from_hdf5(filepath)['data']
spec_freqs = specData.extract('sat_gain')['sat_vna_powers']['values']
spec_powers = specData.extract('sat_gain')['sat_gain']['values']
gen_powers = specData.extract('sat_gain')['sat_gen_power']['values']
#take middle value for LO leakage
detuning = -0.5e6
# lower_sideband_freq = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq-detuning)))]
# IM_spur_lower = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq-3*detuning)))]
# upper_sideband_freq = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq+detuning)))]
# IM_spur_upper = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq+3*detuning)))]
# lower_sideband_filt = spec_freqs == lower_sideband_freq
# upper_sideband_filt = spec_freqs == upper_sideband_freq
# IM_spur_lower_filt = spec_freqs == IM_spur_lower
# IM_spur_upper_filt = spec_freqs == IM_spur_upper
#plot the LO leakage vs power
fig, ax = plt.subplots(figsize = (8,6))
for i, gp in enumerate(np.unique(gen_powers)):
print(i)
gp_filt = gen_powers == gp
print(gp_filt)
# center_freq = spec_freqs[np.argmax(spec_powers[gp_filt])]
print(spec_powers[gp_filt])
ax.plot(spec_freqs[gp_filt][0], spec_powers[gp_filt][0], label = f'{gp+20} dBm')
# ax.vlines(center_freq, 0, 30, linestyles = 'dashed', colors = 'black')
# ax.vlines(center_freq-1e6, 0, 30, linestyles = 'dashed', colors = 'black')
# ax.plot(gen_powers[leakage_filt], spec_powers[leakage_filt], label = 'LO leakage (dBm)')
# ax.plot(gen_powers[upper_sideband_filt], spec_powers[upper_sideband_filt], label = 'Upper input tone power (dBm)')
# ax.plot(gen_powers[lower_sideband_filt], spec_powers[lower_sideband_filt], label = 'Lower input tone power (dBm)')
# ax.plot(gen_powers[IM_spur_upper_filt], spec_powers[IM_spur_upper_filt], label = 'Upper spur power (dBm)')
# ax.plot(gen_powers[IM_spur_lower_filt], spec_powers[IM_spur_lower_filt], label = 'Lower spur power (dBm)')
ax.set_xlabel('Input Signal Power (dBm RT)')
ax.legend()
ax.grid()
ax.set_ylabel('VNA S21 Power (dBm)')
ax.set_title(f'Amplifier Low-power gain: 15dB, f1-f2: {np.round(detuning/1e3)} kHz ')
# ax.set_ylim([0,30])
# ax.annotate(f'Signal input:\n{np.round((center_freq-1e6)/1e9, 5)} GHz', [center_freq-1e6, 15], [center_freq-3e6, 17], arrowprops=dict(facecolor='black', shrink=0.05))
#%%same thing but with phase
# vna saturation gen power sweep
filepath = r'Z:/Data/Hakan/SH_5B1_SS_Gain_6.064GHz/vna_saturation_sweep/2021-09-14/2021-09-14_0002_SH_5B1_saturation_sweep_offres_+500kHz/2021-09-14_0002_SH_5B1_saturation_sweep_offres_+500kHz.ddh5'
# sat_gen_freq = [gen_freq],
# sat_gen_power = [gen_power-gen_att],
# sat_vna_freq = [vna_cw_freq],
# sat_vna_powers = pows.reshape(1,-1)-vna_att,
# sat_gain = gains.reshape(1,-1),
# sat_phases = phases.reshape(1, -1)
specData = all_datadicts_from_hdf5(filepath)['data']
spec_freqs = specData.extract('sat_phases')['sat_vna_powers']['values']
spec_powers = specData.extract('sat_phases')['sat_phases']['values']
gen_powers = specData.extract('sat_phases')['sat_gen_power']['values']
#take middle value for LO leakage
detuning = -0.5e6
# lower_sideband_freq = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq-detuning)))]
# IM_spur_lower = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq-3*detuning)))]
# upper_sideband_freq = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq+detuning)))]
# IM_spur_upper = np.unique(spec_freqs)[np.argmin(np.abs(np.unique(spec_freqs)-(center_freq+3*detuning)))]
# lower_sideband_filt = spec_freqs == lower_sideband_freq
# upper_sideband_filt = spec_freqs == upper_sideband_freq
# IM_spur_lower_filt = spec_freqs == IM_spur_lower
# IM_spur_upper_filt = spec_freqs == IM_spur_upper
#plot the LO leakage vs power
fig, ax = plt.subplots(figsize = (8,6))
for i, gp in enumerate(np.unique(gen_powers)):
print(i)
gp_filt = gen_powers == gp
print(gp_filt)
# center_freq = spec_freqs[np.argmax(spec_powers[gp_filt])]
print(spec_powers[gp_filt])
ax.plot(spec_freqs[gp_filt][0], spec_powers[gp_filt][0], label = f'{gp+20} dBm')
# ax.vlines(center_freq, 0, 30, linestyles = 'dashed', colors = 'black')
# ax.vlines(center_freq-1e6, 0, 30, linestyles = 'dashed', colors = 'black')
# ax.plot(gen_powers[leakage_filt], spec_powers[leakage_filt], label = 'LO leakage (dBm)')
# ax.plot(gen_powers[upper_sideband_filt], spec_powers[upper_sideband_filt], label = 'Upper input tone power (dBm)')
# ax.plot(gen_powers[lower_sideband_filt], spec_powers[lower_sideband_filt], label = 'Lower input tone power (dBm)')
# ax.plot(gen_powers[IM_spur_upper_filt], spec_powers[IM_spur_upper_filt], label = 'Upper spur power (dBm)')
# ax.plot(gen_powers[IM_spur_lower_filt], spec_powers[IM_spur_lower_filt], label = 'Lower spur power (dBm)')
ax.set_xlabel('Input Signal Power (dBm RT)')
ax.legend()
ax.grid()
ax.set_ylabel('VNA S21 Power (dBm)')
ax.set_title(f'Amplifier Low-power phase: 15dB Gain, f1-f2: {np.round(detuning/1e3)} kHz ')
# ax.set_ylim([0,30])
# ax.annotate(f'Signal input:\n{np.round((center_freq-1e6)/1e9, 5)} GHz', [center_freq-1e6, 15], [center_freq-3e6, 17], arrowprops=dict(facecolor='black', shrink=0.05)) | 46.765854 | 198 | 0.74601 | 1,605 | 9,587 | 4.176324 | 0.11028 | 0.060421 | 0.042966 | 0.060868 | 0.930926 | 0.909294 | 0.876921 | 0.870506 | 0.845442 | 0.821274 | 0 | 0.047657 | 0.093877 | 9,587 | 205 | 199 | 46.765854 | 0.723955 | 0.469281 | 0 | 0.532609 | 0 | 0.097826 | 0.330608 | 0.163908 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.032609 | 0 | 0.032609 | 0.065217 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
043f6aaa1b283f4f1ab7eabf73d0af0d1552768c | 29 | py | Python | __init__.py | alphagov/govukurllookup | 9b31c015db86fff7b7cb46dace33c29e92d51753 | [
"MIT"
] | 1 | 2017-08-22T13:21:27.000Z | 2017-08-22T13:21:27.000Z | __init__.py | ukgovdatascience/govukurllookup | 9b31c015db86fff7b7cb46dace33c29e92d51753 | [
"MIT"
] | 9 | 2017-08-09T16:47:48.000Z | 2017-11-08T19:09:40.000Z | __init__.py | alphagov/govukurllookup | 9b31c015db86fff7b7cb46dace33c29e92d51753 | [
"MIT"
] | 2 | 2019-08-29T11:54:32.000Z | 2021-04-10T19:41:30.000Z | from govukurllookup import *
| 14.5 | 28 | 0.827586 | 3 | 29 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 29 | 1 | 29 | 29 | 0.96 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f0a1711df1d868609ab5996205e9566fc0897337 | 198 | py | Python | tests/spec/crm/extensions/test_cards.py | fakepop/hubspot-api-python | f04103a09f93f5c26c99991b25fa76801074f3d3 | [
"Apache-2.0"
] | 117 | 2020-04-06T08:22:53.000Z | 2022-03-18T03:41:29.000Z | tests/spec/crm/extensions/test_cards.py | fakepop/hubspot-api-python | f04103a09f93f5c26c99991b25fa76801074f3d3 | [
"Apache-2.0"
] | 62 | 2020-04-06T16:21:06.000Z | 2022-03-17T16:50:44.000Z | tests/spec/crm/extensions/test_cards.py | fakepop/hubspot-api-python | f04103a09f93f5c26c99991b25fa76801074f3d3 | [
"Apache-2.0"
] | 45 | 2020-04-06T16:13:52.000Z | 2022-03-30T21:33:17.000Z | from hubspot import HubSpot
from hubspot.crm.extensions.cards import CardsApi
def test_is_discoverable():
apis = HubSpot().crm.extensions.cards
assert isinstance(apis.cards_api, CardsApi)
| 24.75 | 49 | 0.787879 | 26 | 198 | 5.884615 | 0.576923 | 0.143791 | 0.261438 | 0.326797 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131313 | 198 | 7 | 50 | 28.285714 | 0.889535 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | false | 0 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
f0a589b9c4260996e8b37abfe120fd7fff118598 | 678 | py | Python | securionpay/cards.py | el-dot/securionpay-python | 2946c4be35483ed6466e9981e45dbb0b39302093 | [
"MIT"
] | 7 | 2017-07-11T17:01:38.000Z | 2020-06-30T10:19:53.000Z | securionpay/cards.py | el-dot/securionpay-python | 2946c4be35483ed6466e9981e45dbb0b39302093 | [
"MIT"
] | 4 | 2016-03-17T01:41:02.000Z | 2020-01-18T20:52:34.000Z | securionpay/cards.py | el-dot/securionpay-python | 2946c4be35483ed6466e9981e45dbb0b39302093 | [
"MIT"
] | 3 | 2016-02-25T15:08:35.000Z | 2021-12-01T08:28:22.000Z | from securionpay.resource import Resource
class Cards(Resource):
def create(self, customer_id, params):
return self._post("/customers/%s/cards" % customer_id, params)
def get(self, customer_id, card_id):
return self._get("/customers/%s/cards/%s" % (customer_id, card_id))
def update(self, customer_id, card_id, params):
return self._post("/customers/%s/cards/%s" % (customer_id, card_id), params)
def delete(self, customer_id, card_id):
return self._delete("/customers/%s/cards/%s" % (customer_id, card_id))
def list(self, customer_id, params=None):
return self._get("/customers/%s/cards" % customer_id, params)
| 35.684211 | 84 | 0.678466 | 95 | 678 | 4.621053 | 0.231579 | 0.22779 | 0.191344 | 0.218679 | 0.701595 | 0.646925 | 0.503417 | 0.366743 | 0.159453 | 0 | 0 | 0 | 0.175516 | 678 | 18 | 85 | 37.666667 | 0.785331 | 0 | 0 | 0 | 0 | 0 | 0.153392 | 0.097345 | 0 | 0 | 0 | 0 | 0 | 1 | 0.416667 | false | 0 | 0.083333 | 0.416667 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 6 |
f0ac33c386c2eb6000a9757dc4efc9e8e6994d7d | 25 | py | Python | tests/samples/example_folder/udf.py | vanguard/sql_translate | 28ae149e54a300c3337b538691be80d878a7dbf2 | [
"Apache-2.0"
] | 3 | 2021-03-19T21:39:29.000Z | 2021-03-26T14:00:24.000Z | tests/samples/example_folder/udf.py | vanguard/sql_translate | 28ae149e54a300c3337b538691be80d878a7dbf2 | [
"Apache-2.0"
] | 1 | 2021-07-07T11:45:04.000Z | 2021-07-07T11:45:04.000Z | tests/samples/example_folder/udf.py | vanguard/sql_translate | 28ae149e54a300c3337b538691be80d878a7dbf2 | [
"Apache-2.0"
] | null | null | null | def hello():
return 1 | 12.5 | 12 | 0.6 | 4 | 25 | 3.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 0.28 | 25 | 2 | 13 | 12.5 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 6 |
f0e5f97c8b1be327a9398a3d1bdf66098329e8db | 26 | py | Python | python/testData/intentions/PyConvertFormatOperatorToMethodIntentionTest/repr.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/intentions/PyConvertFormatOperatorToMethodIntentionTest/repr.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/intentions/PyConvertFormatOperatorToMethodIntentionTest/repr.py | truthiswill/intellij-community | fff88cfb0dc168eea18ecb745d3e5b93f57b0b95 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | print('%r' <caret>% '123') | 26 | 26 | 0.538462 | 4 | 26 | 3.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0.076923 | 26 | 1 | 26 | 26 | 0.458333 | 0 | 0 | 0 | 0 | 0 | 0.185185 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 6 |
9bccfdc9ac82e5a53d6404fec54f60a6b00824e5 | 1,101 | py | Python | tests/dependencyTest.py | MarkTravers/magLabUtilities | e116c8cb627cd82c3b8ba651dd6979b66e568632 | [
"MIT"
] | null | null | null | tests/dependencyTest.py | MarkTravers/magLabUtilities | e116c8cb627cd82c3b8ba651dd6979b66e568632 | [
"MIT"
] | null | null | null | tests/dependencyTest.py | MarkTravers/magLabUtilities | e116c8cb627cd82c3b8ba651dd6979b66e568632 | [
"MIT"
] | null | null | null | #!python3
import magLabUtilities.cubitutilities.cubitmanager
import magLabUtilities.exceptions.exceptions
import magLabUtilities.fileutilities.timeDomain
import magLabUtilities.fileutilities.common
import magLabUtilities.magstromutilities.magstrommanager
import magLabUtilities.optimizerutilities.costFunctions
import magLabUtilities.optimizerutilities.gradientDescent
import magLabUtilities.optimizerutilities.parameterSpaces
import magLabUtilities.optimizerutilities.particleSwarm
import magLabUtilities.parametrictestutilities.parameterspace
import magLabUtilities.parametrictestutilities.testmanager
import magLabUtilities.signalutilities.calculus
import magLabUtilities.signalutilities.canonical1d
import magLabUtilities.signalutilities.hysteresis
import magLabUtilities.signalutilities.interpolation
import magLabUtilities.signalutilities.signals
# import magLabUtilities.uiutilities.plotting.hysteresis # Requires Matlab installation
import magLabUtilities.uiutilities.messagepipe
import magLabUtilities.vsmutilities.ezVSM
import magLabUtilities.config
| 34.40625 | 88 | 0.881017 | 84 | 1,101 | 11.547619 | 0.404762 | 0.43299 | 0.185567 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001972 | 0.079019 | 1,101 | 31 | 89 | 35.516129 | 0.954635 | 0.084469 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
9be82d1f694555f3ec9202d19339d6d981f5c4e1 | 245 | py | Python | src/servers/spigot/download.py | dudeofawesome/docker-minecraft | 11bf4b997fe45ad6f899b7cb05cdd6a3acebb372 | [
"MIT"
] | null | null | null | src/servers/spigot/download.py | dudeofawesome/docker-minecraft | 11bf4b997fe45ad6f899b7cb05cdd6a3acebb372 | [
"MIT"
] | null | null | null | src/servers/spigot/download.py | dudeofawesome/docker-minecraft | 11bf4b997fe45ad6f899b7cb05cdd6a3acebb372 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
import os
import requests
from mc_utils import get_mc_manifest_version
minecraft_version, minecraft_major_minor_version, json_url = get_mc_manifest_version(os.environ['MINECRAFT_VERSION'])
print(f"{minecraft_version}")
| 24.5 | 117 | 0.836735 | 36 | 245 | 5.305556 | 0.583333 | 0.251309 | 0.136126 | 0.209424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004425 | 0.077551 | 245 | 9 | 118 | 27.222222 | 0.840708 | 0.085714 | 0 | 0 | 0 | 0 | 0.161435 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0.2 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
5020cdcd031aa6ea1d82ed7caa1d7bd8f2c82cb5 | 39 | py | Python | speaksee/evaluation/recall/__init__.py | aimagelab/speaksee | 63700a4062e2ae00132a5c77007604fdaf4bd00b | [
"BSD-3-Clause"
] | 29 | 2019-02-28T05:29:53.000Z | 2021-01-25T06:55:48.000Z | speaksee/evaluation/recall/__init__.py | aimagelab/speaksee | 63700a4062e2ae00132a5c77007604fdaf4bd00b | [
"BSD-3-Clause"
] | 2 | 2019-10-26T02:29:59.000Z | 2021-01-15T13:58:53.000Z | speaksee/evaluation/recall/__init__.py | aimagelab/speaksee | 63700a4062e2ae00132a5c77007604fdaf4bd00b | [
"BSD-3-Clause"
] | 11 | 2019-03-12T08:43:09.000Z | 2021-03-15T03:20:43.000Z | from .recall import recall, old_recall
| 19.5 | 38 | 0.820513 | 6 | 39 | 5.166667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128205 | 39 | 1 | 39 | 39 | 0.911765 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
503ca4625f09ec82a47047e431820db3e0a272ae | 378 | py | Python | terrascript/data/sumologic.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | terrascript/data/sumologic.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | terrascript/data/sumologic.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # terrascript/data/sumologic.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:28:04 UTC)
#
# For imports without namespace, e.g.
#
# >>> import terrascript.data.sumologic
#
# instead of
#
# >>> import terrascript.data.SumoLogic.sumologic
#
# This is only available for 'official' and 'partner' providers.
from terrascript.data.SumoLogic.sumologic import *
| 25.2 | 73 | 0.743386 | 49 | 378 | 5.734694 | 0.693878 | 0.213523 | 0.341637 | 0.213523 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036697 | 0.134921 | 378 | 14 | 74 | 27 | 0.82263 | 0.796296 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
504e5d4e6be3ffea48f01bcb30159adb4cc482e1 | 20 | py | Python | dataset/__init__.py | SpatialPerceptionNeuralNetwork/SOA_DORN_TF | 33814467e9135036abf28f2da19c5984c8744089 | [
"Unlicense"
] | 17 | 2019-02-17T07:39:39.000Z | 2021-08-17T05:20:19.000Z | dataset/__init__.py | SpatialPerceptionNeuralNetwork/SOA_DORN_TF | 33814467e9135036abf28f2da19c5984c8744089 | [
"Unlicense"
] | 6 | 2019-03-04T14:17:22.000Z | 2019-11-07T15:06:55.000Z | dataset/__init__.py | SpatialPerceptionNeuralNetwork/SOA_DORN_TF | 33814467e9135036abf28f2da19c5984c8744089 | [
"Unlicense"
] | 4 | 2019-02-17T07:39:47.000Z | 2019-08-13T17:13:23.000Z | from . import loader | 20 | 20 | 0.8 | 3 | 20 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 20 | 1 | 20 | 20 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ac8cf468acd8580718478adf21d618edcb7fdaa0 | 47 | py | Python | enthought/pyface/image_list.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 3 | 2016-12-09T06:05:18.000Z | 2018-03-01T13:00:29.000Z | enthought/pyface/image_list.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 1 | 2020-12-02T00:51:32.000Z | 2020-12-02T08:48:55.000Z | enthought/pyface/image_list.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | null | null | null | # proxy module
from pyface.image_list import *
| 15.666667 | 31 | 0.787234 | 7 | 47 | 5.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148936 | 47 | 2 | 32 | 23.5 | 0.9 | 0.255319 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
ac945877033f84205e3de70a542dbe29f787db8c | 57 | py | Python | facilyst/models/neural_networks/__init__.py | ParthivNaresh/Facilyst | 786932b0afcf07cd300b2e6ce55ccf7f9e4c49d9 | [
"MIT"
] | null | null | null | facilyst/models/neural_networks/__init__.py | ParthivNaresh/Facilyst | 786932b0afcf07cd300b2e6ce55ccf7f9e4c49d9 | [
"MIT"
] | 3 | 2022-02-26T17:19:28.000Z | 2022-03-01T09:34:19.000Z | facilyst/models/neural_networks/__init__.py | ParthivNaresh/facilyst | 786932b0afcf07cd300b2e6ce55ccf7f9e4c49d9 | [
"MIT"
] | null | null | null | from .mlp_regressor import MultiLayerPerceptronRegressor
| 28.5 | 56 | 0.912281 | 5 | 57 | 10.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070175 | 57 | 1 | 57 | 57 | 0.962264 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
acb5f166496e336c88cd5503528d39ce1cd15b35 | 46 | py | Python | __init__.py | espoirMur/balobi_nini | b68b9af4c84ec0f5b38ae8ba52d5f0d32b41ead3 | [
"Unlicense"
] | 1 | 2020-09-30T08:03:10.000Z | 2020-09-30T08:03:10.000Z | __init__.py | espoirMur/balobi_nini | b68b9af4c84ec0f5b38ae8ba52d5f0d32b41ead3 | [
"Unlicense"
] | 22 | 2020-09-23T14:05:33.000Z | 2021-12-04T22:40:41.000Z | __init__.py | espoirMur/balobi_nini | b68b9af4c84ec0f5b38ae8ba52d5f0d32b41ead3 | [
"Unlicense"
] | 1 | 2021-07-29T10:38:13.000Z | 2021-07-29T10:38:13.000Z | from dotenv import load_dotenv
load_dotenv() | 11.5 | 30 | 0.826087 | 7 | 46 | 5.142857 | 0.571429 | 0.555556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 46 | 4 | 31 | 11.5 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
acf5dbd1951b079f48c88ceeb87413cb6432187d | 7,315 | py | Python | tests/test_io.py | jeffkinnison/florin | 94e76812e9fe27c86b2ce39313d07beb21c8b478 | [
"MIT"
] | 6 | 2019-06-03T19:11:05.000Z | 2021-01-13T06:35:43.000Z | tests/test_io.py | jeffkinnison/florin | 94e76812e9fe27c86b2ce39313d07beb21c8b478 | [
"MIT"
] | 4 | 2019-06-10T14:48:15.000Z | 2019-10-01T16:48:58.000Z | tests/test_io.py | jeffkinnison/florin | 94e76812e9fe27c86b2ce39313d07beb21c8b478 | [
"MIT"
] | 1 | 2019-09-25T17:57:23.000Z | 2019-09-25T17:57:23.000Z | import glob
import os
import h5py
import pytest
import numpy as np
from skimage.io import imread, imsave
from florin.io import load, load_image, load_images, load_npy, load_hdf5, \
load_tiff, save, save_image, save_images, save_npy, \
save_hdf5, save_tiff
@pytest.fixture(scope='module')
def load_setup(tmpdir_factory):
"""Set up a small test case for loading image data"""
data = np.random.randint(0, high=256, size=(100, 300, 300), dtype=np.uint8)
tmpdir = tmpdir_factory.mktemp('data')
os.makedirs(os.path.join(str(tmpdir), 'png'), exist_ok=True)
for i in range(data.shape[0]):
fname = str(i).zfill(3) + '.png'
imsave(os.path.join(str(tmpdir), 'png', fname), data[i])
imsave(os.path.join(str(tmpdir), 'data.tif'), data, plugin='tifffile')
imsave(os.path.join(str(tmpdir), 'data.tiff'), data, plugin='tifffile')
with h5py.File(os.path.join(str(tmpdir), 'data.h5'), 'w') as f:
f.create_dataset('stack', data=data)
f.create_dataset('foo', data=data)
np.save(os.path.join(str(tmpdir), 'data.npy'), data)
return data, str(tmpdir)
@pytest.fixture(scope='module')
def save_setup():
"""Set up data to test save functions."""
return np.random.randint(0, high=256, size=(100, 300, 300), dtype=np.uint8)
def test_load(load_setup):
"""Test that the load function works over all test filetypes."""
data, tmpdir = load_setup
loaded = load()(os.path.join(tmpdir, 'data.npy'))
assert np.all(loaded == data)
loaded = load()(os.path.join(tmpdir, 'data.h5'))
assert isinstance(loaded, h5py.Dataset)
assert np.all(loaded[:] == data)
loaded = load()(os.path.join(tmpdir, 'data.h5'), key='foo')
assert isinstance(loaded, h5py.Dataset)
assert np.all(loaded[:] == data)
loaded = load()(os.path.join(tmpdir, 'data.tif'))
assert np.all(loaded == data)
loaded = load()(os.path.join(tmpdir, 'data.tiff'))
assert np.all(loaded == data)
loaded = load()(os.path.join(tmpdir, 'png'))
assert np.all(loaded == data)
for i in range(data.shape[0]):
fname = fname = str(i).zfill(3) + '.png'
loaded = load()(os.path.join(tmpdir, 'png', fname))
assert np.all(loaded == data[i])
with pytest.raises(FileNotFoundError):
loaded = load()('/foo/bar.lksd')
def test_load_hdf5(load_setup):
data, tmpdir = load_setup
loaded = load_hdf5(os.path.join(tmpdir, 'data.h5'))
assert isinstance(loaded, h5py.Dataset)
assert np.all(loaded[:] == data)
loaded = load_hdf5(os.path.join(tmpdir, 'data.h5'), key='foo')
assert isinstance(loaded, h5py.Dataset)
assert np.all(loaded[:] == data[:])
def test_load_image(load_setup):
data, tmpdir = load_setup
for i in range(data.shape[0]):
fname = fname = str(i).zfill(3) + '.png'
loaded = load_image(os.path.join(tmpdir, 'png', fname))
assert np.all(loaded == data[i])
def test_load_images(load_setup):
data, tmpdir = load_setup
loaded = load_images(os.path.join(tmpdir, 'png'))
assert np.all(loaded == data)
def test_load_npy(load_setup):
data, tmpdir = load_setup
loaded = load_npy(os.path.join(tmpdir, 'data.npy'))
assert np.all(loaded == data)
def test_load_tiff(load_setup):
data, tmpdir = load_setup
loaded = load_tiff(os.path.join(tmpdir, 'data.tif'))
assert np.all(loaded == data)
loaded = load_tiff(os.path.join(tmpdir, 'data.tiff'))
assert np.all(loaded == data)
def test_save(save_setup, tmpdir):
data = save_setup
tmpdir = str(tmpdir)
fpath = os.path.join(tmpdir, 'data.h5')
save()(data, fpath)
assert os.path.isfile(fpath)
with h5py.File(fpath, 'r') as saved:
assert 'stack' in saved
assert np.all(saved['stack'][:] == data)
save()(data, fpath, key='foo')
assert os.path.isfile(fpath)
with h5py.File(fpath, 'r') as saved:
assert 'stack' in saved
assert 'foo' in saved
assert np.all(saved['stack'][:] == data)
fpath = os.path.join(tmpdir, 'data.npy')
save()(data, fpath)
assert os.path.isfile(fpath)
saved = np.load(fpath)
assert np.all(saved == data)
fpath = os.path.join(tmpdir, 'data.tif')
save()(data, fpath)
assert os.path.isfile(fpath)
saved = imread(fpath)
assert np.all(saved == data)
fpath = os.path.join(tmpdir, 'data.tiff')
save()(data, fpath)
assert os.path.isfile(fpath)
saved = imread(fpath)
assert np.all(saved == data)
fpath = os.path.join(tmpdir, 'png')
save()(data, fpath)
assert os.path.isdir(fpath)
imgs = sorted(glob.glob(os.path.join(fpath, '*.png')))
for i, img in enumerate(imgs):
fname = '{}.png'.format(str(i).zfill(3))
assert os.path.isfile(os.path.join(fpath, fname))
assert os.path.join(fpath, fname) == img
saved = imread(img)
assert np.all(saved == data[i])
for i in range(data.shape[0]):
fname = '{}.png'.format(str(i).zfill(3))
fpath = os.path.join(tmpdir, fname)
save()(data[i], fpath)
assert os.path.isfile(fpath)
saved = imread(fpath)
assert np.all(saved == data[i])
def test_save_hdf5(save_setup, tmpdir):
data = save_setup
tmpdir = str(tmpdir)
fpath = os.path.join(tmpdir, 'data.h5')
save_hdf5(data, fpath)
assert os.path.isfile(fpath)
with h5py.File(fpath, 'r') as saved:
assert 'stack' in saved
assert np.all(saved['stack'][:] == data)
save_hdf5(data, fpath, key='foo')
assert os.path.isfile(fpath)
with h5py.File(fpath, 'r') as saved:
assert 'stack' in saved
assert 'foo' in saved
assert np.all(saved['stack'][:] == data)
def test_save_image(save_setup, tmpdir):
data = save_setup
tmpdir = str(tmpdir)
for i in range(data.shape[0]):
fname = '{}.png'.format(str(i).zfill(3))
fpath = os.path.join(tmpdir, fname)
save_image(data[i], fpath)
assert os.path.isfile(fpath)
saved = imread(fpath)
assert np.all(saved == data[i])
def test_save_images(save_setup, tmpdir):
data = save_setup
tmpdir = str(tmpdir)
fpath = os.path.join(tmpdir, 'png')
save_images(data, fpath)
assert os.path.isdir(fpath)
imgs = sorted(glob.glob(os.path.join(fpath, '*.png')))
for i, img in enumerate(imgs):
fname = '{}.png'.format(str(i).zfill(3))
assert os.path.isfile(os.path.join(fpath, fname))
assert os.path.join(fpath, fname) == img
saved = imread(img)
assert np.all(saved == data[i])
def test_save_npy(save_setup, tmpdir):
data = save_setup
tmpdir = str(tmpdir)
fpath = os.path.join(tmpdir, 'data.npy')
save_npy(data, fpath)
assert os.path.isfile(fpath)
saved = np.load(fpath)
assert np.all(saved == data)
def test_save_tiff(save_setup, tmpdir):
data = save_setup
tmpdir = str(tmpdir)
fpath = os.path.join(tmpdir, 'data.tif')
save_tiff(data, fpath)
assert os.path.isfile(fpath)
saved = imread(fpath)
assert np.all(saved == data)
fpath = os.path.join(tmpdir, 'data.tiff')
save_tiff(data, fpath)
assert os.path.isfile(fpath)
saved = imread(fpath)
assert np.all(saved == data)
| 29.027778 | 79 | 0.62406 | 1,072 | 7,315 | 4.186567 | 0.096082 | 0.072193 | 0.08467 | 0.092692 | 0.831328 | 0.819296 | 0.784537 | 0.764706 | 0.709002 | 0.692068 | 0 | 0.011344 | 0.216678 | 7,315 | 251 | 80 | 29.143426 | 0.771902 | 0.019412 | 0 | 0.672131 | 0 | 0 | 0.0503 | 0 | 0 | 0 | 0 | 0 | 0.306011 | 1 | 0.076503 | false | 0 | 0.038251 | 0 | 0.125683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
4a0a429a794d41a8db85bc52007405a14c503169 | 28 | py | Python | numpretty/__init__.py | obfuscatedgenerated/numpretty | d751cb967eab8f0009e4659347720d49b2771bf3 | [
"MIT"
] | 1 | 2022-02-17T22:06:26.000Z | 2022-02-17T22:06:26.000Z | numpretty/__init__.py | obfuscatedgenerated/numpretty | d751cb967eab8f0009e4659347720d49b2771bf3 | [
"MIT"
] | null | null | null | numpretty/__init__.py | obfuscatedgenerated/numpretty | d751cb967eab8f0009e4659347720d49b2771bf3 | [
"MIT"
] | null | null | null | from numpretty.main import * | 28 | 28 | 0.821429 | 4 | 28 | 5.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 28 | 1 | 28 | 28 | 0.92 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4a17169b6d1d2a8deea1324f0d3da3d10ed64ef9 | 285 | py | Python | 1_beginner/chapter4/practice/money_check.py | code4tomorrow/Python | 035b6f5d8fd635a16caaff78bcd3f582663dadc3 | [
"MIT"
] | 4 | 2021-03-01T00:32:45.000Z | 2021-05-21T22:01:52.000Z | 1_beginner/chapter4/practice/money_check.py | code4tomorrow/Python | 035b6f5d8fd635a16caaff78bcd3f582663dadc3 | [
"MIT"
] | 29 | 2020-09-12T22:56:04.000Z | 2021-09-25T17:08:42.000Z | 1_beginner/chapter4/practice/money_check.py | code4tomorrow/Python | 035b6f5d8fd635a16caaff78bcd3f582663dadc3 | [
"MIT"
] | 7 | 2021-02-25T01:50:55.000Z | 2022-02-28T00:00:42.000Z | # Money Check
# Write a program that asks for a person's
# amount of money (floating point).
# If the person's amount of money is 0,
# print "Bankrupt". If not, print "Not Bankrupt"
# If the person's amount of money is
# greater than 1000.0, then print "Rich".
# Write your code here
| 28.5 | 48 | 0.712281 | 51 | 285 | 3.980392 | 0.568627 | 0.103448 | 0.192118 | 0.221675 | 0.364532 | 0.26601 | 0.26601 | 0.26601 | 0 | 0 | 0 | 0.026201 | 0.196491 | 285 | 9 | 49 | 31.666667 | 0.860262 | 0.936842 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0.111111 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c57bcbdd6a2d5e52d54e97cc8c8122100921bb7f | 3,551 | py | Python | microbe_directory/comparisons/statistics.py | dcdanko/MD2 | c9f4e39e3275a1d333a86e819640f988757023fe | [
"MIT"
] | 11 | 2019-12-29T10:04:12.000Z | 2022-01-25T23:28:30.000Z | microbe_directory/comparisons/statistics.py | dcdanko/MD2 | c9f4e39e3275a1d333a86e819640f988757023fe | [
"MIT"
] | 2 | 2020-07-09T16:55:12.000Z | 2020-09-26T22:09:09.000Z | microbe_directory/comparisons/statistics.py | dcdanko/MD2 | c9f4e39e3275a1d333a86e819640f988757023fe | [
"MIT"
] | 2 | 2020-01-14T21:12:42.000Z | 2020-06-05T20:51:46.000Z |
import pandas as pd
import numpy as np
from scipy.stats import chisquare
from scipy import stats
from collections import defaultdict
from random import choices
def compare_categorical(value_being_compared, values_in_taxa_list_1, values_in_taxa_list_2):
stats1 = count_values(values_in_taxa_list_1, value_being_compared)
stats1 = pd.Series(stats1)
stats2 = count_values(values_in_taxa_list_2, value_being_compared)
stats2 = pd.Series(stats2)
a = chisquare(stats1, stats2)
return pd.Series({
'abundance_in': stats1,
'abundance_out': stats2,
'p-value': a.pvalue,
})
def compare_numeric(values_in_taxa_list_1, values_in_taxa_list_2):
"""Retun a Pandas Series with [abundance-in, abundance-out, p-value]."""
mean1 = values_in_taxa_list_1.mean()
mean2 = values_in_taxa_list_2.mean()
a = stats.ttest_ind(values_in_taxa_list_1, values_in_taxa_list_2, equal_var=False)
return pd.Series({
'abundance_in': mean1,
'abundance_out': mean2,
'p-value': a.pvalue,
})
def count_values(values, value_being_compared):
x = defaultdict(float)
for var in [True, False]:
x[var] = 1 / (1000 * 1000)
for var in values:
if var == value_being_compared:
x[True] += 1
else:
x[False] += 1
return x
def compare_categorical_abundances(value_being_compared, values_in_taxa_list_1, values_in_taxa_list_2):
stats1 = count_values_abundances(values_in_taxa_list_1, value_being_compared)
stats1 = pd.Series(stats1)
stats2 = count_values_abundances(values_in_taxa_list_2, value_being_compared)
stats2 = pd.Series(stats2)
keyslist1 = list(values_in_taxa_list_1.keys())
keyslist2 = list(values_in_taxa_list_2.keys())
valueslist1 = list(values_in_taxa_list_1.values())
valueslist2 = list(values_in_taxa_list_2.values())
tenthousand_samples1 = choices(keyslist1, valueslist1, k = 10**4)
tenthousand_samples2 = choices(keyslist2, valueslist2, k = 10**4)
a = stats.ks_2samp(tenthousand_samples1, tenthousand_samples2)
return pd.Series({
'abundance_in': stats1,
'abundance_out': stats2,
'p-value': a.pvalue,
})
def mean_ignore_nans(dictin):
num = 0
denom = 0
for key, val in dictin.items():
if not np.isnan(key):
num += key*val
denom += val
return num/denom if denom != 0 else 0
def compare_numeric_abundances(values_in_taxa_list_1, values_in_taxa_list_2):
"""Retun a Pandas Series with [abundance-in, abundance-out, p-value]."""
mean1 = mean_ignore_nans(values_in_taxa_list_1)
mean2 = mean_ignore_nans(values_in_taxa_list_2)
keyslist1 = list(values_in_taxa_list_1.keys())
keyslist2 = list(values_in_taxa_list_2.keys())
valueslist1 = list(values_in_taxa_list_1.values())
valueslist2 = list(values_in_taxa_list_2.values())
tenthousand_samples1 = choices(keyslist1, valueslist1, k=10**4)
tenthousand_samples2 = choices(keyslist2, valueslist2, k=10**4)
a = stats.ks_2samp(tenthousand_samples1, tenthousand_samples2)
return pd.Series({
'abundance_in': mean1,
'abundance_out': mean2,
'p-value': a.pvalue,
})
def count_values_abundances(values, value_being_compared):
x = defaultdict(float)
for var in [True, False]:
x[var] = 1 / (1000 * 1000)
for var in values.keys():
if var == value_being_compared:
x[True] += values[var]
else:
x[False] += values[var]
return x
| 33.819048 | 103 | 0.691355 | 499 | 3,551 | 4.589178 | 0.158317 | 0.09083 | 0.136245 | 0.181659 | 0.79738 | 0.781659 | 0.777293 | 0.717904 | 0.717904 | 0.717904 | 0 | 0.040483 | 0.206984 | 3,551 | 104 | 104 | 34.144231 | 0.772727 | 0.037454 | 0 | 0.581395 | 0 | 0 | 0.037581 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081395 | false | 0 | 0.069767 | 0 | 0.232558 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
c5ab8125d0a77b449534ca451d960b7ee3a917d4 | 185 | py | Python | genre/models/__init__.py | wagnew3/Amodal-3D-Reconstruction-for-Robotic-Manipulationvia-Stability-and-Connectivity--Release | f55c6b0fac44d9d749e7804d99169a39d30c2111 | [
"MIT"
] | null | null | null | genre/models/__init__.py | wagnew3/Amodal-3D-Reconstruction-for-Robotic-Manipulationvia-Stability-and-Connectivity--Release | f55c6b0fac44d9d749e7804d99169a39d30c2111 | [
"MIT"
] | null | null | null | genre/models/__init__.py | wagnew3/Amodal-3D-Reconstruction-for-Robotic-Manipulationvia-Stability-and-Connectivity--Release | f55c6b0fac44d9d749e7804d99169a39d30c2111 | [
"MIT"
] | null | null | null | import importlib
def get_model(alias, test=False):
module = importlib.import_module('genre.models.' + alias)
if test:
return module.Model_test
return module.Model
| 20.555556 | 61 | 0.702703 | 24 | 185 | 5.291667 | 0.541667 | 0.15748 | 0.251969 | 0.330709 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.205405 | 185 | 8 | 62 | 23.125 | 0.863946 | 0 | 0 | 0 | 0 | 0 | 0.07027 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
c5bce9347b3158a00b63733dafe978726d96cdb7 | 77 | py | Python | composite/tests/urls.py | takeplace/django-composite | a47a933d1721e98c2ce4d86d3bc940086d36c54e | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | composite/tests/urls.py | takeplace/django-composite | a47a933d1721e98c2ce4d86d3bc940086d36c54e | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | composite/tests/urls.py | takeplace/django-composite | a47a933d1721e98c2ce4d86d3bc940086d36c54e | [
"Apache-2.0",
"BSD-3-Clause"
] | 1 | 2020-02-11T10:27:07.000Z | 2020-02-11T10:27:07.000Z | from django.test import TestCase
class CollectionTests(TestCase):
pass
| 12.833333 | 32 | 0.779221 | 9 | 77 | 6.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168831 | 77 | 5 | 33 | 15.4 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.333333 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
a8840d752b525c09667f8781a24a0ef7fe6acea6 | 19 | py | Python | binapy/hashing/__init__.py | guillp/binapy | f14207048282c46e59608c8695bfd6bd4379f5fd | [
"MIT"
] | null | null | null | binapy/hashing/__init__.py | guillp/binapy | f14207048282c46e59608c8695bfd6bd4379f5fd | [
"MIT"
] | null | null | null | binapy/hashing/__init__.py | guillp/binapy | f14207048282c46e59608c8695bfd6bd4379f5fd | [
"MIT"
] | null | null | null | from .sha import *
| 9.5 | 18 | 0.684211 | 3 | 19 | 4.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210526 | 19 | 1 | 19 | 19 | 0.866667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
767bbbab113c4106abb0665492a43545bb1c7c8b | 40 | py | Python | CMake/otbTestNumpy.py | xcorail/OTB | 092a93654c3b5d009e420f450fe9b675f737cdca | [
"Apache-2.0"
] | 2 | 2019-02-13T14:48:19.000Z | 2019-12-03T02:54:28.000Z | CMake/otbTestNumpy.py | xcorail/OTB | 092a93654c3b5d009e420f450fe9b675f737cdca | [
"Apache-2.0"
] | 3 | 2015-10-14T10:11:38.000Z | 2015-10-15T08:26:23.000Z | CMake/otbTestNumpy.py | xcorail/OTB | 092a93654c3b5d009e420f450fe9b675f737cdca | [
"Apache-2.0"
] | 2 | 2019-01-17T10:36:14.000Z | 2019-12-03T02:54:36.000Z | import numpy
print(numpy.get_include())
| 13.333333 | 26 | 0.8 | 6 | 40 | 5.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.075 | 40 | 2 | 27 | 20 | 0.837838 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 6 |
76a3c3f3bb5d913a8fdd81c4f98b97f1ddd4870b | 11,028 | py | Python | app/poll/views.py | sargentfrancesca/newlynpolls | 561e57eb5d4c2fc3e8b448b841476adacfb88566 | [
"MIT"
] | null | null | null | app/poll/views.py | sargentfrancesca/newlynpolls | 561e57eb5d4c2fc3e8b448b841476adacfb88566 | [
"MIT"
] | null | null | null | app/poll/views.py | sargentfrancesca/newlynpolls | 561e57eb5d4c2fc3e8b448b841476adacfb88566 | [
"MIT"
] | null | null | null | from flask import Flask, render_template, redirect, url_for, abort, flash, request,\
current_app, make_response, jsonify
from flask.ext.login import login_required, current_user
from flask.ext.sqlalchemy import get_debug_queries
from . import poll
from .forms import PostForm, DrawForm
from .. import db
from ..models import Permission, Role, User, Post, Comment, Event, Prompt, PromptEvent
from ..decorators import admin_required, permission_required
from sqlalchemy.sql.expression import func, select
import base64
import os, json
@poll.route('/', methods=['GET', 'POST'])
def home():
user = User.query.filter_by(username="mediumra_re").first()
event = Event.query.filter_by(name="The General Opinions").first()
form = PostForm(user=user)
print form.prompts.data
if form.validate_on_submit():
post = Post(user=user)
post.name = form.name.data
post.age = form.age.data
post.gender = form.gender.data
post.body = form.body.data
post.passion = form.passion.data
post.event = event
post.platform = request.user_agent.platform
post.browser = request.user_agent.browser
post.prompt = Prompt.query.filter_by(id=form.prompts.data).first()
post.user = user
db.session.add(post)
db.session.commit()
flash('Submitted')
return redirect(url_for('poll.home'))
return redirect(url_for('poll.cheers_user', user=user.username, event=post.event.name_slug))
return render_template('poll/poll.html', form=form, user=user, event=event, draw=False, return_to="/poll")
@poll.route('/draw', methods=['GET', 'POST'])
def draw():
user = User.query.filter_by(username="mediumra_re").first()
event = Event.query.filter_by(name="The General Opinions").first()
form = PostForm(user=user)
print form.prompts.data
if form.validate_on_submit():
post = Post(user=user)
post.name = form.name.data
post.age = form.age.data
post.gender = form.gender.data
post.body = form.body.data
post.passion = form.passion.data
post.event = event
post.platform = request.user_agent.platform
post.browser = request.user_agent.browser
post.prompt = Prompt.query.filter_by(id=form.prompts.data).first()
post.user = user
if form.image_uri.data > 0:
post.image_uri = form.image_uri.data
details_concat = ('-').join([str(form.age.data), form.gender.data, form.passion.data])
details_space = details_concat.split(' ')
deets = ('-').join(details_space).lower()
image_file = post.event.name_slug + deets
post.image_file = image_file
data_full = form.image_uri.data
data = data_full.split(',')[1]
fh = open("data/images/"+image_file+".png", "wb")
fh.write(data.decode('base64'))
fh.close()
db.session.add(post)
db.session.commit()
flash('Submitted')
return redirect(url_for('poll.home'))
return redirect(url_for('poll.cheers_user', user=user.username, event=post.event.name_slug))
return render_template('poll/poll.html', form=form, user=user, event=event, draw=True, return_to="/poll/draw")
@poll.route('/<username>/draw', methods=['GET', 'POST'])
def draw_user(username):
user = User.query.filter_by(username=username).first()
form = PostForm(user=user)
event = Event.get_current(user=user)
print form.prompts.data
if form.validate_on_submit():
post = Post(user=user)
post.name = form.name.data
post.age = form.age.data
post.gender = form.gender.data
post.body = form.body.data
post.passion = form.passion.data
post.event = event
post.platform = request.user_agent.platform
post.browser = request.user_agent.browser
post.prompt = Prompt.query.filter_by(id=form.prompts.data).first()
post.user = user
post.image_uri = form.image_uri.data
details_concat = ('-').join([str(form.age.data), form.gender.data, form.passion.data])
details_space = details_concat.split(' ')
deets = ('-').join(details_space).lower()
image_file = post.event.name_slug + deets
app = Flask(__name__)
APP_ROOT = os.path.dirname(os.path.abspath(__file__))
UPLOAD_FOLDER = os.path.join(APP_ROOT, '/var/www/html/newlynpolls/app/static/data/images')
app.config['UPLOAD_FOLDER'] = UPLOAD_FOLDER
post.image_file = image_file
data_full = form.image_uri.data
try:
data = data_full.split(',')[1]
except:
pass
else:
fh = open((os.path.join(app.config['UPLOAD_FOLDER'], image_file+'.png')), "wb")
fh.write(data.decode('base64'))
fh.close()
db.session.add(post)
db.session.commit()
flash('Submitted')
return redirect(url_for('poll.cheers_user', user=user.username, event=post.event.name_slug))
return render_template('poll/poll.html', form=form, user=user, event=event, draw=True, return_to="/poll/" + user.username + "/draw")
@poll.route('/<username>', methods=['GET', 'POST'])
def vote(username):
user = User.query.filter_by(username=username).first()
form = PostForm(user=user)
event = Event.get_current(user=user)
print form.prompts.data
if form.validate_on_submit():
post = Post(user=user)
post.name = form.name.data
post.age = form.age.data
post.gender = form.gender.data
post.body = form.body.data
post.passion = form.passion.data
post.event = event
post.platform = request.user_agent.platform
post.browser = request.user_agent.browser
print "Prompt", Prompt.query.filter_by(id=form.prompt.data).first()
post.prompt = Prompt.query.filter_by(id=form.prompts.data).first()
post.user = user
db.session.add(post)
db.session.commit()
flash('Submitted')
return redirect(url_for('poll.cheers_user', user=user.username, event=post.event.name_slug))
return render_template('poll/poll.html', form=form, user=user, event=event, draw=False, return_to="/poll/" + user.username)
@poll.route('/cheers/<user>/<event>')
def cheers_user(user, event):
user = User.query.filter_by(username=user).first()
event = Event.get_current(user)
posts = Post.query.filter_by(event=event).order_by(Post.id.desc()).all()
if request.referrer != None:
return_to = request.referrer
else:
return_to = request.environ['PATH_INFO']
try:
if request.cookies['presentation_mode']:
return render_template('poll/thanks.html', type="presentation", posts=posts, return_to=return_to)
except KeyError:
return render_template('poll/thanks.html', type="basic", posts=posts, return_to=return_to)
@poll.route('/cheers/<user>/<event>/archive')
def cheers_user_archive(user, event):
user = User.query.filter_by(username=user).first()
event = Event.query.filter_by(name_slug=event).first()
posts = Post.query.filter_by(event=event).order_by(Post.id.desc()).all()
try:
if request.cookies['presentation_mode']:
return render_template('poll/thanks.html', type="presentation", posts=posts)
except KeyError:
return render_template('poll/thanks.html', type="basic", posts=posts)
@poll.route('/cheers')
def cheers():
try:
if request.cookies['presentation_mode']:
return render_template('poll/thanks.html', type="presentation")
except KeyError:
return render_template('poll/thanks.html', type="basic")
@poll.route('/random/all')
def randompoll():
random_opinion = Post.query.order_by(func.rand()).first()
request.randomtype = 'all'
return render_template('poll/random.html', post=random_opinion)
@poll.route('/randomajax')
def randomajax():
random_opinion = [Post.query.order_by(func.rand()).first()]
opinions = [(r.body, r.prompt.text) for r in random_opinion]
return jsonify(opinions=opinions)
@poll.route('/randomeventajax/<int:id>')
def randomeventajax(id):
event = Event.query.get_or_404(id)
random_opinion = [Post.query.filter_by(event=event).order_by(func.rand()).first()]
opinions = [(r.body, r.prompt.text) for r in random_opinion]
return jsonify(opinions=opinions)
@poll.route('/randomtodayajax/<username>')
def randomtodayajax(username):
user = User.query.filter_by(username=username).first()
event = Event.get_current(user)
random_opinion = [Post.query.filter_by(event=event).order_by(func.rand()).first()]
opinions = [(r.body, r.prompt.text, r.image_file) for r in random_opinion]
return jsonify(opinions=opinions)
@poll.route('/random/event/<int:id>')
def randomeventpoll(id):
event = Event.query.get_or_404(id)
random_opinion = Post.query.filter_by(event=event).order_by(func.rand()).first()
request.randomtype = 'individual'
return render_template('poll/random.html', post=random_opinion)
@poll.route('/random/today/<username>')
def randomtoday(username):
user = User.query.filter_by(username=username).first()
event = Event.get_current(user)
posts = Post.query.filter_by(event=event).order_by(func.rand()).first()
request.randomtype = 'individual-today'
return render_template('poll/random.html', post=posts)
@poll.route('/all')
def opinions():
posts = Post.query.order_by(Post.id.desc()).all()
return render_template('poll/plain_posts.html', posts=posts)
@poll.route('/all/<username>')
def opinions_user(username):
user = User.query.filter_by(username=username).first()
posts = Post.query.filter_by(user=user).order_by(Post.id.desc()).all()
return render_template('poll/plain_posts.html', posts=posts)
@poll.route('/<username>/<event>')
def opinions_user_event(username, event):
user = User.query.filter_by(username=username).first()
event = Event.query.filter_by(user=user, name_slug=event).first()
posts = Post.query.filter_by(event=event).order_by(Post.id.desc()).all()
return render_template('poll/plain_posts.html', posts=posts)
# event = Event.query.get(11)
# print event.get_slug()
@poll.route('/today/<username>')
def todays_opinions(username):
user = User.query.filter_by(username=username).first()
event = Event.get_current(user)
posts = Post.query.filter_by(event=event).order_by(Post.id.desc()).all()
return render_template('poll/plain_posts.html', posts=posts)
@poll.route('/ajax/vote', methods=['POST'])
def vote_ajax():
value = request.form['value']
post_id = request.form['post_id']
post = Post.query.get_or_404(post_id)
yay = post.yay
nay = post.nay
if value == 'yay':
post.yay = yay + 1
else:
post.nay = nay + 1
db.session.add(post)
db.session.commit()
return json.dumps({'status':'OK','yay':post.yay,'nay':post.nay, 'id':post.id});
| 36.637874 | 136 | 0.66848 | 1,489 | 11,028 | 4.817327 | 0.114842 | 0.044612 | 0.052558 | 0.05688 | 0.791719 | 0.774014 | 0.759097 | 0.733305 | 0.733305 | 0.724383 | 0 | 0.002443 | 0.183351 | 11,028 | 300 | 137 | 36.76 | 0.794026 | 0.004534 | 0 | 0.6125 | 0 | 0 | 0.100237 | 0.025697 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.029167 | 0.045833 | null | null | 0.020833 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
76c9e4e226a372770fbe33818a9c7d0d0c87acec | 90 | py | Python | cui_p3p.py | userElaina/rsa-p2p | 8d0d5e09581f9df5c2109f21d3acbf5d13599231 | [
"Apache-2.0",
"MIT"
] | null | null | null | cui_p3p.py | userElaina/rsa-p2p | 8d0d5e09581f9df5c2109f21d3acbf5d13599231 | [
"Apache-2.0",
"MIT"
] | null | null | null | cui_p3p.py | userElaina/rsa-p2p | 8d0d5e09581f9df5c2109f21d3acbf5d13599231 | [
"Apache-2.0",
"MIT"
] | null | null | null | from rsap2p import TCPp2p_CUI
TCPp2p_CUI(('0.0.0.0',23303),'user3','127.0.0.1').joins() | 30 | 57 | 0.677778 | 18 | 90 | 3.277778 | 0.611111 | 0.135593 | 0.101695 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228916 | 0.077778 | 90 | 3 | 57 | 30 | 0.481928 | 0 | 0 | 0 | 0 | 0 | 0.235955 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
4f35750b6e687c6372a4a1ffed20c2d66df6ac3c | 148 | py | Python | nim/agent.py | burnpiro/pong-deep-q-learning | 1711efebdbb0d7813f8f1fd320c194b15b053da5 | [
"MIT"
] | 1 | 2021-11-12T02:28:08.000Z | 2021-11-12T02:28:08.000Z | nim/agent.py | burnpiro/pong-deep-q-learning | 1711efebdbb0d7813f8f1fd320c194b15b053da5 | [
"MIT"
] | null | null | null | nim/agent.py | burnpiro/pong-deep-q-learning | 1711efebdbb0d7813f8f1fd320c194b15b053da5 | [
"MIT"
] | null | null | null | class Agent:
@staticmethod
def select_move(board, turn):
pass
def __str__(self):
return self.__class__.__name__[0:-5]
| 16.444444 | 44 | 0.621622 | 18 | 148 | 4.388889 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018692 | 0.277027 | 148 | 8 | 45 | 18.5 | 0.719626 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.166667 | 0 | 0.166667 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 6 |
4f5a7ded49d9f16b259465a5bff90e5266055b9f | 219 | py | Python | backend/home/admin.py | crowdbotics-apps/test-001-32044 | bf377076e2f2a57d95e21daceb5d61b81e5db4e1 | [
"FTL",
"AML",
"RSA-MD"
] | null | null | null | backend/home/admin.py | crowdbotics-apps/test-001-32044 | bf377076e2f2a57d95e21daceb5d61b81e5db4e1 | [
"FTL",
"AML",
"RSA-MD"
] | null | null | null | backend/home/admin.py | crowdbotics-apps/test-001-32044 | bf377076e2f2a57d95e21daceb5d61b81e5db4e1 | [
"FTL",
"AML",
"RSA-MD"
] | null | null | null | from django.contrib import admin
from .models import Airplane, Book, Car, Movie
admin.site.register(Car)
admin.site.register(Movie)
admin.site.register(Book)
admin.site.register(Airplane)
# Register your models here.
| 21.9 | 46 | 0.794521 | 32 | 219 | 5.4375 | 0.4375 | 0.206897 | 0.390805 | 0.252874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.100457 | 219 | 9 | 47 | 24.333333 | 0.883249 | 0.118721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 6 |
4f6a7eb9b7dff890a48ac11f41b288c9fbaec118 | 129 | py | Python | utils/email.py | City-of-Helsinki/berth-reservations | a3b1a8c2176f132505527acdf6da3a62199401db | [
"MIT"
] | 3 | 2020-10-13T07:58:48.000Z | 2020-12-22T09:41:50.000Z | utils/email.py | City-of-Helsinki/berth-reservations | a3b1a8c2176f132505527acdf6da3a62199401db | [
"MIT"
] | 422 | 2018-10-25T10:57:05.000Z | 2022-03-30T05:47:14.000Z | utils/email.py | City-of-Helsinki/berth-reservations | a3b1a8c2176f132505527acdf6da3a62199401db | [
"MIT"
] | 1 | 2020-04-03T07:38:03.000Z | 2020-04-03T07:38:03.000Z | from typing import Optional
def is_valid_email(email: Optional[str]) -> bool:
return email and "@example.com" not in email
| 21.5 | 49 | 0.736434 | 20 | 129 | 4.65 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170543 | 129 | 5 | 50 | 25.8 | 0.869159 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 6 |
4f8d03926907ceb4b1f7526fce5d4a4b1fb147b1 | 101 | py | Python | tests/test_nfldbc.py | strandx/nflpredict | 56b4364f424ef9577b5c0ede71dd11a0fe4663f4 | [
"MIT"
] | null | null | null | tests/test_nfldbc.py | strandx/nflpredict | 56b4364f424ef9577b5c0ede71dd11a0fe4663f4 | [
"MIT"
] | null | null | null | tests/test_nfldbc.py | strandx/nflpredict | 56b4364f424ef9577b5c0ede71dd11a0fe4663f4 | [
"MIT"
] | null | null | null | from context import nfldbc
# Test database connection
def test_dbc():
assert nfldbc.dbc != None
| 16.833333 | 29 | 0.742574 | 14 | 101 | 5.285714 | 0.785714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.188119 | 101 | 5 | 30 | 20.2 | 0.902439 | 0.237624 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4f99a9baa5b77f3bf292f97bf8b4dda8737fcc4b | 230 | py | Python | boucanpy/core/dns_record/__init__.py | bbhunter/boucanpy | 7d2fb105e7b1e90653a511534fb878bb62d02f17 | [
"MIT"
] | 34 | 2019-11-16T17:22:15.000Z | 2022-02-11T23:12:46.000Z | boucanpy/core/dns_record/__init__.py | bbhunter/boucanpy | 7d2fb105e7b1e90653a511534fb878bb62d02f17 | [
"MIT"
] | 1 | 2021-02-09T09:34:55.000Z | 2021-02-10T21:46:20.000Z | boucanpy/core/dns_record/__init__.py | bbhunter/boucanpy | 7d2fb105e7b1e90653a511534fb878bb62d02f17 | [
"MIT"
] | 9 | 2019-11-18T22:18:07.000Z | 2021-02-08T13:23:51.000Z | from .repos import DnsRecordRepo
from .data import DnsRecordData
from .responses import DnsRecordResponse, DnsRecordsResponse, DnsRecordsDigResponse
from .forms import DnsRecordCreateForm, DnsRecordForZoneCreateForm, DnsRecordStr
| 46 | 83 | 0.878261 | 20 | 230 | 10.1 | 0.7 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086957 | 230 | 4 | 84 | 57.5 | 0.961905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4fa12b9fbcada3c3106f437f0be3e0d3e561762f | 78 | py | Python | bindings/python/tests/ConcurrentHelpers/__init__.py | johnbeckettn2e/sysrepo | 252256610d158e317a39f8780f4adffc3c14f137 | [
"Apache-2.0"
] | 1 | 2021-04-01T10:34:59.000Z | 2021-04-01T10:34:59.000Z | bindings/python/tests/ConcurrentHelpers/__init__.py | johnbeckettn2e/sysrepo | 252256610d158e317a39f8780f4adffc3c14f137 | [
"Apache-2.0"
] | 1 | 2021-07-24T14:04:06.000Z | 2021-07-24T14:04:06.000Z | bindings/python/tests/ConcurrentHelpers/__init__.py | johnbeckettn2e/sysrepo | 252256610d158e317a39f8780f4adffc3c14f137 | [
"Apache-2.0"
] | null | null | null | from .Tester import *
from .TestManager import *
from .SysrepoTester import *
| 19.5 | 28 | 0.769231 | 9 | 78 | 6.666667 | 0.555556 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 78 | 3 | 29 | 26 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
96ef444cb776104def536a330448dfd680745af1 | 142 | py | Python | frarch/__init__.py | victorbadenas/frarch | e75e2a63aaf14cf797ffffc901ca382b3d88b7b0 | [
"Apache-2.0"
] | null | null | null | frarch/__init__.py | victorbadenas/frarch | e75e2a63aaf14cf797ffffc901ca382b3d88b7b0 | [
"Apache-2.0"
] | 4 | 2022-02-16T20:53:24.000Z | 2022-02-16T21:39:26.000Z | frarch/__init__.py | victorbadenas/frarch | e75e2a63aaf14cf797ffffc901ca382b3d88b7b0 | [
"Apache-2.0"
] | 1 | 2022-03-20T23:47:16.000Z | 2022-03-20T23:47:16.000Z | from . import datasets
from . import models
from . import modules
from . import train
from . import utils
from .parser import parse_arguments
| 20.285714 | 35 | 0.788732 | 20 | 142 | 5.55 | 0.5 | 0.45045 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.169014 | 142 | 6 | 36 | 23.666667 | 0.940678 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8c2d9d1b52a69694777061b80a24a22aeac9d7dc | 20 | py | Python | gaiachallenge/potential/__init__.py | adrn/GaiaChallenge | c9d76a28f4a6b5c47c2d1c17bb77ab24214050ca | [
"MIT"
] | null | null | null | gaiachallenge/potential/__init__.py | adrn/GaiaChallenge | c9d76a28f4a6b5c47c2d1c17bb77ab24214050ca | [
"MIT"
] | null | null | null | gaiachallenge/potential/__init__.py | adrn/GaiaChallenge | c9d76a28f4a6b5c47c2d1c17bb77ab24214050ca | [
"MIT"
] | null | null | null | from .pal5 import *
| 10 | 19 | 0.7 | 3 | 20 | 4.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.2 | 20 | 1 | 20 | 20 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8c42d47a6dd182f4cf7960944c9847b5d454b98a | 696 | py | Python | dotbot/context.py | benperiton/dotbot | fe516ee4af7bf57969e38a3226e2760f760b08e2 | [
"MIT"
] | 1 | 2020-10-06T07:14:08.000Z | 2020-10-06T07:14:08.000Z | dotbot/context.py | benperiton/dotbot | fe516ee4af7bf57969e38a3226e2760f760b08e2 | [
"MIT"
] | 4 | 2021-09-13T12:26:42.000Z | 2021-09-27T11:30:43.000Z | modules/dotbot/dotbot/context.py | danitome24/dotfiles | 22821002e8922e1f1ff706b851e116ed77beb164 | [
"MIT"
] | null | null | null | import copy
import os
class Context(object):
'''
Contextual data and information for plugins.
'''
def __init__(self, base_directory):
self._base_directory = base_directory
self._defaults = {}
pass
def set_base_directory(self, base_directory):
self._base_directory = base_directory
def base_directory(self, canonical_path=True):
base_directory = self._base_directory
if canonical_path:
base_directory = os.path.realpath(base_directory)
return base_directory
def set_defaults(self, defaults):
self._defaults = defaults
def defaults(self):
return copy.deepcopy(self._defaults)
| 24.857143 | 61 | 0.673851 | 79 | 696 | 5.594937 | 0.341772 | 0.382353 | 0.230769 | 0.190045 | 0.309955 | 0.21267 | 0.21267 | 0.21267 | 0 | 0 | 0 | 0 | 0.251437 | 696 | 27 | 62 | 25.777778 | 0.848369 | 0.063218 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.277778 | false | 0.055556 | 0.111111 | 0.055556 | 0.555556 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
4fd8c5c75cd5d5a2d86aab3fd8f4a891cb614f68 | 28 | py | Python | SMSGlobalAPI/__init__.py | pr8kerl/sns2smsglobal | 5f4fae2bce09e08c3a8d789dc1a08164fd5118e3 | [
"MIT"
] | null | null | null | SMSGlobalAPI/__init__.py | pr8kerl/sns2smsglobal | 5f4fae2bce09e08c3a8d789dc1a08164fd5118e3 | [
"MIT"
] | null | null | null | SMSGlobalAPI/__init__.py | pr8kerl/sns2smsglobal | 5f4fae2bce09e08c3a8d789dc1a08164fd5118e3 | [
"MIT"
] | null | null | null | from wrapper import Wrapper
| 14 | 27 | 0.857143 | 4 | 28 | 6 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 28 | 1 | 28 | 28 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
4fddff5f60fa192cd0a262ba930dfa91ebe5e849 | 42 | py | Python | src/spyd/authentication/exceptions.py | fdChasm/spyd | 38e070d10290c2da1e9e5c2226aace871e4dcc59 | [
"Zlib"
] | 4 | 2015-05-05T16:44:42.000Z | 2020-10-27T09:45:23.000Z | src/spyd/authentication/exceptions.py | fdChasm/spyd | 38e070d10290c2da1e9e5c2226aace871e4dcc59 | [
"Zlib"
] | null | null | null | src/spyd/authentication/exceptions.py | fdChasm/spyd | 38e070d10290c2da1e9e5c2226aace871e4dcc59 | [
"Zlib"
] | 2 | 2016-12-13T22:21:08.000Z | 2020-03-14T16:44:20.000Z | class AuthFailedException(Exception): pass | 42 | 42 | 0.880952 | 4 | 42 | 9.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 42 | 1 | 42 | 42 | 0.925 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 6 |
4fe2b3139e68f9c9004d147ba2d528b75651d323 | 15,014 | py | Python | src/fastjet/_singleevent.py | scikit-hep/fastjet | e5aebdc66167400472cd29a36f4af0f2a789a992 | [
"BSD-3-Clause"
] | 8 | 2021-04-18T07:00:19.000Z | 2022-02-16T16:07:21.000Z | src/fastjet/_singleevent.py | scikit-hep/fastjet | e5aebdc66167400472cd29a36f4af0f2a789a992 | [
"BSD-3-Clause"
] | 25 | 2021-04-15T16:35:33.000Z | 2022-03-31T20:38:57.000Z | src/fastjet/_singleevent.py | scikit-hep/fastjet | e5aebdc66167400472cd29a36f4af0f2a789a992 | [
"BSD-3-Clause"
] | 3 | 2021-07-26T23:12:36.000Z | 2021-08-24T14:57:27.000Z | import awkward as ak
import numpy as np
import fastjet._ext # noqa: F401, E402
class _classsingleevent:
def __init__(self, data, jetdef):
self.jetdef = jetdef
self.data = self.single_to_jagged(data)
px, py, pz, E, offsets = self.extract_cons(self.data)
px = self.correct_byteorder(px)
py = self.correct_byteorder(py)
pz = self.correct_byteorder(pz)
E = self.correct_byteorder(E)
offsets = self.correct_byteorder(offsets)
self._results = fastjet._ext.interfacemulti(px, py, pz, E, offsets, jetdef)
def correct_byteorder(self, data):
if data.dtype.byteorder == "=":
pass
else:
data = data.dtype.newbyteorder("=")
return data
def check_jaggedness(self, data):
if isinstance(data.layout, ak.layout.ListOffsetArray64):
return 1 + self.check_jaggedness(ak.Array(data.layout.content))
else:
return 0
def extract_cons(self, array):
px = np.asarray(ak.Array(array.layout.content, behavior=array.behavior).px)
py = np.asarray(ak.Array(array.layout.content, behavior=array.behavior).py)
pz = np.asarray(ak.Array(array.layout.content, behavior=array.behavior).pz)
E = np.asarray(ak.Array(array.layout.content, behavior=array.behavior).E)
off = np.asarray(array.layout.stops)
off = np.insert(off, 0, 0)
return px, py, pz, E, off
def _check_record(self, data):
out = isinstance(
data.layout,
(
ak.layout.RecordArray,
ak.layout.NumpyArray,
),
)
return out
def single_to_jagged(self, array):
single = ak.Array(
ak.layout.ListOffsetArray64(
ak.layout.Index64(np.array([0, len(array)])),
ak.layout.RecordArray(
[
ak.layout.NumpyArray(array.px),
ak.layout.NumpyArray(array.py),
ak.layout.NumpyArray(array.pz),
ak.layout.NumpyArray(array.E),
],
["px", "py", "pz", "E"],
parameters={"__record__": "Momentum4D"},
),
)
)
return single
def inclusive_jets(self, min_pt):
np_results = self._results.to_numpy(min_pt)
return ak.Array(
ak.layout.RecordArray(
(
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
),
("px", "py", "pz", "E"),
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
def unclustered_particles(self):
np_results = self._results.to_numpy_unclustered_particles()
return ak.Array(
ak.layout.RecordArray(
(
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
),
("px", "py", "pz", "E"),
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
def exclusive_jets(self, n_jets, dcut):
np_results = 0
if n_jets == 0:
raise ValueError("Njets cannot be 0") from None
if dcut == -1 and n_jets != -1:
np_results = self._results.to_numpy_exclusive_njet(n_jets)
if n_jets == -1 and dcut != -1:
np_results = self._results.to_numpy_exclusive_dcut(dcut)
if np_results == 0:
raise ValueError("Either Dcut or Njets should be entered") from None
return ak.Array(
ak.layout.RecordArray(
(
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
),
("px", "py", "pz", "E"),
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
def exclusive_jets_ycut(self, ycut):
np_results = self._results.to_numpy_exclusive_ycut(ycut)
return ak.Array(
ak.layout.RecordArray(
(
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
),
("px", "py", "pz", "E"),
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
def constituent_index(self, min_pt):
np_results = self._results.to_numpy_with_constituents(min_pt)
off = np.insert(np_results[-1], 0, 0)
out = ak.Array(
ak.layout.ListOffsetArray64(
ak.layout.Index64(np_results[0]), ak.layout.NumpyArray(np_results[1])
)
)
out = ak.Array(ak.layout.ListOffsetArray64(ak.layout.Index64(off), out.layout))
return out[0]
def unique_history_order(self):
np_results = self._results.to_numpy_unique_history_order()
out = ak.Array(ak.layout.NumpyArray(np_results[0]))
return out
def constituents(self, min_pt):
np_results = self._results.to_numpy_with_constituents(min_pt)
off = np.insert(np_results[-1], 0, 0)
out = ak.Array(
ak.layout.ListOffsetArray64(
ak.layout.Index64(np_results[0]), ak.layout.NumpyArray(np_results[1])
)
)
outputs_to_inputs = ak.Array(
ak.layout.ListOffsetArray64(ak.layout.Index64(off), out.layout)
)
shape = ak.num(outputs_to_inputs)
total = np.sum(shape)
duplicate = ak.unflatten(np.zeros(total, np.int64), shape)
prepared = self.data[:, np.newaxis][duplicate]
return prepared[outputs_to_inputs][0]
def exclusive_dmerge(self, njets):
np_results = self._results.to_numpy_exclusive_dmerge(njets)
out = np_results[0]
out = out[0]
return out
def exclusive_dmerge_max(self, njets):
np_results = self._results.to_numpy_exclusive_dmerge_max(njets)
out = np_results[0]
out = out[0]
return out
def exclusive_ymerge_max(self, njets):
np_results = self._results.to_numpy_exclusive_ymerge_max(njets)
out = np_results[0]
out = out[0]
return out
def exclusive_ymerge(self, njets):
np_results = self._results.to_numpy_exclusive_ymerge(njets)
out = np_results[0]
out = out[0]
return out
def Q(self):
np_results = self._results.to_numpy_q()
out = np_results[0]
out = out[0]
return out
def Q2(self):
np_results = self._results.to_numpy_q2()
out = np_results[0]
out = out[0]
return out
def exclusive_subjets(self, data, dcut, nsub):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = 0
if nsub == 0:
raise ValueError("Nsub cannot be 0")
if dcut == -1 and nsub != -1:
np_results = self._results.to_numpy_exclusive_subjets_nsub(
px, py, pz, E, nsub
)
if nsub == -1 and dcut != -1:
np_results = self._results.to_numpy_exclusive_subjets_dcut(
px, py, pz, E, dcut
)
if np_results == 0:
raise ValueError("Either Dcut or Njets should be entered") from None
return ak.Array(
ak.layout.RecordArray(
[
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
],
["px", "py", "pz", "E"],
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
def exclusive_subjets_up_to(self, data, nsub):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = self._results.to_numpy_exclusive_subjets_up_to(px, py, pz, E, nsub)
return ak.Array(
ak.layout.RecordArray(
[
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
],
["px", "py", "pz", "E"],
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
def exclusive_subdmerge(self, data, nsub):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = self._results.to_numpy_exclusive_subdmerge(px, py, pz, E, nsub)
out = np_results[0]
out = out[0]
return out
def exclusive_subdmerge_max(self, data, nsub):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = self._results.to_numpy_exclusive_subdmerge_max(px, py, pz, E, nsub)
out = np_results[0]
out = out[0]
return out
def n_exclusive_subjets(self, data, dcut):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = self._results.to_numpy_n_exclusive_subjets(px, py, pz, E, dcut)
out = np_results[0]
out = out[0]
return out
def has_parents(self, data):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = self._results.to_numpy_has_parents(px, py, pz, E)
out = np_results[0]
out = out[0]
return out
def has_child(self, data):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = self._results.to_numpy_has_child(px, py, pz, E)
out = np_results[0]
out = out[0]
return out
def jet_scale_for_algorithm(self, data):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = self._results.to_numpy_jet_scale_for_algorithm(px, py, pz, E)
out = np_results[0]
out = out[0]
return out
def n_particles(self):
np_results = self._results.to_numpy_n_particles()
out = np_results[0]
out = out[0]
return out
def n_exclusive_jets(self, dcut):
np_results = self._results.to_numpy_n_exclusive_jets(dcut)
out = np_results[0]
out = out[0]
return out
def childless_pseudojets(self):
np_results = self._results.to_numpy_childless_pseudojets()
return ak.Array(
ak.layout.RecordArray(
(
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
),
("px", "py", "pz", "E"),
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
def jets(self):
np_results = self._results.to_numpy_jets()
return ak.Array(
ak.layout.RecordArray(
(
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
),
("px", "py", "pz", "E"),
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
def get_parents(self, data):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = self._results.to_numpy_get_parents(px, py, pz, E)
return ak.Array(
ak.layout.RecordArray(
(
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
),
("px", "py", "pz", "E"),
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
def get_child(self, data):
try:
px = data.px
py = data.py
pz = data.pz
E = data.E
except AttributeError:
raise AttributeError("Lorentz vector not found") from None
np_results = self._results.to_numpy_get_child(px, py, pz, E)
return ak.Array(
ak.layout.RecordArray(
(
ak.layout.NumpyArray(np_results[0]),
ak.layout.NumpyArray(np_results[1]),
ak.layout.NumpyArray(np_results[2]),
ak.layout.NumpyArray(np_results[3]),
),
("px", "py", "pz", "E"),
parameters={"__record__": "Momentum4D"},
),
behavior=self.data.behavior,
)
| 34.514943 | 88 | 0.52551 | 1,681 | 15,014 | 4.493754 | 0.077335 | 0.111994 | 0.114376 | 0.113847 | 0.80421 | 0.779852 | 0.771247 | 0.735902 | 0.713265 | 0.694864 | 0 | 0.015225 | 0.365659 | 15,014 | 434 | 89 | 34.59447 | 0.777929 | 0.001066 | 0 | 0.62406 | 0 | 0 | 0.043212 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.082707 | false | 0.002506 | 0.007519 | 0 | 0.175439 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8b1f1874f92b99eec66717fc9e41a475f02be9ea | 220 | py | Python | tests/test_api.py | FlorianPignol/telluric | 285c7e195b2da630b4c76f2552465a424bcdaeb2 | [
"MIT"
] | 81 | 2018-04-12T12:29:06.000Z | 2022-03-17T09:41:55.000Z | tests/test_api.py | FlorianPignol/telluric | 285c7e195b2da630b4c76f2552465a424bcdaeb2 | [
"MIT"
] | 283 | 2018-04-09T11:32:25.000Z | 2022-03-25T22:16:38.000Z | tests/test_api.py | FlorianPignol/telluric | 285c7e195b2da630b4c76f2552465a424bcdaeb2 | [
"MIT"
] | 22 | 2018-04-09T10:53:52.000Z | 2022-02-09T10:38:33.000Z | import telluric
def test_api_exposed():
assert hasattr(telluric, "GeoVector")
assert hasattr(telluric, "GeoRaster2")
assert hasattr(telluric, "GeoFeature")
assert hasattr(telluric, "FeatureCollection")
| 24.444444 | 49 | 0.740909 | 22 | 220 | 7.318182 | 0.545455 | 0.322981 | 0.521739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005376 | 0.154545 | 220 | 8 | 50 | 27.5 | 0.860215 | 0 | 0 | 0 | 0 | 0 | 0.209091 | 0 | 0 | 0 | 0 | 0 | 0.666667 | 1 | 0.166667 | true | 0 | 0.166667 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
8b23577718543254c33b75e313149655c700b1f2 | 33 | py | Python | code/display/__init__.py | FrederikWR/course-02443-stochastic-virus-outbreak | 4f1d7f1fa4aa197b31ed86c4daf420d5a637974e | [
"MIT"
] | null | null | null | code/display/__init__.py | FrederikWR/course-02443-stochastic-virus-outbreak | 4f1d7f1fa4aa197b31ed86c4daf420d5a637974e | [
"MIT"
] | null | null | null | code/display/__init__.py | FrederikWR/course-02443-stochastic-virus-outbreak | 4f1d7f1fa4aa197b31ed86c4daf420d5a637974e | [
"MIT"
] | null | null | null |
from .world_map import WorldMap
| 11 | 31 | 0.818182 | 5 | 33 | 5.2 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151515 | 33 | 2 | 32 | 16.5 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8b2ff9678bd8f29e377c8eb6829f8ec1d0dfa4bf | 29 | py | Python | examples/cfd/__init__.py | BrunoMot/devito | b6e077857765b7b5fad812ec5774635ca4c6fbb7 | [
"MIT"
] | 204 | 2020-01-09T11:27:58.000Z | 2022-03-20T22:53:37.000Z | examples/cfd/__init__.py | BrunoMot/devito | b6e077857765b7b5fad812ec5774635ca4c6fbb7 | [
"MIT"
] | 949 | 2016-04-25T11:41:34.000Z | 2019-12-27T10:43:40.000Z | examples/cfd/__init__.py | BrunoMot/devito | b6e077857765b7b5fad812ec5774635ca4c6fbb7 | [
"MIT"
] | 131 | 2020-01-08T17:43:13.000Z | 2022-03-27T11:36:47.000Z | from .tools import * # noqa
| 14.5 | 28 | 0.655172 | 4 | 29 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.241379 | 29 | 1 | 29 | 29 | 0.863636 | 0.137931 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 6 |
8c8a96283de1d287d330b88dc1f07ee2c3af8abe | 13,462 | py | Python | idl2py/wcs/observatory.py | RapidLzj/idl2py | 193051cd8d01db0d125b8975713b885ad521a992 | [
"MIT"
] | null | null | null | idl2py/wcs/observatory.py | RapidLzj/idl2py | 193051cd8d01db0d125b8975713b885ad521a992 | [
"MIT"
] | null | null | null | idl2py/wcs/observatory.py | RapidLzj/idl2py | 193051cd8d01db0d125b8975713b885ad521a992 | [
"MIT"
] | null | null | null | """
By Dr Jie Zheng -Q, NAOC
v1 2019-04-27
"""
import numpy as np
from..util import *
def observatory():
pass
#pro observatory,obsname,obs_struct, print = print
#;+
#; NAME:
#; OBSERVATORY
#; PURPOSE:
#; Return longitude, latitude, altitude & time zones of an observatory
#; EXPLANATION:
#; Given an observatory name, returns a structure giving the longitude,
#; latitude, altitude, and time zone
#;
#; CALLING SEQUENCE:
#; Observatory, obsname, obs_struct, [ /PRINT ]
#;
#; INPUTS:
#; obsname - scalar or vector string giving abbreviated name(s) of
#; observatories for which location or time information is requested.
#; If obsname is an empty string, then information is returned for
#; all observatories in the database. See the NOTES: section
#; for the list of 41 recognized observatories. The case of the
#; string does not matter
#; OUTPUTS:
#; obs_struct - an IDL structure containing information on the specified
#; observatories. The structure tags are as follows:
#; .observatory - abbreviated observatory name
#; .name - full observatory name
#; .longitude - observatory longitude in degrees *west*
#; .latitude - observatory latitude in degrees
#; .altitude - observatory altitude in meters above sea level
#; .tz - time zone, number of hours *west* of Greenwich
#;
#; OPTIONAL INPUT KEYWORD:
#; /PRINT - If this keyword is set, (or if only 1 parameter is supplied)
#; then OBSERVATORY will display information about the specified
#; observatories at the terminal
#; EXAMPLE:
#; Get the latitude, longitude and altitude of Kitt Peak National Observatory
#;
#; IDL> observatory,'kpno',obs
#; IDL> print,obs.longitude ==> 111.6 degrees west
#; IDL> print,obs.latitude ==> +31.9633 degrees
#; IDL> print,obs.altitude ==> 2120 meters above sea level
#;
#; NOTES:
#; Observatory information is taken from noao$lib/obsdb.dat file in IRAF 2.11
#; Currently recognized observatory names are as follows:
#;
#; 'kpno': Kitt Peak National Observatory
#; 'ctio': Cerro Tololo Interamerican Observatory
#; 'eso': European Southern Observatory
#; 'lick': Lick Observatory
#; 'mmto': MMT Observatory
#; 'cfht': Canada-France-Hawaii Telescope
#; 'lapalma': Roque de los Muchachos, La Palma
#; 'mso': Mt. Stromlo Observatory
#; 'sso': Siding Spring Observatory
#; 'aao': Anglo-Australian Observatory
#; 'mcdonald': McDonald Observatory
#; 'lco': Las Campanas Observatory
#; 'mtbigelow': Catalina Observatory: 61 inch telescope
#; 'dao': Dominion Astrophysical Observatory
#; 'spm': Observatorio Astronomico Nacional, San Pedro Martir
#; 'tona': Observatorio Astronomico Nacional, Tonantzintla
#; 'Palomar': The Hale Telescope
#; 'mdm': Michigan-Dartmouth-MIT Observatory
#; 'NOV': National Observatory of Venezuela
#; 'bmo': Black Moshannon Observatory
#; 'BAO': Beijing XingLong Observatory
#; 'keck': W. M. Keck Observatory
#; 'ekar': Mt. Ekar 182 cm. Telescope
#; 'loiano': Bologna Astronomical Observatory, Loiano - Italy
#; 'apo': Apache Point Observatory
#; 'lowell': Lowell Observatory
#; 'vbo': Vainu Bappu Observatory
#; 'flwo': Whipple Observatory
#; 'oro': Oak Ridge Observatory
#; 'lna': Laboratorio Nacional de Astrofisica - Brazil
#; 'saao': South African Astronomical Observatory
#; 'casleo': Complejo Astronomico El Leoncito, San Juan
#; 'bosque': Estacion Astrofisica Bosque Alegre, Cordoba
#; 'rozhen': National Astronomical Observatory Rozhen - Bulgaria
#; 'irtf': NASA Infrared Telescope Facility
#; 'bgsuo': Bowling Green State Univ Observatory
#; 'ca': Calar Alto Observatory
#; 'holi': Observatorium Hoher List (Universitaet Bonn) - Germany
#; 'lmo': Leander McCormick Observatory
#; 'fmo': Fan Mountain Observatory
#; 'whitin': Whitin Observatory, Wellesley College
#; 'mgio': Mount Graham International Observatory
#;
#; PROCEDURE CALLS:
#; TEN()
#; REVISION HISTORY:
#; Written W. Landsman July 2000
#; Corrected sign error for 'holi' W.L/ Holger Israel Mar 2008
#; Correctly terminate when observatory name not recognized
#; S. Koposov, July 2008
#;-
#
# On_error,2 ;Return to caller
# compile_opt idl2
#
# if N_params() LT 1 then begin
# print,'Observatory, obsname, obs_struct, [/print]'
# return
# endif
#
#obs=[ 'kpno','ctio','eso','lick','mmto','cfht','lapalma','mso','sso','aao', $
# 'mcdonald','lco','mtbigelow','dao','spm','tona','Palomar','mdm','NOV','bmo',$
# 'BAO','keck','ekar','loiano','apo','lowell','vbo','flwo','oro','lna','saao',$
# 'casleo','bosque','rozhen','irtf','bgsuo','ca','holi','lmo','fmo','whitin',$
# 'mgio']
#
# if N_elements(obsname) EQ 1 then if obsname eq '' then obsname = obs
# nobs = N_elements(obsname)
# obs_struct = {observatory:'',name:'', longitude:0.0, latitude:0.0, $
# altitude:0.0, tz:0.0}
# if Nobs GT 1 then obs_struct = replicate(obs_struct,Nobs)
# obs_struct.observatory = obsname
#
#
#for i=0,Nobs-1 do begin
#case strlowcase(obsname[i]) of
#"kpno": begin
# name = "Kitt Peak National Observatory"
# longitude = [111,36.0]
# latitude = [31,57.8]
# altitude = 2120.
# tz = 7
# end
#"ctio": begin
# name = "Cerro Tololo Interamerican Observatory"
# longitude = 70.815
# latitude = -30.16527778
# altitude = 2215.
# tz = 4
# end
#"eso": begin
# name = "European Southern Observatory"
# longitude = [70,43.8]
# latitude = [-29,15.4]
# altitude = 2347.
# tz = 4
# end
#"lick": begin
# name = "Lick Observatory"
# longitude = [121,38.2]
# latitude = [37,20.6]
# altitude = 1290.
# tz = 8
# end
#"mmto": begin
# name = "MMT Observatory"
# longitude = [110,53.1]
# latitude = [31,41.3]
# altitude = 2600.
# tz = 7
# end
#"cfht": begin
# name = "Canada-France-Hawaii Telescope"
# longitude = [155,28.3]
# latitude = [19,49.6]
# altitude = 4215.
# tz = 10
# end
#"lapalma": begin
# name = "Roque de los Muchachos, La Palma"
# longitude = [17,52.8]
# latitude = [28,45.5]
# altitude = 2327
# tz = 0
# end
#"mso": begin
# name = "Mt. Stromlo Observatory"
# longitude = [210,58,32.4]
# latitude = [-35,19,14.34]
# altitude = 767
# tz = -10
# end
#"sso": begin
# name = "Siding Spring Observatory"
# longitude = [210,56,19.70]
# latitude = [-31,16,24.10]
# altitude = 1149
# tz = -10
# end
#"aao": begin
# name = "Anglo-Australian Observatory"
# longitude = [210,56,2.09]
# latitude = [-31,16,37.34]
# altitude = 1164
# tz = -10
# end
#"mcdonald": begin
# name = "McDonald Observatory"
# longitude = 104.0216667
# latitude = 30.6716667
# altitude = 2075
# tz = 6
# end
#"lco": begin
# name = "Las Campanas Observatory"
# longitude = [70,42.1]
# latitude = [-29,0.2]
# altitude = 2282
# tz = 4
# end
#"mtbigelow": begin
# name = "Catalina Observatory: 61 inch telescope"
# longitude = [110,43.9]
# latitude = [32,25.0]
# altitude = 2510.
# tz = 7
# end
#"dao": begin
# name = "Dominion Astrophysical Observatory"
# longitude = [123,25.0]
# latitude = [48,31.3]
# altitude = 229.
# tz = 8
# end
# "spm": begin
# name = "Observatorio Astronomico Nacional, San Pedro Martir"
# longitude = [115,29,13]
# latitude = [31,01,45]
# altitude = 2830.
# tz = 7
# end
# "tona": begin
# name = "Observatorio Astronomico Nacional, Tonantzintla"
# longitude = [98,18,50]
# latitude = [19,01,58]
# tz = 8
# altitude = -999999 ; Altitude not supplied
# end
# "palomar": begin
# name = "The Hale Telescope"
# longitude = [116,51,46.80]
# latitude = [33,21,21.6]
# altitude = 1706.
# tz = 8
# end
# "mdm": begin
# name = "Michigan-Dartmouth-MIT Observatory"
# longitude = [111,37.0]
# latitude = [31,57.0]
# altitude = 1938.5
# tz = 7
# end
# "nov": begin
# name = "National Observatory of Venezuela"
# longitude = [70,52.0]
# latitude = [8,47.4]
# altitude = 3610
# tz = 4
# end
# "bmo": begin
# name = "Black Moshannon Observatory"
# longitude = [78,00.3]
# latitude = [40,55.3]
# altitude = 738.
# tz = 5
# end
# "bao": begin
# name = "Beijing XingLong Observatory"
# longitude = [242,25.5]
# latitude = [40,23.6]
# altitude = 950.
# tz = -8
# end
# "keck": begin
# name = "W. M. Keck Observatory"
# longitude = [155,28.7]
# latitude = [19,49.7]
# altitude = 4160.
# tz = 10
# end
# "ekar": begin
# name = "Mt. Ekar 182 cm. Telescope"
# longitude = [348,25,07.92]
# latitude = [45,50,54.92]
# altitude = 1413.69
# tz = -1
# end
# "loiano": begin
# name = "Bologna Astronomical Observatory, Loiano - Italy"
# longitude = [348,39,58]
# latitude = [44,15,33]
# altitude = 785.
# tz = -1
# end
# "apo": begin
# name = "Apache Point Observatory"
# longitude = [105,49.2]
# latitude = [32,46.8]
# altitude = 2798.
# tz = 7
# end
# "lowell": begin
# name = "Lowell Observatory"
# longitude = [111,32.1]
# latitude = [35,05.8]
# altitude = 2198.
# tz = 7
# end
# "vbo": begin
# name = "Vainu Bappu Observatory"
# longitude = 281.1734
# latitude = 12.57666
# altitude = 725.
# tz = -5.5
# end
# "flwo": begin
# name = "Whipple Observatory"
# longitude = [110,52,39]
# latitude = [31,40,51.4]
# altitude = 2320.
# tz = 7
# end
# "oro": begin
# name = "Oak Ridge Observatory"
# longitude = [71,33,29.32]
# latitude = [42,30,18.94]
# altitude = 184.
# tz = 5
# end
#
# "lna": begin
# name = "Laboratorio Nacional de Astrofisica - Brazil"
# longitude = 45.5825
# latitude = [-22,32,04]
# altitude = 1864.
# tz = 3
# end
#
# "saao": begin
# name = "South African Astronomical Observatory"
# longitude = [339,11,21.5]
# latitude = [-32,22,46]
# altitude = 1798.
# tz = -2
# end
# "casleo": begin
# name = "Complejo Astronomico El Leoncito, San Juan"
# longitude = [69,18,00]
# latitude = [-31,47,57]
# altitude = 2552
# tz = 3
# end
# "bosque": begin
# name = "Estacion Astrofisica Bosque Alegre, Cordoba"
# longitude = [64,32,45]
# latitude = [-31,35,54]
# altitude = 1250
# tz = 3
# end
# "rozhen": begin
# name = "National Astronomical Observatory Rozhen - Bulgaria"
# longitude = [335,15,22]
# latitude = [41,41,35]
# altitude = 1759
# tz = -2
# end
# "irtf": begin
# name = "NASA Infrared Telescope Facility"
# longitude = 155.471999
# latitude = 19.826218
# altitude = 4168
# tz = 10
# end
# "bgsuo": begin
# name = "Bowling Green State Univ Observatory"
# longitude = [83,39,33]
# latitude = [41,22,42]
# altitude = 225.
# tz = 5
# end
# "ca": begin
# name = "Calar Alto Observatory"
# longitude = [2,32,46.5]
# latitude = [37,13,25]
# altitude = 2168
# tz = -1
# end
# "holi": begin
# name = "Observatorium Hoher List (Universitaet Bonn) - Germany"
# longitude = 353.15 ;Corrected sign error March 2008
# latitude = 50.16276
# altitude = 541
# tz = -1
# end
# "lmo": begin
# name = "Leander McCormick Observatory"
# longitude = [78,31,24]
# latitude = [38,02,00]
# altitude = 264
# tz = 5
# end
# "fmo": begin
# name = "Fan Mountain Observatory"
# longitude = [78,41,34]
# latitude = [37,52,41]
# altitude = 556
# tz = 5
# end
# "whitin": begin
# name = "Whitin Observatory, Wellesley College"
# longitude = 71.305833
# latitude = 42.295
# altitude = 32
# tz = 5
# end
# "mgio": begin
# name = "Mount Graham International Observatory"
# longitude = [109,53,31.25]
# latitude = [32,42,04.69]
# altitude = 3191.0
# tz = 7
# end
# else: message,'Unable to find observatory ' + obsname + ' in database'
# endcase
#
# obs_struct[i].longitude = ten(longitude)
# obs_struct[i].latitude = ten(latitude)
# obs_struct[i].tz = tz
# obs_struct[i].name = name
# obs_struct[i].altitude = altitude
#
# if N_params() EQ 1 or keyword_set(print) then begin
# print,' '
# print,'Observatory: ',obsname[i]
# print,'Name: ',name
# print,'longitude:',obs_struct[i].longitude
# print,'latitude:',obs_struct[i].latitude
# print,'altitude:',altitude
# print,'time zone:',tz
# endif
# endfor
#
# return
# end
| 29.586813 | 83 | 0.566855 | 1,545 | 13,462 | 4.924919 | 0.290615 | 0.049678 | 0.007097 | 0.010645 | 0.124195 | 0.039953 | 0 | 0 | 0 | 0 | 0 | 0.086006 | 0.293493 | 13,462 | 454 | 84 | 29.651982 | 0.714015 | 0.863245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0.25 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 6 |
8ca54aacf24d0599cb185098c75b456f3f8ddf78 | 7,890 | py | Python | negative_i18n/migrations/0001_initial.py | negative-space/negative-i18n | 3eab74ba551af85ff3072d23ea93d62ab79aae90 | [
"MIT"
] | null | null | null | negative_i18n/migrations/0001_initial.py | negative-space/negative-i18n | 3eab74ba551af85ff3072d23ea93d62ab79aae90 | [
"MIT"
] | null | null | null | negative_i18n/migrations/0001_initial.py | negative-space/negative-i18n | 3eab74ba551af85ff3072d23ea93d62ab79aae90 | [
"MIT"
] | null | null | null | # Generated by Django 2.1.7 on 2019-02-14 08:33
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='StringTranslation',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('last_used', models.DateTimeField(blank=True, default=None, null=True)),
('context', models.CharField(blank=True, default='default', max_length=50, null=True)),
('key', models.CharField(blank=True, max_length=400, null=True)),
('translation', models.TextField(blank=True, null=True)),
('translation_af', models.TextField(blank=True, null=True)),
('translation_ar', models.TextField(blank=True, null=True)),
('translation_ast', models.TextField(blank=True, null=True)),
('translation_az', models.TextField(blank=True, null=True)),
('translation_bg', models.TextField(blank=True, null=True)),
('translation_be', models.TextField(blank=True, null=True)),
('translation_bn', models.TextField(blank=True, null=True)),
('translation_br', models.TextField(blank=True, null=True)),
('translation_bs', models.TextField(blank=True, null=True)),
('translation_ca', models.TextField(blank=True, null=True)),
('translation_cs', models.TextField(blank=True, null=True)),
('translation_cy', models.TextField(blank=True, null=True)),
('translation_da', models.TextField(blank=True, null=True)),
('translation_de', models.TextField(blank=True, null=True)),
('translation_dsb', models.TextField(blank=True, null=True)),
('translation_el', models.TextField(blank=True, null=True)),
('translation_en', models.TextField(blank=True, null=True)),
('translation_en_au', models.TextField(blank=True, null=True)),
('translation_en_gb', models.TextField(blank=True, null=True)),
('translation_eo', models.TextField(blank=True, null=True)),
('translation_es', models.TextField(blank=True, null=True)),
('translation_es_ar', models.TextField(blank=True, null=True)),
('translation_es_co', models.TextField(blank=True, null=True)),
('translation_es_mx', models.TextField(blank=True, null=True)),
('translation_es_ni', models.TextField(blank=True, null=True)),
('translation_es_ve', models.TextField(blank=True, null=True)),
('translation_et', models.TextField(blank=True, null=True)),
('translation_eu', models.TextField(blank=True, null=True)),
('translation_fa', models.TextField(blank=True, null=True)),
('translation_fi', models.TextField(blank=True, null=True)),
('translation_fr', models.TextField(blank=True, null=True)),
('translation_fy', models.TextField(blank=True, null=True)),
('translation_ga', models.TextField(blank=True, null=True)),
('translation_gd', models.TextField(blank=True, null=True)),
('translation_gl', models.TextField(blank=True, null=True)),
('translation_he', models.TextField(blank=True, null=True)),
('translation_hi', models.TextField(blank=True, null=True)),
('translation_hr', models.TextField(blank=True, null=True)),
('translation_hsb', models.TextField(blank=True, null=True)),
('translation_hu', models.TextField(blank=True, null=True)),
('translation_ia', models.TextField(blank=True, null=True)),
('translation_ind', models.TextField(blank=True, null=True)),
('translation_io', models.TextField(blank=True, null=True)),
('translation_is', models.TextField(blank=True, null=True)),
('translation_it', models.TextField(blank=True, null=True)),
('translation_ja', models.TextField(blank=True, null=True)),
('translation_ka', models.TextField(blank=True, null=True)),
('translation_kab', models.TextField(blank=True, null=True)),
('translation_kk', models.TextField(blank=True, null=True)),
('translation_km', models.TextField(blank=True, null=True)),
('translation_kn', models.TextField(blank=True, null=True)),
('translation_ko', models.TextField(blank=True, null=True)),
('translation_lb', models.TextField(blank=True, null=True)),
('translation_lt', models.TextField(blank=True, null=True)),
('translation_lv', models.TextField(blank=True, null=True)),
('translation_mk', models.TextField(blank=True, null=True)),
('translation_ml', models.TextField(blank=True, null=True)),
('translation_mn', models.TextField(blank=True, null=True)),
('translation_mr', models.TextField(blank=True, null=True)),
('translation_my', models.TextField(blank=True, null=True)),
('translation_nb', models.TextField(blank=True, null=True)),
('translation_ne', models.TextField(blank=True, null=True)),
('translation_nl', models.TextField(blank=True, null=True)),
('translation_nn', models.TextField(blank=True, null=True)),
('translation_os', models.TextField(blank=True, null=True)),
('translation_pa', models.TextField(blank=True, null=True)),
('translation_pl', models.TextField(blank=True, null=True)),
('translation_pt', models.TextField(blank=True, null=True)),
('translation_pt_br', models.TextField(blank=True, null=True)),
('translation_ro', models.TextField(blank=True, null=True)),
('translation_ru', models.TextField(blank=True, null=True)),
('translation_sk', models.TextField(blank=True, null=True)),
('translation_sl', models.TextField(blank=True, null=True)),
('translation_sq', models.TextField(blank=True, null=True)),
('translation_sr', models.TextField(blank=True, null=True)),
('translation_sr_latn', models.TextField(blank=True, null=True)),
('translation_sv', models.TextField(blank=True, null=True)),
('translation_sw', models.TextField(blank=True, null=True)),
('translation_ta', models.TextField(blank=True, null=True)),
('translation_te', models.TextField(blank=True, null=True)),
('translation_th', models.TextField(blank=True, null=True)),
('translation_tr', models.TextField(blank=True, null=True)),
('translation_tt', models.TextField(blank=True, null=True)),
('translation_udm', models.TextField(blank=True, null=True)),
('translation_uk', models.TextField(blank=True, null=True)),
('translation_ur', models.TextField(blank=True, null=True)),
('translation_vi', models.TextField(blank=True, null=True)),
('translation_zh_hans', models.TextField(blank=True, null=True)),
('translation_zh_hant', models.TextField(blank=True, null=True)),
('obsolete', models.BooleanField(default=False)),
],
),
migrations.AlterUniqueTogether(
name='stringtranslation',
unique_together={('context', 'key')},
),
]
| 66.302521 | 114 | 0.598479 | 820 | 7,890 | 5.628049 | 0.17561 | 0.181365 | 0.370531 | 0.468039 | 0.84442 | 0.84442 | 0.837486 | 0.175948 | 0 | 0 | 0 | 0.003375 | 0.248923 | 7,890 | 118 | 115 | 66.864407 | 0.775397 | 0.005703 | 0 | 0.018018 | 1 | 0 | 0.176463 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.009009 | 0 | 0.045045 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 6 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.