body_hash stringlengths 64 64 | body stringlengths 23 109k | docstring stringlengths 1 57k | path stringlengths 4 198 | name stringlengths 1 115 | repository_name stringlengths 7 111 | repository_stars float64 0 191k | lang stringclasses 1 value | body_without_docstring stringlengths 14 108k | unified stringlengths 45 133k |
|---|---|---|---|---|---|---|---|---|---|
bfc2afbe9159ff1483811c75642736bad05360e63e0cfac8e0a5739b5533625e | def psmid_to_usi(psmid, project_id):
'\n Convert Percolator out PSMId to HUPO-PSI Universal Spectrum Identifier.\n See http://www.psidev.info/usi for more info.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
usi = ':'.join(['mzspec', project_id, '_'.join(psmid[:(- 6)]), 'scan', psmid[(- 3)]])
return usi | Convert Percolator out PSMId to HUPO-PSI Universal Spectrum Identifier.
See http://www.psidev.info/usi for more info.
Expects the following formatted PSMId:
`run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`
See https://github.com/percolator/percolator/issues/147 | scripts/percolator_tools.py | psmid_to_usi | RalfG/workflow_pxd_to_speclib | 0 | python | def psmid_to_usi(psmid, project_id):
'\n Convert Percolator out PSMId to HUPO-PSI Universal Spectrum Identifier.\n See http://www.psidev.info/usi for more info.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
usi = ':'.join(['mzspec', project_id, '_'.join(psmid[:(- 6)]), 'scan', psmid[(- 3)]])
return usi | def psmid_to_usi(psmid, project_id):
'\n Convert Percolator out PSMId to HUPO-PSI Universal Spectrum Identifier.\n See http://www.psidev.info/usi for more info.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
usi = ':'.join(['mzspec', project_id, '_'.join(psmid[:(- 6)]), 'scan', psmid[(- 3)]])
return usi<|docstring|>Convert Percolator out PSMId to HUPO-PSI Universal Spectrum Identifier.
See http://www.psidev.info/usi for more info.
Expects the following formatted PSMId:
`run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`
See https://github.com/percolator/percolator/issues/147<|endoftext|> |
8444b0c65debc88ea2366a7db2e2ebc6146148f1a6f3852a41efb7a715bc7bb3 | def psmid_to_run(psmid):
'\n Extract run from Percolator PSMId.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
run = '_'.join(psmid[:(- 6)])
return run | Extract run from Percolator PSMId.
Expects the following formatted PSMId:
`run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`
See https://github.com/percolator/percolator/issues/147 | scripts/percolator_tools.py | psmid_to_run | RalfG/workflow_pxd_to_speclib | 0 | python | def psmid_to_run(psmid):
'\n Extract run from Percolator PSMId.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
run = '_'.join(psmid[:(- 6)])
return run | def psmid_to_run(psmid):
'\n Extract run from Percolator PSMId.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
run = '_'.join(psmid[:(- 6)])
return run<|docstring|>Extract run from Percolator PSMId.
Expects the following formatted PSMId:
`run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`
See https://github.com/percolator/percolator/issues/147<|endoftext|> |
078b7435dc375b9c0423d7297a748da0cb2fc6667780f4088ed072c15f2b6a91 | def psmid_to_charge(psmid):
'\n Extract charge from Percolator PSMId.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
charge = int(psmid[(- 2)])
return charge | Extract charge from Percolator PSMId.
Expects the following formatted PSMId:
`run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`
See https://github.com/percolator/percolator/issues/147 | scripts/percolator_tools.py | psmid_to_charge | RalfG/workflow_pxd_to_speclib | 0 | python | def psmid_to_charge(psmid):
'\n Extract charge from Percolator PSMId.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
charge = int(psmid[(- 2)])
return charge | def psmid_to_charge(psmid):
'\n Extract charge from Percolator PSMId.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
charge = int(psmid[(- 2)])
return charge<|docstring|>Extract charge from Percolator PSMId.
Expects the following formatted PSMId:
`run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`
See https://github.com/percolator/percolator/issues/147<|endoftext|> |
aecac724511a3efa0766a0bcb973c780b5c3339cafd7ad12350aa7ea3972b389 | def psmid_to_scan(psmid):
'\n Extract charge from Percolator PSMId.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
scan = int(psmid[(- 3)])
return scan | Extract charge from Percolator PSMId.
Expects the following formatted PSMId:
`run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`
See https://github.com/percolator/percolator/issues/147 | scripts/percolator_tools.py | psmid_to_scan | RalfG/workflow_pxd_to_speclib | 0 | python | def psmid_to_scan(psmid):
'\n Extract charge from Percolator PSMId.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
scan = int(psmid[(- 3)])
return scan | def psmid_to_scan(psmid):
'\n Extract charge from Percolator PSMId.\n \n Expects the following formatted PSMId:\n `run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`\n See https://github.com/percolator/percolator/issues/147\n '
psmid = psmid.split('_')
scan = int(psmid[(- 3)])
return scan<|docstring|>Extract charge from Percolator PSMId.
Expects the following formatted PSMId:
`run` _ `SII` _ `MSGFPlus spectrum index` _ `PSM rank` _ `scan number` _ `MSGFPlus-assigned charge` _ `rank`
See https://github.com/percolator/percolator/issues/147<|endoftext|> |
08ca31030655d69e77955e653c02ed713afeb6cade065d5dcda0312f9beec4d2 | def fix_pin_tabs(path, prot_sep='|||'):
'\n Take a pin file and rewrite it, replacing the tabs that separate the\n Proteins column with a different separator\n '
f = open(path)
rows = f.readlines()
outfile = (path + '_fixed')
out = open(outfile, 'w+')
for (i, row) in enumerate(rows):
if (i == (0 & row.startswith('SpecId'))):
numcol = len(row.split('\t'))
out.write(row)
elif (i == (1 & row.startswith('DefaultDirection'))):
out.write(row)
else:
r = row.strip().split('\t')
r_cols = r[:(numcol - 1)]
r_proteins = r[(numcol - 1):]
r_cols.append(prot_sep.join(r_proteins))
out.write(('\t'.join(r_cols) + '\n'))
f.close()
out.close()
return None | Take a pin file and rewrite it, replacing the tabs that separate the
Proteins column with a different separator | scripts/percolator_tools.py | fix_pin_tabs | RalfG/workflow_pxd_to_speclib | 0 | python | def fix_pin_tabs(path, prot_sep='|||'):
'\n Take a pin file and rewrite it, replacing the tabs that separate the\n Proteins column with a different separator\n '
f = open(path)
rows = f.readlines()
outfile = (path + '_fixed')
out = open(outfile, 'w+')
for (i, row) in enumerate(rows):
if (i == (0 & row.startswith('SpecId'))):
numcol = len(row.split('\t'))
out.write(row)
elif (i == (1 & row.startswith('DefaultDirection'))):
out.write(row)
else:
r = row.strip().split('\t')
r_cols = r[:(numcol - 1)]
r_proteins = r[(numcol - 1):]
r_cols.append(prot_sep.join(r_proteins))
out.write(('\t'.join(r_cols) + '\n'))
f.close()
out.close()
return None | def fix_pin_tabs(path, prot_sep='|||'):
'\n Take a pin file and rewrite it, replacing the tabs that separate the\n Proteins column with a different separator\n '
f = open(path)
rows = f.readlines()
outfile = (path + '_fixed')
out = open(outfile, 'w+')
for (i, row) in enumerate(rows):
if (i == (0 & row.startswith('SpecId'))):
numcol = len(row.split('\t'))
out.write(row)
elif (i == (1 & row.startswith('DefaultDirection'))):
out.write(row)
else:
r = row.strip().split('\t')
r_cols = r[:(numcol - 1)]
r_proteins = r[(numcol - 1):]
r_cols.append(prot_sep.join(r_proteins))
out.write(('\t'.join(r_cols) + '\n'))
f.close()
out.close()
return None<|docstring|>Take a pin file and rewrite it, replacing the tabs that separate the
Proteins column with a different separator<|endoftext|> |
08175f7b6e195b477f614036aba359222e6932ff2120175928b16cb18a26244f | def extract_seq_mods(df, mods):
'\n Extract PEPREC-style modifications and sequence from Percolator-\n style peptide notation.\n '
modifications = {}
for mod in mods:
modifications[str(mod['unimod_accession'])] = mod['name']
modlist = []
for (_, row) in df.iterrows():
if ('UNIMOD' in row['modified_peptide']):
pep = row['modified_peptide'].split('.')[1]
mods = re.findall('\\[([^]]*)\\]', pep)
modstring = ''
for mod in mods:
mod = (('[' + mod) + ']')
key = mod.split(':')[1].rstrip(']')
try:
if (key == '21'):
phospholoc = pep[(pep.find(mod) - 1)]
modstring += ((((str(pep.find(mod)) + '|') + modifications[key]) + phospholoc) + '|')
pep = pep.replace(mod, '', 1)
else:
modstring += (((str(pep.find(mod)) + '|') + modifications[key]) + '|')
pep = pep.replace(mod, '', 1)
except:
print('Modification not expected: {}'.format(mod))
modlist.append(modstring.rstrip('|'))
else:
modlist.append('')
peplist = []
for (_, row) in df.iterrows():
pep = row['modified_peptide']
pep = pep.split('.')[1]
if ('UNIMOD' in pep):
mods = re.findall('\\[([^]]*)\\]', pep)
for mod in mods:
pep = pep.replace((('[' + mod) + ']'), '', 1)
peplist.append(pep)
df_out = pd.DataFrame({'peptide': peplist, 'modifications': modlist})
return df_out | Extract PEPREC-style modifications and sequence from Percolator-
style peptide notation. | scripts/percolator_tools.py | extract_seq_mods | RalfG/workflow_pxd_to_speclib | 0 | python | def extract_seq_mods(df, mods):
'\n Extract PEPREC-style modifications and sequence from Percolator-\n style peptide notation.\n '
modifications = {}
for mod in mods:
modifications[str(mod['unimod_accession'])] = mod['name']
modlist = []
for (_, row) in df.iterrows():
if ('UNIMOD' in row['modified_peptide']):
pep = row['modified_peptide'].split('.')[1]
mods = re.findall('\\[([^]]*)\\]', pep)
modstring =
for mod in mods:
mod = (('[' + mod) + ']')
key = mod.split(':')[1].rstrip(']')
try:
if (key == '21'):
phospholoc = pep[(pep.find(mod) - 1)]
modstring += ((((str(pep.find(mod)) + '|') + modifications[key]) + phospholoc) + '|')
pep = pep.replace(mod, , 1)
else:
modstring += (((str(pep.find(mod)) + '|') + modifications[key]) + '|')
pep = pep.replace(mod, , 1)
except:
print('Modification not expected: {}'.format(mod))
modlist.append(modstring.rstrip('|'))
else:
modlist.append()
peplist = []
for (_, row) in df.iterrows():
pep = row['modified_peptide']
pep = pep.split('.')[1]
if ('UNIMOD' in pep):
mods = re.findall('\\[([^]]*)\\]', pep)
for mod in mods:
pep = pep.replace((('[' + mod) + ']'), , 1)
peplist.append(pep)
df_out = pd.DataFrame({'peptide': peplist, 'modifications': modlist})
return df_out | def extract_seq_mods(df, mods):
'\n Extract PEPREC-style modifications and sequence from Percolator-\n style peptide notation.\n '
modifications = {}
for mod in mods:
modifications[str(mod['unimod_accession'])] = mod['name']
modlist = []
for (_, row) in df.iterrows():
if ('UNIMOD' in row['modified_peptide']):
pep = row['modified_peptide'].split('.')[1]
mods = re.findall('\\[([^]]*)\\]', pep)
modstring =
for mod in mods:
mod = (('[' + mod) + ']')
key = mod.split(':')[1].rstrip(']')
try:
if (key == '21'):
phospholoc = pep[(pep.find(mod) - 1)]
modstring += ((((str(pep.find(mod)) + '|') + modifications[key]) + phospholoc) + '|')
pep = pep.replace(mod, , 1)
else:
modstring += (((str(pep.find(mod)) + '|') + modifications[key]) + '|')
pep = pep.replace(mod, , 1)
except:
print('Modification not expected: {}'.format(mod))
modlist.append(modstring.rstrip('|'))
else:
modlist.append()
peplist = []
for (_, row) in df.iterrows():
pep = row['modified_peptide']
pep = pep.split('.')[1]
if ('UNIMOD' in pep):
mods = re.findall('\\[([^]]*)\\]', pep)
for mod in mods:
pep = pep.replace((('[' + mod) + ']'), , 1)
peplist.append(pep)
df_out = pd.DataFrame({'peptide': peplist, 'modifications': modlist})
return df_out<|docstring|>Extract PEPREC-style modifications and sequence from Percolator-
style peptide notation.<|endoftext|> |
b7b32a2a8d78392597f9b90bb34e976c65ccf85611473f5b6acc2b28a67417ab | def createDimScreenEffect(self):
'Fill splashScreen with black color and reduce the widget opacity to create dim screen effect'
primScreenGeo = QtGui.QGuiApplication.primaryScreen().geometry()
screenPixMap = QtGui.QPixmap(primScreenGeo.width(), primScreenGeo.height())
screenPixMap.fill(QtGui.QColor(0, 0, 0))
self.setPixmap(screenPixMap)
self.setWindowState(QtCore.Qt.WindowFullScreen)
self.setWindowOpacity(0.4) | Fill splashScreen with black color and reduce the widget opacity to create dim screen effect | src/screenshot/CaptureScreen.py | createDimScreenEffect | mathewthe2/Game2Text-Lightning | 0 | python | def createDimScreenEffect(self):
primScreenGeo = QtGui.QGuiApplication.primaryScreen().geometry()
screenPixMap = QtGui.QPixmap(primScreenGeo.width(), primScreenGeo.height())
screenPixMap.fill(QtGui.QColor(0, 0, 0))
self.setPixmap(screenPixMap)
self.setWindowState(QtCore.Qt.WindowFullScreen)
self.setWindowOpacity(0.4) | def createDimScreenEffect(self):
primScreenGeo = QtGui.QGuiApplication.primaryScreen().geometry()
screenPixMap = QtGui.QPixmap(primScreenGeo.width(), primScreenGeo.height())
screenPixMap.fill(QtGui.QColor(0, 0, 0))
self.setPixmap(screenPixMap)
self.setWindowState(QtCore.Qt.WindowFullScreen)
self.setWindowOpacity(0.4)<|docstring|>Fill splashScreen with black color and reduce the widget opacity to create dim screen effect<|endoftext|> |
552f66200b51bbb3f6e415fc88de59a9668e0bac2a2224a1fec32730354f26f2 | def mousePressEvent(self, event):
'Show rectangle at mouse position when left-clicked'
if (event.button() == QtCore.Qt.LeftButton):
self.origin = event.pos()
self.rubberBand.setGeometry(QtCore.QRect(self.origin, QtCore.QSize()))
self.rubberBand.show() | Show rectangle at mouse position when left-clicked | src/screenshot/CaptureScreen.py | mousePressEvent | mathewthe2/Game2Text-Lightning | 0 | python | def mousePressEvent(self, event):
if (event.button() == QtCore.Qt.LeftButton):
self.origin = event.pos()
self.rubberBand.setGeometry(QtCore.QRect(self.origin, QtCore.QSize()))
self.rubberBand.show() | def mousePressEvent(self, event):
if (event.button() == QtCore.Qt.LeftButton):
self.origin = event.pos()
self.rubberBand.setGeometry(QtCore.QRect(self.origin, QtCore.QSize()))
self.rubberBand.show()<|docstring|>Show rectangle at mouse position when left-clicked<|endoftext|> |
fb1043ff66cb52e952650a5ee260fdc48630e9280d207bed0884a8451cb67796 | def mouseMoveEvent(self, event):
'Resize rectangle as we move mouse, after left-clicked.'
self.rubberBand.setGeometry(QtCore.QRect(self.origin, event.pos()).normalized()) | Resize rectangle as we move mouse, after left-clicked. | src/screenshot/CaptureScreen.py | mouseMoveEvent | mathewthe2/Game2Text-Lightning | 0 | python | def mouseMoveEvent(self, event):
self.rubberBand.setGeometry(QtCore.QRect(self.origin, event.pos()).normalized()) | def mouseMoveEvent(self, event):
self.rubberBand.setGeometry(QtCore.QRect(self.origin, event.pos()).normalized())<|docstring|>Resize rectangle as we move mouse, after left-clicked.<|endoftext|> |
f12a14c9414b7f5bfc9b11c325066bd28111c6341cae16fa4c015c2de2b373b0 | def mouseReleaseEvent(self, event):
"Upon mouse released, ask the main desktop's QScreen to capture screen on defined area."
if (event.button() == QtCore.Qt.LeftButton):
self.end = event.pos()
self.rubberBand.hide()
self.close()
primaryScreen = QtGui.QGuiApplication.primaryScreen()
grabbedPixMap = primaryScreen.grabWindow(0, self.origin.x(), self.origin.y(), (self.end.x() - self.origin.x()), (self.end.y() - self.origin.y()))
img = self.Pixmap_to_Opencv(grabbedPixMap)
if (self.onSnippingCompleted is not None):
self.onSnippingCompleted((img, self.origin, self.end))
self.close() | Upon mouse released, ask the main desktop's QScreen to capture screen on defined area. | src/screenshot/CaptureScreen.py | mouseReleaseEvent | mathewthe2/Game2Text-Lightning | 0 | python | def mouseReleaseEvent(self, event):
if (event.button() == QtCore.Qt.LeftButton):
self.end = event.pos()
self.rubberBand.hide()
self.close()
primaryScreen = QtGui.QGuiApplication.primaryScreen()
grabbedPixMap = primaryScreen.grabWindow(0, self.origin.x(), self.origin.y(), (self.end.x() - self.origin.x()), (self.end.y() - self.origin.y()))
img = self.Pixmap_to_Opencv(grabbedPixMap)
if (self.onSnippingCompleted is not None):
self.onSnippingCompleted((img, self.origin, self.end))
self.close() | def mouseReleaseEvent(self, event):
if (event.button() == QtCore.Qt.LeftButton):
self.end = event.pos()
self.rubberBand.hide()
self.close()
primaryScreen = QtGui.QGuiApplication.primaryScreen()
grabbedPixMap = primaryScreen.grabWindow(0, self.origin.x(), self.origin.y(), (self.end.x() - self.origin.x()), (self.end.y() - self.origin.y()))
img = self.Pixmap_to_Opencv(grabbedPixMap)
if (self.onSnippingCompleted is not None):
self.onSnippingCompleted((img, self.origin, self.end))
self.close()<|docstring|>Upon mouse released, ask the main desktop's QScreen to capture screen on defined area.<|endoftext|> |
732a18999d4e8c8ebdd7e7f849c1bf5d29a9c087594b2cb2d4ae322c6ba66e6c | def suite():
'Define all the tests of the module.'
suite = unittest.TestSuite()
suite.addTest(unittest.makeSuite(EndV1TestCase))
return suite | Define all the tests of the module. | tests/zoomus/components/meeting/test_end.py | suite | ROMBOTics/zoomus | 178 | python | def suite():
suite = unittest.TestSuite()
suite.addTest(unittest.makeSuite(EndV1TestCase))
return suite | def suite():
suite = unittest.TestSuite()
suite.addTest(unittest.makeSuite(EndV1TestCase))
return suite<|docstring|>Define all the tests of the module.<|endoftext|> |
dc1ed5a6894d929353ae3669319467852b4e60e3ea9d141815b25206ffe8b754 | def static_sol(mat, rhs):
'Solve a static problem [mat]{u_sol} = {rhs}\n '
if (type(mat) is csr_matrix):
u_sol = spsolve(mat, rhs)
elif (type(mat) is ndarray):
u_sol = solve(mat, rhs)
else:
raise Exception('Not supported matrix storage scheme!')
return u_sol | Solve a static problem [mat]{u_sol} = {rhs} | solidspydyn/solutil.py | static_sol | jgomezc1/SOLIDSPy_DYN | 1 | python | def static_sol(mat, rhs):
'\n '
if (type(mat) is csr_matrix):
u_sol = spsolve(mat, rhs)
elif (type(mat) is ndarray):
u_sol = solve(mat, rhs)
else:
raise Exception('Not supported matrix storage scheme!')
return u_sol | def static_sol(mat, rhs):
'\n '
if (type(mat) is csr_matrix):
u_sol = spsolve(mat, rhs)
elif (type(mat) is ndarray):
u_sol = solve(mat, rhs)
else:
raise Exception('Not supported matrix storage scheme!')
return u_sol<|docstring|>Solve a static problem [mat]{u_sol} = {rhs}<|endoftext|> |
32836570723ec197bb309530e19f905a64475213cd6891b1861136eab85cc281 | def initial_conds(ninc, neq, RHSG, MG, KG, CG):
'\n Currently homogeneous initial conditions only\n '
Up = np.zeros([neq], dtype=np.float)
Vp = np.zeros([neq], dtype=np.float)
Ap = np.zeros([neq], dtype=np.float)
U = np.zeros([neq, ninc], dtype=np.float)
V = np.zeros([neq, 2], dtype=np.float)
A = np.zeros([neq, 2], dtype=np.float)
F = np.zeros([neq], dtype=np.float)
FE = np.zeros([neq], dtype=np.float)
F = RHSG[(:, 0)]
FS = KG.dot(Up)
FD = CG.dot(Vp)
FE = ((F - FD) - FS)
Ap = static_sol(MG, FE)
A[(:, 0)] = Ap
return (U, V, A) | Currently homogeneous initial conditions only | solidspydyn/solutil.py | initial_conds | jgomezc1/SOLIDSPy_DYN | 1 | python | def initial_conds(ninc, neq, RHSG, MG, KG, CG):
'\n \n '
Up = np.zeros([neq], dtype=np.float)
Vp = np.zeros([neq], dtype=np.float)
Ap = np.zeros([neq], dtype=np.float)
U = np.zeros([neq, ninc], dtype=np.float)
V = np.zeros([neq, 2], dtype=np.float)
A = np.zeros([neq, 2], dtype=np.float)
F = np.zeros([neq], dtype=np.float)
FE = np.zeros([neq], dtype=np.float)
F = RHSG[(:, 0)]
FS = KG.dot(Up)
FD = CG.dot(Vp)
FE = ((F - FD) - FS)
Ap = static_sol(MG, FE)
A[(:, 0)] = Ap
return (U, V, A) | def initial_conds(ninc, neq, RHSG, MG, KG, CG):
'\n \n '
Up = np.zeros([neq], dtype=np.float)
Vp = np.zeros([neq], dtype=np.float)
Ap = np.zeros([neq], dtype=np.float)
U = np.zeros([neq, ninc], dtype=np.float)
V = np.zeros([neq, 2], dtype=np.float)
A = np.zeros([neq, 2], dtype=np.float)
F = np.zeros([neq], dtype=np.float)
FE = np.zeros([neq], dtype=np.float)
F = RHSG[(:, 0)]
FS = KG.dot(Up)
FD = CG.dot(Vp)
FE = ((F - FD) - FS)
Ap = static_sol(MG, FE)
A[(:, 0)] = Ap
return (U, V, A)<|docstring|>Currently homogeneous initial conditions only<|endoftext|> |
6d036da2830e0e768f5fd2a564c7bc25231e519fb8efe6bd0ab2480d69979bdd | def time_implicit(icount, m, dt, theta, ass, U, V, A, F, MG, CG, KE):
'Uses the Wilson theta method to perform\n implicit time integration.\n Parameters\n ----------\n icount : Integer. Number of equations\n m : Integer. Number of time increments\n dt : Float. Time step.\n ass : Float array. Integration constants.\n theta : Float. Integration parameter.\n U, V, A: Float arrays with the nodal point displacement,\n velocity and acceleration. Must be passed with the\n initial conditions.\n F : Float array with the point load amplitudes.\n MG : Float array. Mass matrix.\n CG : Float array. Dammping matrix.\n KG : Float array. Stiffness matrix.\n \n '
a_0 = ass[0]
a_1 = ass[1]
a_2 = ass[2]
a_3 = ass[3]
a_4 = ass[4]
a_5 = ass[5]
a_6 = ass[6]
a_7 = ass[7]
a_8 = ass[8]
LU = splu(KE)
for k in range((m - 1)):
start_time = datetime.now()
VV = np.zeros([icount], dtype=np.float)
AA = np.zeros([icount], dtype=np.float)
RHS = np.zeros([icount], dtype=np.float)
FE = np.zeros([icount], dtype=np.float)
for i in range(0, icount):
RHS[i] = (F[(i, k)] + (theta * (F[(i, (1 + k))] - F[(i, k)])))
AA[i] = (((a_0 * U[(i, k)]) + (a_2 * V[(i, 0)])) + (2.0 * A[(i, 0)]))
VV[i] = (((a_1 * U[(i, k)]) + (2 * V[(i, 0)])) + (a_3 * A[(i, 0)]))
FI = MG.dot(AA)
FD = CG.dot(VV)
FE = ((RHS + FI) + FD)
Up = LU.solve(FE)
end_time = datetime.now()
print('Increment number....: {}'.format((k + 1)))
print('Duration for this increment....: {}'.format((end_time - start_time)))
for i in range(0, icount):
A[(i, 1)] = (((((- a_4) * U[(i, k)]) + (a_5 * V[(i, 0)])) + (a_6 * A[(i, 0)])) + (a_4 * Up[i]))
V[(i, 1)] = ((V[(i, 0)] + (a_7 * A[(i, 0)])) + (a_7 * A[(i, 1)]))
U[(i, (k + 1))] = (((U[(i, k)] + (dt * V[(i, 0)])) + ((2.0 * a_8) * A[(i, 0)])) + (a_8 * A[(i, 1)]))
A[(:, 0)] = A[(:, 1)]
V[(:, 0)] = V[(:, 1)]
return U | Uses the Wilson theta method to perform
implicit time integration.
Parameters
----------
icount : Integer. Number of equations
m : Integer. Number of time increments
dt : Float. Time step.
ass : Float array. Integration constants.
theta : Float. Integration parameter.
U, V, A: Float arrays with the nodal point displacement,
velocity and acceleration. Must be passed with the
initial conditions.
F : Float array with the point load amplitudes.
MG : Float array. Mass matrix.
CG : Float array. Dammping matrix.
KG : Float array. Stiffness matrix. | solidspydyn/solutil.py | time_implicit | jgomezc1/SOLIDSPy_DYN | 1 | python | def time_implicit(icount, m, dt, theta, ass, U, V, A, F, MG, CG, KE):
'Uses the Wilson theta method to perform\n implicit time integration.\n Parameters\n ----------\n icount : Integer. Number of equations\n m : Integer. Number of time increments\n dt : Float. Time step.\n ass : Float array. Integration constants.\n theta : Float. Integration parameter.\n U, V, A: Float arrays with the nodal point displacement,\n velocity and acceleration. Must be passed with the\n initial conditions.\n F : Float array with the point load amplitudes.\n MG : Float array. Mass matrix.\n CG : Float array. Dammping matrix.\n KG : Float array. Stiffness matrix.\n \n '
a_0 = ass[0]
a_1 = ass[1]
a_2 = ass[2]
a_3 = ass[3]
a_4 = ass[4]
a_5 = ass[5]
a_6 = ass[6]
a_7 = ass[7]
a_8 = ass[8]
LU = splu(KE)
for k in range((m - 1)):
start_time = datetime.now()
VV = np.zeros([icount], dtype=np.float)
AA = np.zeros([icount], dtype=np.float)
RHS = np.zeros([icount], dtype=np.float)
FE = np.zeros([icount], dtype=np.float)
for i in range(0, icount):
RHS[i] = (F[(i, k)] + (theta * (F[(i, (1 + k))] - F[(i, k)])))
AA[i] = (((a_0 * U[(i, k)]) + (a_2 * V[(i, 0)])) + (2.0 * A[(i, 0)]))
VV[i] = (((a_1 * U[(i, k)]) + (2 * V[(i, 0)])) + (a_3 * A[(i, 0)]))
FI = MG.dot(AA)
FD = CG.dot(VV)
FE = ((RHS + FI) + FD)
Up = LU.solve(FE)
end_time = datetime.now()
print('Increment number....: {}'.format((k + 1)))
print('Duration for this increment....: {}'.format((end_time - start_time)))
for i in range(0, icount):
A[(i, 1)] = (((((- a_4) * U[(i, k)]) + (a_5 * V[(i, 0)])) + (a_6 * A[(i, 0)])) + (a_4 * Up[i]))
V[(i, 1)] = ((V[(i, 0)] + (a_7 * A[(i, 0)])) + (a_7 * A[(i, 1)]))
U[(i, (k + 1))] = (((U[(i, k)] + (dt * V[(i, 0)])) + ((2.0 * a_8) * A[(i, 0)])) + (a_8 * A[(i, 1)]))
A[(:, 0)] = A[(:, 1)]
V[(:, 0)] = V[(:, 1)]
return U | def time_implicit(icount, m, dt, theta, ass, U, V, A, F, MG, CG, KE):
'Uses the Wilson theta method to perform\n implicit time integration.\n Parameters\n ----------\n icount : Integer. Number of equations\n m : Integer. Number of time increments\n dt : Float. Time step.\n ass : Float array. Integration constants.\n theta : Float. Integration parameter.\n U, V, A: Float arrays with the nodal point displacement,\n velocity and acceleration. Must be passed with the\n initial conditions.\n F : Float array with the point load amplitudes.\n MG : Float array. Mass matrix.\n CG : Float array. Dammping matrix.\n KG : Float array. Stiffness matrix.\n \n '
a_0 = ass[0]
a_1 = ass[1]
a_2 = ass[2]
a_3 = ass[3]
a_4 = ass[4]
a_5 = ass[5]
a_6 = ass[6]
a_7 = ass[7]
a_8 = ass[8]
LU = splu(KE)
for k in range((m - 1)):
start_time = datetime.now()
VV = np.zeros([icount], dtype=np.float)
AA = np.zeros([icount], dtype=np.float)
RHS = np.zeros([icount], dtype=np.float)
FE = np.zeros([icount], dtype=np.float)
for i in range(0, icount):
RHS[i] = (F[(i, k)] + (theta * (F[(i, (1 + k))] - F[(i, k)])))
AA[i] = (((a_0 * U[(i, k)]) + (a_2 * V[(i, 0)])) + (2.0 * A[(i, 0)]))
VV[i] = (((a_1 * U[(i, k)]) + (2 * V[(i, 0)])) + (a_3 * A[(i, 0)]))
FI = MG.dot(AA)
FD = CG.dot(VV)
FE = ((RHS + FI) + FD)
Up = LU.solve(FE)
end_time = datetime.now()
print('Increment number....: {}'.format((k + 1)))
print('Duration for this increment....: {}'.format((end_time - start_time)))
for i in range(0, icount):
A[(i, 1)] = (((((- a_4) * U[(i, k)]) + (a_5 * V[(i, 0)])) + (a_6 * A[(i, 0)])) + (a_4 * Up[i]))
V[(i, 1)] = ((V[(i, 0)] + (a_7 * A[(i, 0)])) + (a_7 * A[(i, 1)]))
U[(i, (k + 1))] = (((U[(i, k)] + (dt * V[(i, 0)])) + ((2.0 * a_8) * A[(i, 0)])) + (a_8 * A[(i, 1)]))
A[(:, 0)] = A[(:, 1)]
V[(:, 0)] = V[(:, 1)]
return U<|docstring|>Uses the Wilson theta method to perform
implicit time integration.
Parameters
----------
icount : Integer. Number of equations
m : Integer. Number of time increments
dt : Float. Time step.
ass : Float array. Integration constants.
theta : Float. Integration parameter.
U, V, A: Float arrays with the nodal point displacement,
velocity and acceleration. Must be passed with the
initial conditions.
F : Float array with the point load amplitudes.
MG : Float array. Mass matrix.
CG : Float array. Dammping matrix.
KG : Float array. Stiffness matrix.<|endoftext|> |
581fb4120cc8213e0e89bb61ab2ae4f4ca42070955e790b6ce4593c20cf71aab | def wrap_mod_func():
'\n\n .. wikisection:: faq\n :title: From func?\n\n Yes.\n '
mod_func()
pass | .. wikisection:: faq
:title: From func?
Yes. | tests/refs/some_pkg/other_mod.py | wrap_mod_func | amirkdv/sphinxcontrib-wiki | 1 | python | def wrap_mod_func():
'\n\n .. wikisection:: faq\n :title: From func?\n\n Yes.\n '
mod_func()
pass | def wrap_mod_func():
'\n\n .. wikisection:: faq\n :title: From func?\n\n Yes.\n '
mod_func()
pass<|docstring|>.. wikisection:: faq
:title: From func?
Yes.<|endoftext|> |
2b73c4333eba1ffbac0cfef03ff54c867164e72aef1194c9ffe8bcc6c0cf12f2 | def getStepSpace(stepName):
'\n _getStepSpace_\n\n Util to get the runtime step space.\n This imports dynamic runtime libraries so be careful how\n you use it\n\n '
modName = 'WMTaskSpace'
if (modName in sys.modules.keys()):
taskspace = sys.modules[modName]
else:
try:
taskspace = __import__(modName, globals(), locals(), ['taskSpace'])
except ImportError as ex:
msg = 'Unable to load WMTaskSpace module:\n'
msg += str(ex)
raise RuntimeError(msg)
try:
stepSpace = taskspace.taskSpace.stepSpace(stepName)
except Exception as ex:
msg = 'Error retrieving stepSpace from TaskSpace:\n'
msg += str(ex)
raise RuntimeError(msg)
return stepSpace | _getStepSpace_
Util to get the runtime step space.
This imports dynamic runtime libraries so be careful how
you use it | src/python/WMCore/WMSpec/Steps/Executor.py | getStepSpace | hufnagel/WMCore | 1 | python | def getStepSpace(stepName):
'\n _getStepSpace_\n\n Util to get the runtime step space.\n This imports dynamic runtime libraries so be careful how\n you use it\n\n '
modName = 'WMTaskSpace'
if (modName in sys.modules.keys()):
taskspace = sys.modules[modName]
else:
try:
taskspace = __import__(modName, globals(), locals(), ['taskSpace'])
except ImportError as ex:
msg = 'Unable to load WMTaskSpace module:\n'
msg += str(ex)
raise RuntimeError(msg)
try:
stepSpace = taskspace.taskSpace.stepSpace(stepName)
except Exception as ex:
msg = 'Error retrieving stepSpace from TaskSpace:\n'
msg += str(ex)
raise RuntimeError(msg)
return stepSpace | def getStepSpace(stepName):
'\n _getStepSpace_\n\n Util to get the runtime step space.\n This imports dynamic runtime libraries so be careful how\n you use it\n\n '
modName = 'WMTaskSpace'
if (modName in sys.modules.keys()):
taskspace = sys.modules[modName]
else:
try:
taskspace = __import__(modName, globals(), locals(), ['taskSpace'])
except ImportError as ex:
msg = 'Unable to load WMTaskSpace module:\n'
msg += str(ex)
raise RuntimeError(msg)
try:
stepSpace = taskspace.taskSpace.stepSpace(stepName)
except Exception as ex:
msg = 'Error retrieving stepSpace from TaskSpace:\n'
msg += str(ex)
raise RuntimeError(msg)
return stepSpace<|docstring|>_getStepSpace_
Util to get the runtime step space.
This imports dynamic runtime libraries so be careful how
you use it<|endoftext|> |
32af8fc9ff308297ba36c81b334dc3476d97b26a8d19e1bc2a8ffd8a4b2ec7ca | def initialise(self, step, job):
'\n _initialise_\n\n\n Initialise the executor attributes\n\n '
self.step = step
self.job = job
self.stepName = getStepName(self.step)
self.stepSpace = getStepSpace(self.stepName)
self.task = self.stepSpace.getWMTask()
self.workload = self.stepSpace.taskSpace.workload
self.report = Report(self.stepName)
self.report.data.task = self.task.name()
self.report.data.workload = self.stepSpace.taskSpace.workloadName()
self.report.data.id = job['id']
self.errorDestination = getStepErrorDestination(self.step)
self.step.section_('execution')
self.step.execution.exitStatus = 0
self.step.execution.reportLocation = ('%s/Report.pkl' % (self.stepSpace.location,))
self.report.setStepStatus(stepName=self.stepName, status=1)
emulatorName = getattr(self.step.emulator, 'emulatorName', None)
if (emulatorName != None):
self.emulator = getStepEmulator(emulatorName)
self.emulator.initialise(self)
self.emulationMode = True
return | _initialise_
Initialise the executor attributes | src/python/WMCore/WMSpec/Steps/Executor.py | initialise | hufnagel/WMCore | 1 | python | def initialise(self, step, job):
'\n _initialise_\n\n\n Initialise the executor attributes\n\n '
self.step = step
self.job = job
self.stepName = getStepName(self.step)
self.stepSpace = getStepSpace(self.stepName)
self.task = self.stepSpace.getWMTask()
self.workload = self.stepSpace.taskSpace.workload
self.report = Report(self.stepName)
self.report.data.task = self.task.name()
self.report.data.workload = self.stepSpace.taskSpace.workloadName()
self.report.data.id = job['id']
self.errorDestination = getStepErrorDestination(self.step)
self.step.section_('execution')
self.step.execution.exitStatus = 0
self.step.execution.reportLocation = ('%s/Report.pkl' % (self.stepSpace.location,))
self.report.setStepStatus(stepName=self.stepName, status=1)
emulatorName = getattr(self.step.emulator, 'emulatorName', None)
if (emulatorName != None):
self.emulator = getStepEmulator(emulatorName)
self.emulator.initialise(self)
self.emulationMode = True
return | def initialise(self, step, job):
'\n _initialise_\n\n\n Initialise the executor attributes\n\n '
self.step = step
self.job = job
self.stepName = getStepName(self.step)
self.stepSpace = getStepSpace(self.stepName)
self.task = self.stepSpace.getWMTask()
self.workload = self.stepSpace.taskSpace.workload
self.report = Report(self.stepName)
self.report.data.task = self.task.name()
self.report.data.workload = self.stepSpace.taskSpace.workloadName()
self.report.data.id = job['id']
self.errorDestination = getStepErrorDestination(self.step)
self.step.section_('execution')
self.step.execution.exitStatus = 0
self.step.execution.reportLocation = ('%s/Report.pkl' % (self.stepSpace.location,))
self.report.setStepStatus(stepName=self.stepName, status=1)
emulatorName = getattr(self.step.emulator, 'emulatorName', None)
if (emulatorName != None):
self.emulator = getStepEmulator(emulatorName)
self.emulator.initialise(self)
self.emulationMode = True
return<|docstring|>_initialise_
Initialise the executor attributes<|endoftext|> |
66ef48ea5ad94018691afb2a7f9e78fa2529e6a3322a81616307648ef3a90bb3 | def saveReport(self):
'\n _saveReport_\n\n Save the job report\n\n '
self.report.persist(self.step.execution.reportLocation)
return | _saveReport_
Save the job report | src/python/WMCore/WMSpec/Steps/Executor.py | saveReport | hufnagel/WMCore | 1 | python | def saveReport(self):
'\n _saveReport_\n\n Save the job report\n\n '
self.report.persist(self.step.execution.reportLocation)
return | def saveReport(self):
'\n _saveReport_\n\n Save the job report\n\n '
self.report.persist(self.step.execution.reportLocation)
return<|docstring|>_saveReport_
Save the job report<|endoftext|> |
87c2c5de622b0f4f9faea0fb074fed92e138c71b4b165e24dd74e4e1bf898d9f | def pre(self, emulator=None):
'\n _pre_\n\n pre execution checks. Can alter flow of execution by returning\n a different step in the task. If None, then current step will\n be passed to execute.\n\n TODO: Define better how to switch to different step within the task\n\n '
return None | _pre_
pre execution checks. Can alter flow of execution by returning
a different step in the task. If None, then current step will
be passed to execute.
TODO: Define better how to switch to different step within the task | src/python/WMCore/WMSpec/Steps/Executor.py | pre | hufnagel/WMCore | 1 | python | def pre(self, emulator=None):
'\n _pre_\n\n pre execution checks. Can alter flow of execution by returning\n a different step in the task. If None, then current step will\n be passed to execute.\n\n TODO: Define better how to switch to different step within the task\n\n '
return None | def pre(self, emulator=None):
'\n _pre_\n\n pre execution checks. Can alter flow of execution by returning\n a different step in the task. If None, then current step will\n be passed to execute.\n\n TODO: Define better how to switch to different step within the task\n\n '
return None<|docstring|>_pre_
pre execution checks. Can alter flow of execution by returning
a different step in the task. If None, then current step will
be passed to execute.
TODO: Define better how to switch to different step within the task<|endoftext|> |
d7af67c4ad1143bb4de0b71cef82249be5006f923b8467be21e3d2cc90fc35e8 | def execute(self, emulator=None):
'\n _execute_\n\n Override behaviour to execute this step type.\n If Emulator is provided, execute the emulator instead.\n Return a framework job report instance\n\n '
msg = 'WMSpec.Steps.Executor.execute method not overridden in '
msg += ('implementation: %s\n' % self.__class__.__name__)
raise NotImplementedError(msg) | _execute_
Override behaviour to execute this step type.
If Emulator is provided, execute the emulator instead.
Return a framework job report instance | src/python/WMCore/WMSpec/Steps/Executor.py | execute | hufnagel/WMCore | 1 | python | def execute(self, emulator=None):
'\n _execute_\n\n Override behaviour to execute this step type.\n If Emulator is provided, execute the emulator instead.\n Return a framework job report instance\n\n '
msg = 'WMSpec.Steps.Executor.execute method not overridden in '
msg += ('implementation: %s\n' % self.__class__.__name__)
raise NotImplementedError(msg) | def execute(self, emulator=None):
'\n _execute_\n\n Override behaviour to execute this step type.\n If Emulator is provided, execute the emulator instead.\n Return a framework job report instance\n\n '
msg = 'WMSpec.Steps.Executor.execute method not overridden in '
msg += ('implementation: %s\n' % self.__class__.__name__)
raise NotImplementedError(msg)<|docstring|>_execute_
Override behaviour to execute this step type.
If Emulator is provided, execute the emulator instead.
Return a framework job report instance<|endoftext|> |
3c114e1bbdbbedcaeaf9123806f33c97b4f4355a80672452c7ea499a38345780 | def post(self, emulator=None):
'\n _post_\n\n post execution checks. Can alter flow of execution by returning\n a different step in the task. If None, then the next step in the task\n will be used next.\n\n TODO: Define better how to switch to different step within the task\n\n '
return None | _post_
post execution checks. Can alter flow of execution by returning
a different step in the task. If None, then the next step in the task
will be used next.
TODO: Define better how to switch to different step within the task | src/python/WMCore/WMSpec/Steps/Executor.py | post | hufnagel/WMCore | 1 | python | def post(self, emulator=None):
'\n _post_\n\n post execution checks. Can alter flow of execution by returning\n a different step in the task. If None, then the next step in the task\n will be used next.\n\n TODO: Define better how to switch to different step within the task\n\n '
return None | def post(self, emulator=None):
'\n _post_\n\n post execution checks. Can alter flow of execution by returning\n a different step in the task. If None, then the next step in the task\n will be used next.\n\n TODO: Define better how to switch to different step within the task\n\n '
return None<|docstring|>_post_
post execution checks. Can alter flow of execution by returning
a different step in the task. If None, then the next step in the task
will be used next.
TODO: Define better how to switch to different step within the task<|endoftext|> |
61ca2a792221fecf87284608118b7a5f6cbe1bbe6274aa86d99980e4c7435660 | def setCondorChirpAttrDelayed(self, key, value, compress=False, maxLen=5120):
'\n _setCondorChirpAttrDelayed_\n\n Util to call condor_chirp and publish the key/value pair\n\n '
if compress:
value = zipEncodeStr(value, maxLen=maxLen)
condor_chirp_bin = None
condor_config = os.getenv('CONDOR_CONFIG', None)
if condor_config:
condor_config_dir = os.path.dirname(condor_config)
condor_chirp_bin = os.path.join(condor_config_dir, 'main/condor/libexec/condor_chirp')
if ((not condor_chirp_bin) or (not os.path.isfile(condor_chirp_bin))):
condor_chirp_bin = getFullPath('condor_chirp')
if (condor_chirp_bin and os.access(condor_chirp_bin, os.X_OK)):
args = [condor_chirp_bin, 'set_job_attr_delayed', key, json.dumps(value)]
subprocess.call(args)
else:
if (condor_chirp_bin and (not os.access(condor_chirp_bin, os.X_OK))):
msg = ('condor_chirp was found in: %s, but it was not an executable.' % condor_chirp_bin)
else:
msg = 'condor_chirp was not found in the system.'
self.logger.warning(msg)
return | _setCondorChirpAttrDelayed_
Util to call condor_chirp and publish the key/value pair | src/python/WMCore/WMSpec/Steps/Executor.py | setCondorChirpAttrDelayed | hufnagel/WMCore | 1 | python | def setCondorChirpAttrDelayed(self, key, value, compress=False, maxLen=5120):
'\n _setCondorChirpAttrDelayed_\n\n Util to call condor_chirp and publish the key/value pair\n\n '
if compress:
value = zipEncodeStr(value, maxLen=maxLen)
condor_chirp_bin = None
condor_config = os.getenv('CONDOR_CONFIG', None)
if condor_config:
condor_config_dir = os.path.dirname(condor_config)
condor_chirp_bin = os.path.join(condor_config_dir, 'main/condor/libexec/condor_chirp')
if ((not condor_chirp_bin) or (not os.path.isfile(condor_chirp_bin))):
condor_chirp_bin = getFullPath('condor_chirp')
if (condor_chirp_bin and os.access(condor_chirp_bin, os.X_OK)):
args = [condor_chirp_bin, 'set_job_attr_delayed', key, json.dumps(value)]
subprocess.call(args)
else:
if (condor_chirp_bin and (not os.access(condor_chirp_bin, os.X_OK))):
msg = ('condor_chirp was found in: %s, but it was not an executable.' % condor_chirp_bin)
else:
msg = 'condor_chirp was not found in the system.'
self.logger.warning(msg)
return | def setCondorChirpAttrDelayed(self, key, value, compress=False, maxLen=5120):
'\n _setCondorChirpAttrDelayed_\n\n Util to call condor_chirp and publish the key/value pair\n\n '
if compress:
value = zipEncodeStr(value, maxLen=maxLen)
condor_chirp_bin = None
condor_config = os.getenv('CONDOR_CONFIG', None)
if condor_config:
condor_config_dir = os.path.dirname(condor_config)
condor_chirp_bin = os.path.join(condor_config_dir, 'main/condor/libexec/condor_chirp')
if ((not condor_chirp_bin) or (not os.path.isfile(condor_chirp_bin))):
condor_chirp_bin = getFullPath('condor_chirp')
if (condor_chirp_bin and os.access(condor_chirp_bin, os.X_OK)):
args = [condor_chirp_bin, 'set_job_attr_delayed', key, json.dumps(value)]
subprocess.call(args)
else:
if (condor_chirp_bin and (not os.access(condor_chirp_bin, os.X_OK))):
msg = ('condor_chirp was found in: %s, but it was not an executable.' % condor_chirp_bin)
else:
msg = 'condor_chirp was not found in the system.'
self.logger.warning(msg)
return<|docstring|>_setCondorChirpAttrDelayed_
Util to call condor_chirp and publish the key/value pair<|endoftext|> |
16fd10dd11461db6437098b0a028196c93c7e2e8dffad203a962c2ce96950fb9 | def get_args():
'Get command-line arguments'
parser = argparse.ArgumentParser(description='Picnic game', formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument('food', metavar='str', nargs='+', help='Item(s) to bring')
parser.add_argument('-s', '--sorted', help='Sort the items', action='store_true')
parser.add_argument('-nc', '--nocomma', help='Eliminate the Oxford comma', action='store_true')
parser.add_argument('-sep', '--separator', metavar='str', help='User-defined separator', type=str, default=',')
return parser.parse_args() | Get command-line arguments | 03_picnic/picnic.py | get_args | FabrizioPe/tiny_python_projects | 0 | python | def get_args():
parser = argparse.ArgumentParser(description='Picnic game', formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument('food', metavar='str', nargs='+', help='Item(s) to bring')
parser.add_argument('-s', '--sorted', help='Sort the items', action='store_true')
parser.add_argument('-nc', '--nocomma', help='Eliminate the Oxford comma', action='store_true')
parser.add_argument('-sep', '--separator', metavar='str', help='User-defined separator', type=str, default=',')
return parser.parse_args() | def get_args():
parser = argparse.ArgumentParser(description='Picnic game', formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument('food', metavar='str', nargs='+', help='Item(s) to bring')
parser.add_argument('-s', '--sorted', help='Sort the items', action='store_true')
parser.add_argument('-nc', '--nocomma', help='Eliminate the Oxford comma', action='store_true')
parser.add_argument('-sep', '--separator', metavar='str', help='User-defined separator', type=str, default=',')
return parser.parse_args()<|docstring|>Get command-line arguments<|endoftext|> |
cce4452f40174fbdb1a951a46cfcd4cab3d96ba64f5897648afcecb8961f74a1 | def main():
'Make a jazz noise here'
args = get_args()
foods = args.food
sep = args.separator
if args.sorted:
foods.sort()
l = len(foods)
bringings = ''
if (l == 1):
bringings = foods[0]
elif (l == 2):
bringings = ' and '.join(foods)
else:
bringings = (f'{sep} '.join(foods[:(- 1)]) + f'{sep} ')
if args.nocomma:
bringings = (bringings[:(- 2)] + ' ')
bringings += f'and {foods[(- 1)]}'
print(f'You are bringing {bringings}.') | Make a jazz noise here | 03_picnic/picnic.py | main | FabrizioPe/tiny_python_projects | 0 | python | def main():
args = get_args()
foods = args.food
sep = args.separator
if args.sorted:
foods.sort()
l = len(foods)
bringings =
if (l == 1):
bringings = foods[0]
elif (l == 2):
bringings = ' and '.join(foods)
else:
bringings = (f'{sep} '.join(foods[:(- 1)]) + f'{sep} ')
if args.nocomma:
bringings = (bringings[:(- 2)] + ' ')
bringings += f'and {foods[(- 1)]}'
print(f'You are bringing {bringings}.') | def main():
args = get_args()
foods = args.food
sep = args.separator
if args.sorted:
foods.sort()
l = len(foods)
bringings =
if (l == 1):
bringings = foods[0]
elif (l == 2):
bringings = ' and '.join(foods)
else:
bringings = (f'{sep} '.join(foods[:(- 1)]) + f'{sep} ')
if args.nocomma:
bringings = (bringings[:(- 2)] + ' ')
bringings += f'and {foods[(- 1)]}'
print(f'You are bringing {bringings}.')<|docstring|>Make a jazz noise here<|endoftext|> |
7c3603cc9f07b226cd031d95279311e80305c495c40dd7bfa062e57f87a47011 | def are_xml_element_equal(e1, e2):
'test for equality of two given xml elements'
if (len(e1) != len(e2)):
return False
if (e1.tag != e2.tag):
return False
if (expandpath(e1.text) != expandpath(e2.text)):
return False
if (e1.tail != e2.tail):
return False
if (e1.attrib != e2.attrib):
return False
return True | test for equality of two given xml elements | tests/utils/test_converters.py | are_xml_element_equal | patrickkesper/amira_blender_rendering | 26 | python | def are_xml_element_equal(e1, e2):
if (len(e1) != len(e2)):
return False
if (e1.tag != e2.tag):
return False
if (expandpath(e1.text) != expandpath(e2.text)):
return False
if (e1.tail != e2.tail):
return False
if (e1.attrib != e2.attrib):
return False
return True | def are_xml_element_equal(e1, e2):
if (len(e1) != len(e2)):
return False
if (e1.tag != e2.tag):
return False
if (expandpath(e1.text) != expandpath(e2.text)):
return False
if (e1.tail != e2.tail):
return False
if (e1.attrib != e2.attrib):
return False
return True<|docstring|>test for equality of two given xml elements<|endoftext|> |
1b70911d827c5431a8149ddc4207cff8c792b353a9a9dd1f4a61749421006ee7 | def __init__(self, app=None):
'\n An alternative way to pass your :class:`flask.Flask` application\n object to Flask-Compress. :meth:`init_app` also takes care of some\n default `settings`_.\n\n :param app: the :class:`flask.Flask` application object.\n '
self.app = app
if (app is not None):
self.init_app(app) | An alternative way to pass your :class:`flask.Flask` application
object to Flask-Compress. :meth:`init_app` also takes care of some
default `settings`_.
:param app: the :class:`flask.Flask` application object. | venv/lib/python3.9/site-packages/flask_compress/flask_compress.py | __init__ | pierceoneill/covid19dashboard | 70 | python | def __init__(self, app=None):
'\n An alternative way to pass your :class:`flask.Flask` application\n object to Flask-Compress. :meth:`init_app` also takes care of some\n default `settings`_.\n\n :param app: the :class:`flask.Flask` application object.\n '
self.app = app
if (app is not None):
self.init_app(app) | def __init__(self, app=None):
'\n An alternative way to pass your :class:`flask.Flask` application\n object to Flask-Compress. :meth:`init_app` also takes care of some\n default `settings`_.\n\n :param app: the :class:`flask.Flask` application object.\n '
self.app = app
if (app is not None):
self.init_app(app)<|docstring|>An alternative way to pass your :class:`flask.Flask` application
object to Flask-Compress. :meth:`init_app` also takes care of some
default `settings`_.
:param app: the :class:`flask.Flask` application object.<|endoftext|> |
8969f22d55f6e06fdfc57c965f6cf1e953892294a955546cb01902fb3890f91c | def _choose_compress_algorithm(self, accept_encoding_header):
'\n Determine which compression algorithm we\'re going to use based on the\n client request. The `Accept-Encoding` header may list one or more desired\n algorithms, together with a "quality factor" for each one (higher quality\n means the client prefers that algorithm more).\n\n :param accept_encoding_header: Content of the `Accept-Encoding` header\n :return: name of a compression algorithm (`gzip`, `deflate`, `br`) or `None` if\n the client and server don\'t agree on any.\n '
fallback_to_any = False
algos_by_quality = defaultdict(set)
server_algos_set = set(self.enabled_algorithms)
for part in accept_encoding_header.lower().split(','):
part = part.strip()
if (';q=' in part):
algo = part.split(';')[0].strip()
try:
quality = float(part.split('=')[1].strip())
except ValueError:
quality = 1.0
else:
algo = part
quality = 1.0
if (algo == '*'):
if (quality > 0):
fallback_to_any = True
elif (algo == 'identity'):
algos_by_quality[quality].add(None)
elif (algo in server_algos_set):
algos_by_quality[quality].add(algo)
for (_, viable_algos) in sorted(algos_by_quality.items(), reverse=True):
if (len(viable_algos) == 1):
return viable_algos.pop()
elif (len(viable_algos) > 1):
for server_algo in self.enabled_algorithms:
if (server_algo in viable_algos):
return server_algo
if fallback_to_any:
return self.enabled_algorithms[0]
return None | Determine which compression algorithm we're going to use based on the
client request. The `Accept-Encoding` header may list one or more desired
algorithms, together with a "quality factor" for each one (higher quality
means the client prefers that algorithm more).
:param accept_encoding_header: Content of the `Accept-Encoding` header
:return: name of a compression algorithm (`gzip`, `deflate`, `br`) or `None` if
the client and server don't agree on any. | venv/lib/python3.9/site-packages/flask_compress/flask_compress.py | _choose_compress_algorithm | pierceoneill/covid19dashboard | 70 | python | def _choose_compress_algorithm(self, accept_encoding_header):
'\n Determine which compression algorithm we\'re going to use based on the\n client request. The `Accept-Encoding` header may list one or more desired\n algorithms, together with a "quality factor" for each one (higher quality\n means the client prefers that algorithm more).\n\n :param accept_encoding_header: Content of the `Accept-Encoding` header\n :return: name of a compression algorithm (`gzip`, `deflate`, `br`) or `None` if\n the client and server don\'t agree on any.\n '
fallback_to_any = False
algos_by_quality = defaultdict(set)
server_algos_set = set(self.enabled_algorithms)
for part in accept_encoding_header.lower().split(','):
part = part.strip()
if (';q=' in part):
algo = part.split(';')[0].strip()
try:
quality = float(part.split('=')[1].strip())
except ValueError:
quality = 1.0
else:
algo = part
quality = 1.0
if (algo == '*'):
if (quality > 0):
fallback_to_any = True
elif (algo == 'identity'):
algos_by_quality[quality].add(None)
elif (algo in server_algos_set):
algos_by_quality[quality].add(algo)
for (_, viable_algos) in sorted(algos_by_quality.items(), reverse=True):
if (len(viable_algos) == 1):
return viable_algos.pop()
elif (len(viable_algos) > 1):
for server_algo in self.enabled_algorithms:
if (server_algo in viable_algos):
return server_algo
if fallback_to_any:
return self.enabled_algorithms[0]
return None | def _choose_compress_algorithm(self, accept_encoding_header):
'\n Determine which compression algorithm we\'re going to use based on the\n client request. The `Accept-Encoding` header may list one or more desired\n algorithms, together with a "quality factor" for each one (higher quality\n means the client prefers that algorithm more).\n\n :param accept_encoding_header: Content of the `Accept-Encoding` header\n :return: name of a compression algorithm (`gzip`, `deflate`, `br`) or `None` if\n the client and server don\'t agree on any.\n '
fallback_to_any = False
algos_by_quality = defaultdict(set)
server_algos_set = set(self.enabled_algorithms)
for part in accept_encoding_header.lower().split(','):
part = part.strip()
if (';q=' in part):
algo = part.split(';')[0].strip()
try:
quality = float(part.split('=')[1].strip())
except ValueError:
quality = 1.0
else:
algo = part
quality = 1.0
if (algo == '*'):
if (quality > 0):
fallback_to_any = True
elif (algo == 'identity'):
algos_by_quality[quality].add(None)
elif (algo in server_algos_set):
algos_by_quality[quality].add(algo)
for (_, viable_algos) in sorted(algos_by_quality.items(), reverse=True):
if (len(viable_algos) == 1):
return viable_algos.pop()
elif (len(viable_algos) > 1):
for server_algo in self.enabled_algorithms:
if (server_algo in viable_algos):
return server_algo
if fallback_to_any:
return self.enabled_algorithms[0]
return None<|docstring|>Determine which compression algorithm we're going to use based on the
client request. The `Accept-Encoding` header may list one or more desired
algorithms, together with a "quality factor" for each one (higher quality
means the client prefers that algorithm more).
:param accept_encoding_header: Content of the `Accept-Encoding` header
:return: name of a compression algorithm (`gzip`, `deflate`, `br`) or `None` if
the client and server don't agree on any.<|endoftext|> |
07265b794207cfe6fc5b6c89f29177487b8e98b9e3b98fbc5d90c9476cbdd427 | def solve_problem(problem, procs):
"\n if (problem.intbeta>2):\n raise(NotImplementedError('For multipool only beta>2 implemented so far'))\n \n if(problem.rands == []):\n raise(ValueError('Set rands before proceeding'))\n "
jobs = []
sizeSegment = ((((problem.Kplus + problem.Kmin) + 1) - (((problem.Kplus + problem.Kmin) + 1) % procs)) / procs)
for i in range(0, procs):
jobs.append((((i * sizeSegment) + 1), ((i + 1) * sizeSegment)))
jobs[(- 1)] = (jobs[(- 1)][0], (jobs[(- 1)][1] + (((problem.Kplus + problem.Kmin) + 1) % procs)))
jobs = [((job[0] - (problem.Kmin + 1)), (job[1] - (problem.Kmin + 1)), problem) for job in jobs]
Pool_main = Pool(procs)
pool = Pool_main.map(subsolver, jobs)
result = ((((sum(pool) * 2) * problem.k) * sin((pi * problem.beta))) / pi)
Pool_main.close()
return result | if (problem.intbeta>2):
raise(NotImplementedError('For multipool only beta>2 implemented so far'))
if(problem.rands == []):
raise(ValueError('Set rands before proceeding')) | utils.py | solve_problem | erik-grennberg-jansson/matern_sfem | 0 | python | def solve_problem(problem, procs):
"\n if (problem.intbeta>2):\n raise(NotImplementedError('For multipool only beta>2 implemented so far'))\n \n if(problem.rands == []):\n raise(ValueError('Set rands before proceeding'))\n "
jobs = []
sizeSegment = ((((problem.Kplus + problem.Kmin) + 1) - (((problem.Kplus + problem.Kmin) + 1) % procs)) / procs)
for i in range(0, procs):
jobs.append((((i * sizeSegment) + 1), ((i + 1) * sizeSegment)))
jobs[(- 1)] = (jobs[(- 1)][0], (jobs[(- 1)][1] + (((problem.Kplus + problem.Kmin) + 1) % procs)))
jobs = [((job[0] - (problem.Kmin + 1)), (job[1] - (problem.Kmin + 1)), problem) for job in jobs]
Pool_main = Pool(procs)
pool = Pool_main.map(subsolver, jobs)
result = ((((sum(pool) * 2) * problem.k) * sin((pi * problem.beta))) / pi)
Pool_main.close()
return result | def solve_problem(problem, procs):
"\n if (problem.intbeta>2):\n raise(NotImplementedError('For multipool only beta>2 implemented so far'))\n \n if(problem.rands == []):\n raise(ValueError('Set rands before proceeding'))\n "
jobs = []
sizeSegment = ((((problem.Kplus + problem.Kmin) + 1) - (((problem.Kplus + problem.Kmin) + 1) % procs)) / procs)
for i in range(0, procs):
jobs.append((((i * sizeSegment) + 1), ((i + 1) * sizeSegment)))
jobs[(- 1)] = (jobs[(- 1)][0], (jobs[(- 1)][1] + (((problem.Kplus + problem.Kmin) + 1) % procs)))
jobs = [((job[0] - (problem.Kmin + 1)), (job[1] - (problem.Kmin + 1)), problem) for job in jobs]
Pool_main = Pool(procs)
pool = Pool_main.map(subsolver, jobs)
result = ((((sum(pool) * 2) * problem.k) * sin((pi * problem.beta))) / pi)
Pool_main.close()
return result<|docstring|>if (problem.intbeta>2):
raise(NotImplementedError('For multipool only beta>2 implemented so far'))
if(problem.rands == []):
raise(ValueError('Set rands before proceeding'))<|endoftext|> |
5b114b1646b418e8f284c1525c8f3a290012a2d6b8fe655e7170d76e73e39eef | async def async_setup_platform(hass: HomeAssistant, config: ConfigType, async_add_entities: Callable[([Iterable[Entity]], None)], discovery_info: (DiscoveryInfoType | None)=None) -> None:
'Set up cover(s) for KNX platform.'
entities = []
for device in hass.data[DOMAIN].xknx.devices:
if isinstance(device, XknxCover):
entities.append(KNXCover(device))
async_add_entities(entities) | Set up cover(s) for KNX platform. | homeassistant/components/knx/cover.py | async_setup_platform | dn0sar/core | 1 | python | async def async_setup_platform(hass: HomeAssistant, config: ConfigType, async_add_entities: Callable[([Iterable[Entity]], None)], discovery_info: (DiscoveryInfoType | None)=None) -> None:
entities = []
for device in hass.data[DOMAIN].xknx.devices:
if isinstance(device, XknxCover):
entities.append(KNXCover(device))
async_add_entities(entities) | async def async_setup_platform(hass: HomeAssistant, config: ConfigType, async_add_entities: Callable[([Iterable[Entity]], None)], discovery_info: (DiscoveryInfoType | None)=None) -> None:
entities = []
for device in hass.data[DOMAIN].xknx.devices:
if isinstance(device, XknxCover):
entities.append(KNXCover(device))
async_add_entities(entities)<|docstring|>Set up cover(s) for KNX platform.<|endoftext|> |
b91d28f2dcedc36d5cc1c0381dfe4d23482c40d674338e088d27efa90790bc70 | def __init__(self, device: XknxCover):
'Initialize the cover.'
self._device: XknxCover
super().__init__(device)
self._unsubscribe_auto_updater: (Callable[([], None)] | None) = None | Initialize the cover. | homeassistant/components/knx/cover.py | __init__ | dn0sar/core | 1 | python | def __init__(self, device: XknxCover):
self._device: XknxCover
super().__init__(device)
self._unsubscribe_auto_updater: (Callable[([], None)] | None) = None | def __init__(self, device: XknxCover):
self._device: XknxCover
super().__init__(device)
self._unsubscribe_auto_updater: (Callable[([], None)] | None) = None<|docstring|>Initialize the cover.<|endoftext|> |
36447f0c91a6f37eb1f87b56cae7863c087e95ec27911f9f4ca07a45aef50f0a | @callback
async def after_update_callback(self, device: XknxDevice) -> None:
'Call after device was updated.'
self.async_write_ha_state()
if self._device.is_traveling():
self.start_auto_updater() | Call after device was updated. | homeassistant/components/knx/cover.py | after_update_callback | dn0sar/core | 1 | python | @callback
async def after_update_callback(self, device: XknxDevice) -> None:
self.async_write_ha_state()
if self._device.is_traveling():
self.start_auto_updater() | @callback
async def after_update_callback(self, device: XknxDevice) -> None:
self.async_write_ha_state()
if self._device.is_traveling():
self.start_auto_updater()<|docstring|>Call after device was updated.<|endoftext|> |
6d500ce62abb40345a441d6b195b7aa1596195f2ddb3a4bbd13ade8325ba60f4 | @property
def device_class(self) -> (str | None):
'Return the class of this device, from component DEVICE_CLASSES.'
if (self._device.device_class in DEVICE_CLASSES):
return self._device.device_class
if self._device.supports_angle:
return DEVICE_CLASS_BLIND
return None | Return the class of this device, from component DEVICE_CLASSES. | homeassistant/components/knx/cover.py | device_class | dn0sar/core | 1 | python | @property
def device_class(self) -> (str | None):
if (self._device.device_class in DEVICE_CLASSES):
return self._device.device_class
if self._device.supports_angle:
return DEVICE_CLASS_BLIND
return None | @property
def device_class(self) -> (str | None):
if (self._device.device_class in DEVICE_CLASSES):
return self._device.device_class
if self._device.supports_angle:
return DEVICE_CLASS_BLIND
return None<|docstring|>Return the class of this device, from component DEVICE_CLASSES.<|endoftext|> |
9f42a6bb061f0eba2ca705dabac2693a7ad9084075f25bc5c6831bb1bab58d96 | @property
def supported_features(self) -> int:
'Flag supported features.'
supported_features = ((SUPPORT_OPEN | SUPPORT_CLOSE) | SUPPORT_SET_POSITION)
if self._device.supports_stop:
supported_features |= SUPPORT_STOP
if self._device.supports_angle:
supported_features |= (((SUPPORT_SET_TILT_POSITION | SUPPORT_OPEN_TILT) | SUPPORT_CLOSE_TILT) | SUPPORT_STOP_TILT)
return supported_features | Flag supported features. | homeassistant/components/knx/cover.py | supported_features | dn0sar/core | 1 | python | @property
def supported_features(self) -> int:
supported_features = ((SUPPORT_OPEN | SUPPORT_CLOSE) | SUPPORT_SET_POSITION)
if self._device.supports_stop:
supported_features |= SUPPORT_STOP
if self._device.supports_angle:
supported_features |= (((SUPPORT_SET_TILT_POSITION | SUPPORT_OPEN_TILT) | SUPPORT_CLOSE_TILT) | SUPPORT_STOP_TILT)
return supported_features | @property
def supported_features(self) -> int:
supported_features = ((SUPPORT_OPEN | SUPPORT_CLOSE) | SUPPORT_SET_POSITION)
if self._device.supports_stop:
supported_features |= SUPPORT_STOP
if self._device.supports_angle:
supported_features |= (((SUPPORT_SET_TILT_POSITION | SUPPORT_OPEN_TILT) | SUPPORT_CLOSE_TILT) | SUPPORT_STOP_TILT)
return supported_features<|docstring|>Flag supported features.<|endoftext|> |
33e0c78d095b34e386668ca0fca133555c1596e6f8a2bf7bef4dc16796b1cdd0 | @property
def current_cover_position(self) -> (int | None):
'Return the current position of the cover.\n\n None is unknown, 0 is closed, 100 is fully open.\n '
pos = self._device.current_position()
return ((100 - pos) if (pos is not None) else None) | Return the current position of the cover.
None is unknown, 0 is closed, 100 is fully open. | homeassistant/components/knx/cover.py | current_cover_position | dn0sar/core | 1 | python | @property
def current_cover_position(self) -> (int | None):
'Return the current position of the cover.\n\n None is unknown, 0 is closed, 100 is fully open.\n '
pos = self._device.current_position()
return ((100 - pos) if (pos is not None) else None) | @property
def current_cover_position(self) -> (int | None):
'Return the current position of the cover.\n\n None is unknown, 0 is closed, 100 is fully open.\n '
pos = self._device.current_position()
return ((100 - pos) if (pos is not None) else None)<|docstring|>Return the current position of the cover.
None is unknown, 0 is closed, 100 is fully open.<|endoftext|> |
9a77f4322167dcf4267a16b985be62aca0754029041b49efccf79a1edb469536 | @property
def is_closed(self) -> (bool | None):
'Return if the cover is closed.'
if (self._device.current_position() is None):
return None
return self._device.is_closed() | Return if the cover is closed. | homeassistant/components/knx/cover.py | is_closed | dn0sar/core | 1 | python | @property
def is_closed(self) -> (bool | None):
if (self._device.current_position() is None):
return None
return self._device.is_closed() | @property
def is_closed(self) -> (bool | None):
if (self._device.current_position() is None):
return None
return self._device.is_closed()<|docstring|>Return if the cover is closed.<|endoftext|> |
3d19a3c76fc11ba036db2ecb4a85fc48613320e1c63633def64ac266e7ddaf30 | @property
def is_opening(self) -> bool:
'Return if the cover is opening or not.'
return self._device.is_opening() | Return if the cover is opening or not. | homeassistant/components/knx/cover.py | is_opening | dn0sar/core | 1 | python | @property
def is_opening(self) -> bool:
return self._device.is_opening() | @property
def is_opening(self) -> bool:
return self._device.is_opening()<|docstring|>Return if the cover is opening or not.<|endoftext|> |
325ae5feb5154732c237630c18c6d238be6212009e9279ef71e1ac3c7a4770f1 | @property
def is_closing(self) -> bool:
'Return if the cover is closing or not.'
return self._device.is_closing() | Return if the cover is closing or not. | homeassistant/components/knx/cover.py | is_closing | dn0sar/core | 1 | python | @property
def is_closing(self) -> bool:
return self._device.is_closing() | @property
def is_closing(self) -> bool:
return self._device.is_closing()<|docstring|>Return if the cover is closing or not.<|endoftext|> |
5600f7b6451b61bf57d97d5781d6bd77ce3a4a2dcb5c80ff8d8fb764f2004e18 | async def async_close_cover(self, **kwargs: Any) -> None:
'Close the cover.'
(await self._device.set_down()) | Close the cover. | homeassistant/components/knx/cover.py | async_close_cover | dn0sar/core | 1 | python | async def async_close_cover(self, **kwargs: Any) -> None:
(await self._device.set_down()) | async def async_close_cover(self, **kwargs: Any) -> None:
(await self._device.set_down())<|docstring|>Close the cover.<|endoftext|> |
1d30d226ff05eeb7139415894aaa19bce2c320c567e45c2d26a6018a08d8fec8 | async def async_open_cover(self, **kwargs: Any) -> None:
'Open the cover.'
(await self._device.set_up()) | Open the cover. | homeassistant/components/knx/cover.py | async_open_cover | dn0sar/core | 1 | python | async def async_open_cover(self, **kwargs: Any) -> None:
(await self._device.set_up()) | async def async_open_cover(self, **kwargs: Any) -> None:
(await self._device.set_up())<|docstring|>Open the cover.<|endoftext|> |
1ff17091dbf7d9f9bb5c80dc3983f20067155b2b3f615256affee7656e137739 | async def async_set_cover_position(self, **kwargs: Any) -> None:
'Move the cover to a specific position.'
knx_position = (100 - kwargs[ATTR_POSITION])
(await self._device.set_position(knx_position)) | Move the cover to a specific position. | homeassistant/components/knx/cover.py | async_set_cover_position | dn0sar/core | 1 | python | async def async_set_cover_position(self, **kwargs: Any) -> None:
knx_position = (100 - kwargs[ATTR_POSITION])
(await self._device.set_position(knx_position)) | async def async_set_cover_position(self, **kwargs: Any) -> None:
knx_position = (100 - kwargs[ATTR_POSITION])
(await self._device.set_position(knx_position))<|docstring|>Move the cover to a specific position.<|endoftext|> |
ffb5b11660e7e261fe335be693b50b322c57176f2b2c84c27255a3d0f6202249 | async def async_stop_cover(self, **kwargs: Any) -> None:
'Stop the cover.'
(await self._device.stop())
self.stop_auto_updater() | Stop the cover. | homeassistant/components/knx/cover.py | async_stop_cover | dn0sar/core | 1 | python | async def async_stop_cover(self, **kwargs: Any) -> None:
(await self._device.stop())
self.stop_auto_updater() | async def async_stop_cover(self, **kwargs: Any) -> None:
(await self._device.stop())
self.stop_auto_updater()<|docstring|>Stop the cover.<|endoftext|> |
736b0c9298ffa5b8802d04b87190cb6fd70767dc23963b28591679ce4b9d8444 | @property
def current_cover_tilt_position(self) -> (int | None):
'Return current tilt position of cover.'
if (not self._device.supports_angle):
return None
ang = self._device.current_angle()
return ((100 - ang) if (ang is not None) else None) | Return current tilt position of cover. | homeassistant/components/knx/cover.py | current_cover_tilt_position | dn0sar/core | 1 | python | @property
def current_cover_tilt_position(self) -> (int | None):
if (not self._device.supports_angle):
return None
ang = self._device.current_angle()
return ((100 - ang) if (ang is not None) else None) | @property
def current_cover_tilt_position(self) -> (int | None):
if (not self._device.supports_angle):
return None
ang = self._device.current_angle()
return ((100 - ang) if (ang is not None) else None)<|docstring|>Return current tilt position of cover.<|endoftext|> |
644804d51e33f7508f2ff36d3e28e3d57433497db64620441aabb97ec2e73685 | async def async_set_cover_tilt_position(self, **kwargs: Any) -> None:
'Move the cover tilt to a specific position.'
knx_tilt_position = (100 - kwargs[ATTR_TILT_POSITION])
(await self._device.set_angle(knx_tilt_position)) | Move the cover tilt to a specific position. | homeassistant/components/knx/cover.py | async_set_cover_tilt_position | dn0sar/core | 1 | python | async def async_set_cover_tilt_position(self, **kwargs: Any) -> None:
knx_tilt_position = (100 - kwargs[ATTR_TILT_POSITION])
(await self._device.set_angle(knx_tilt_position)) | async def async_set_cover_tilt_position(self, **kwargs: Any) -> None:
knx_tilt_position = (100 - kwargs[ATTR_TILT_POSITION])
(await self._device.set_angle(knx_tilt_position))<|docstring|>Move the cover tilt to a specific position.<|endoftext|> |
e60e71085665ccc4ca2f6febcff1a4bd33a26f99beedb734bd7e6830d71d6cb0 | async def async_open_cover_tilt(self, **kwargs: Any) -> None:
'Open the cover tilt.'
(await self._device.set_short_up()) | Open the cover tilt. | homeassistant/components/knx/cover.py | async_open_cover_tilt | dn0sar/core | 1 | python | async def async_open_cover_tilt(self, **kwargs: Any) -> None:
(await self._device.set_short_up()) | async def async_open_cover_tilt(self, **kwargs: Any) -> None:
(await self._device.set_short_up())<|docstring|>Open the cover tilt.<|endoftext|> |
52f10477e3af42d5d685a8fd4bd31554a13b1198fd62a80924c1bfec9953c867 | async def async_close_cover_tilt(self, **kwargs: Any) -> None:
'Close the cover tilt.'
(await self._device.set_short_down()) | Close the cover tilt. | homeassistant/components/knx/cover.py | async_close_cover_tilt | dn0sar/core | 1 | python | async def async_close_cover_tilt(self, **kwargs: Any) -> None:
(await self._device.set_short_down()) | async def async_close_cover_tilt(self, **kwargs: Any) -> None:
(await self._device.set_short_down())<|docstring|>Close the cover tilt.<|endoftext|> |
94bac232aa28883a5836c3aaebfa668af1215f05e6dbf21216191e95867e19a7 | async def async_stop_cover_tilt(self, **kwargs: Any) -> None:
'Stop the cover tilt.'
(await self._device.stop())
self.stop_auto_updater() | Stop the cover tilt. | homeassistant/components/knx/cover.py | async_stop_cover_tilt | dn0sar/core | 1 | python | async def async_stop_cover_tilt(self, **kwargs: Any) -> None:
(await self._device.stop())
self.stop_auto_updater() | async def async_stop_cover_tilt(self, **kwargs: Any) -> None:
(await self._device.stop())
self.stop_auto_updater()<|docstring|>Stop the cover tilt.<|endoftext|> |
49087ebcdbbca0257c5624f567e5305fba41ab48411c8154c877b31ff807ecce | def start_auto_updater(self) -> None:
'Start the autoupdater to update Home Assistant while cover is moving.'
if (self._unsubscribe_auto_updater is None):
self._unsubscribe_auto_updater = async_track_utc_time_change(self.hass, self.auto_updater_hook) | Start the autoupdater to update Home Assistant while cover is moving. | homeassistant/components/knx/cover.py | start_auto_updater | dn0sar/core | 1 | python | def start_auto_updater(self) -> None:
if (self._unsubscribe_auto_updater is None):
self._unsubscribe_auto_updater = async_track_utc_time_change(self.hass, self.auto_updater_hook) | def start_auto_updater(self) -> None:
if (self._unsubscribe_auto_updater is None):
self._unsubscribe_auto_updater = async_track_utc_time_change(self.hass, self.auto_updater_hook)<|docstring|>Start the autoupdater to update Home Assistant while cover is moving.<|endoftext|> |
9e4ecc8dcd42e6792ca912ce702cd1db45e8a5d47313140a0e628aa1057609a3 | def stop_auto_updater(self) -> None:
'Stop the autoupdater.'
if (self._unsubscribe_auto_updater is not None):
self._unsubscribe_auto_updater()
self._unsubscribe_auto_updater = None | Stop the autoupdater. | homeassistant/components/knx/cover.py | stop_auto_updater | dn0sar/core | 1 | python | def stop_auto_updater(self) -> None:
if (self._unsubscribe_auto_updater is not None):
self._unsubscribe_auto_updater()
self._unsubscribe_auto_updater = None | def stop_auto_updater(self) -> None:
if (self._unsubscribe_auto_updater is not None):
self._unsubscribe_auto_updater()
self._unsubscribe_auto_updater = None<|docstring|>Stop the autoupdater.<|endoftext|> |
d60b71cf1555148bc0704bc679a98209830a500d21f3c79f29dbfb6ca9138be2 | @callback
def auto_updater_hook(self, now: datetime) -> None:
'Call for the autoupdater.'
self.async_write_ha_state()
if self._device.position_reached():
self.hass.async_create_task(self._device.auto_stop_if_necessary())
self.stop_auto_updater() | Call for the autoupdater. | homeassistant/components/knx/cover.py | auto_updater_hook | dn0sar/core | 1 | python | @callback
def auto_updater_hook(self, now: datetime) -> None:
self.async_write_ha_state()
if self._device.position_reached():
self.hass.async_create_task(self._device.auto_stop_if_necessary())
self.stop_auto_updater() | @callback
def auto_updater_hook(self, now: datetime) -> None:
self.async_write_ha_state()
if self._device.position_reached():
self.hass.async_create_task(self._device.auto_stop_if_necessary())
self.stop_auto_updater()<|docstring|>Call for the autoupdater.<|endoftext|> |
f13319334402478b1ce9c1ff98b68b4fca78b8f3aef338d20c9b7aff80e1575c | def catfile(s):
"\n Find all occurences of 'Catfile <filepath>' and replace them by the content\n of <filepath>.\n "
while True:
try:
m = next(re.finditer('\nCatxfile .*\n', s))
filepath = s[(m.start() + 9):(m.end() - 1)]
with open(filepath, 'r') as fh:
content = fh.read()
content = (TAB + content.replace('\n', ('\n' + TAB)))
content = content[:(- len(TAB))]
s = ''.join([s[:(m.start() + 1)], content, s[m.end():]])
except StopIteration:
break
return s | Find all occurences of 'Catfile <filepath>' and replace them by the content
of <filepath>. | preprocess.py | catfile | tbm/GSWL-book | 130 | python | def catfile(s):
"\n Find all occurences of 'Catfile <filepath>' and replace them by the content\n of <filepath>.\n "
while True:
try:
m = next(re.finditer('\nCatxfile .*\n', s))
filepath = s[(m.start() + 9):(m.end() - 1)]
with open(filepath, 'r') as fh:
content = fh.read()
content = (TAB + content.replace('\n', ('\n' + TAB)))
content = content[:(- len(TAB))]
s = .join([s[:(m.start() + 1)], content, s[m.end():]])
except StopIteration:
break
return s | def catfile(s):
"\n Find all occurences of 'Catfile <filepath>' and replace them by the content\n of <filepath>.\n "
while True:
try:
m = next(re.finditer('\nCatxfile .*\n', s))
filepath = s[(m.start() + 9):(m.end() - 1)]
with open(filepath, 'r') as fh:
content = fh.read()
content = (TAB + content.replace('\n', ('\n' + TAB)))
content = content[:(- len(TAB))]
s = .join([s[:(m.start() + 1)], content, s[m.end():]])
except StopIteration:
break
return s<|docstring|>Find all occurences of 'Catfile <filepath>' and replace them by the content
of <filepath>.<|endoftext|> |
232c4bd623c5d11f96c276d5d715f6b5d8822026ac83de18fa61062adae46e85 | def caturl(s):
"\n Find all occurences of 'Caturl <url>' and replace them by the content\n of <url>.\n "
while True:
try:
m = next(re.finditer('\nCaturl .*\n', s))
url = s[(m.start() + 8):(m.end() - 1)]
f = urllib.urlopen(url)
content = f.read()
s = ''.join([s[:(m.start() + 1)], content, s[m.end():]])
except StopIteration:
break
return s | Find all occurences of 'Caturl <url>' and replace them by the content
of <url>. | preprocess.py | caturl | tbm/GSWL-book | 130 | python | def caturl(s):
"\n Find all occurences of 'Caturl <url>' and replace them by the content\n of <url>.\n "
while True:
try:
m = next(re.finditer('\nCaturl .*\n', s))
url = s[(m.start() + 8):(m.end() - 1)]
f = urllib.urlopen(url)
content = f.read()
s = .join([s[:(m.start() + 1)], content, s[m.end():]])
except StopIteration:
break
return s | def caturl(s):
"\n Find all occurences of 'Caturl <url>' and replace them by the content\n of <url>.\n "
while True:
try:
m = next(re.finditer('\nCaturl .*\n', s))
url = s[(m.start() + 8):(m.end() - 1)]
f = urllib.urlopen(url)
content = f.read()
s = .join([s[:(m.start() + 1)], content, s[m.end():]])
except StopIteration:
break
return s<|docstring|>Find all occurences of 'Caturl <url>' and replace them by the content
of <url>.<|endoftext|> |
ec57f537d62052783451a12cbc3e0fbd5467bcfc2813fcfe858a38587f1c065f | def __init__(self, geometry: Geometry, a: Callable, f: Callable, **kargs) -> None:
'Creates a nonlineal 1D equation with the form:\n\n .. math::\n -\\frac{d}{dx}\\left(a(x)u\\frac{du}{dx}\\right)=f(x)\n\n Args:\n geometry (Geometry): Input lineal geometry\n a (Callable): Function a\n f (Callable): Function f\n '
self.a = a
self.f = f
Core.__init__(self, geometry, solver=NoLineal.Newton, **kargs) | Creates a nonlineal 1D equation with the form:
.. math::
-\frac{d}{dx}\left(a(x)u\frac{du}{dx}\right)=f(x)
Args:
geometry (Geometry): Input lineal geometry
a (Callable): Function a
f (Callable): Function f | FEM/NonLinealExample.py | __init__ | ZibraMax/FEM | 10 | python | def __init__(self, geometry: Geometry, a: Callable, f: Callable, **kargs) -> None:
'Creates a nonlineal 1D equation with the form:\n\n .. math::\n -\\frac{d}{dx}\\left(a(x)u\\frac{du}{dx}\\right)=f(x)\n\n Args:\n geometry (Geometry): Input lineal geometry\n a (Callable): Function a\n f (Callable): Function f\n '
self.a = a
self.f = f
Core.__init__(self, geometry, solver=NoLineal.Newton, **kargs) | def __init__(self, geometry: Geometry, a: Callable, f: Callable, **kargs) -> None:
'Creates a nonlineal 1D equation with the form:\n\n .. math::\n -\\frac{d}{dx}\\left(a(x)u\\frac{du}{dx}\\right)=f(x)\n\n Args:\n geometry (Geometry): Input lineal geometry\n a (Callable): Function a\n f (Callable): Function f\n '
self.a = a
self.f = f
Core.__init__(self, geometry, solver=NoLineal.Newton, **kargs)<|docstring|>Creates a nonlineal 1D equation with the form:
.. math::
-\frac{d}{dx}\left(a(x)u\frac{du}{dx}\right)=f(x)
Args:
geometry (Geometry): Input lineal geometry
a (Callable): Function a
f (Callable): Function f<|endoftext|> |
36bb70943c6c8c34e58e4a2e525c3d1fbbb0516d67d46983f979e49fb8f9a826 | def elementMatrices(self) -> None:
"Calculate the element matrices usign Reddy's non lineal finite element model. Element matrices and forces are calculated with Gauss-Legendre quadrature. Point number depends of element discretization.\n "
for e in tqdm(self.elements, unit='Element'):
e.Te = np.zeros(e.Ke.shape)
e.Fe = np.zeros(e.Fe.shape)
e.Ke = np.zeros(e.Ke.shape)
(_x, _p) = e.T(e.Z.T)
(jac, dpz) = e.J(e.Z.T)
detjac = np.linalg.det(jac)
_j = np.linalg.inv(jac)
dpx = (_j @ dpz)
for i in range(e.n):
for j in range(e.n):
for k in range(len(e.Z)):
e.Ke[(i, j)] += ((((((self.a(_x[k]) * e.Ue[0]) @ _p[k]) * dpx[k][0][i]) * dpx[k][0][j]) * detjac[k]) * e.W[k])
e.Te[(i, j)] += (((((_p[k][j] * dpx[k][0][i]) * e.Ue[0]) @ dpx[k][0]) * detjac[k]) * e.W[k])
for k in range(len(e.Z)):
e.Fe[i][0] += (((self.f(_x[k]) * _p[k][i]) * detjac[k]) * e.W[k])
e.Te += e.Ke | Calculate the element matrices usign Reddy's non lineal finite element model. Element matrices and forces are calculated with Gauss-Legendre quadrature. Point number depends of element discretization. | FEM/NonLinealExample.py | elementMatrices | ZibraMax/FEM | 10 | python | def elementMatrices(self) -> None:
"\n "
for e in tqdm(self.elements, unit='Element'):
e.Te = np.zeros(e.Ke.shape)
e.Fe = np.zeros(e.Fe.shape)
e.Ke = np.zeros(e.Ke.shape)
(_x, _p) = e.T(e.Z.T)
(jac, dpz) = e.J(e.Z.T)
detjac = np.linalg.det(jac)
_j = np.linalg.inv(jac)
dpx = (_j @ dpz)
for i in range(e.n):
for j in range(e.n):
for k in range(len(e.Z)):
e.Ke[(i, j)] += ((((((self.a(_x[k]) * e.Ue[0]) @ _p[k]) * dpx[k][0][i]) * dpx[k][0][j]) * detjac[k]) * e.W[k])
e.Te[(i, j)] += (((((_p[k][j] * dpx[k][0][i]) * e.Ue[0]) @ dpx[k][0]) * detjac[k]) * e.W[k])
for k in range(len(e.Z)):
e.Fe[i][0] += (((self.f(_x[k]) * _p[k][i]) * detjac[k]) * e.W[k])
e.Te += e.Ke | def elementMatrices(self) -> None:
"\n "
for e in tqdm(self.elements, unit='Element'):
e.Te = np.zeros(e.Ke.shape)
e.Fe = np.zeros(e.Fe.shape)
e.Ke = np.zeros(e.Ke.shape)
(_x, _p) = e.T(e.Z.T)
(jac, dpz) = e.J(e.Z.T)
detjac = np.linalg.det(jac)
_j = np.linalg.inv(jac)
dpx = (_j @ dpz)
for i in range(e.n):
for j in range(e.n):
for k in range(len(e.Z)):
e.Ke[(i, j)] += ((((((self.a(_x[k]) * e.Ue[0]) @ _p[k]) * dpx[k][0][i]) * dpx[k][0][j]) * detjac[k]) * e.W[k])
e.Te[(i, j)] += (((((_p[k][j] * dpx[k][0][i]) * e.Ue[0]) @ dpx[k][0]) * detjac[k]) * e.W[k])
for k in range(len(e.Z)):
e.Fe[i][0] += (((self.f(_x[k]) * _p[k][i]) * detjac[k]) * e.W[k])
e.Te += e.Ke<|docstring|>Calculate the element matrices usign Reddy's non lineal finite element model. Element matrices and forces are calculated with Gauss-Legendre quadrature. Point number depends of element discretization.<|endoftext|> |
656ed387db847eb4c3674c95c76cddda9892881d845057a55f6f1a74337f3004 | def postProcess(self) -> None:
'Generate graph of solution and solution derivative\n '
X = []
U1 = []
U2 = []
fig = plt.figure()
ax1 = fig.add_subplot(1, 2, 1)
ax2 = fig.add_subplot(1, 2, 2)
for e in tqdm(self.elements, unit='Element'):
(_x, _u, du) = e.giveSolution(True)
X += _x.T[0].tolist()
U1 += _u[0].tolist()
U2 += du[(:, 0, 0)].tolist()
ax1.plot(X, U1)
ax2.plot(X, np.array(U2))
ax1.grid()
ax2.grid()
ax1.set_title('$U(x)$')
ax2.set_title('$\\frac{dU}{dx}$') | Generate graph of solution and solution derivative | FEM/NonLinealExample.py | postProcess | ZibraMax/FEM | 10 | python | def postProcess(self) -> None:
'\n '
X = []
U1 = []
U2 = []
fig = plt.figure()
ax1 = fig.add_subplot(1, 2, 1)
ax2 = fig.add_subplot(1, 2, 2)
for e in tqdm(self.elements, unit='Element'):
(_x, _u, du) = e.giveSolution(True)
X += _x.T[0].tolist()
U1 += _u[0].tolist()
U2 += du[(:, 0, 0)].tolist()
ax1.plot(X, U1)
ax2.plot(X, np.array(U2))
ax1.grid()
ax2.grid()
ax1.set_title('$U(x)$')
ax2.set_title('$\\frac{dU}{dx}$') | def postProcess(self) -> None:
'\n '
X = []
U1 = []
U2 = []
fig = plt.figure()
ax1 = fig.add_subplot(1, 2, 1)
ax2 = fig.add_subplot(1, 2, 2)
for e in tqdm(self.elements, unit='Element'):
(_x, _u, du) = e.giveSolution(True)
X += _x.T[0].tolist()
U1 += _u[0].tolist()
U2 += du[(:, 0, 0)].tolist()
ax1.plot(X, U1)
ax2.plot(X, np.array(U2))
ax1.grid()
ax2.grid()
ax1.set_title('$U(x)$')
ax2.set_title('$\\frac{dU}{dx}$')<|docstring|>Generate graph of solution and solution derivative<|endoftext|> |
c6041dfc3b8d9ac738c61320cc89ab458ac0a590db61b1be13d771aaff3cd585 | def block_diag(M, n):
'bd creates a sparse block diagonal matrix by repeating M n times\n\n Args:\n M (2d numpy array): matrix to be repeated\n n (float): number of times to repeat\n '
return sp.sparse.block_diag([M for i in range(n)]) | bd creates a sparse block diagonal matrix by repeating M n times
Args:
M (2d numpy array): matrix to be repeated
n (float): number of times to repeat | core/controllers/controller_aux.py | block_diag | PastorD/ensemblempc | 2 | python | def block_diag(M, n):
'bd creates a sparse block diagonal matrix by repeating M n times\n\n Args:\n M (2d numpy array): matrix to be repeated\n n (float): number of times to repeat\n '
return sp.sparse.block_diag([M for i in range(n)]) | def block_diag(M, n):
'bd creates a sparse block diagonal matrix by repeating M n times\n\n Args:\n M (2d numpy array): matrix to be repeated\n n (float): number of times to repeat\n '
return sp.sparse.block_diag([M for i in range(n)])<|docstring|>bd creates a sparse block diagonal matrix by repeating M n times
Args:
M (2d numpy array): matrix to be repeated
n (float): number of times to repeat<|endoftext|> |
3d1317b4188f9a619f0df9b1cbf01c41d1d6634ad69ab0f668fa3e7c818ddead | def update_element(self, element):
'\n Extract some informations from element to prepare the repository\n :param element: Element to add to the repository\n :return: Unicode string containing meta-data\n\n ar -x control.tar.gz\n tar -xf control.tar.gz control\n '
if element.archive:
element.name = element.archive.rpartition('.')[2] | Extract some informations from element to prepare the repository
:param element: Element to add to the repository
:return: Unicode string containing meta-data
ar -x control.tar.gz
tar -xf control.tar.gz control | moneta/repositories/flat_files.py | update_element | pombredanne/Moneta | 6 | python | def update_element(self, element):
'\n Extract some informations from element to prepare the repository\n :param element: Element to add to the repository\n :return: Unicode string containing meta-data\n\n ar -x control.tar.gz\n tar -xf control.tar.gz control\n '
if element.archive:
element.name = element.archive.rpartition('.')[2] | def update_element(self, element):
'\n Extract some informations from element to prepare the repository\n :param element: Element to add to the repository\n :return: Unicode string containing meta-data\n\n ar -x control.tar.gz\n tar -xf control.tar.gz control\n '
if element.archive:
element.name = element.archive.rpartition('.')[2]<|docstring|>Extract some informations from element to prepare the repository
:param element: Element to add to the repository
:return: Unicode string containing meta-data
ar -x control.tar.gz
tar -xf control.tar.gz control<|endoftext|> |
c3437bcd4ae5703d4848b77c0429ddef280ddf88dcb8ff1b5621aa26e63fc54f | def finish_element(self, element: Element, states: list):
'\n Called after the .save() operations, with all states associated to this new element.\n Remove previous versions from target states\n :param element: Element\n :param states: list of ArchiveState\n '
Element.states.through.objects.exclude(element__version=element.version).filter(archivestate__in=states, element__archive=element.archive).delete()
super().finish_element(element, states) | Called after the .save() operations, with all states associated to this new element.
Remove previous versions from target states
:param element: Element
:param states: list of ArchiveState | moneta/repositories/flat_files.py | finish_element | pombredanne/Moneta | 6 | python | def finish_element(self, element: Element, states: list):
'\n Called after the .save() operations, with all states associated to this new element.\n Remove previous versions from target states\n :param element: Element\n :param states: list of ArchiveState\n '
Element.states.through.objects.exclude(element__version=element.version).filter(archivestate__in=states, element__archive=element.archive).delete()
super().finish_element(element, states) | def finish_element(self, element: Element, states: list):
'\n Called after the .save() operations, with all states associated to this new element.\n Remove previous versions from target states\n :param element: Element\n :param states: list of ArchiveState\n '
Element.states.through.objects.exclude(element__version=element.version).filter(archivestate__in=states, element__archive=element.archive).delete()
super().finish_element(element, states)<|docstring|>Called after the .save() operations, with all states associated to this new element.
Remove previous versions from target states
:param element: Element
:param states: list of ArchiveState<|endoftext|> |
ece28dfe62680eb5960a0d9cbec43eaff60ce23de53c80ad9a90bf8bb0c0252b | def __init__(self, amqp_url):
'Setup the example publisher object, passing in the URL we will use\n to connect to RabbitMQ.\n\n :param str amqp_url: The URL for connecting to RabbitMQ\n\n '
self._connection = None
self._channel = None
self._deliveries = None
self._acked = None
self._nacked = None
self._message_number = None
self._stopping = False
self._url = amqp_url | Setup the example publisher object, passing in the URL we will use
to connect to RabbitMQ.
:param str amqp_url: The URL for connecting to RabbitMQ | rabbitmq/asyncpub.py | __init__ | irr/python-labs | 4 | python | def __init__(self, amqp_url):
'Setup the example publisher object, passing in the URL we will use\n to connect to RabbitMQ.\n\n :param str amqp_url: The URL for connecting to RabbitMQ\n\n '
self._connection = None
self._channel = None
self._deliveries = None
self._acked = None
self._nacked = None
self._message_number = None
self._stopping = False
self._url = amqp_url | def __init__(self, amqp_url):
'Setup the example publisher object, passing in the URL we will use\n to connect to RabbitMQ.\n\n :param str amqp_url: The URL for connecting to RabbitMQ\n\n '
self._connection = None
self._channel = None
self._deliveries = None
self._acked = None
self._nacked = None
self._message_number = None
self._stopping = False
self._url = amqp_url<|docstring|>Setup the example publisher object, passing in the URL we will use
to connect to RabbitMQ.
:param str amqp_url: The URL for connecting to RabbitMQ<|endoftext|> |
ac11654cfdbdbbcd9b71d5f3994d04aed5be1a2d20ffe963d4517fa8011abb63 | def connect(self):
'This method connects to RabbitMQ, returning the connection handle.\n When the connection is established, the on_connection_open method\n will be invoked by pika. If you want the reconnection to work, make\n sure you set stop_ioloop_on_close to False, which is not the default\n behavior of this adapter.\n\n :rtype: pika.SelectConnection\n\n '
LOGGER.info('Connecting to %s', self._url)
return pika.SelectConnection(pika.URLParameters(self._url), on_open_callback=self.on_connection_open, on_close_callback=self.on_connection_closed, stop_ioloop_on_close=False) | This method connects to RabbitMQ, returning the connection handle.
When the connection is established, the on_connection_open method
will be invoked by pika. If you want the reconnection to work, make
sure you set stop_ioloop_on_close to False, which is not the default
behavior of this adapter.
:rtype: pika.SelectConnection | rabbitmq/asyncpub.py | connect | irr/python-labs | 4 | python | def connect(self):
'This method connects to RabbitMQ, returning the connection handle.\n When the connection is established, the on_connection_open method\n will be invoked by pika. If you want the reconnection to work, make\n sure you set stop_ioloop_on_close to False, which is not the default\n behavior of this adapter.\n\n :rtype: pika.SelectConnection\n\n '
LOGGER.info('Connecting to %s', self._url)
return pika.SelectConnection(pika.URLParameters(self._url), on_open_callback=self.on_connection_open, on_close_callback=self.on_connection_closed, stop_ioloop_on_close=False) | def connect(self):
'This method connects to RabbitMQ, returning the connection handle.\n When the connection is established, the on_connection_open method\n will be invoked by pika. If you want the reconnection to work, make\n sure you set stop_ioloop_on_close to False, which is not the default\n behavior of this adapter.\n\n :rtype: pika.SelectConnection\n\n '
LOGGER.info('Connecting to %s', self._url)
return pika.SelectConnection(pika.URLParameters(self._url), on_open_callback=self.on_connection_open, on_close_callback=self.on_connection_closed, stop_ioloop_on_close=False)<|docstring|>This method connects to RabbitMQ, returning the connection handle.
When the connection is established, the on_connection_open method
will be invoked by pika. If you want the reconnection to work, make
sure you set stop_ioloop_on_close to False, which is not the default
behavior of this adapter.
:rtype: pika.SelectConnection<|endoftext|> |
5b9eb494164b386b53e74b2383441cd18aac401ba733274591f7d373b803dc83 | def on_connection_open(self, unused_connection):
"This method is called by pika once the connection to RabbitMQ has\n been established. It passes the handle to the connection object in\n case we need it, but in this case, we'll just mark it unused.\n\n :type unused_connection: pika.SelectConnection\n\n "
LOGGER.info('Connection opened')
self.open_channel() | This method is called by pika once the connection to RabbitMQ has
been established. It passes the handle to the connection object in
case we need it, but in this case, we'll just mark it unused.
:type unused_connection: pika.SelectConnection | rabbitmq/asyncpub.py | on_connection_open | irr/python-labs | 4 | python | def on_connection_open(self, unused_connection):
"This method is called by pika once the connection to RabbitMQ has\n been established. It passes the handle to the connection object in\n case we need it, but in this case, we'll just mark it unused.\n\n :type unused_connection: pika.SelectConnection\n\n "
LOGGER.info('Connection opened')
self.open_channel() | def on_connection_open(self, unused_connection):
"This method is called by pika once the connection to RabbitMQ has\n been established. It passes the handle to the connection object in\n case we need it, but in this case, we'll just mark it unused.\n\n :type unused_connection: pika.SelectConnection\n\n "
LOGGER.info('Connection opened')
self.open_channel()<|docstring|>This method is called by pika once the connection to RabbitMQ has
been established. It passes the handle to the connection object in
case we need it, but in this case, we'll just mark it unused.
:type unused_connection: pika.SelectConnection<|endoftext|> |
1895966243365db08bc4e865023732634a933ecdcebe31bcd0b38f0547b14a6b | def on_connection_closed(self, connection, reply_code, reply_text):
'This method is invoked by pika when the connection to RabbitMQ is\n closed unexpectedly. Since it is unexpected, we will reconnect to\n RabbitMQ if it disconnects.\n\n :param pika.connection.Connection connection: The closed connection obj\n :param int reply_code: The server provided reply_code if given\n :param str reply_text: The server provided reply_text if given\n\n '
self._channel = None
if self._stopping:
self._connection.ioloop.stop()
else:
LOGGER.warning('Connection closed, reopening in 5 seconds: (%s) %s', reply_code, reply_text)
self._connection.add_timeout(5, self._connection.ioloop.stop) | This method is invoked by pika when the connection to RabbitMQ is
closed unexpectedly. Since it is unexpected, we will reconnect to
RabbitMQ if it disconnects.
:param pika.connection.Connection connection: The closed connection obj
:param int reply_code: The server provided reply_code if given
:param str reply_text: The server provided reply_text if given | rabbitmq/asyncpub.py | on_connection_closed | irr/python-labs | 4 | python | def on_connection_closed(self, connection, reply_code, reply_text):
'This method is invoked by pika when the connection to RabbitMQ is\n closed unexpectedly. Since it is unexpected, we will reconnect to\n RabbitMQ if it disconnects.\n\n :param pika.connection.Connection connection: The closed connection obj\n :param int reply_code: The server provided reply_code if given\n :param str reply_text: The server provided reply_text if given\n\n '
self._channel = None
if self._stopping:
self._connection.ioloop.stop()
else:
LOGGER.warning('Connection closed, reopening in 5 seconds: (%s) %s', reply_code, reply_text)
self._connection.add_timeout(5, self._connection.ioloop.stop) | def on_connection_closed(self, connection, reply_code, reply_text):
'This method is invoked by pika when the connection to RabbitMQ is\n closed unexpectedly. Since it is unexpected, we will reconnect to\n RabbitMQ if it disconnects.\n\n :param pika.connection.Connection connection: The closed connection obj\n :param int reply_code: The server provided reply_code if given\n :param str reply_text: The server provided reply_text if given\n\n '
self._channel = None
if self._stopping:
self._connection.ioloop.stop()
else:
LOGGER.warning('Connection closed, reopening in 5 seconds: (%s) %s', reply_code, reply_text)
self._connection.add_timeout(5, self._connection.ioloop.stop)<|docstring|>This method is invoked by pika when the connection to RabbitMQ is
closed unexpectedly. Since it is unexpected, we will reconnect to
RabbitMQ if it disconnects.
:param pika.connection.Connection connection: The closed connection obj
:param int reply_code: The server provided reply_code if given
:param str reply_text: The server provided reply_text if given<|endoftext|> |
f5428fb3e7fe0b795a993ab172b609c5d5dace06b813bc220a64e7474f913d9e | def open_channel(self):
'This method will open a new channel with RabbitMQ by issuing the\n Channel.Open RPC command. When RabbitMQ confirms the channel is open\n by sending the Channel.OpenOK RPC reply, the on_channel_open method\n will be invoked.\n\n '
LOGGER.info('Creating a new channel')
self._connection.channel(on_open_callback=self.on_channel_open) | This method will open a new channel with RabbitMQ by issuing the
Channel.Open RPC command. When RabbitMQ confirms the channel is open
by sending the Channel.OpenOK RPC reply, the on_channel_open method
will be invoked. | rabbitmq/asyncpub.py | open_channel | irr/python-labs | 4 | python | def open_channel(self):
'This method will open a new channel with RabbitMQ by issuing the\n Channel.Open RPC command. When RabbitMQ confirms the channel is open\n by sending the Channel.OpenOK RPC reply, the on_channel_open method\n will be invoked.\n\n '
LOGGER.info('Creating a new channel')
self._connection.channel(on_open_callback=self.on_channel_open) | def open_channel(self):
'This method will open a new channel with RabbitMQ by issuing the\n Channel.Open RPC command. When RabbitMQ confirms the channel is open\n by sending the Channel.OpenOK RPC reply, the on_channel_open method\n will be invoked.\n\n '
LOGGER.info('Creating a new channel')
self._connection.channel(on_open_callback=self.on_channel_open)<|docstring|>This method will open a new channel with RabbitMQ by issuing the
Channel.Open RPC command. When RabbitMQ confirms the channel is open
by sending the Channel.OpenOK RPC reply, the on_channel_open method
will be invoked.<|endoftext|> |
7c1644ec731583a7fd1dc6f1cdfca4ed5a6edde930fcda948b1ccafcdd5baf16 | def on_channel_open(self, channel):
"This method is invoked by pika when the channel has been opened.\n The channel object is passed in so we can make use of it.\n\n Since the channel is now open, we'll declare the exchange to use.\n\n :param pika.channel.Channel channel: The channel object\n\n "
LOGGER.info('Channel opened')
self._channel = channel
self.add_on_channel_close_callback()
self.setup_exchange(self.EXCHANGE) | This method is invoked by pika when the channel has been opened.
The channel object is passed in so we can make use of it.
Since the channel is now open, we'll declare the exchange to use.
:param pika.channel.Channel channel: The channel object | rabbitmq/asyncpub.py | on_channel_open | irr/python-labs | 4 | python | def on_channel_open(self, channel):
"This method is invoked by pika when the channel has been opened.\n The channel object is passed in so we can make use of it.\n\n Since the channel is now open, we'll declare the exchange to use.\n\n :param pika.channel.Channel channel: The channel object\n\n "
LOGGER.info('Channel opened')
self._channel = channel
self.add_on_channel_close_callback()
self.setup_exchange(self.EXCHANGE) | def on_channel_open(self, channel):
"This method is invoked by pika when the channel has been opened.\n The channel object is passed in so we can make use of it.\n\n Since the channel is now open, we'll declare the exchange to use.\n\n :param pika.channel.Channel channel: The channel object\n\n "
LOGGER.info('Channel opened')
self._channel = channel
self.add_on_channel_close_callback()
self.setup_exchange(self.EXCHANGE)<|docstring|>This method is invoked by pika when the channel has been opened.
The channel object is passed in so we can make use of it.
Since the channel is now open, we'll declare the exchange to use.
:param pika.channel.Channel channel: The channel object<|endoftext|> |
d4934b232446a383e6e6403c8785cb904e9f41115dc4d69c3ef8f210602f759e | def add_on_channel_close_callback(self):
'This method tells pika to call the on_channel_closed method if\n RabbitMQ unexpectedly closes the channel.\n\n '
LOGGER.info('Adding channel close callback')
self._channel.add_on_close_callback(self.on_channel_closed) | This method tells pika to call the on_channel_closed method if
RabbitMQ unexpectedly closes the channel. | rabbitmq/asyncpub.py | add_on_channel_close_callback | irr/python-labs | 4 | python | def add_on_channel_close_callback(self):
'This method tells pika to call the on_channel_closed method if\n RabbitMQ unexpectedly closes the channel.\n\n '
LOGGER.info('Adding channel close callback')
self._channel.add_on_close_callback(self.on_channel_closed) | def add_on_channel_close_callback(self):
'This method tells pika to call the on_channel_closed method if\n RabbitMQ unexpectedly closes the channel.\n\n '
LOGGER.info('Adding channel close callback')
self._channel.add_on_close_callback(self.on_channel_closed)<|docstring|>This method tells pika to call the on_channel_closed method if
RabbitMQ unexpectedly closes the channel.<|endoftext|> |
36053c79d3ab4489a8cd428ddf6ca428ea6a54ed05ff3baa6408af0c4280e309 | def on_channel_closed(self, channel, reply_code, reply_text):
"Invoked by pika when RabbitMQ unexpectedly closes the channel.\n Channels are usually closed if you attempt to do something that\n violates the protocol, such as re-declare an exchange or queue with\n different parameters. In this case, we'll close the connection\n to shutdown the object.\n\n :param pika.channel.Channel channel: The closed channel\n :param int reply_code: The numeric reason the channel was closed\n :param str reply_text: The text reason the channel was closed\n\n "
LOGGER.warning('Channel was closed: (%s) %s', reply_code, reply_text)
self._channel = None
if (not self._stopping):
self._connection.close() | Invoked by pika when RabbitMQ unexpectedly closes the channel.
Channels are usually closed if you attempt to do something that
violates the protocol, such as re-declare an exchange or queue with
different parameters. In this case, we'll close the connection
to shutdown the object.
:param pika.channel.Channel channel: The closed channel
:param int reply_code: The numeric reason the channel was closed
:param str reply_text: The text reason the channel was closed | rabbitmq/asyncpub.py | on_channel_closed | irr/python-labs | 4 | python | def on_channel_closed(self, channel, reply_code, reply_text):
"Invoked by pika when RabbitMQ unexpectedly closes the channel.\n Channels are usually closed if you attempt to do something that\n violates the protocol, such as re-declare an exchange or queue with\n different parameters. In this case, we'll close the connection\n to shutdown the object.\n\n :param pika.channel.Channel channel: The closed channel\n :param int reply_code: The numeric reason the channel was closed\n :param str reply_text: The text reason the channel was closed\n\n "
LOGGER.warning('Channel was closed: (%s) %s', reply_code, reply_text)
self._channel = None
if (not self._stopping):
self._connection.close() | def on_channel_closed(self, channel, reply_code, reply_text):
"Invoked by pika when RabbitMQ unexpectedly closes the channel.\n Channels are usually closed if you attempt to do something that\n violates the protocol, such as re-declare an exchange or queue with\n different parameters. In this case, we'll close the connection\n to shutdown the object.\n\n :param pika.channel.Channel channel: The closed channel\n :param int reply_code: The numeric reason the channel was closed\n :param str reply_text: The text reason the channel was closed\n\n "
LOGGER.warning('Channel was closed: (%s) %s', reply_code, reply_text)
self._channel = None
if (not self._stopping):
self._connection.close()<|docstring|>Invoked by pika when RabbitMQ unexpectedly closes the channel.
Channels are usually closed if you attempt to do something that
violates the protocol, such as re-declare an exchange or queue with
different parameters. In this case, we'll close the connection
to shutdown the object.
:param pika.channel.Channel channel: The closed channel
:param int reply_code: The numeric reason the channel was closed
:param str reply_text: The text reason the channel was closed<|endoftext|> |
7bfa3d63a57a8f2e7185cc7d22836936775b6e77898b596bcaae296f975bb38e | def setup_exchange(self, exchange_name):
'Setup the exchange on RabbitMQ by invoking the Exchange.Declare RPC\n command. When it is complete, the on_exchange_declareok method will\n be invoked by pika.\n\n :param str|unicode exchange_name: The name of the exchange to declare\n\n '
LOGGER.info('Declaring exchange %s', exchange_name)
self._channel.exchange_declare(self.on_exchange_declareok, exchange_name, self.EXCHANGE_TYPE) | Setup the exchange on RabbitMQ by invoking the Exchange.Declare RPC
command. When it is complete, the on_exchange_declareok method will
be invoked by pika.
:param str|unicode exchange_name: The name of the exchange to declare | rabbitmq/asyncpub.py | setup_exchange | irr/python-labs | 4 | python | def setup_exchange(self, exchange_name):
'Setup the exchange on RabbitMQ by invoking the Exchange.Declare RPC\n command. When it is complete, the on_exchange_declareok method will\n be invoked by pika.\n\n :param str|unicode exchange_name: The name of the exchange to declare\n\n '
LOGGER.info('Declaring exchange %s', exchange_name)
self._channel.exchange_declare(self.on_exchange_declareok, exchange_name, self.EXCHANGE_TYPE) | def setup_exchange(self, exchange_name):
'Setup the exchange on RabbitMQ by invoking the Exchange.Declare RPC\n command. When it is complete, the on_exchange_declareok method will\n be invoked by pika.\n\n :param str|unicode exchange_name: The name of the exchange to declare\n\n '
LOGGER.info('Declaring exchange %s', exchange_name)
self._channel.exchange_declare(self.on_exchange_declareok, exchange_name, self.EXCHANGE_TYPE)<|docstring|>Setup the exchange on RabbitMQ by invoking the Exchange.Declare RPC
command. When it is complete, the on_exchange_declareok method will
be invoked by pika.
:param str|unicode exchange_name: The name of the exchange to declare<|endoftext|> |
aadf63332b108ee9ab5f2c14528ca840acc5226fb35315e9c7786b4fe84caadb | def on_exchange_declareok(self, unused_frame):
'Invoked by pika when RabbitMQ has finished the Exchange.Declare RPC\n command.\n\n :param pika.Frame.Method unused_frame: Exchange.DeclareOk response frame\n\n '
LOGGER.info('Exchange declared')
self.setup_queue(self.QUEUE) | Invoked by pika when RabbitMQ has finished the Exchange.Declare RPC
command.
:param pika.Frame.Method unused_frame: Exchange.DeclareOk response frame | rabbitmq/asyncpub.py | on_exchange_declareok | irr/python-labs | 4 | python | def on_exchange_declareok(self, unused_frame):
'Invoked by pika when RabbitMQ has finished the Exchange.Declare RPC\n command.\n\n :param pika.Frame.Method unused_frame: Exchange.DeclareOk response frame\n\n '
LOGGER.info('Exchange declared')
self.setup_queue(self.QUEUE) | def on_exchange_declareok(self, unused_frame):
'Invoked by pika when RabbitMQ has finished the Exchange.Declare RPC\n command.\n\n :param pika.Frame.Method unused_frame: Exchange.DeclareOk response frame\n\n '
LOGGER.info('Exchange declared')
self.setup_queue(self.QUEUE)<|docstring|>Invoked by pika when RabbitMQ has finished the Exchange.Declare RPC
command.
:param pika.Frame.Method unused_frame: Exchange.DeclareOk response frame<|endoftext|> |
f9e0483ca6fd31c3765507655cf6127b8c9e78b348f272ea15f41e23cc296b7b | def setup_queue(self, queue_name):
'Setup the queue on RabbitMQ by invoking the Queue.Declare RPC\n command. When it is complete, the on_queue_declareok method will\n be invoked by pika.\n\n :param str|unicode queue_name: The name of the queue to declare.\n\n '
LOGGER.info('Declaring queue %s', queue_name)
self._channel.queue_declare(self.on_queue_declareok, queue_name) | Setup the queue on RabbitMQ by invoking the Queue.Declare RPC
command. When it is complete, the on_queue_declareok method will
be invoked by pika.
:param str|unicode queue_name: The name of the queue to declare. | rabbitmq/asyncpub.py | setup_queue | irr/python-labs | 4 | python | def setup_queue(self, queue_name):
'Setup the queue on RabbitMQ by invoking the Queue.Declare RPC\n command. When it is complete, the on_queue_declareok method will\n be invoked by pika.\n\n :param str|unicode queue_name: The name of the queue to declare.\n\n '
LOGGER.info('Declaring queue %s', queue_name)
self._channel.queue_declare(self.on_queue_declareok, queue_name) | def setup_queue(self, queue_name):
'Setup the queue on RabbitMQ by invoking the Queue.Declare RPC\n command. When it is complete, the on_queue_declareok method will\n be invoked by pika.\n\n :param str|unicode queue_name: The name of the queue to declare.\n\n '
LOGGER.info('Declaring queue %s', queue_name)
self._channel.queue_declare(self.on_queue_declareok, queue_name)<|docstring|>Setup the queue on RabbitMQ by invoking the Queue.Declare RPC
command. When it is complete, the on_queue_declareok method will
be invoked by pika.
:param str|unicode queue_name: The name of the queue to declare.<|endoftext|> |
769338b40cce3aebb62ff750b9ba469d17501a86df46d68d8a88fa3838b15998 | def on_queue_declareok(self, method_frame):
'Method invoked by pika when the Queue.Declare RPC call made in\n setup_queue has completed. In this method we will bind the queue\n and exchange together with the routing key by issuing the Queue.Bind\n RPC command. When this command is complete, the on_bindok method will\n be invoked by pika.\n\n :param pika.frame.Method method_frame: The Queue.DeclareOk frame\n\n '
LOGGER.info('Binding %s to %s with %s', self.EXCHANGE, self.QUEUE, self.ROUTING_KEY)
self._channel.queue_bind(self.on_bindok, self.QUEUE, self.EXCHANGE, self.ROUTING_KEY) | Method invoked by pika when the Queue.Declare RPC call made in
setup_queue has completed. In this method we will bind the queue
and exchange together with the routing key by issuing the Queue.Bind
RPC command. When this command is complete, the on_bindok method will
be invoked by pika.
:param pika.frame.Method method_frame: The Queue.DeclareOk frame | rabbitmq/asyncpub.py | on_queue_declareok | irr/python-labs | 4 | python | def on_queue_declareok(self, method_frame):
'Method invoked by pika when the Queue.Declare RPC call made in\n setup_queue has completed. In this method we will bind the queue\n and exchange together with the routing key by issuing the Queue.Bind\n RPC command. When this command is complete, the on_bindok method will\n be invoked by pika.\n\n :param pika.frame.Method method_frame: The Queue.DeclareOk frame\n\n '
LOGGER.info('Binding %s to %s with %s', self.EXCHANGE, self.QUEUE, self.ROUTING_KEY)
self._channel.queue_bind(self.on_bindok, self.QUEUE, self.EXCHANGE, self.ROUTING_KEY) | def on_queue_declareok(self, method_frame):
'Method invoked by pika when the Queue.Declare RPC call made in\n setup_queue has completed. In this method we will bind the queue\n and exchange together with the routing key by issuing the Queue.Bind\n RPC command. When this command is complete, the on_bindok method will\n be invoked by pika.\n\n :param pika.frame.Method method_frame: The Queue.DeclareOk frame\n\n '
LOGGER.info('Binding %s to %s with %s', self.EXCHANGE, self.QUEUE, self.ROUTING_KEY)
self._channel.queue_bind(self.on_bindok, self.QUEUE, self.EXCHANGE, self.ROUTING_KEY)<|docstring|>Method invoked by pika when the Queue.Declare RPC call made in
setup_queue has completed. In this method we will bind the queue
and exchange together with the routing key by issuing the Queue.Bind
RPC command. When this command is complete, the on_bindok method will
be invoked by pika.
:param pika.frame.Method method_frame: The Queue.DeclareOk frame<|endoftext|> |
3f2cca70d9d568d59e69478642be1eec8e1afabc1fcaa38b47bc0487c62c5215 | def on_bindok(self, unused_frame):
"This method is invoked by pika when it receives the Queue.BindOk\n response from RabbitMQ. Since we know we're now setup and bound, it's\n time to start publishing."
LOGGER.info('Queue bound')
self.start_publishing() | This method is invoked by pika when it receives the Queue.BindOk
response from RabbitMQ. Since we know we're now setup and bound, it's
time to start publishing. | rabbitmq/asyncpub.py | on_bindok | irr/python-labs | 4 | python | def on_bindok(self, unused_frame):
"This method is invoked by pika when it receives the Queue.BindOk\n response from RabbitMQ. Since we know we're now setup and bound, it's\n time to start publishing."
LOGGER.info('Queue bound')
self.start_publishing() | def on_bindok(self, unused_frame):
"This method is invoked by pika when it receives the Queue.BindOk\n response from RabbitMQ. Since we know we're now setup and bound, it's\n time to start publishing."
LOGGER.info('Queue bound')
self.start_publishing()<|docstring|>This method is invoked by pika when it receives the Queue.BindOk
response from RabbitMQ. Since we know we're now setup and bound, it's
time to start publishing.<|endoftext|> |
21a7b05e51006e30ea47afcd5e996c897e2ccf0fd468427de3b0dd932956f36d | def start_publishing(self):
'This method will enable delivery confirmations and schedule the\n first message to be sent to RabbitMQ\n\n '
LOGGER.info('Issuing consumer related RPC commands')
self.enable_delivery_confirmations()
self.schedule_next_message() | This method will enable delivery confirmations and schedule the
first message to be sent to RabbitMQ | rabbitmq/asyncpub.py | start_publishing | irr/python-labs | 4 | python | def start_publishing(self):
'This method will enable delivery confirmations and schedule the\n first message to be sent to RabbitMQ\n\n '
LOGGER.info('Issuing consumer related RPC commands')
self.enable_delivery_confirmations()
self.schedule_next_message() | def start_publishing(self):
'This method will enable delivery confirmations and schedule the\n first message to be sent to RabbitMQ\n\n '
LOGGER.info('Issuing consumer related RPC commands')
self.enable_delivery_confirmations()
self.schedule_next_message()<|docstring|>This method will enable delivery confirmations and schedule the
first message to be sent to RabbitMQ<|endoftext|> |
6c8821213d094ac19a0cf8d585ed847e7180c9768ab0b70e592dfaeafa06c674 | def enable_delivery_confirmations(self):
'Send the Confirm.Select RPC method to RabbitMQ to enable delivery\n confirmations on the channel. The only way to turn this off is to close\n the channel and create a new one.\n\n When the message is confirmed from RabbitMQ, the\n on_delivery_confirmation method will be invoked passing in a Basic.Ack\n or Basic.Nack method from RabbitMQ that will indicate which messages it\n is confirming or rejecting.\n\n '
LOGGER.info('Issuing Confirm.Select RPC command')
self._channel.confirm_delivery(self.on_delivery_confirmation) | Send the Confirm.Select RPC method to RabbitMQ to enable delivery
confirmations on the channel. The only way to turn this off is to close
the channel and create a new one.
When the message is confirmed from RabbitMQ, the
on_delivery_confirmation method will be invoked passing in a Basic.Ack
or Basic.Nack method from RabbitMQ that will indicate which messages it
is confirming or rejecting. | rabbitmq/asyncpub.py | enable_delivery_confirmations | irr/python-labs | 4 | python | def enable_delivery_confirmations(self):
'Send the Confirm.Select RPC method to RabbitMQ to enable delivery\n confirmations on the channel. The only way to turn this off is to close\n the channel and create a new one.\n\n When the message is confirmed from RabbitMQ, the\n on_delivery_confirmation method will be invoked passing in a Basic.Ack\n or Basic.Nack method from RabbitMQ that will indicate which messages it\n is confirming or rejecting.\n\n '
LOGGER.info('Issuing Confirm.Select RPC command')
self._channel.confirm_delivery(self.on_delivery_confirmation) | def enable_delivery_confirmations(self):
'Send the Confirm.Select RPC method to RabbitMQ to enable delivery\n confirmations on the channel. The only way to turn this off is to close\n the channel and create a new one.\n\n When the message is confirmed from RabbitMQ, the\n on_delivery_confirmation method will be invoked passing in a Basic.Ack\n or Basic.Nack method from RabbitMQ that will indicate which messages it\n is confirming or rejecting.\n\n '
LOGGER.info('Issuing Confirm.Select RPC command')
self._channel.confirm_delivery(self.on_delivery_confirmation)<|docstring|>Send the Confirm.Select RPC method to RabbitMQ to enable delivery
confirmations on the channel. The only way to turn this off is to close
the channel and create a new one.
When the message is confirmed from RabbitMQ, the
on_delivery_confirmation method will be invoked passing in a Basic.Ack
or Basic.Nack method from RabbitMQ that will indicate which messages it
is confirming or rejecting.<|endoftext|> |
048b25a938db8d15db7766874dcab822bb0bc0ba388d71665ec7139f870a7f47 | def on_delivery_confirmation(self, method_frame):
"Invoked by pika when RabbitMQ responds to a Basic.Publish RPC\n command, passing in either a Basic.Ack or Basic.Nack frame with\n the delivery tag of the message that was published. The delivery tag\n is an integer counter indicating the message number that was sent\n on the channel via Basic.Publish. Here we're just doing house keeping\n to keep track of stats and remove message numbers that we expect\n a delivery confirmation of from the list used to keep track of messages\n that are pending confirmation.\n\n :param pika.frame.Method method_frame: Basic.Ack or Basic.Nack frame\n\n "
confirmation_type = method_frame.method.NAME.split('.')[1].lower()
LOGGER.info('Received %s for delivery tag: %i', confirmation_type, method_frame.method.delivery_tag)
if (confirmation_type == 'ack'):
self._acked += 1
elif (confirmation_type == 'nack'):
self._nacked += 1
self._deliveries.remove(method_frame.method.delivery_tag)
LOGGER.info('Published %i messages, %i have yet to be confirmed, %i were acked and %i were nacked', self._message_number, len(self._deliveries), self._acked, self._nacked) | Invoked by pika when RabbitMQ responds to a Basic.Publish RPC
command, passing in either a Basic.Ack or Basic.Nack frame with
the delivery tag of the message that was published. The delivery tag
is an integer counter indicating the message number that was sent
on the channel via Basic.Publish. Here we're just doing house keeping
to keep track of stats and remove message numbers that we expect
a delivery confirmation of from the list used to keep track of messages
that are pending confirmation.
:param pika.frame.Method method_frame: Basic.Ack or Basic.Nack frame | rabbitmq/asyncpub.py | on_delivery_confirmation | irr/python-labs | 4 | python | def on_delivery_confirmation(self, method_frame):
"Invoked by pika when RabbitMQ responds to a Basic.Publish RPC\n command, passing in either a Basic.Ack or Basic.Nack frame with\n the delivery tag of the message that was published. The delivery tag\n is an integer counter indicating the message number that was sent\n on the channel via Basic.Publish. Here we're just doing house keeping\n to keep track of stats and remove message numbers that we expect\n a delivery confirmation of from the list used to keep track of messages\n that are pending confirmation.\n\n :param pika.frame.Method method_frame: Basic.Ack or Basic.Nack frame\n\n "
confirmation_type = method_frame.method.NAME.split('.')[1].lower()
LOGGER.info('Received %s for delivery tag: %i', confirmation_type, method_frame.method.delivery_tag)
if (confirmation_type == 'ack'):
self._acked += 1
elif (confirmation_type == 'nack'):
self._nacked += 1
self._deliveries.remove(method_frame.method.delivery_tag)
LOGGER.info('Published %i messages, %i have yet to be confirmed, %i were acked and %i were nacked', self._message_number, len(self._deliveries), self._acked, self._nacked) | def on_delivery_confirmation(self, method_frame):
"Invoked by pika when RabbitMQ responds to a Basic.Publish RPC\n command, passing in either a Basic.Ack or Basic.Nack frame with\n the delivery tag of the message that was published. The delivery tag\n is an integer counter indicating the message number that was sent\n on the channel via Basic.Publish. Here we're just doing house keeping\n to keep track of stats and remove message numbers that we expect\n a delivery confirmation of from the list used to keep track of messages\n that are pending confirmation.\n\n :param pika.frame.Method method_frame: Basic.Ack or Basic.Nack frame\n\n "
confirmation_type = method_frame.method.NAME.split('.')[1].lower()
LOGGER.info('Received %s for delivery tag: %i', confirmation_type, method_frame.method.delivery_tag)
if (confirmation_type == 'ack'):
self._acked += 1
elif (confirmation_type == 'nack'):
self._nacked += 1
self._deliveries.remove(method_frame.method.delivery_tag)
LOGGER.info('Published %i messages, %i have yet to be confirmed, %i were acked and %i were nacked', self._message_number, len(self._deliveries), self._acked, self._nacked)<|docstring|>Invoked by pika when RabbitMQ responds to a Basic.Publish RPC
command, passing in either a Basic.Ack or Basic.Nack frame with
the delivery tag of the message that was published. The delivery tag
is an integer counter indicating the message number that was sent
on the channel via Basic.Publish. Here we're just doing house keeping
to keep track of stats and remove message numbers that we expect
a delivery confirmation of from the list used to keep track of messages
that are pending confirmation.
:param pika.frame.Method method_frame: Basic.Ack or Basic.Nack frame<|endoftext|> |
c63b774f3fdcf9b4d2b6210b5eee4c8a56375225877ce593fe3177990495ab58 | def schedule_next_message(self):
'If we are not closing our connection to RabbitMQ, schedule another\n message to be delivered in PUBLISH_INTERVAL seconds.\n\n '
LOGGER.info('Scheduling next message for %0.1f seconds', self.PUBLISH_INTERVAL)
self._connection.add_timeout(self.PUBLISH_INTERVAL, self.publish_message) | If we are not closing our connection to RabbitMQ, schedule another
message to be delivered in PUBLISH_INTERVAL seconds. | rabbitmq/asyncpub.py | schedule_next_message | irr/python-labs | 4 | python | def schedule_next_message(self):
'If we are not closing our connection to RabbitMQ, schedule another\n message to be delivered in PUBLISH_INTERVAL seconds.\n\n '
LOGGER.info('Scheduling next message for %0.1f seconds', self.PUBLISH_INTERVAL)
self._connection.add_timeout(self.PUBLISH_INTERVAL, self.publish_message) | def schedule_next_message(self):
'If we are not closing our connection to RabbitMQ, schedule another\n message to be delivered in PUBLISH_INTERVAL seconds.\n\n '
LOGGER.info('Scheduling next message for %0.1f seconds', self.PUBLISH_INTERVAL)
self._connection.add_timeout(self.PUBLISH_INTERVAL, self.publish_message)<|docstring|>If we are not closing our connection to RabbitMQ, schedule another
message to be delivered in PUBLISH_INTERVAL seconds.<|endoftext|> |
9049e772fad9ec54cbeb2883470e142d005d0e9ba2feb75e6cab936fe2e023ff | def publish_message(self):
'If the class is not stopping, publish a message to RabbitMQ,\n appending a list of deliveries with the message number that was sent.\n This list will be used to check for delivery confirmations in the\n on_delivery_confirmations method.\n\n Once the message has been sent, schedule another message to be sent.\n The main reason I put scheduling in was just so you can get a good idea\n of how the process is flowing by slowing down and speeding up the\n delivery intervals by changing the PUBLISH_INTERVAL constant in the\n class.\n\n '
if ((self._channel is None) or (not self._channel.is_open)):
return
message = {u'مفتاح': u' قيمة', u'键': u'值', u'キー': u'値'}
properties = pika.BasicProperties(app_id='example-publisher', content_type='application/json', headers=message)
self._channel.basic_publish(self.EXCHANGE, self.ROUTING_KEY, json.dumps(message, ensure_ascii=False), properties)
self._message_number += 1
self._deliveries.append(self._message_number)
LOGGER.info('Published message # %i', self._message_number)
self.schedule_next_message() | If the class is not stopping, publish a message to RabbitMQ,
appending a list of deliveries with the message number that was sent.
This list will be used to check for delivery confirmations in the
on_delivery_confirmations method.
Once the message has been sent, schedule another message to be sent.
The main reason I put scheduling in was just so you can get a good idea
of how the process is flowing by slowing down and speeding up the
delivery intervals by changing the PUBLISH_INTERVAL constant in the
class. | rabbitmq/asyncpub.py | publish_message | irr/python-labs | 4 | python | def publish_message(self):
'If the class is not stopping, publish a message to RabbitMQ,\n appending a list of deliveries with the message number that was sent.\n This list will be used to check for delivery confirmations in the\n on_delivery_confirmations method.\n\n Once the message has been sent, schedule another message to be sent.\n The main reason I put scheduling in was just so you can get a good idea\n of how the process is flowing by slowing down and speeding up the\n delivery intervals by changing the PUBLISH_INTERVAL constant in the\n class.\n\n '
if ((self._channel is None) or (not self._channel.is_open)):
return
message = {u'مفتاح': u' قيمة', u'键': u'值', u'キー': u'値'}
properties = pika.BasicProperties(app_id='example-publisher', content_type='application/json', headers=message)
self._channel.basic_publish(self.EXCHANGE, self.ROUTING_KEY, json.dumps(message, ensure_ascii=False), properties)
self._message_number += 1
self._deliveries.append(self._message_number)
LOGGER.info('Published message # %i', self._message_number)
self.schedule_next_message() | def publish_message(self):
'If the class is not stopping, publish a message to RabbitMQ,\n appending a list of deliveries with the message number that was sent.\n This list will be used to check for delivery confirmations in the\n on_delivery_confirmations method.\n\n Once the message has been sent, schedule another message to be sent.\n The main reason I put scheduling in was just so you can get a good idea\n of how the process is flowing by slowing down and speeding up the\n delivery intervals by changing the PUBLISH_INTERVAL constant in the\n class.\n\n '
if ((self._channel is None) or (not self._channel.is_open)):
return
message = {u'مفتاح': u' قيمة', u'键': u'值', u'キー': u'値'}
properties = pika.BasicProperties(app_id='example-publisher', content_type='application/json', headers=message)
self._channel.basic_publish(self.EXCHANGE, self.ROUTING_KEY, json.dumps(message, ensure_ascii=False), properties)
self._message_number += 1
self._deliveries.append(self._message_number)
LOGGER.info('Published message # %i', self._message_number)
self.schedule_next_message()<|docstring|>If the class is not stopping, publish a message to RabbitMQ,
appending a list of deliveries with the message number that was sent.
This list will be used to check for delivery confirmations in the
on_delivery_confirmations method.
Once the message has been sent, schedule another message to be sent.
The main reason I put scheduling in was just so you can get a good idea
of how the process is flowing by slowing down and speeding up the
delivery intervals by changing the PUBLISH_INTERVAL constant in the
class.<|endoftext|> |
ee574f0da2abb01749aae9907f42bfd3413656f24509ea57eaf115269933c91e | def run(self):
'Run the example code by connecting and then starting the IOLoop.\n\n '
while (not self._stopping):
self._connection = None
self._deliveries = []
self._acked = 0
self._nacked = 0
self._message_number = 0
try:
self._connection = self.connect()
self._connection.ioloop.start()
except KeyboardInterrupt:
self.stop()
if ((self._connection is not None) and (not self._connection.is_closed)):
self._connection.ioloop.start()
LOGGER.info('Stopped') | Run the example code by connecting and then starting the IOLoop. | rabbitmq/asyncpub.py | run | irr/python-labs | 4 | python | def run(self):
'\n\n '
while (not self._stopping):
self._connection = None
self._deliveries = []
self._acked = 0
self._nacked = 0
self._message_number = 0
try:
self._connection = self.connect()
self._connection.ioloop.start()
except KeyboardInterrupt:
self.stop()
if ((self._connection is not None) and (not self._connection.is_closed)):
self._connection.ioloop.start()
LOGGER.info('Stopped') | def run(self):
'\n\n '
while (not self._stopping):
self._connection = None
self._deliveries = []
self._acked = 0
self._nacked = 0
self._message_number = 0
try:
self._connection = self.connect()
self._connection.ioloop.start()
except KeyboardInterrupt:
self.stop()
if ((self._connection is not None) and (not self._connection.is_closed)):
self._connection.ioloop.start()
LOGGER.info('Stopped')<|docstring|>Run the example code by connecting and then starting the IOLoop.<|endoftext|> |
1fca1bca13cef8ea47f0a4c4dc10bdf30aa5ba35adc0fd236ceb78af1a5d5a3e | def stop(self):
'Stop the example by closing the channel and connection. We\n set a flag here so that we stop scheduling new messages to be\n published. The IOLoop is started because this method is\n invoked by the Try/Catch below when KeyboardInterrupt is caught.\n Starting the IOLoop again will allow the publisher to cleanly\n disconnect from RabbitMQ.\n\n '
LOGGER.info('Stopping')
self._stopping = True
self.close_channel()
self.close_connection() | Stop the example by closing the channel and connection. We
set a flag here so that we stop scheduling new messages to be
published. The IOLoop is started because this method is
invoked by the Try/Catch below when KeyboardInterrupt is caught.
Starting the IOLoop again will allow the publisher to cleanly
disconnect from RabbitMQ. | rabbitmq/asyncpub.py | stop | irr/python-labs | 4 | python | def stop(self):
'Stop the example by closing the channel and connection. We\n set a flag here so that we stop scheduling new messages to be\n published. The IOLoop is started because this method is\n invoked by the Try/Catch below when KeyboardInterrupt is caught.\n Starting the IOLoop again will allow the publisher to cleanly\n disconnect from RabbitMQ.\n\n '
LOGGER.info('Stopping')
self._stopping = True
self.close_channel()
self.close_connection() | def stop(self):
'Stop the example by closing the channel and connection. We\n set a flag here so that we stop scheduling new messages to be\n published. The IOLoop is started because this method is\n invoked by the Try/Catch below when KeyboardInterrupt is caught.\n Starting the IOLoop again will allow the publisher to cleanly\n disconnect from RabbitMQ.\n\n '
LOGGER.info('Stopping')
self._stopping = True
self.close_channel()
self.close_connection()<|docstring|>Stop the example by closing the channel and connection. We
set a flag here so that we stop scheduling new messages to be
published. The IOLoop is started because this method is
invoked by the Try/Catch below when KeyboardInterrupt is caught.
Starting the IOLoop again will allow the publisher to cleanly
disconnect from RabbitMQ.<|endoftext|> |
f6fb2d83ef95fbf5ea84a371347bef0595e9482223d3b8c3356a319bbd8cfc8c | def close_channel(self):
'Invoke this command to close the channel with RabbitMQ by sending\n the Channel.Close RPC command.\n\n '
if (self._channel is not None):
LOGGER.info('Closing the channel')
self._channel.close() | Invoke this command to close the channel with RabbitMQ by sending
the Channel.Close RPC command. | rabbitmq/asyncpub.py | close_channel | irr/python-labs | 4 | python | def close_channel(self):
'Invoke this command to close the channel with RabbitMQ by sending\n the Channel.Close RPC command.\n\n '
if (self._channel is not None):
LOGGER.info('Closing the channel')
self._channel.close() | def close_channel(self):
'Invoke this command to close the channel with RabbitMQ by sending\n the Channel.Close RPC command.\n\n '
if (self._channel is not None):
LOGGER.info('Closing the channel')
self._channel.close()<|docstring|>Invoke this command to close the channel with RabbitMQ by sending
the Channel.Close RPC command.<|endoftext|> |
9118492884dd3c05f4f601a4e1a58738d27c6272898a3ebd0f9cebfee9c3d988 | def close_connection(self):
'This method closes the connection to RabbitMQ.'
if (self._connection is not None):
LOGGER.info('Closing connection')
self._connection.close() | This method closes the connection to RabbitMQ. | rabbitmq/asyncpub.py | close_connection | irr/python-labs | 4 | python | def close_connection(self):
if (self._connection is not None):
LOGGER.info('Closing connection')
self._connection.close() | def close_connection(self):
if (self._connection is not None):
LOGGER.info('Closing connection')
self._connection.close()<|docstring|>This method closes the connection to RabbitMQ.<|endoftext|> |
c55aa5a653e96763264f663b1b663ef26e9683c94e6bc51c9103a9dc38d43171 | def save_secret_key(key: str) -> str:
'\n Saves the secret key to the .env file.\n '
env_file_path = _get_env_file_path()
print('Secret key does not exist; saving to', env_file_path, file=sys.stderr)
with env_file_path.open('a', encoding='UTF-8') as env_file:
env_file.write(f'''SECRET_KEY={key}
''')
return key | Saves the secret key to the .env file. | src/morphodict/site/save_secret_key.py | save_secret_key | sarahrmoeller/morphodict | 8 | python | def save_secret_key(key: str) -> str:
'\n \n '
env_file_path = _get_env_file_path()
print('Secret key does not exist; saving to', env_file_path, file=sys.stderr)
with env_file_path.open('a', encoding='UTF-8') as env_file:
env_file.write(f'SECRET_KEY={key}
')
return key | def save_secret_key(key: str) -> str:
'\n \n '
env_file_path = _get_env_file_path()
print('Secret key does not exist; saving to', env_file_path, file=sys.stderr)
with env_file_path.open('a', encoding='UTF-8') as env_file:
env_file.write(f'SECRET_KEY={key}
')
return key<|docstring|>Saves the secret key to the .env file.<|endoftext|> |
85649fe9c4b474160971b69854e5a3d498fb82188f076b75621f848c4727f684 | def _get_env_file_path() -> Path:
'\n Return the path to the .env file at the root of the repository\n '
if ('MORPHODICT_ENV_FILE_PATH' in os.environ):
return Path(os.environ['MORPHODICT_ENV_FILE_PATH'])
path = Path(__file__)
while ((path != path.parent) and (not (path / 'pyproject.toml').is_file())):
path = path.parent
assert (path / 'pyproject.toml').is_file()
base_dir = path
return (base_dir / '.env') | Return the path to the .env file at the root of the repository | src/morphodict/site/save_secret_key.py | _get_env_file_path | sarahrmoeller/morphodict | 8 | python | def _get_env_file_path() -> Path:
'\n \n '
if ('MORPHODICT_ENV_FILE_PATH' in os.environ):
return Path(os.environ['MORPHODICT_ENV_FILE_PATH'])
path = Path(__file__)
while ((path != path.parent) and (not (path / 'pyproject.toml').is_file())):
path = path.parent
assert (path / 'pyproject.toml').is_file()
base_dir = path
return (base_dir / '.env') | def _get_env_file_path() -> Path:
'\n \n '
if ('MORPHODICT_ENV_FILE_PATH' in os.environ):
return Path(os.environ['MORPHODICT_ENV_FILE_PATH'])
path = Path(__file__)
while ((path != path.parent) and (not (path / 'pyproject.toml').is_file())):
path = path.parent
assert (path / 'pyproject.toml').is_file()
base_dir = path
return (base_dir / '.env')<|docstring|>Return the path to the .env file at the root of the repository<|endoftext|> |
668736639075ad9a23975048d456ca2089c58b23b4daf4eb50ba0d0ac9aec771 | def convert_to_datetime(line):
'TODO 1:\n Given a log line extract its timestamp and convert it to a datetime object. \n For example calling the function with:\n INFO 2014-07-03T23:27:51 supybot Shutdown complete.\n returns:\n datetime(2014, 7, 3, 23, 27, 51)'
(a, b, c, d, e, f) = (int(digit) for digit in re.findall('\\d+', line.split()[1]))
return datetime(a, b, c, d, e, f) | TODO 1:
Given a log line extract its timestamp and convert it to a datetime object.
For example calling the function with:
INFO 2014-07-03T23:27:51 supybot Shutdown complete.
returns:
datetime(2014, 7, 3, 23, 27, 51) | days/01-03-datetimes/code/parse_date_from_logs.py | convert_to_datetime | VladimirLind/100daysofcode-with-python-course | 0 | python | def convert_to_datetime(line):
'TODO 1:\n Given a log line extract its timestamp and convert it to a datetime object. \n For example calling the function with:\n INFO 2014-07-03T23:27:51 supybot Shutdown complete.\n returns:\n datetime(2014, 7, 3, 23, 27, 51)'
(a, b, c, d, e, f) = (int(digit) for digit in re.findall('\\d+', line.split()[1]))
return datetime(a, b, c, d, e, f) | def convert_to_datetime(line):
'TODO 1:\n Given a log line extract its timestamp and convert it to a datetime object. \n For example calling the function with:\n INFO 2014-07-03T23:27:51 supybot Shutdown complete.\n returns:\n datetime(2014, 7, 3, 23, 27, 51)'
(a, b, c, d, e, f) = (int(digit) for digit in re.findall('\\d+', line.split()[1]))
return datetime(a, b, c, d, e, f)<|docstring|>TODO 1:
Given a log line extract its timestamp and convert it to a datetime object.
For example calling the function with:
INFO 2014-07-03T23:27:51 supybot Shutdown complete.
returns:
datetime(2014, 7, 3, 23, 27, 51)<|endoftext|> |
01b174f2a1192ef0deaecf80465174143750dab095810ed8bab7d74ed159e026 | def time_between_shutdowns(loglines):
'TODO 2:\n Extract shutdown events ("Shutdown initiated") from loglines and calculate the \n timedelta between the first and last one. \n Return this datetime.timedelta object.'
shutdown_timestamps = [convert_to_datetime(line) for line in loglines if ('Shutdown initiated' in line)]
return (shutdown_timestamps[1] - shutdown_timestamps[0]) | TODO 2:
Extract shutdown events ("Shutdown initiated") from loglines and calculate the
timedelta between the first and last one.
Return this datetime.timedelta object. | days/01-03-datetimes/code/parse_date_from_logs.py | time_between_shutdowns | VladimirLind/100daysofcode-with-python-course | 0 | python | def time_between_shutdowns(loglines):
'TODO 2:\n Extract shutdown events ("Shutdown initiated") from loglines and calculate the \n timedelta between the first and last one. \n Return this datetime.timedelta object.'
shutdown_timestamps = [convert_to_datetime(line) for line in loglines if ('Shutdown initiated' in line)]
return (shutdown_timestamps[1] - shutdown_timestamps[0]) | def time_between_shutdowns(loglines):
'TODO 2:\n Extract shutdown events ("Shutdown initiated") from loglines and calculate the \n timedelta between the first and last one. \n Return this datetime.timedelta object.'
shutdown_timestamps = [convert_to_datetime(line) for line in loglines if ('Shutdown initiated' in line)]
return (shutdown_timestamps[1] - shutdown_timestamps[0])<|docstring|>TODO 2:
Extract shutdown events ("Shutdown initiated") from loglines and calculate the
timedelta between the first and last one.
Return this datetime.timedelta object.<|endoftext|> |
fb8ba642be489dc65f8a615331195f3cb45200d11e8197100dd0a78a7771dab6 | def build_python_file(self):
'\n Build a simple Python "program" that ctags can use.\n\n :returns: Path to a constructed, valid Python source file\n '
path = ''
with tempfile.NamedTemporaryFile(delete=False, suffix='.py') as temp:
try:
path = temp.name
temp.writelines([b'def my_definition():\n', b'\toutput = "Hello, world!"\n', b'\tprint(output)\n'])
finally:
temp.close()
return path | Build a simple Python "program" that ctags can use.
:returns: Path to a constructed, valid Python source file | .sublime/Packages/CTags/test_ctags.py | build_python_file | teedoo/dotfiles | 1 | python | def build_python_file(self):
'\n Build a simple Python "program" that ctags can use.\n\n :returns: Path to a constructed, valid Python source file\n '
path =
with tempfile.NamedTemporaryFile(delete=False, suffix='.py') as temp:
try:
path = temp.name
temp.writelines([b'def my_definition():\n', b'\toutput = "Hello, world!"\n', b'\tprint(output)\n'])
finally:
temp.close()
return path | def build_python_file(self):
'\n Build a simple Python "program" that ctags can use.\n\n :returns: Path to a constructed, valid Python source file\n '
path =
with tempfile.NamedTemporaryFile(delete=False, suffix='.py') as temp:
try:
path = temp.name
temp.writelines([b'def my_definition():\n', b'\toutput = "Hello, world!"\n', b'\tprint(output)\n'])
finally:
temp.close()
return path<|docstring|>Build a simple Python "program" that ctags can use.
:returns: Path to a constructed, valid Python source file<|endoftext|> |
209c3f84f3a834508fdefd2fb1b2ef532e058340c3ba6e5cb6145a45297afd99 | def build_python_file__extended(self):
'\n Build a Python "program" demonstrating all common CTag types\n\n Build a Python program that demonstrates the following CTag types:\n - ``f`` - function definitions\n - ``v`` - variable definitions\n - ``c`` - classes\n - ``m`` - class, struct, and union members\n - ``i`` - import\n\n This is mainly intended to regression test for issue #209.\n\n :returns: Path to a constructed, valid Python source file\n '
path = ''
with tempfile.NamedTemporaryFile(delete=False, suffix='.py') as temp:
try:
path = temp.name
temp.writelines([b'import os\n', b'\n', b'COLOR_RED = "\\c800080FF;"\t#red\n', b'\n', b'def my_function(first_name):\n', b'\tprint("Hello {0}".format(first_name))\n', b'\n', b'class MyClass(object):\n', b'\tlast_name = None\n', b'\taddress = None\t# comment preceded by a tab\n', b'\n', b'\tdef my_method(self, last_name):\n', b'\t\tself.last_name = last_name\n', b'\t\tprint("Hello again, {0}".format(self.last_name))\n'])
finally:
temp.close()
return path | Build a Python "program" demonstrating all common CTag types
Build a Python program that demonstrates the following CTag types:
- ``f`` - function definitions
- ``v`` - variable definitions
- ``c`` - classes
- ``m`` - class, struct, and union members
- ``i`` - import
This is mainly intended to regression test for issue #209.
:returns: Path to a constructed, valid Python source file | .sublime/Packages/CTags/test_ctags.py | build_python_file__extended | teedoo/dotfiles | 1 | python | def build_python_file__extended(self):
'\n Build a Python "program" demonstrating all common CTag types\n\n Build a Python program that demonstrates the following CTag types:\n - ``f`` - function definitions\n - ``v`` - variable definitions\n - ``c`` - classes\n - ``m`` - class, struct, and union members\n - ``i`` - import\n\n This is mainly intended to regression test for issue #209.\n\n :returns: Path to a constructed, valid Python source file\n '
path =
with tempfile.NamedTemporaryFile(delete=False, suffix='.py') as temp:
try:
path = temp.name
temp.writelines([b'import os\n', b'\n', b'COLOR_RED = "\\c800080FF;"\t#red\n', b'\n', b'def my_function(first_name):\n', b'\tprint("Hello {0}".format(first_name))\n', b'\n', b'class MyClass(object):\n', b'\tlast_name = None\n', b'\taddress = None\t# comment preceded by a tab\n', b'\n', b'\tdef my_method(self, last_name):\n', b'\t\tself.last_name = last_name\n', b'\t\tprint("Hello again, {0}".format(self.last_name))\n'])
finally:
temp.close()
return path | def build_python_file__extended(self):
'\n Build a Python "program" demonstrating all common CTag types\n\n Build a Python program that demonstrates the following CTag types:\n - ``f`` - function definitions\n - ``v`` - variable definitions\n - ``c`` - classes\n - ``m`` - class, struct, and union members\n - ``i`` - import\n\n This is mainly intended to regression test for issue #209.\n\n :returns: Path to a constructed, valid Python source file\n '
path =
with tempfile.NamedTemporaryFile(delete=False, suffix='.py') as temp:
try:
path = temp.name
temp.writelines([b'import os\n', b'\n', b'COLOR_RED = "\\c800080FF;"\t#red\n', b'\n', b'def my_function(first_name):\n', b'\tprint("Hello {0}".format(first_name))\n', b'\n', b'class MyClass(object):\n', b'\tlast_name = None\n', b'\taddress = None\t# comment preceded by a tab\n', b'\n', b'\tdef my_method(self, last_name):\n', b'\t\tself.last_name = last_name\n', b'\t\tprint("Hello again, {0}".format(self.last_name))\n'])
finally:
temp.close()
return path<|docstring|>Build a Python "program" demonstrating all common CTag types
Build a Python program that demonstrates the following CTag types:
- ``f`` - function definitions
- ``v`` - variable definitions
- ``c`` - classes
- ``m`` - class, struct, and union members
- ``i`` - import
This is mainly intended to regression test for issue #209.
:returns: Path to a constructed, valid Python source file<|endoftext|> |
fd71a8f6bebe83a5946d9c5b82f671add1daa0e7e239127f81020fcb9aeb9d6e | def build_java_file(self):
'\n Build a slightly detailed Java "program" that ctags can use.\n\n Build a slightly more detailed program that \'build_python_file\' does,\n in order to test more advanced functionality of ctags.py, or ctags.exe\n\n :returns: Path to a constructed, valid Java source file\n '
path = ''
with tempfile.NamedTemporaryFile(delete=False, suffix='.java') as temp:
try:
path = temp.name
temp.writelines([b'public class DemoClass {\n', b'\tpublic static void main(String args[]) {\n', b'\t\tSystem.out.println("Hello, World");\n', b'\n', b'\t\tDemoClass demo = new DemoClass();\n', b'\t\tSystem.out.printf("Sum %d\n", demo.getSum(5,6));\n', b'\t}\n', b'\n', b'\tprivate int getSum(int a, int b) {\n', b'\t\treturn (a + b);\n', b'\t}\n', b'}\n'])
finally:
temp.close()
return path | Build a slightly detailed Java "program" that ctags can use.
Build a slightly more detailed program that 'build_python_file' does,
in order to test more advanced functionality of ctags.py, or ctags.exe
:returns: Path to a constructed, valid Java source file | .sublime/Packages/CTags/test_ctags.py | build_java_file | teedoo/dotfiles | 1 | python | def build_java_file(self):
'\n Build a slightly detailed Java "program" that ctags can use.\n\n Build a slightly more detailed program that \'build_python_file\' does,\n in order to test more advanced functionality of ctags.py, or ctags.exe\n\n :returns: Path to a constructed, valid Java source file\n '
path =
with tempfile.NamedTemporaryFile(delete=False, suffix='.java') as temp:
try:
path = temp.name
temp.writelines([b'public class DemoClass {\n', b'\tpublic static void main(String args[]) {\n', b'\t\tSystem.out.println("Hello, World");\n', b'\n', b'\t\tDemoClass demo = new DemoClass();\n', b'\t\tSystem.out.printf("Sum %d\n", demo.getSum(5,6));\n', b'\t}\n', b'\n', b'\tprivate int getSum(int a, int b) {\n', b'\t\treturn (a + b);\n', b'\t}\n', b'}\n'])
finally:
temp.close()
return path | def build_java_file(self):
'\n Build a slightly detailed Java "program" that ctags can use.\n\n Build a slightly more detailed program that \'build_python_file\' does,\n in order to test more advanced functionality of ctags.py, or ctags.exe\n\n :returns: Path to a constructed, valid Java source file\n '
path =
with tempfile.NamedTemporaryFile(delete=False, suffix='.java') as temp:
try:
path = temp.name
temp.writelines([b'public class DemoClass {\n', b'\tpublic static void main(String args[]) {\n', b'\t\tSystem.out.println("Hello, World");\n', b'\n', b'\t\tDemoClass demo = new DemoClass();\n', b'\t\tSystem.out.printf("Sum %d\n", demo.getSum(5,6));\n', b'\t}\n', b'\n', b'\tprivate int getSum(int a, int b) {\n', b'\t\treturn (a + b);\n', b'\t}\n', b'}\n'])
finally:
temp.close()
return path<|docstring|>Build a slightly detailed Java "program" that ctags can use.
Build a slightly more detailed program that 'build_python_file' does,
in order to test more advanced functionality of ctags.py, or ctags.exe
:returns: Path to a constructed, valid Java source file<|endoftext|> |
83ad53d956a3440c768eb56026d733f7b0473dbeb34c77c2cfdab31133867afd | def build_c_file(self):
'\n Build a simple C "program" that ctags can use.\n\n This is mainly intended to regression test for issue #213.\n\n :returns: Path to a constructed, valid C source file\n '
path = ''
with tempfile.NamedTemporaryFile(delete=False, suffix='.c') as temp:
try:
path = temp.name
temp.writelines([b'#define foo(x,y) x+y\n#define foobar 1\n\nvoid bar()\n{\n\tfoo(10,2);\n#if foobar\n\tfoo(2,3); \n}\n'])
finally:
temp.close()
return path | Build a simple C "program" that ctags can use.
This is mainly intended to regression test for issue #213.
:returns: Path to a constructed, valid C source file | .sublime/Packages/CTags/test_ctags.py | build_c_file | teedoo/dotfiles | 1 | python | def build_c_file(self):
'\n Build a simple C "program" that ctags can use.\n\n This is mainly intended to regression test for issue #213.\n\n :returns: Path to a constructed, valid C source file\n '
path =
with tempfile.NamedTemporaryFile(delete=False, suffix='.c') as temp:
try:
path = temp.name
temp.writelines([b'#define foo(x,y) x+y\n#define foobar 1\n\nvoid bar()\n{\n\tfoo(10,2);\n#if foobar\n\tfoo(2,3); \n}\n'])
finally:
temp.close()
return path | def build_c_file(self):
'\n Build a simple C "program" that ctags can use.\n\n This is mainly intended to regression test for issue #213.\n\n :returns: Path to a constructed, valid C source file\n '
path =
with tempfile.NamedTemporaryFile(delete=False, suffix='.c') as temp:
try:
path = temp.name
temp.writelines([b'#define foo(x,y) x+y\n#define foobar 1\n\nvoid bar()\n{\n\tfoo(10,2);\n#if foobar\n\tfoo(2,3); \n}\n'])
finally:
temp.close()
return path<|docstring|>Build a simple C "program" that ctags can use.
This is mainly intended to regression test for issue #213.
:returns: Path to a constructed, valid C source file<|endoftext|> |
a3772bfa6569c2500b61884163fe86bdd293a791464ed5a210d02208163d5629 | def setUp(self):
'\n Set up test environment.\n\n Ensures the ``ctags_not_on_path`` test is run first, and all other\n tests are skipped if this fails. If ctags is not installed, no test\n will pass.\n '
self.test_build_ctags__ctags_on_path() | Set up test environment.
Ensures the ``ctags_not_on_path`` test is run first, and all other
tests are skipped if this fails. If ctags is not installed, no test
will pass. | .sublime/Packages/CTags/test_ctags.py | setUp | teedoo/dotfiles | 1 | python | def setUp(self):
'\n Set up test environment.\n\n Ensures the ``ctags_not_on_path`` test is run first, and all other\n tests are skipped if this fails. If ctags is not installed, no test\n will pass.\n '
self.test_build_ctags__ctags_on_path() | def setUp(self):
'\n Set up test environment.\n\n Ensures the ``ctags_not_on_path`` test is run first, and all other\n tests are skipped if this fails. If ctags is not installed, no test\n will pass.\n '
self.test_build_ctags__ctags_on_path()<|docstring|>Set up test environment.
Ensures the ``ctags_not_on_path`` test is run first, and all other
tests are skipped if this fails. If ctags is not installed, no test
will pass.<|endoftext|> |
437c27110faacfa159a253cd1489c8acd2ea29d5f61648c2831ed99feeeeff5d | def test_build_ctags__ctags_on_path(self):
'\n Checks that ``ctags`` is in ``PATH``.\n '
with tempfile.NamedTemporaryFile() as temp:
try:
ctags.build_ctags(path=temp.name)
except EnvironmentError:
self.fail('build_ctags() raised EnvironmentError. ctags not on path') | Checks that ``ctags`` is in ``PATH``. | .sublime/Packages/CTags/test_ctags.py | test_build_ctags__ctags_on_path | teedoo/dotfiles | 1 | python | def test_build_ctags__ctags_on_path(self):
'\n \n '
with tempfile.NamedTemporaryFile() as temp:
try:
ctags.build_ctags(path=temp.name)
except EnvironmentError:
self.fail('build_ctags() raised EnvironmentError. ctags not on path') | def test_build_ctags__ctags_on_path(self):
'\n \n '
with tempfile.NamedTemporaryFile() as temp:
try:
ctags.build_ctags(path=temp.name)
except EnvironmentError:
self.fail('build_ctags() raised EnvironmentError. ctags not on path')<|docstring|>Checks that ``ctags`` is in ``PATH``.<|endoftext|> |
cf249b95e6431384c71a172fd7e25fee8b1e563e5449075aa44ea91eaca83d47 | def test_build_ctags__custom_command(self):
'\n Checks for support of simple custom command to execute ctags.\n '
with tempfile.NamedTemporaryFile() as temp:
try:
ctags.build_ctags(path=temp.name, cmd='ctags')
except EnvironmentError:
self.fail('build_ctags() raised EnvironmentError. ctags not on path') | Checks for support of simple custom command to execute ctags. | .sublime/Packages/CTags/test_ctags.py | test_build_ctags__custom_command | teedoo/dotfiles | 1 | python | def test_build_ctags__custom_command(self):
'\n \n '
with tempfile.NamedTemporaryFile() as temp:
try:
ctags.build_ctags(path=temp.name, cmd='ctags')
except EnvironmentError:
self.fail('build_ctags() raised EnvironmentError. ctags not on path') | def test_build_ctags__custom_command(self):
'\n \n '
with tempfile.NamedTemporaryFile() as temp:
try:
ctags.build_ctags(path=temp.name, cmd='ctags')
except EnvironmentError:
self.fail('build_ctags() raised EnvironmentError. ctags not on path')<|docstring|>Checks for support of simple custom command to execute ctags.<|endoftext|> |
76f677f4b59beba90d815c2634a335633383e5052b69adf0695f4f39517744be | def test_build_ctags__invalid_custom_command(self):
'\n Checks for failure for invalid custom command to execute ctags.\n '
with tempfile.NamedTemporaryFile() as temp:
with self.assertRaises(CalledProcessError):
ctags.build_ctags(path=temp.name, cmd='ccttaaggss') | Checks for failure for invalid custom command to execute ctags. | .sublime/Packages/CTags/test_ctags.py | test_build_ctags__invalid_custom_command | teedoo/dotfiles | 1 | python | def test_build_ctags__invalid_custom_command(self):
'\n \n '
with tempfile.NamedTemporaryFile() as temp:
with self.assertRaises(CalledProcessError):
ctags.build_ctags(path=temp.name, cmd='ccttaaggss') | def test_build_ctags__invalid_custom_command(self):
'\n \n '
with tempfile.NamedTemporaryFile() as temp:
with self.assertRaises(CalledProcessError):
ctags.build_ctags(path=temp.name, cmd='ccttaaggss')<|docstring|>Checks for failure for invalid custom command to execute ctags.<|endoftext|> |
d81b4f63f4966673745a26854d50c3444674adb7ddcaa53bd40785c1cad58e08 | def test_build_ctags__single_file(self):
'\n Test execution of ctags using a single temporary file.\n '
path = self.build_python_file()
tag_file = ctags.build_ctags(path=path)
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
self.assertEqual(content[(- 1)], 'my_definition\t{0}\t/^def my_definition():$/;"\tf{1}'.format(filename, os.linesep))
finally:
output.close()
os.remove(path)
os.remove(tag_file) | Test execution of ctags using a single temporary file. | .sublime/Packages/CTags/test_ctags.py | test_build_ctags__single_file | teedoo/dotfiles | 1 | python | def test_build_ctags__single_file(self):
'\n \n '
path = self.build_python_file()
tag_file = ctags.build_ctags(path=path)
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
self.assertEqual(content[(- 1)], 'my_definition\t{0}\t/^def my_definition():$/;"\tf{1}'.format(filename, os.linesep))
finally:
output.close()
os.remove(path)
os.remove(tag_file) | def test_build_ctags__single_file(self):
'\n \n '
path = self.build_python_file()
tag_file = ctags.build_ctags(path=path)
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
self.assertEqual(content[(- 1)], 'my_definition\t{0}\t/^def my_definition():$/;"\tf{1}'.format(filename, os.linesep))
finally:
output.close()
os.remove(path)
os.remove(tag_file)<|docstring|>Test execution of ctags using a single temporary file.<|endoftext|> |
7fcf9e0c02ab3be9a99be410802cbee6e699e532acae651988614930b5a5fb30 | def test_build_ctags__custom_tag_file(self):
'\n Test execution of ctags using a custom tag file.\n '
path = self.build_python_file()
tag_file = ctags.build_ctags(path=path, tag_file='my_tag_file')
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
self.assertEqual(content[(- 1)], 'my_definition\t{0}\t/^def my_definition():$/;"\tf{1}'.format(filename, os.linesep))
finally:
output.close()
os.remove(path)
os.remove(tag_file) | Test execution of ctags using a custom tag file. | .sublime/Packages/CTags/test_ctags.py | test_build_ctags__custom_tag_file | teedoo/dotfiles | 1 | python | def test_build_ctags__custom_tag_file(self):
'\n \n '
path = self.build_python_file()
tag_file = ctags.build_ctags(path=path, tag_file='my_tag_file')
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
self.assertEqual(content[(- 1)], 'my_definition\t{0}\t/^def my_definition():$/;"\tf{1}'.format(filename, os.linesep))
finally:
output.close()
os.remove(path)
os.remove(tag_file) | def test_build_ctags__custom_tag_file(self):
'\n \n '
path = self.build_python_file()
tag_file = ctags.build_ctags(path=path, tag_file='my_tag_file')
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
self.assertEqual(content[(- 1)], 'my_definition\t{0}\t/^def my_definition():$/;"\tf{1}'.format(filename, os.linesep))
finally:
output.close()
os.remove(path)
os.remove(tag_file)<|docstring|>Test execution of ctags using a custom tag file.<|endoftext|> |
be89983a7391158090ffa511d32295864a5fea9ba0d8bffc4406ab1f39a25984 | def test_build_ctags__additional_options(self):
'\n Test execution of ctags using additional ctags options.\n '
path = self.build_python_file()
tag_file = ctags.build_ctags(path=path, tag_file='my_tag_file', opts='--language-force=java')
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
self.assertEqual(content[(- 1)][:2], '!_')
finally:
output.close()
os.remove(path)
os.remove(tag_file) | Test execution of ctags using additional ctags options. | .sublime/Packages/CTags/test_ctags.py | test_build_ctags__additional_options | teedoo/dotfiles | 1 | python | def test_build_ctags__additional_options(self):
'\n \n '
path = self.build_python_file()
tag_file = ctags.build_ctags(path=path, tag_file='my_tag_file', opts='--language-force=java')
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
self.assertEqual(content[(- 1)][:2], '!_')
finally:
output.close()
os.remove(path)
os.remove(tag_file) | def test_build_ctags__additional_options(self):
'\n \n '
path = self.build_python_file()
tag_file = ctags.build_ctags(path=path, tag_file='my_tag_file', opts='--language-force=java')
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
self.assertEqual(content[(- 1)][:2], '!_')
finally:
output.close()
os.remove(path)
os.remove(tag_file)<|docstring|>Test execution of ctags using additional ctags options.<|endoftext|> |
e2189d83a0de90a8ced09899d08b490fa97299f6ffb6b426808fb5517eb0bd3d | def test_post_process_tag__line_numbers(self):
'\n Test ``post_process_tag`` with a line number ``excmd`` variable.\n\n Test function with an sample tag from a Python file. This in turn tests\n the supporting functions.\n '
tag = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'ex_command': '99', 'type': 'f', 'fields': None}
expected_output = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'tag_path': ('.\\a_folder\\a_script.py', 'acme_function'), 'ex_command': '99', 'type': 'f', 'fields': None}
result = ctags.post_process_tag(tag)
self.assertEqual(result, expected_output) | Test ``post_process_tag`` with a line number ``excmd`` variable.
Test function with an sample tag from a Python file. This in turn tests
the supporting functions. | .sublime/Packages/CTags/test_ctags.py | test_post_process_tag__line_numbers | teedoo/dotfiles | 1 | python | def test_post_process_tag__line_numbers(self):
'\n Test ``post_process_tag`` with a line number ``excmd`` variable.\n\n Test function with an sample tag from a Python file. This in turn tests\n the supporting functions.\n '
tag = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'ex_command': '99', 'type': 'f', 'fields': None}
expected_output = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'tag_path': ('.\\a_folder\\a_script.py', 'acme_function'), 'ex_command': '99', 'type': 'f', 'fields': None}
result = ctags.post_process_tag(tag)
self.assertEqual(result, expected_output) | def test_post_process_tag__line_numbers(self):
'\n Test ``post_process_tag`` with a line number ``excmd`` variable.\n\n Test function with an sample tag from a Python file. This in turn tests\n the supporting functions.\n '
tag = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'ex_command': '99', 'type': 'f', 'fields': None}
expected_output = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'tag_path': ('.\\a_folder\\a_script.py', 'acme_function'), 'ex_command': '99', 'type': 'f', 'fields': None}
result = ctags.post_process_tag(tag)
self.assertEqual(result, expected_output)<|docstring|>Test ``post_process_tag`` with a line number ``excmd`` variable.
Test function with an sample tag from a Python file. This in turn tests
the supporting functions.<|endoftext|> |
2c6bdca28560373d5df4aa676f41a0fab10adcd37a5ed9ec79b316648abe22f3 | def test_post_process_tag__regex_no_fields(self):
'\n Test ``post_process_tag`` with a regex ``excmd`` variable.\n\n Test function with an sample tag from a Python file. This in turn tests\n the supporting functions.\n '
tag = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'ex_command': '/^def acme_function(tag):$/', 'type': 'f', 'fields': None}
expected_output = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'tag_path': ('.\\a_folder\\a_script.py', 'acme_function'), 'ex_command': 'def acme_function(tag):', 'type': 'f', 'fields': None}
result = ctags.post_process_tag(tag)
self.assertEqual(result, expected_output) | Test ``post_process_tag`` with a regex ``excmd`` variable.
Test function with an sample tag from a Python file. This in turn tests
the supporting functions. | .sublime/Packages/CTags/test_ctags.py | test_post_process_tag__regex_no_fields | teedoo/dotfiles | 1 | python | def test_post_process_tag__regex_no_fields(self):
'\n Test ``post_process_tag`` with a regex ``excmd`` variable.\n\n Test function with an sample tag from a Python file. This in turn tests\n the supporting functions.\n '
tag = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'ex_command': '/^def acme_function(tag):$/', 'type': 'f', 'fields': None}
expected_output = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'tag_path': ('.\\a_folder\\a_script.py', 'acme_function'), 'ex_command': 'def acme_function(tag):', 'type': 'f', 'fields': None}
result = ctags.post_process_tag(tag)
self.assertEqual(result, expected_output) | def test_post_process_tag__regex_no_fields(self):
'\n Test ``post_process_tag`` with a regex ``excmd`` variable.\n\n Test function with an sample tag from a Python file. This in turn tests\n the supporting functions.\n '
tag = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'ex_command': '/^def acme_function(tag):$/', 'type': 'f', 'fields': None}
expected_output = {'symbol': 'acme_function', 'filename': '.\\a_folder\\a_script.py', 'tag_path': ('.\\a_folder\\a_script.py', 'acme_function'), 'ex_command': 'def acme_function(tag):', 'type': 'f', 'fields': None}
result = ctags.post_process_tag(tag)
self.assertEqual(result, expected_output)<|docstring|>Test ``post_process_tag`` with a regex ``excmd`` variable.
Test function with an sample tag from a Python file. This in turn tests
the supporting functions.<|endoftext|> |
ea6a612cf0de8810b4ec444697f90e1de20b5af260e0a12a4b40fa129ec2d682 | def test_post_process_tag__fields(self):
'\n Test ``post_process_tag`` with a number of ``field`` variables.\n\n Test function with an sample tag from a Java file. This in turn tests\n the supporting functions.\n '
tag = {'symbol': 'getSum', 'filename': '.\\a_folder\\DemoClass.java', 'ex_command': '/^\tprivate int getSum(int a, int b) {$/', 'type': 'm', 'fields': 'class:DemoClass\tfile:'}
expected_output = {'symbol': 'getSum', 'filename': '.\\a_folder\\DemoClass.java', 'tag_path': ('.\\a_folder\\DemoClass.java', 'DemoClass', 'getSum'), 'ex_command': '\tprivate int getSum(int a, int b) {', 'type': 'm', 'fields': 'class:DemoClass\tfile:', 'field_keys': ['class', 'file'], 'class': 'DemoClass', 'file': ''}
result = ctags.post_process_tag(tag)
self.assertEqual(result, expected_output) | Test ``post_process_tag`` with a number of ``field`` variables.
Test function with an sample tag from a Java file. This in turn tests
the supporting functions. | .sublime/Packages/CTags/test_ctags.py | test_post_process_tag__fields | teedoo/dotfiles | 1 | python | def test_post_process_tag__fields(self):
'\n Test ``post_process_tag`` with a number of ``field`` variables.\n\n Test function with an sample tag from a Java file. This in turn tests\n the supporting functions.\n '
tag = {'symbol': 'getSum', 'filename': '.\\a_folder\\DemoClass.java', 'ex_command': '/^\tprivate int getSum(int a, int b) {$/', 'type': 'm', 'fields': 'class:DemoClass\tfile:'}
expected_output = {'symbol': 'getSum', 'filename': '.\\a_folder\\DemoClass.java', 'tag_path': ('.\\a_folder\\DemoClass.java', 'DemoClass', 'getSum'), 'ex_command': '\tprivate int getSum(int a, int b) {', 'type': 'm', 'fields': 'class:DemoClass\tfile:', 'field_keys': ['class', 'file'], 'class': 'DemoClass', 'file': }
result = ctags.post_process_tag(tag)
self.assertEqual(result, expected_output) | def test_post_process_tag__fields(self):
'\n Test ``post_process_tag`` with a number of ``field`` variables.\n\n Test function with an sample tag from a Java file. This in turn tests\n the supporting functions.\n '
tag = {'symbol': 'getSum', 'filename': '.\\a_folder\\DemoClass.java', 'ex_command': '/^\tprivate int getSum(int a, int b) {$/', 'type': 'm', 'fields': 'class:DemoClass\tfile:'}
expected_output = {'symbol': 'getSum', 'filename': '.\\a_folder\\DemoClass.java', 'tag_path': ('.\\a_folder\\DemoClass.java', 'DemoClass', 'getSum'), 'ex_command': '\tprivate int getSum(int a, int b) {', 'type': 'm', 'fields': 'class:DemoClass\tfile:', 'field_keys': ['class', 'file'], 'class': 'DemoClass', 'file': }
result = ctags.post_process_tag(tag)
self.assertEqual(result, expected_output)<|docstring|>Test ``post_process_tag`` with a number of ``field`` variables.
Test function with an sample tag from a Java file. This in turn tests
the supporting functions.<|endoftext|> |
5a0c8ec75d9ea4d1cb3b512eade588c5deb7c5e6f829f3abe1612925b2a41fa8 | def test_parse_tag_lines__python(self):
'\n Test ``parse_tag_lines`` with a sample Python file.\n '
path = self.build_python_file__extended()
tag_file = ctags.build_ctags(path=path, opts=['--python-kinds=-i'])
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
except IOError:
self.fail('Setup of files for test failed')
finally:
output.close()
os.remove(path)
os.remove(tag_file)
expected_outputs = {'MyClass': [{'symbol': 'MyClass', 'filename': filename, 'ex_command': 'class MyClass(object):', 'tag_path': (filename, 'MyClass'), 'type': 'c', 'fields': None}], 'address': [{'symbol': 'address', 'filename': filename, 'ex_command': '\taddress = None\t# comment preceded by a tab', 'tag_path': (filename, 'MyClass', 'address'), 'type': 'v', 'fields': 'class:MyClass', 'field_keys': ['class'], 'class': 'MyClass'}], 'last_name': [{'symbol': 'last_name', 'filename': filename, 'ex_command': '\tlast_name = None', 'tag_path': (filename, 'MyClass', 'last_name'), 'type': 'v', 'fields': 'class:MyClass', 'field_keys': ['class'], 'class': 'MyClass'}], 'my_function': [{'symbol': 'my_function', 'filename': filename, 'ex_command': 'def my_function(first_name):', 'tag_path': (filename, 'my_function'), 'type': 'f', 'fields': None}], 'my_method': [{'symbol': 'my_method', 'filename': filename, 'ex_command': '\tdef my_method(self, last_name):', 'tag_path': (filename, 'MyClass', 'my_method'), 'type': 'm', 'fields': 'class:MyClass', 'field_keys': ['class'], 'class': 'MyClass'}], 'COLOR_RED': [{'symbol': 'COLOR_RED', 'filename': filename, 'ex_command': 'COLOR_RED = "\\c800080FF;"\t#red', 'tag_path': (filename, 'COLOR_RED'), 'type': 'v', 'fields': None}]}
result = ctags.parse_tag_lines(content)
for key in expected_outputs:
self.assertEqual(result[key], expected_outputs[key])
for key in result:
self.assertEqual(expected_outputs[key], result[key]) | Test ``parse_tag_lines`` with a sample Python file. | .sublime/Packages/CTags/test_ctags.py | test_parse_tag_lines__python | teedoo/dotfiles | 1 | python | def test_parse_tag_lines__python(self):
'\n \n '
path = self.build_python_file__extended()
tag_file = ctags.build_ctags(path=path, opts=['--python-kinds=-i'])
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
except IOError:
self.fail('Setup of files for test failed')
finally:
output.close()
os.remove(path)
os.remove(tag_file)
expected_outputs = {'MyClass': [{'symbol': 'MyClass', 'filename': filename, 'ex_command': 'class MyClass(object):', 'tag_path': (filename, 'MyClass'), 'type': 'c', 'fields': None}], 'address': [{'symbol': 'address', 'filename': filename, 'ex_command': '\taddress = None\t# comment preceded by a tab', 'tag_path': (filename, 'MyClass', 'address'), 'type': 'v', 'fields': 'class:MyClass', 'field_keys': ['class'], 'class': 'MyClass'}], 'last_name': [{'symbol': 'last_name', 'filename': filename, 'ex_command': '\tlast_name = None', 'tag_path': (filename, 'MyClass', 'last_name'), 'type': 'v', 'fields': 'class:MyClass', 'field_keys': ['class'], 'class': 'MyClass'}], 'my_function': [{'symbol': 'my_function', 'filename': filename, 'ex_command': 'def my_function(first_name):', 'tag_path': (filename, 'my_function'), 'type': 'f', 'fields': None}], 'my_method': [{'symbol': 'my_method', 'filename': filename, 'ex_command': '\tdef my_method(self, last_name):', 'tag_path': (filename, 'MyClass', 'my_method'), 'type': 'm', 'fields': 'class:MyClass', 'field_keys': ['class'], 'class': 'MyClass'}], 'COLOR_RED': [{'symbol': 'COLOR_RED', 'filename': filename, 'ex_command': 'COLOR_RED = "\\c800080FF;"\t#red', 'tag_path': (filename, 'COLOR_RED'), 'type': 'v', 'fields': None}]}
result = ctags.parse_tag_lines(content)
for key in expected_outputs:
self.assertEqual(result[key], expected_outputs[key])
for key in result:
self.assertEqual(expected_outputs[key], result[key]) | def test_parse_tag_lines__python(self):
'\n \n '
path = self.build_python_file__extended()
tag_file = ctags.build_ctags(path=path, opts=['--python-kinds=-i'])
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
except IOError:
self.fail('Setup of files for test failed')
finally:
output.close()
os.remove(path)
os.remove(tag_file)
expected_outputs = {'MyClass': [{'symbol': 'MyClass', 'filename': filename, 'ex_command': 'class MyClass(object):', 'tag_path': (filename, 'MyClass'), 'type': 'c', 'fields': None}], 'address': [{'symbol': 'address', 'filename': filename, 'ex_command': '\taddress = None\t# comment preceded by a tab', 'tag_path': (filename, 'MyClass', 'address'), 'type': 'v', 'fields': 'class:MyClass', 'field_keys': ['class'], 'class': 'MyClass'}], 'last_name': [{'symbol': 'last_name', 'filename': filename, 'ex_command': '\tlast_name = None', 'tag_path': (filename, 'MyClass', 'last_name'), 'type': 'v', 'fields': 'class:MyClass', 'field_keys': ['class'], 'class': 'MyClass'}], 'my_function': [{'symbol': 'my_function', 'filename': filename, 'ex_command': 'def my_function(first_name):', 'tag_path': (filename, 'my_function'), 'type': 'f', 'fields': None}], 'my_method': [{'symbol': 'my_method', 'filename': filename, 'ex_command': '\tdef my_method(self, last_name):', 'tag_path': (filename, 'MyClass', 'my_method'), 'type': 'm', 'fields': 'class:MyClass', 'field_keys': ['class'], 'class': 'MyClass'}], 'COLOR_RED': [{'symbol': 'COLOR_RED', 'filename': filename, 'ex_command': 'COLOR_RED = "\\c800080FF;"\t#red', 'tag_path': (filename, 'COLOR_RED'), 'type': 'v', 'fields': None}]}
result = ctags.parse_tag_lines(content)
for key in expected_outputs:
self.assertEqual(result[key], expected_outputs[key])
for key in result:
self.assertEqual(expected_outputs[key], result[key])<|docstring|>Test ``parse_tag_lines`` with a sample Python file.<|endoftext|> |
d7c13a32c2ebee6d81f81837cf7c19e8cbc8f0ef85371cc891eecd07086a41f7 | def test_parse_tag_lines__c(self):
'\n Test ``parse_tag_lines`` with a sample C file.\n '
path = self.build_c_file()
tag_file = ctags.build_ctags(path=path)
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
except IOError:
self.fail('Setup of files for test failed')
finally:
output.close()
os.remove(path)
os.remove(tag_file)
expected_outputs = {'bar': [{'symbol': 'bar', 'filename': filename, 'ex_command': 'void bar()', 'tag_path': (filename, 'bar'), 'type': 'f', 'fields': None}], 'foo': [{'symbol': 'foo', 'filename': filename, 'ex_command': '1', 'tag_path': (filename, 'foo'), 'type': 'd', 'fields': 'file:', 'field_keys': ['file'], 'file': ''}], 'foobar': [{'symbol': 'foobar', 'filename': filename, 'ex_command': '2', 'tag_path': (filename, 'foobar'), 'type': 'd', 'fields': 'file:', 'field_keys': ['file'], 'file': ''}]}
result = ctags.parse_tag_lines(content)
for key in expected_outputs:
self.assertEqual(result[key], expected_outputs[key])
for key in result:
self.assertEqual(expected_outputs[key], result[key]) | Test ``parse_tag_lines`` with a sample C file. | .sublime/Packages/CTags/test_ctags.py | test_parse_tag_lines__c | teedoo/dotfiles | 1 | python | def test_parse_tag_lines__c(self):
'\n \n '
path = self.build_c_file()
tag_file = ctags.build_ctags(path=path)
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
except IOError:
self.fail('Setup of files for test failed')
finally:
output.close()
os.remove(path)
os.remove(tag_file)
expected_outputs = {'bar': [{'symbol': 'bar', 'filename': filename, 'ex_command': 'void bar()', 'tag_path': (filename, 'bar'), 'type': 'f', 'fields': None}], 'foo': [{'symbol': 'foo', 'filename': filename, 'ex_command': '1', 'tag_path': (filename, 'foo'), 'type': 'd', 'fields': 'file:', 'field_keys': ['file'], 'file': }], 'foobar': [{'symbol': 'foobar', 'filename': filename, 'ex_command': '2', 'tag_path': (filename, 'foobar'), 'type': 'd', 'fields': 'file:', 'field_keys': ['file'], 'file': }]}
result = ctags.parse_tag_lines(content)
for key in expected_outputs:
self.assertEqual(result[key], expected_outputs[key])
for key in result:
self.assertEqual(expected_outputs[key], result[key]) | def test_parse_tag_lines__c(self):
'\n \n '
path = self.build_c_file()
tag_file = ctags.build_ctags(path=path)
with codecs.open(tag_file, encoding='utf-8') as output:
try:
content = output.readlines()
filename = os.path.basename(path)
except IOError:
self.fail('Setup of files for test failed')
finally:
output.close()
os.remove(path)
os.remove(tag_file)
expected_outputs = {'bar': [{'symbol': 'bar', 'filename': filename, 'ex_command': 'void bar()', 'tag_path': (filename, 'bar'), 'type': 'f', 'fields': None}], 'foo': [{'symbol': 'foo', 'filename': filename, 'ex_command': '1', 'tag_path': (filename, 'foo'), 'type': 'd', 'fields': 'file:', 'field_keys': ['file'], 'file': }], 'foobar': [{'symbol': 'foobar', 'filename': filename, 'ex_command': '2', 'tag_path': (filename, 'foobar'), 'type': 'd', 'fields': 'file:', 'field_keys': ['file'], 'file': }]}
result = ctags.parse_tag_lines(content)
for key in expected_outputs:
self.assertEqual(result[key], expected_outputs[key])
for key in result:
self.assertEqual(expected_outputs[key], result[key])<|docstring|>Test ``parse_tag_lines`` with a sample C file.<|endoftext|> |
8b33b17b71173a94bf11b38c85b03a07aae6befeb1a6f337de9b394983869afa | def __init__(self, top_dir, ensemble_num):
'\n Small class for manipulating a standard directory structure for EBMetaD runs.\n :param top_dir: the path to the directory containing all the ensemble members.\n :param param_dict: a dictionary specifying the ensemble number, the iteration,\n and the phase of the simulation.\n '
self._top_dir = top_dir
self._ensemble_num = ensemble_num | Small class for manipulating a standard directory structure for EBMetaD runs.
:param top_dir: the path to the directory containing all the ensemble members.
:param param_dict: a dictionary specifying the ensemble number, the iteration,
and the phase of the simulation. | run_ebmetad/directory_helper.py | __init__ | jmhays/run-EBMetaD | 1 | python | def __init__(self, top_dir, ensemble_num):
'\n Small class for manipulating a standard directory structure for EBMetaD runs.\n :param top_dir: the path to the directory containing all the ensemble members.\n :param param_dict: a dictionary specifying the ensemble number, the iteration,\n and the phase of the simulation.\n '
self._top_dir = top_dir
self._ensemble_num = ensemble_num | def __init__(self, top_dir, ensemble_num):
'\n Small class for manipulating a standard directory structure for EBMetaD runs.\n :param top_dir: the path to the directory containing all the ensemble members.\n :param param_dict: a dictionary specifying the ensemble number, the iteration,\n and the phase of the simulation.\n '
self._top_dir = top_dir
self._ensemble_num = ensemble_num<|docstring|>Small class for manipulating a standard directory structure for EBMetaD runs.
:param top_dir: the path to the directory containing all the ensemble members.
:param param_dict: a dictionary specifying the ensemble number, the iteration,
and the phase of the simulation.<|endoftext|> |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.