nwo
stringlengths
5
106
sha
stringlengths
40
40
path
stringlengths
4
174
language
stringclasses
1 value
identifier
stringlengths
1
140
parameters
stringlengths
0
87.7k
argument_list
stringclasses
1 value
return_statement
stringlengths
0
426k
docstring
stringlengths
0
64.3k
docstring_summary
stringlengths
0
26.3k
docstring_tokens
list
function
stringlengths
18
4.83M
function_tokens
list
url
stringlengths
83
304
arsaboo/homeassistant-config
53c998986fbe84d793a0b174757154ab30e676e4
custom_components/futures_cnn/sensor.py
python
CNNFuturesData.__init__
(self, resource)
Initialize the data object.
Initialize the data object.
[ "Initialize", "the", "data", "object", "." ]
def __init__(self, resource): """Initialize the data object.""" self._resource = resource self.data = None self.available = True
[ "def", "__init__", "(", "self", ",", "resource", ")", ":", "self", ".", "_resource", "=", "resource", "self", ".", "data", "=", "None", "self", ".", "available", "=", "True" ]
https://github.com/arsaboo/homeassistant-config/blob/53c998986fbe84d793a0b174757154ab30e676e4/custom_components/futures_cnn/sensor.py#L183-L187
IJDykeman/wangTiles
7c1ee2095ebdf7f72bce07d94c6484915d5cae8b
experimental_code/tiles_3d/venv_mac/lib/python2.7/site-packages/setuptools/command/sdist.py
python
sdist.make_distribution
(self)
Workaround for #516
Workaround for #516
[ "Workaround", "for", "#516" ]
def make_distribution(self): """ Workaround for #516 """ with self._remove_os_link(): orig.sdist.make_distribution(self)
[ "def", "make_distribution", "(", "self", ")", ":", "with", "self", ".", "_remove_os_link", "(", ")", ":", "orig", ".", "sdist", ".", "make_distribution", "(", "self", ")" ]
https://github.com/IJDykeman/wangTiles/blob/7c1ee2095ebdf7f72bce07d94c6484915d5cae8b/experimental_code/tiles_3d/venv_mac/lib/python2.7/site-packages/setuptools/command/sdist.py#L73-L78
oddt/oddt
8cf555820d97a692ade81c101ebe10e28bcb3722
oddt/scoring/functions/RFScore.py
python
rfscore.__init__
(self, protein=None, n_jobs=-1, version=1, spr=0, **kwargs)
Scoring function implementing RF-Score variants. It predicts the binding affinity (pKi/d) of ligand in a complex utilizng simple descriptors (close contacts of atoms <12A) with sophisticated machine-learning model (random forest). The third variand supplements those contacts with Vina partial scores. For futher details see RF-Score publications v1[1]_, v2[2]_, v3[3]_. Parameters ---------- protein : oddt.toolkit.Molecule object Receptor for the scored ligands n_jobs: int (default=-1) Number of cores to use for scoring and training. By default (-1) all cores are allocated. version: int (default=1) Scoring function variant. The deault is the simplest one (v1). spr: int (default=0) The minimum number of contacts in each pair of atom types in the training set for the column to be included in training. This is a way of removal of not frequent and empty contacts. References ---------- .. [1] Ballester PJ, Mitchell JBO. A machine learning approach to predicting protein-ligand binding affinity with applications to molecular docking. Bioinformatics. 2010;26: 1169-1175. doi:10.1093/bioinformatics/btq112 .. [2] Ballester PJ, Schreyer A, Blundell TL. Does a more precise chemical description of protein-ligand complexes lead to more accurate prediction of binding affinity? J Chem Inf Model. 2014;54: 944-955. doi:10.1021/ci500091r .. [3] Li H, Leung K-S, Wong M-H, Ballester PJ. Improving AutoDock Vina Using Random Forest: The Growing Accuracy of Binding Affinity Prediction by the Effective Exploitation of Larger Data Sets. Mol Inform. WILEY-VCH Verlag; 2015;34: 115-126. doi:10.1002/minf.201400132
Scoring function implementing RF-Score variants. It predicts the binding affinity (pKi/d) of ligand in a complex utilizng simple descriptors (close contacts of atoms <12A) with sophisticated machine-learning model (random forest). The third variand supplements those contacts with Vina partial scores. For futher details see RF-Score publications v1[1]_, v2[2]_, v3[3]_.
[ "Scoring", "function", "implementing", "RF", "-", "Score", "variants", ".", "It", "predicts", "the", "binding", "affinity", "(", "pKi", "/", "d", ")", "of", "ligand", "in", "a", "complex", "utilizng", "simple", "descriptors", "(", "close", "contacts", "of", ...
def __init__(self, protein=None, n_jobs=-1, version=1, spr=0, **kwargs): """Scoring function implementing RF-Score variants. It predicts the binding affinity (pKi/d) of ligand in a complex utilizng simple descriptors (close contacts of atoms <12A) with sophisticated machine-learning model (random forest). The third variand supplements those contacts with Vina partial scores. For futher details see RF-Score publications v1[1]_, v2[2]_, v3[3]_. Parameters ---------- protein : oddt.toolkit.Molecule object Receptor for the scored ligands n_jobs: int (default=-1) Number of cores to use for scoring and training. By default (-1) all cores are allocated. version: int (default=1) Scoring function variant. The deault is the simplest one (v1). spr: int (default=0) The minimum number of contacts in each pair of atom types in the training set for the column to be included in training. This is a way of removal of not frequent and empty contacts. References ---------- .. [1] Ballester PJ, Mitchell JBO. A machine learning approach to predicting protein-ligand binding affinity with applications to molecular docking. Bioinformatics. 2010;26: 1169-1175. doi:10.1093/bioinformatics/btq112 .. [2] Ballester PJ, Schreyer A, Blundell TL. Does a more precise chemical description of protein-ligand complexes lead to more accurate prediction of binding affinity? J Chem Inf Model. 2014;54: 944-955. doi:10.1021/ci500091r .. [3] Li H, Leung K-S, Wong M-H, Ballester PJ. Improving AutoDock Vina Using Random Forest: The Growing Accuracy of Binding Affinity Prediction by the Effective Exploitation of Larger Data Sets. Mol Inform. WILEY-VCH Verlag; 2015;34: 115-126. doi:10.1002/minf.201400132 """ self.protein = protein self.n_jobs = n_jobs self.version = version self.spr = spr if version == 1: cutoff = 12 mtry = 6 descriptors = close_contacts_descriptor( protein, cutoff=cutoff, protein_types=protein_atomic_nums, ligand_types=ligand_atomic_nums) elif version == 2: cutoff = np.array([0, 2, 4, 6, 8, 10, 12]) mtry = 14 descriptors = close_contacts_descriptor( protein, cutoff=cutoff, protein_types=protein_atomic_nums, ligand_types=ligand_atomic_nums) elif version == 3: cutoff = 12 mtry = 6 cc = close_contacts_descriptor( protein, cutoff=cutoff, protein_types=protein_atomic_nums, ligand_types=ligand_atomic_nums) vina_scores = ['vina_gauss1', 'vina_gauss2', 'vina_repulsion', 'vina_hydrophobic', 'vina_hydrogen', 'vina_num_rotors'] vina = oddt_vina_descriptor(protein, vina_scores=vina_scores) descriptors = ensemble_descriptor((vina, cc)) model = randomforest(n_estimators=500, oob_score=True, n_jobs=n_jobs, max_features=mtry, bootstrap=True, min_samples_split=6, **kwargs) super(rfscore, self).__init__(model, descriptors, score_title='rfscore_v%i' % self.version)
[ "def", "__init__", "(", "self", ",", "protein", "=", "None", ",", "n_jobs", "=", "-", "1", ",", "version", "=", "1", ",", "spr", "=", "0", ",", "*", "*", "kwargs", ")", ":", "self", ".", "protein", "=", "protein", "self", ".", "n_jobs", "=", "n...
https://github.com/oddt/oddt/blob/8cf555820d97a692ade81c101ebe10e28bcb3722/oddt/scoring/functions/RFScore.py#L34-L123
sagemath/sage
f9b2db94f675ff16963ccdefba4f1a3393b3fe0d
src/sage/schemes/toric/library.py
python
ToricVarietyFactory.Cube_deformation
(self,k, names=None, base_ring=QQ)
return ToricVariety(fan, coordinate_names=names)
r""" Construct, for each `k\in\ZZ_{\geq 0}`, a toric variety with `\ZZ_k`-torsion in the Chow group. The fans of this sequence of toric varieties all equal the face fan of a unit cube topologically, but the ``(1,1,1)``-vertex is moved to ``(1,1,2k+1)``. This example was studied in [FS1994]_. INPUT: - ``k`` -- integer. The case ``k=0`` is the same as :meth:`Cube_face_fan`. - ``names`` -- string. Names for the homogeneous coordinates. See :func:`~sage.schemes.toric.variety.normalize_names` for acceptable formats. - ``base_ring`` -- a ring (default: `\QQ`). The base ring for the toric variety. OUTPUT: A :class:`toric variety <sage.schemes.toric.variety.ToricVariety_field>` `X_k`. Its Chow group is `A_1(X_k)=\ZZ_k`. EXAMPLES:: sage: X_2 = toric_varieties.Cube_deformation(2) sage: X_2 3-d toric variety covered by 6 affine patches sage: X_2.fan().rays() N( 1, 1, 5), N( 1, -1, 1), N(-1, 1, 1), N(-1, -1, 1), N(-1, -1, -1), N(-1, 1, -1), N( 1, -1, -1), N( 1, 1, -1) in 3-d lattice N sage: X_2.gens() (z0, z1, z2, z3, z4, z5, z6, z7)
r""" Construct, for each `k\in\ZZ_{\geq 0}`, a toric variety with `\ZZ_k`-torsion in the Chow group.
[ "r", "Construct", "for", "each", "k", "\\", "in", "\\", "ZZ_", "{", "\\", "geq", "0", "}", "a", "toric", "variety", "with", "\\", "ZZ_k", "-", "torsion", "in", "the", "Chow", "group", "." ]
def Cube_deformation(self,k, names=None, base_ring=QQ): r""" Construct, for each `k\in\ZZ_{\geq 0}`, a toric variety with `\ZZ_k`-torsion in the Chow group. The fans of this sequence of toric varieties all equal the face fan of a unit cube topologically, but the ``(1,1,1)``-vertex is moved to ``(1,1,2k+1)``. This example was studied in [FS1994]_. INPUT: - ``k`` -- integer. The case ``k=0`` is the same as :meth:`Cube_face_fan`. - ``names`` -- string. Names for the homogeneous coordinates. See :func:`~sage.schemes.toric.variety.normalize_names` for acceptable formats. - ``base_ring`` -- a ring (default: `\QQ`). The base ring for the toric variety. OUTPUT: A :class:`toric variety <sage.schemes.toric.variety.ToricVariety_field>` `X_k`. Its Chow group is `A_1(X_k)=\ZZ_k`. EXAMPLES:: sage: X_2 = toric_varieties.Cube_deformation(2) sage: X_2 3-d toric variety covered by 6 affine patches sage: X_2.fan().rays() N( 1, 1, 5), N( 1, -1, 1), N(-1, 1, 1), N(-1, -1, 1), N(-1, -1, -1), N(-1, 1, -1), N( 1, -1, -1), N( 1, 1, -1) in 3-d lattice N sage: X_2.gens() (z0, z1, z2, z3, z4, z5, z6, z7) """ # We are going to eventually switch off consistency checks, so we need # to be sure that the input is acceptable. try: k = ZZ(k) # make sure that we got a "mathematical" integer except TypeError: raise TypeError("cube deformations X_k are defined only for " "non-negative integer k!\nGot: %s" % k) if k < 0: raise ValueError("cube deformations X_k are defined only for " "non-negative k!\nGot: %s" % k) rays = lambda kappa: matrix([[ 1, 1, 2*kappa+1],[ 1,-1, 1],[-1, 1, 1],[-1,-1, 1], [-1,-1,-1],[-1, 1,-1],[ 1,-1,-1],[ 1, 1,-1]]) cones = [[0,1,2,3],[4,5,6,7],[0,1,7,6],[4,5,3,2],[0,2,5,7],[4,6,1,3]] fan = Fan(cones, rays(k)) return ToricVariety(fan, coordinate_names=names)
[ "def", "Cube_deformation", "(", "self", ",", "k", ",", "names", "=", "None", ",", "base_ring", "=", "QQ", ")", ":", "# We are going to eventually switch off consistency checks, so we need", "# to be sure that the input is acceptable.", "try", ":", "k", "=", "ZZ", "(", ...
https://github.com/sagemath/sage/blob/f9b2db94f675ff16963ccdefba4f1a3393b3fe0d/src/sage/schemes/toric/library.py#L985-L1046
SanPen/GridCal
d3f4566d2d72c11c7e910c9d162538ef0e60df31
src/GridCal/Engine/Simulations/PowerFlow/derivatives.py
python
derivatives_Beq_csc_fast
(nb, nl, iBeqx, F, T, V, ma, k2)
return dSbus_dBeqx, dSf_dBeqx, dSt_dBeqx
Compute the derivatives of: - dSbus_dBeqz, dSf_dBeqz, dSt_dBeqz -> iBeqx=iBeqz - dSbus_dBeqv, dSf_dBeqv, dSt_dBeqv -> iBeqx=iBeqv :param nb: Number of buses :param nl: Number of branches :param iBeqx: array of indices {iBeqz, iBeqv} :param F: Array of branch "from" bus indices :param T: Array of branch "to" bus indices :param V:Array of complex voltages :param ma: Array of branch taps modules :param k2: Array of "k2" parameters :return: - dSbus_dBeqz, dSf_dBeqz, dSt_dBeqz -> if iBeqx=iBeqz - dSbus_dBeqv, dSf_dBeqv, dSt_dBeqv -> if iBeqx=iBeqv
Compute the derivatives of: - dSbus_dBeqz, dSf_dBeqz, dSt_dBeqz -> iBeqx=iBeqz - dSbus_dBeqv, dSf_dBeqv, dSt_dBeqv -> iBeqx=iBeqv
[ "Compute", "the", "derivatives", "of", ":", "-", "dSbus_dBeqz", "dSf_dBeqz", "dSt_dBeqz", "-", ">", "iBeqx", "=", "iBeqz", "-", "dSbus_dBeqv", "dSf_dBeqv", "dSt_dBeqv", "-", ">", "iBeqx", "=", "iBeqv" ]
def derivatives_Beq_csc_fast(nb, nl, iBeqx, F, T, V, ma, k2): """ Compute the derivatives of: - dSbus_dBeqz, dSf_dBeqz, dSt_dBeqz -> iBeqx=iBeqz - dSbus_dBeqv, dSf_dBeqv, dSt_dBeqv -> iBeqx=iBeqv :param nb: Number of buses :param nl: Number of branches :param iBeqx: array of indices {iBeqz, iBeqv} :param F: Array of branch "from" bus indices :param T: Array of branch "to" bus indices :param V:Array of complex voltages :param ma: Array of branch taps modules :param k2: Array of "k2" parameters :return: - dSbus_dBeqz, dSf_dBeqz, dSt_dBeqz -> if iBeqx=iBeqz - dSbus_dBeqv, dSf_dBeqv, dSt_dBeqv -> if iBeqx=iBeqv """ ndev = len(iBeqx) dSbus_dBeq_data, dSbus_dBeq_indices, dSbus_dBeq_indptr, \ dSf_dBeqx_data, dSf_dBeqx_indices, dSf_dBeqx_indptr = derivatives_Beq_csc_numba(iBeqx, F, V, ma, k2) dSbus_dBeqx = sp.csc_matrix((dSbus_dBeq_data, dSbus_dBeq_indices, dSbus_dBeq_indptr), shape=(nb, ndev)) dSf_dBeqx = sp.csc_matrix((dSf_dBeqx_data, dSf_dBeqx_indices, dSf_dBeqx_indptr), shape=(nl, ndev)) dSt_dBeqx = sp.csc_matrix((nl, ndev), dtype=complex) return dSbus_dBeqx, dSf_dBeqx, dSt_dBeqx
[ "def", "derivatives_Beq_csc_fast", "(", "nb", ",", "nl", ",", "iBeqx", ",", "F", ",", "T", ",", "V", ",", "ma", ",", "k2", ")", ":", "ndev", "=", "len", "(", "iBeqx", ")", "dSbus_dBeq_data", ",", "dSbus_dBeq_indices", ",", "dSbus_dBeq_indptr", ",", "dS...
https://github.com/SanPen/GridCal/blob/d3f4566d2d72c11c7e910c9d162538ef0e60df31/src/GridCal/Engine/Simulations/PowerFlow/derivatives.py#L831-L860
misterch0c/shadowbroker
e3a069bea47a2c1009697941ac214adc6f90aa8d
windows/Resources/Python/Core/Lib/lib-tk/Tkinter.py
python
Menu.activate
(self, index)
Activate entry at INDEX.
Activate entry at INDEX.
[ "Activate", "entry", "at", "INDEX", "." ]
def activate(self, index): """Activate entry at INDEX.""" self.tk.call(self._w, 'activate', index)
[ "def", "activate", "(", "self", ",", "index", ")", ":", "self", ".", "tk", ".", "call", "(", "self", ".", "_w", ",", "'activate'", ",", "index", ")" ]
https://github.com/misterch0c/shadowbroker/blob/e3a069bea47a2c1009697941ac214adc6f90aa8d/windows/Resources/Python/Core/Lib/lib-tk/Tkinter.py#L2936-L2938
deepfakes/faceswap
09c7d8aca3c608d1afad941ea78e9fd9b64d9219
lib/gui/wrapper.py
python
FaceswapControl.terminate
(self)
Terminate the running process in a LongRunningTask so we can still output to console
Terminate the running process in a LongRunningTask so we can still output to console
[ "Terminate", "the", "running", "process", "in", "a", "LongRunningTask", "so", "we", "can", "still", "output", "to", "console" ]
def terminate(self): """ Terminate the running process in a LongRunningTask so we can still output to console """ if self.thread is None: logger.debug("Terminating wrapper in LongRunningTask") self.thread = LongRunningTask(target=self.terminate_in_thread, args=(self.command, self.process)) if self.command == "train": self.wrapper.tk_vars["istraining"].set(False) self.thread.start() self.config.root.after(1000, self.terminate) elif not self.thread.complete.is_set(): logger.debug("Not finished terminating") self.config.root.after(1000, self.terminate) else: logger.debug("Termination Complete. Cleaning up") _ = self.thread.get_result() # Terminate the LongRunningTask object self.thread = None
[ "def", "terminate", "(", "self", ")", ":", "if", "self", ".", "thread", "is", "None", ":", "logger", ".", "debug", "(", "\"Terminating wrapper in LongRunningTask\"", ")", "self", ".", "thread", "=", "LongRunningTask", "(", "target", "=", "self", ".", "termin...
https://github.com/deepfakes/faceswap/blob/09c7d8aca3c608d1afad941ea78e9fd9b64d9219/lib/gui/wrapper.py#L370-L387
EricZgw/PyramidBox
03f67d8422f2f03e5ee580e2f92d4040c24bbdb4
preprocessing/tf_image.py
python
fix_image_flip_shape
(image, result)
return result
Set the shape to 3 dimensional if we don't know anything else. Args: image: original image size result: flipped or transformed image Returns: An image whose shape is at least None,None,None.
Set the shape to 3 dimensional if we don't know anything else. Args: image: original image size result: flipped or transformed image Returns: An image whose shape is at least None,None,None.
[ "Set", "the", "shape", "to", "3", "dimensional", "if", "we", "don", "t", "know", "anything", "else", ".", "Args", ":", "image", ":", "original", "image", "size", "result", ":", "flipped", "or", "transformed", "image", "Returns", ":", "An", "image", "whos...
def fix_image_flip_shape(image, result): """Set the shape to 3 dimensional if we don't know anything else. Args: image: original image size result: flipped or transformed image Returns: An image whose shape is at least None,None,None. """ image_shape = image.get_shape() if image_shape == tensor_shape.unknown_shape(): result.set_shape([None, None, None]) else: result.set_shape(image_shape) return result
[ "def", "fix_image_flip_shape", "(", "image", ",", "result", ")", ":", "image_shape", "=", "image", ".", "get_shape", "(", ")", "if", "image_shape", "==", "tensor_shape", ".", "unknown_shape", "(", ")", ":", "result", ".", "set_shape", "(", "[", "None", ","...
https://github.com/EricZgw/PyramidBox/blob/03f67d8422f2f03e5ee580e2f92d4040c24bbdb4/preprocessing/tf_image.py#L119-L132
SublimeHaskell/SublimeHaskell
733c7c4fa6eaf34a6544b888345f23ab4f461fcb
internals/backend.py
python
HaskellBackend.is_available
(**_kwargs)
return False
Test if the backend is available. For most backends, this verifies that an executable exists and possibly that the version is supported (see the `hsdev` version of this method.) Return True if the backend is available. The base class returns False.
Test if the backend is available. For most backends, this verifies that an executable exists and possibly that the version is supported (see the `hsdev` version of this method.)
[ "Test", "if", "the", "backend", "is", "available", ".", "For", "most", "backends", "this", "verifies", "that", "an", "executable", "exists", "and", "possibly", "that", "the", "version", "is", "supported", "(", "see", "the", "hsdev", "version", "of", "this", ...
def is_available(**_kwargs): '''Test if the backend is available. For most backends, this verifies that an executable exists and possibly that the version is supported (see the `hsdev` version of this method.) Return True if the backend is available. The base class returns False. ''' return False
[ "def", "is_available", "(", "*", "*", "_kwargs", ")", ":", "return", "False" ]
https://github.com/SublimeHaskell/SublimeHaskell/blob/733c7c4fa6eaf34a6544b888345f23ab4f461fcb/internals/backend.py#L29-L35
windelbouwman/ppci
915c069e0667042c085ec42c78e9e3c9a5295324
ppci/api.py
python
c3c
( sources, includes, march, opt_level=0, reporter=None, debug=False, outstream=None, )
return ir_to_object( [ir_module], march, debug=debug, reporter=reporter, opt=opt_cg, outstream=outstream, )
Compile a set of sources into binary format for the given target. Args: sources: a collection of sources that will be compiled. includes: a collection of sources that will be used for type and function information. march: the architecture for which to compile. reporter: reporter to write compilation report to debug: include debugging information Returns: An object file .. doctest:: >>> import io >>> from ppci.api import c3c >>> source_file = io.StringIO("module main; var int a;") >>> obj = c3c([source_file], [], 'arm') >>> print(obj) CodeObject of 4 bytes
Compile a set of sources into binary format for the given target.
[ "Compile", "a", "set", "of", "sources", "into", "binary", "format", "for", "the", "given", "target", "." ]
def c3c( sources, includes, march, opt_level=0, reporter=None, debug=False, outstream=None, ): """Compile a set of sources into binary format for the given target. Args: sources: a collection of sources that will be compiled. includes: a collection of sources that will be used for type and function information. march: the architecture for which to compile. reporter: reporter to write compilation report to debug: include debugging information Returns: An object file .. doctest:: >>> import io >>> from ppci.api import c3c >>> source_file = io.StringIO("module main; var int a;") >>> obj = c3c([source_file], [], 'arm') >>> print(obj) CodeObject of 4 bytes """ reporter = get_reporter(reporter) march = get_arch(march) ir_module = c3_to_ir(sources, includes, march, reporter=reporter) optimize(ir_module, level=opt_level, reporter=reporter) opt_cg = "size" if opt_level == "s" else "speed" return ir_to_object( [ir_module], march, debug=debug, reporter=reporter, opt=opt_cg, outstream=outstream, )
[ "def", "c3c", "(", "sources", ",", "includes", ",", "march", ",", "opt_level", "=", "0", ",", "reporter", "=", "None", ",", "debug", "=", "False", ",", "outstream", "=", "None", ",", ")", ":", "reporter", "=", "get_reporter", "(", "reporter", ")", "m...
https://github.com/windelbouwman/ppci/blob/915c069e0667042c085ec42c78e9e3c9a5295324/ppci/api.py#L395-L440
textX/textX
452b684d434c557a63ff7ae2b014770c29669e4c
textx/lang.py
python
python_type
(textx_type_name)
return { 'ID': text, 'BOOL': bool, 'INT': int, 'FLOAT': float, 'STRICTFLOAT': float, 'STRING': text, 'NUMBER': float, 'BASETYPE': text, }.get(textx_type_name, textx_type_name)
Return Python type from the name of base textx type.
Return Python type from the name of base textx type.
[ "Return", "Python", "type", "from", "the", "name", "of", "base", "textx", "type", "." ]
def python_type(textx_type_name): """Return Python type from the name of base textx type.""" return { 'ID': text, 'BOOL': bool, 'INT': int, 'FLOAT': float, 'STRICTFLOAT': float, 'STRING': text, 'NUMBER': float, 'BASETYPE': text, }.get(textx_type_name, textx_type_name)
[ "def", "python_type", "(", "textx_type_name", ")", ":", "return", "{", "'ID'", ":", "text", ",", "'BOOL'", ":", "bool", ",", "'INT'", ":", "int", ",", "'FLOAT'", ":", "float", ",", "'STRICTFLOAT'", ":", "float", ",", "'STRING'", ":", "text", ",", "'NUM...
https://github.com/textX/textX/blob/452b684d434c557a63ff7ae2b014770c29669e4c/textx/lang.py#L136-L147
Erotemic/ubelt
221d5f6262d5c8e78638e1a38e3adcc9cc9a15e9
ubelt/util_hash.py
python
HashableExtensions.lookup
(self, data)
return hash_func
Returns an appropriate function to hash ``data`` if one has been registered. Raises: TypeError : if data has no registered hash methods Example: >>> import ubelt as ub >>> import pytest >>> if not ub.modname_to_modpath('numpy'): ... raise pytest.skip('numpy is optional') >>> self = HashableExtensions() >>> self._register_numpy_extensions() >>> self._register_builtin_class_extensions() >>> import numpy as np >>> data = np.array([1, 2, 3]) >>> self.lookup(data[0]) >>> class Foo(object): >>> def __init__(f): >>> f.attr = 1 >>> data = Foo() >>> assert pytest.raises(TypeError, self.lookup, data) >>> # If ub.hash_data does not support your object, >>> # then you can register it. >>> @self.register(Foo) >>> def _hashfoo(data): >>> return b'FOO', data.attr >>> func = self.lookup(data) >>> assert func(data)[1] == 1 >>> import uuid >>> data = uuid.uuid4() >>> self.lookup(data)
Returns an appropriate function to hash ``data`` if one has been registered.
[ "Returns", "an", "appropriate", "function", "to", "hash", "data", "if", "one", "has", "been", "registered", "." ]
def lookup(self, data): """ Returns an appropriate function to hash ``data`` if one has been registered. Raises: TypeError : if data has no registered hash methods Example: >>> import ubelt as ub >>> import pytest >>> if not ub.modname_to_modpath('numpy'): ... raise pytest.skip('numpy is optional') >>> self = HashableExtensions() >>> self._register_numpy_extensions() >>> self._register_builtin_class_extensions() >>> import numpy as np >>> data = np.array([1, 2, 3]) >>> self.lookup(data[0]) >>> class Foo(object): >>> def __init__(f): >>> f.attr = 1 >>> data = Foo() >>> assert pytest.raises(TypeError, self.lookup, data) >>> # If ub.hash_data does not support your object, >>> # then you can register it. >>> @self.register(Foo) >>> def _hashfoo(data): >>> return b'FOO', data.attr >>> func = self.lookup(data) >>> assert func(data)[1] == 1 >>> import uuid >>> data = uuid.uuid4() >>> self.lookup(data) """ # Evaluate the lazy queue if anything is in it if self._lazy_queue: for func in self._lazy_queue: func() self._lazy_queue = [] # Maybe try using functools.singledispatch instead? # First try O(1) lookup query_hash_type = data.__class__ key = (query_hash_type.__module__, query_hash_type.__name__) try: hash_type, hash_func = self.keyed_extensions[key] except KeyError: # if issubclass(query_hash_type, dict): # # TODO: In Python 3.7+ dicts are ordered by default, so # # perhaps we should allow hashing them by default # import warnings # warnings.warn( # 'It looks like you are trying to hash an unordered dict. ' # 'By default this is not allowed, but if you REALLY need ' # 'to do this, then call ' # 'ubelt.util_hash._HASHABLE_EXTENSIONS._register_agressive_extensions() ' # 'beforehand') raise TypeError( 'No registered hash func for hashable type={!r}'.format( query_hash_type)) return hash_func
[ "def", "lookup", "(", "self", ",", "data", ")", ":", "# Evaluate the lazy queue if anything is in it", "if", "self", ".", "_lazy_queue", ":", "for", "func", "in", "self", ".", "_lazy_queue", ":", "func", "(", ")", "self", ".", "_lazy_queue", "=", "[", "]", ...
https://github.com/Erotemic/ubelt/blob/221d5f6262d5c8e78638e1a38e3adcc9cc9a15e9/ubelt/util_hash.py#L480-L545
getnikola/nikola
2da876e9322e42a93f8295f950e336465c6a4ee5
nikola/plugins/command/serve.py
python
CommandServe.shutdown
(self, signum=None, _frame=None)
Shut down the server that is running detached.
Shut down the server that is running detached.
[ "Shut", "down", "the", "server", "that", "is", "running", "detached", "." ]
def shutdown(self, signum=None, _frame=None): """Shut down the server that is running detached.""" if self.dns_sd: self.dns_sd.Reset() if os.path.exists(self.serve_pidfile): os.remove(self.serve_pidfile) if not self.detached: self.logger.info("Server is shutting down.") if signum: sys.exit(0)
[ "def", "shutdown", "(", "self", ",", "signum", "=", "None", ",", "_frame", "=", "None", ")", ":", "if", "self", ".", "dns_sd", ":", "self", ".", "dns_sd", ".", "Reset", "(", ")", "if", "os", ".", "path", ".", "exists", "(", "self", ".", "serve_pi...
https://github.com/getnikola/nikola/blob/2da876e9322e42a93f8295f950e336465c6a4ee5/nikola/plugins/command/serve.py#L100-L109
prkumar/uplink
3472806f68a60a93f7cb555d36365551a5411cc5
uplink/clients/io/interfaces.py
python
RequestExecution.state
(self)
The current state of the request.
The current state of the request.
[ "The", "current", "state", "of", "the", "request", "." ]
def state(self): """The current state of the request.""" raise NotImplementedError
[ "def", "state", "(", "self", ")", ":", "raise", "NotImplementedError" ]
https://github.com/prkumar/uplink/blob/3472806f68a60a93f7cb555d36365551a5411cc5/uplink/clients/io/interfaces.py#L86-L88
missionpinball/mpf
8e6b74cff4ba06d2fec9445742559c1068b88582
mpf/core/utility_functions.py
python
Util.pwm32_to_int
(source_int: int)
Convert a PWM32 value to int.
Convert a PWM32 value to int.
[ "Convert", "a", "PWM32", "value", "to", "int", "." ]
def pwm32_to_int(source_int: int) -> int: """Convert a PWM32 value to int.""" # generated by the int_to_pwm.py script in the mpf/tools folder lookup_table = { 0: 0, # 00000000000000000000000000000000 1: 2, # 00000000000000000000000000000010 2: 131074, # 00000000000000100000000000000010 3: 4196354, # 00000000010000000000100000000010 4: 33686018, # 00000010000000100000001000000010 5: 68165762, # 00000100000100000010000010000010 6: 138545218, # 00001000010000100000100001000010 7: 277365794, # 00010000100010000100010000100010 8: 572662306, # 00100010001000100010001000100010 9: 574916882, # 00100010010001001000100100010010 10: 613557394, # 00100100100100100010010010010010 11: 1227133514, # 01001001001001001001001001001010 12: 1246382666, # 01001010010010100100101001001010 13: 1385473322, # 01010010100101001010010100101010 14: 1420448938, # 01010100101010100101010010101010 15: 1431612074, # 01010101010101001010101010101010 16: 2863311530, # 10101010101010101010101010101010 17: 2863355222, # 10101010101010110101010101010110 18: 2874583894, # 10101011010101101010101101010110 19: 2909493974, # 10101101011010110101101011010110 20: 3065427638, # 10110110101101101011011010110110 21: 3067833782, # 10110110110110110110110110110110 22: 3681475438, # 11011011011011101101101101101110 23: 3720050414, # 11011101101110110111011011101110 24: 4008636142, # 11101110111011101110111011101110 25: 4017601502, # 11101111011101111011101111011110 26: 4156487614, # 11110111101111101111011110111110 27: 4226801534, # 11111011111011111101111101111110 28: 4278124286, # 11111110111111101111111011111110 29: 4290770942, # 11111111101111111111011111111110 30: 4294901758, # 11111111111111101111111111111110 31: 4294967294, # 11111111111111111111111111111110 32: 4294967295, # 11111111111111111111111111111111 } if 0 <= source_int <= 32: return lookup_table[source_int] raise ValueError("%s is invalid pwm int value. (Expected value " "0-32)" % source_int)
[ "def", "pwm32_to_int", "(", "source_int", ":", "int", ")", "->", "int", ":", "# generated by the int_to_pwm.py script in the mpf/tools folder", "lookup_table", "=", "{", "0", ":", "0", ",", "# 00000000000000000000000000000000", "1", ":", "2", ",", "# 0000000000000000000...
https://github.com/missionpinball/mpf/blob/8e6b74cff4ba06d2fec9445742559c1068b88582/mpf/core/utility_functions.py#L423-L467
wxWidgets/Phoenix
b2199e299a6ca6d866aa6f3d0888499136ead9d6
wx/lib/agw/aui/tabart.py
python
AuiSimpleTabArt.DrawTab
(self, dc, wnd, page, in_rect, close_button_state, paint_control=False)
return out_tab_rect, out_button_rect, x_extent
Draws a single tab. :param `dc`: a :class:`wx.DC` device context; :param `wnd`: a :class:`wx.Window` instance object; :param `page`: the tab control page associated with the tab; :param wx.Rect `in_rect`: rectangle the tab should be confined to; :param integer `close_button_state`: the state of the close button on the tab; :param bool `paint_control`: whether to draw the control inside a tab (if any) on a :class:`MemoryDC`.
Draws a single tab.
[ "Draws", "a", "single", "tab", "." ]
def DrawTab(self, dc, wnd, page, in_rect, close_button_state, paint_control=False): """ Draws a single tab. :param `dc`: a :class:`wx.DC` device context; :param `wnd`: a :class:`wx.Window` instance object; :param `page`: the tab control page associated with the tab; :param wx.Rect `in_rect`: rectangle the tab should be confined to; :param integer `close_button_state`: the state of the close button on the tab; :param bool `paint_control`: whether to draw the control inside a tab (if any) on a :class:`MemoryDC`. """ # if the caption is empty, measure some temporary text caption = page.caption if caption == "": caption = "Xj" agwFlags = self.GetAGWFlags() dc.SetFont(self._selected_font) selected_textx, selected_texty, dummy = dc.GetFullMultiLineTextExtent(caption) dc.SetFont(self._normal_font) normal_textx, normal_texty, dummy = dc.GetFullMultiLineTextExtent(caption) control = page.control # figure out the size of the tab tab_size, x_extent = self.GetTabSize(dc, wnd, page.caption, page.bitmap, page.active, close_button_state, control) tab_height = tab_size[1] tab_width = tab_size[0] tab_x = in_rect.x tab_y = in_rect.y + in_rect.height - tab_height caption = page.caption # select pen, brush and font for the tab to be drawn if page.active: dc.SetPen(self._selected_bkpen) dc.SetBrush(self._selected_bkbrush) dc.SetFont(self._selected_font) textx = selected_textx texty = selected_texty else: dc.SetPen(self._normal_bkpen) dc.SetBrush(self._normal_bkbrush) dc.SetFont(self._normal_font) textx = normal_textx texty = normal_texty if not page.enabled: dc.SetTextForeground(wx.SystemSettings.GetColour(wx.SYS_COLOUR_GRAYTEXT)) else: dc.SetTextForeground(page.text_colour) # -- draw line -- points = [wx.Point() for i in range(7)] points[0].x = tab_x points[0].y = tab_y + tab_height - 1 points[1].x = tab_x + tab_height - 3 points[1].y = tab_y + 2 points[2].x = tab_x + tab_height + 3 points[2].y = tab_y points[3].x = tab_x + tab_width - 2 points[3].y = tab_y points[4].x = tab_x + tab_width points[4].y = tab_y + 2 points[5].x = tab_x + tab_width points[5].y = tab_y + tab_height - 1 points[6] = points[0] dc.SetClippingRegion(in_rect) dc.DrawPolygon(points) dc.SetPen(wx.GREY_PEN) dc.DrawLines(points) close_button_width = 0 if close_button_state != AUI_BUTTON_STATE_HIDDEN: close_button_width = self._active_close_bmp.GetWidth() if agwFlags & AUI_NB_CLOSE_ON_TAB_LEFT: if control: text_offset = tab_x + (tab_height//2) + close_button_width - (textx//2) - 2 else: text_offset = tab_x + (tab_height//2) + ((tab_width+close_button_width)//2) - (textx//2) - 2 else: if control: text_offset = tab_x + (tab_height//2) + close_button_width - (textx//2) else: text_offset = tab_x + (tab_height//2) + ((tab_width-close_button_width)//2) - (textx//2) else: text_offset = tab_x + (tab_height//3) + (tab_width//2) - (textx//2) if control: if agwFlags & AUI_NB_CLOSE_ON_TAB_LEFT: text_offset = tab_x + (tab_height//3) - (textx//2) + close_button_width + 2 else: text_offset = tab_x + (tab_height//3) - (textx//2) # set minimum text offset if text_offset < tab_x + tab_height: text_offset = tab_x + tab_height # chop text if necessary if agwFlags & AUI_NB_CLOSE_ON_TAB_LEFT: draw_text = ChopText(dc, caption, tab_width - (text_offset-tab_x)) else: draw_text = ChopText(dc, caption, tab_width - (text_offset-tab_x) - close_button_width) ypos = (tab_y + tab_height)//2 - (texty//2) + 1 if control: if control.GetPosition() != wx.Point(text_offset+1, ypos): control.SetPosition(wx.Point(text_offset+1, ypos)) if not control.IsShown(): control.Show() if paint_control: bmp = TakeScreenShot(control.GetScreenRect()) dc.DrawBitmap(bmp, text_offset+1, ypos, True) controlW, controlH = control.GetSize() text_offset += controlW + 4 # draw tab text rectx, recty, dummy = dc.GetFullMultiLineTextExtent(draw_text) dc.DrawLabel(draw_text, wx.Rect(text_offset, ypos, rectx, recty)) # draw focus rectangle if page.active and wx.Window.FindFocus() == wnd and (agwFlags & AUI_NB_NO_TAB_FOCUS) == 0: focusRect = wx.Rect(text_offset, ((tab_y + tab_height)//2 - (texty//2) + 1), selected_textx, selected_texty) focusRect.Inflate(2, 2) # TODO: # This should be uncommented when DrawFocusRect will become # available in wxPython # wx.RendererNative.Get().DrawFocusRect(wnd, dc, focusRect, 0) out_button_rect = wx.Rect() # draw close button if necessary if close_button_state != AUI_BUTTON_STATE_HIDDEN: if page.active: bmp = self._active_close_bmp else: bmp = self._disabled_close_bmp if agwFlags & AUI_NB_CLOSE_ON_TAB_LEFT: rect = wx.Rect(tab_x + tab_height - 2, tab_y + (tab_height//2) - (bmp.GetHeight()//2) + 1, close_button_width, tab_height - 1) else: rect = wx.Rect(tab_x + tab_width - close_button_width - 1, tab_y + (tab_height//2) - (bmp.GetHeight()//2) + 1, close_button_width, tab_height - 1) self.DrawButtons(dc, rect, bmp, wx.WHITE, close_button_state) out_button_rect = wx.Rect(*rect) out_tab_rect = wx.Rect(tab_x, tab_y, tab_width, tab_height) dc.DestroyClippingRegion() return out_tab_rect, out_button_rect, x_extent
[ "def", "DrawTab", "(", "self", ",", "dc", ",", "wnd", ",", "page", ",", "in_rect", ",", "close_button_state", ",", "paint_control", "=", "False", ")", ":", "# if the caption is empty, measure some temporary text", "caption", "=", "page", ".", "caption", "if", "c...
https://github.com/wxWidgets/Phoenix/blob/b2199e299a6ca6d866aa6f3d0888499136ead9d6/wx/lib/agw/aui/tabart.py#L1151-L1326
zhl2008/awd-platform
0416b31abea29743387b10b3914581fbe8e7da5e
web_flaskbb/Python-2.7.9/Lib/plat-mac/ic.py
python
IC.settypecreator
(self, file)
[]
def settypecreator(self, file): file = Carbon.File.pathname(file) record = self.mapfile(os.path.split(file)[1]) MacOS.SetCreatorAndType(file, record[2], record[1]) macostools.touched(fss)
[ "def", "settypecreator", "(", "self", ",", "file", ")", ":", "file", "=", "Carbon", ".", "File", ".", "pathname", "(", "file", ")", "record", "=", "self", ".", "mapfile", "(", "os", ".", "path", ".", "split", "(", "file", ")", "[", "1", "]", ")",...
https://github.com/zhl2008/awd-platform/blob/0416b31abea29743387b10b3914581fbe8e7da5e/web_flaskbb/Python-2.7.9/Lib/plat-mac/ic.py#L226-L230
saltstack/salt
fae5bc757ad0f1716483ce7ae180b451545c2058
salt/modules/zabbix.py
python
usergroup_list
(**connection_args)
Retrieve all enabled user groups. .. versionadded:: 2016.3.0 :param _connection_user: Optional - zabbix user (can also be set in opts or pillar, see module's docstring) :param _connection_password: Optional - zabbix password (can also be set in opts or pillar, see module's docstring) :param _connection_url: Optional - url of zabbix frontend (can also be set in opts, pillar, see module's docstring) :return: Array with enabled user groups details, False on failure. CLI Example: .. code-block:: bash salt '*' zabbix.usergroup_list
Retrieve all enabled user groups.
[ "Retrieve", "all", "enabled", "user", "groups", "." ]
def usergroup_list(**connection_args): """ Retrieve all enabled user groups. .. versionadded:: 2016.3.0 :param _connection_user: Optional - zabbix user (can also be set in opts or pillar, see module's docstring) :param _connection_password: Optional - zabbix password (can also be set in opts or pillar, see module's docstring) :param _connection_url: Optional - url of zabbix frontend (can also be set in opts, pillar, see module's docstring) :return: Array with enabled user groups details, False on failure. CLI Example: .. code-block:: bash salt '*' zabbix.usergroup_list """ conn_args = _login(**connection_args) ret = False try: if conn_args: method = "usergroup.get" params = { "output": "extend", } ret = _query(method, params, conn_args["url"], conn_args["auth"]) return ret["result"] else: raise KeyError except KeyError: return ret
[ "def", "usergroup_list", "(", "*", "*", "connection_args", ")", ":", "conn_args", "=", "_login", "(", "*", "*", "connection_args", ")", "ret", "=", "False", "try", ":", "if", "conn_args", ":", "method", "=", "\"usergroup.get\"", "params", "=", "{", "\"outp...
https://github.com/saltstack/salt/blob/fae5bc757ad0f1716483ce7ae180b451545c2058/salt/modules/zabbix.py#L1110-L1141
truenas/middleware
b11ec47d6340324f5a32287ffb4012e5d709b934
src/middlewared/middlewared/plugins/smb_/groupmap.py
python
SMBService.validate_groupmap_hwm
(self, low_range)
return must_reload
Middleware forces allocation of GIDs for Users, Groups, and Administrators to be deterministic with the default idmap backend. Bump up the idmap_tdb high-water mark to avoid conflicts with these and remove any mappings that conflict. Winbindd will regenerate the removed ones as-needed.
Middleware forces allocation of GIDs for Users, Groups, and Administrators to be deterministic with the default idmap backend. Bump up the idmap_tdb high-water mark to avoid conflicts with these and remove any mappings that conflict. Winbindd will regenerate the removed ones as-needed.
[ "Middleware", "forces", "allocation", "of", "GIDs", "for", "Users", "Groups", "and", "Administrators", "to", "be", "deterministic", "with", "the", "default", "idmap", "backend", ".", "Bump", "up", "the", "idmap_tdb", "high", "-", "water", "mark", "to", "avoid"...
def validate_groupmap_hwm(self, low_range): """ Middleware forces allocation of GIDs for Users, Groups, and Administrators to be deterministic with the default idmap backend. Bump up the idmap_tdb high-water mark to avoid conflicts with these and remove any mappings that conflict. Winbindd will regenerate the removed ones as-needed. """ must_reload = False try: tdb_handle = tdb.open(f"{SMBPath.STATEDIR.platform()}/winbindd_idmap.tdb") except FileNotFoundError: return must_reload try: group_hwm_bytes = tdb_handle.get(b'GROUP HWM\00') hwm = struct.unpack("<L", group_hwm_bytes)[0] if hwm < low_range + 2: tdb_handle.transaction_start() new_hwm_bytes = struct.pack("<L", low_range + 2) tdb_handle.store(b'GROUP HWM\00', new_hwm_bytes) tdb_handle.transaction_commit() self.middleware.call_sync('idmap.snapshot_samba4_dataset') must_reload = True for key in tdb_handle.keys(): if key[:3] == b'GID' and int(key.decode()[4:-3]) < (low_range + 2): reverse = tdb_handle.get(key) tdb_handle.transaction_start() tdb_handle.delete(key) tdb_handle.delete(reverse) tdb_handle.transaction_commit() if not must_reload: self.middleware.call_sync('idmap.snapshot_samba4_dataset') must_reload = True except Exception as e: self.logger.warning("TDB maintenace failed: %s", e) finally: tdb_handle.close() return must_reload
[ "def", "validate_groupmap_hwm", "(", "self", ",", "low_range", ")", ":", "must_reload", "=", "False", "try", ":", "tdb_handle", "=", "tdb", ".", "open", "(", "f\"{SMBPath.STATEDIR.platform()}/winbindd_idmap.tdb\"", ")", "except", "FileNotFoundError", ":", "return", ...
https://github.com/truenas/middleware/blob/b11ec47d6340324f5a32287ffb4012e5d709b934/src/middlewared/middlewared/plugins/smb_/groupmap.py#L182-L224
securityclippy/elasticintel
aa08d3e9f5ab1c000128e95161139ce97ff0e334
whois_lambda/requests/packages/urllib3/fields.py
python
RequestField.from_tuples
(cls, fieldname, value)
return request_param
A :class:`~urllib3.fields.RequestField` factory from old-style tuple parameters. Supports constructing :class:`~urllib3.fields.RequestField` from parameter of key/value strings AND key/filetuple. A filetuple is a (filename, data, MIME type) tuple where the MIME type is optional. For example:: 'foo': 'bar', 'fakefile': ('foofile.txt', 'contents of foofile'), 'realfile': ('barfile.txt', open('realfile').read()), 'typedfile': ('bazfile.bin', open('bazfile').read(), 'image/jpeg'), 'nonamefile': 'contents of nonamefile field', Field names and filenames must be unicode.
A :class:`~urllib3.fields.RequestField` factory from old-style tuple parameters.
[ "A", ":", "class", ":", "~urllib3", ".", "fields", ".", "RequestField", "factory", "from", "old", "-", "style", "tuple", "parameters", "." ]
def from_tuples(cls, fieldname, value): """ A :class:`~urllib3.fields.RequestField` factory from old-style tuple parameters. Supports constructing :class:`~urllib3.fields.RequestField` from parameter of key/value strings AND key/filetuple. A filetuple is a (filename, data, MIME type) tuple where the MIME type is optional. For example:: 'foo': 'bar', 'fakefile': ('foofile.txt', 'contents of foofile'), 'realfile': ('barfile.txt', open('realfile').read()), 'typedfile': ('bazfile.bin', open('bazfile').read(), 'image/jpeg'), 'nonamefile': 'contents of nonamefile field', Field names and filenames must be unicode. """ if isinstance(value, tuple): if len(value) == 3: filename, data, content_type = value else: filename, data = value content_type = guess_content_type(filename) else: filename = None content_type = None data = value request_param = cls(fieldname, data, filename=filename) request_param.make_multipart(content_type=content_type) return request_param
[ "def", "from_tuples", "(", "cls", ",", "fieldname", ",", "value", ")", ":", "if", "isinstance", "(", "value", ",", "tuple", ")", ":", "if", "len", "(", "value", ")", "==", "3", ":", "filename", ",", "data", ",", "content_type", "=", "value", "else", ...
https://github.com/securityclippy/elasticintel/blob/aa08d3e9f5ab1c000128e95161139ce97ff0e334/whois_lambda/requests/packages/urllib3/fields.py#L72-L103
dropbox/dropbox-sdk-python
015437429be224732990041164a21a0501235db1
dropbox/team_log.py
python
EventDetails.get_shared_folder_decline_invitation_details
(self)
return self._value
Only call this if :meth:`is_shared_folder_decline_invitation_details` is true. :rtype: SharedFolderDeclineInvitationDetails
Only call this if :meth:`is_shared_folder_decline_invitation_details` is true.
[ "Only", "call", "this", "if", ":", "meth", ":", "is_shared_folder_decline_invitation_details", "is", "true", "." ]
def get_shared_folder_decline_invitation_details(self): """ Only call this if :meth:`is_shared_folder_decline_invitation_details` is true. :rtype: SharedFolderDeclineInvitationDetails """ if not self.is_shared_folder_decline_invitation_details(): raise AttributeError("tag 'shared_folder_decline_invitation_details' not set") return self._value
[ "def", "get_shared_folder_decline_invitation_details", "(", "self", ")", ":", "if", "not", "self", ".", "is_shared_folder_decline_invitation_details", "(", ")", ":", "raise", "AttributeError", "(", "\"tag 'shared_folder_decline_invitation_details' not set\"", ")", "return", "...
https://github.com/dropbox/dropbox-sdk-python/blob/015437429be224732990041164a21a0501235db1/dropbox/team_log.py#L20019-L20027
openedx/edx-platform
68dd185a0ab45862a2a61e0f803d7e03d2be71b5
cms/djangoapps/contentstore/courseware_index.py
python
indexing_is_enabled
()
return settings.FEATURES.get('ENABLE_COURSEWARE_INDEX', False)
Checks to see if the indexing feature is enabled
Checks to see if the indexing feature is enabled
[ "Checks", "to", "see", "if", "the", "indexing", "feature", "is", "enabled" ]
def indexing_is_enabled(): """ Checks to see if the indexing feature is enabled """ return settings.FEATURES.get('ENABLE_COURSEWARE_INDEX', False)
[ "def", "indexing_is_enabled", "(", ")", ":", "return", "settings", ".", "FEATURES", ".", "get", "(", "'ENABLE_COURSEWARE_INDEX'", ",", "False", ")" ]
https://github.com/openedx/edx-platform/blob/68dd185a0ab45862a2a61e0f803d7e03d2be71b5/cms/djangoapps/contentstore/courseware_index.py#L48-L52
veusz/veusz
5a1e2af5f24df0eb2a2842be51f2997c4999c7fb
veusz/dataimport/defn_nd.py
python
ImportFileND
( comm, filename, dataset, shape=None, transpose=False, mode='text', csvdelimiter=',', csvtextdelimiter='"', csvlocale='en_US', prefix="", suffix="", encoding='utf_8', linked=False)
return op.outnames
Import n-dimensional data from a file. filename is the name of the file to read dataset is the dataset to read if shape is set, the dataset is reshaped to these dimensions after loading if transpose=True, then rows and columns, etc, are swapped mode is either 'text' or 'csv' csvdelimiter is the csv delimiter for csv csvtextdelimiter is the csv text delimiter for csv csvlocale is locale to use when reading csv data prefix and suffix are prepended and appended to dataset names encoding is encoding character set if linked=True then the dataset is linked to the file Returns: list of imported datasets
Import n-dimensional data from a file. filename is the name of the file to read dataset is the dataset to read
[ "Import", "n", "-", "dimensional", "data", "from", "a", "file", ".", "filename", "is", "the", "name", "of", "the", "file", "to", "read", "dataset", "is", "the", "dataset", "to", "read" ]
def ImportFileND( comm, filename, dataset, shape=None, transpose=False, mode='text', csvdelimiter=',', csvtextdelimiter='"', csvlocale='en_US', prefix="", suffix="", encoding='utf_8', linked=False): """Import n-dimensional data from a file. filename is the name of the file to read dataset is the dataset to read if shape is set, the dataset is reshaped to these dimensions after loading if transpose=True, then rows and columns, etc, are swapped mode is either 'text' or 'csv' csvdelimiter is the csv delimiter for csv csvtextdelimiter is the csv text delimiter for csv csvlocale is locale to use when reading csv data prefix and suffix are prepended and appended to dataset names encoding is encoding character set if linked=True then the dataset is linked to the file Returns: list of imported datasets """ # look up filename on path realfilename = comm.findFileOnImportPath(filename) params = ImportParamsND( dataset=dataset, filename=realfilename, transpose=transpose, mode=mode, csvdelimiter=csvdelimiter, csvtextdelimiter=csvtextdelimiter, csvlocale=csvlocale, prefix=prefix, suffix=suffix, linked=linked) op = OperationDataImportND(params) comm.document.applyOperation(op) if comm.verbose: print("Imported datasets %s" % ', '.join(op.outnames)) return op.outnames
[ "def", "ImportFileND", "(", "comm", ",", "filename", ",", "dataset", ",", "shape", "=", "None", ",", "transpose", "=", "False", ",", "mode", "=", "'text'", ",", "csvdelimiter", "=", "','", ",", "csvtextdelimiter", "=", "'\"'", ",", "csvlocale", "=", "'en...
https://github.com/veusz/veusz/blob/5a1e2af5f24df0eb2a2842be51f2997c4999c7fb/veusz/dataimport/defn_nd.py#L97-L145
JaniceWuo/MovieRecommend
4c86db64ca45598917d304f535413df3bc9fea65
movierecommend/venv1/Lib/site-packages/pip-9.0.1-py3.6.egg/pip/exceptions.py
python
HashError.__str__
(self)
return '%s\n%s' % (self.head, self.body())
[]
def __str__(self): return '%s\n%s' % (self.head, self.body())
[ "def", "__str__", "(", "self", ")", ":", "return", "'%s\\n%s'", "%", "(", "self", ".", "head", ",", "self", ".", "body", "(", ")", ")" ]
https://github.com/JaniceWuo/MovieRecommend/blob/4c86db64ca45598917d304f535413df3bc9fea65/movierecommend/venv1/Lib/site-packages/pip-9.0.1-py3.6.egg/pip/exceptions.py#L110-L111
JaniceWuo/MovieRecommend
4c86db64ca45598917d304f535413df3bc9fea65
movierecommend/venv1/Lib/site-packages/django/contrib/messages/storage/cookie.py
python
MessageEncoder.default
(self, obj)
return super(MessageEncoder, self).default(obj)
[]
def default(self, obj): if isinstance(obj, Message): # Using 0/1 here instead of False/True to produce more compact json is_safedata = 1 if isinstance(obj.message, SafeData) else 0 message = [self.message_key, is_safedata, obj.level, obj.message] if obj.extra_tags: message.append(obj.extra_tags) return message return super(MessageEncoder, self).default(obj)
[ "def", "default", "(", "self", ",", "obj", ")", ":", "if", "isinstance", "(", "obj", ",", "Message", ")", ":", "# Using 0/1 here instead of False/True to produce more compact json", "is_safedata", "=", "1", "if", "isinstance", "(", "obj", ".", "message", ",", "S...
https://github.com/JaniceWuo/MovieRecommend/blob/4c86db64ca45598917d304f535413df3bc9fea65/movierecommend/venv1/Lib/site-packages/django/contrib/messages/storage/cookie.py#L17-L25
zhl2008/awd-platform
0416b31abea29743387b10b3914581fbe8e7da5e
web_flaskbb/lib/python2.7/site-packages/sqlalchemy/sql/elements.py
python
FunctionFilter.type
(self)
return self.func.type
[]
def type(self): return self.func.type
[ "def", "type", "(", "self", ")", ":", "return", "self", ".", "func", ".", "type" ]
https://github.com/zhl2008/awd-platform/blob/0416b31abea29743387b10b3914581fbe8e7da5e/web_flaskbb/lib/python2.7/site-packages/sqlalchemy/sql/elements.py#L3484-L3485
aws-samples/aws-kube-codesuite
ab4e5ce45416b83bffb947ab8d234df5437f4fca
src/kubernetes/client/models/v1_config_map_volume_source.py
python
V1ConfigMapVolumeSource.default_mode
(self, default_mode)
Sets the default_mode of this V1ConfigMapVolumeSource. Optional: mode bits to use on created files by default. Must be a value between 0 and 0777. Defaults to 0644. Directories within the path are not affected by this setting. This might be in conflict with other options that affect the file mode, like fsGroup, and the result can be other mode bits set. :param default_mode: The default_mode of this V1ConfigMapVolumeSource. :type: int
Sets the default_mode of this V1ConfigMapVolumeSource. Optional: mode bits to use on created files by default. Must be a value between 0 and 0777. Defaults to 0644. Directories within the path are not affected by this setting. This might be in conflict with other options that affect the file mode, like fsGroup, and the result can be other mode bits set.
[ "Sets", "the", "default_mode", "of", "this", "V1ConfigMapVolumeSource", ".", "Optional", ":", "mode", "bits", "to", "use", "on", "created", "files", "by", "default", ".", "Must", "be", "a", "value", "between", "0", "and", "0777", ".", "Defaults", "to", "06...
def default_mode(self, default_mode): """ Sets the default_mode of this V1ConfigMapVolumeSource. Optional: mode bits to use on created files by default. Must be a value between 0 and 0777. Defaults to 0644. Directories within the path are not affected by this setting. This might be in conflict with other options that affect the file mode, like fsGroup, and the result can be other mode bits set. :param default_mode: The default_mode of this V1ConfigMapVolumeSource. :type: int """ self._default_mode = default_mode
[ "def", "default_mode", "(", "self", ",", "default_mode", ")", ":", "self", ".", "_default_mode", "=", "default_mode" ]
https://github.com/aws-samples/aws-kube-codesuite/blob/ab4e5ce45416b83bffb947ab8d234df5437f4fca/src/kubernetes/client/models/v1_config_map_volume_source.py#L64-L73
freqtrade/freqtrade
13651fd3be8d5ce8dcd7c94b920bda4e00b75aca
freqtrade/strategy/hyper.py
python
HyperStrategyMixin.detect_all_parameters
(cls)
return params
Detect all parameters and return them as a list
Detect all parameters and return them as a list
[ "Detect", "all", "parameters", "and", "return", "them", "as", "a", "list" ]
def detect_all_parameters(cls) -> Dict: """ Detect all parameters and return them as a list""" params: Dict = { 'buy': list(cls.detect_parameters('buy')), 'sell': list(cls.detect_parameters('sell')), 'protection': list(cls.detect_parameters('protection')), } params.update({ 'count': len(params['buy'] + params['sell'] + params['protection']) }) return params
[ "def", "detect_all_parameters", "(", "cls", ")", "->", "Dict", ":", "params", ":", "Dict", "=", "{", "'buy'", ":", "list", "(", "cls", ".", "detect_parameters", "(", "'buy'", ")", ")", ",", "'sell'", ":", "list", "(", "cls", ".", "detect_parameters", "...
https://github.com/freqtrade/freqtrade/blob/13651fd3be8d5ce8dcd7c94b920bda4e00b75aca/freqtrade/strategy/hyper.py#L346-L357
geopython/pycsw
43a5c92fa819a3a3fdc8a8e3ef075d784dff73fc
pycsw/core/util.py
python
wktenvelope2bbox
(envelope)
return bbox
returns bbox string of WKT ENVELOPE definition
returns bbox string of WKT ENVELOPE definition
[ "returns", "bbox", "string", "of", "WKT", "ENVELOPE", "definition" ]
def wktenvelope2bbox(envelope): """returns bbox string of WKT ENVELOPE definition""" tmparr = [x.strip() for x in envelope.split('(')[1].split(')')[0].split(',')] bbox = '%s,%s,%s,%s' % (tmparr[0], tmparr[3], tmparr[1], tmparr[2]) return bbox
[ "def", "wktenvelope2bbox", "(", "envelope", ")", ":", "tmparr", "=", "[", "x", ".", "strip", "(", ")", "for", "x", "in", "envelope", ".", "split", "(", "'('", ")", "[", "1", "]", ".", "split", "(", "')'", ")", "[", "0", "]", ".", "split", "(", ...
https://github.com/geopython/pycsw/blob/43a5c92fa819a3a3fdc8a8e3ef075d784dff73fc/pycsw/core/util.py#L169-L174
MongoEngine/django-mongoengine
e8a75e8e5860545ecfbadaf1b1285495022bd7cb
django_mongoengine/views/embedded.py
python
EmbeddedFormMixin.get_form
(self, form_class=None)
return form_class(self.object, **self.get_form_kwargs())
Returns an instance of the form to be used in this view.
Returns an instance of the form to be used in this view.
[ "Returns", "an", "instance", "of", "the", "form", "to", "be", "used", "in", "this", "view", "." ]
def get_form(self, form_class=None): """ Returns an instance of the form to be used in this view. """ if form_class is None: form_class = self.get_form_class() return form_class(self.object, **self.get_form_kwargs())
[ "def", "get_form", "(", "self", ",", "form_class", "=", "None", ")", ":", "if", "form_class", "is", "None", ":", "form_class", "=", "self", ".", "get_form_class", "(", ")", "return", "form_class", "(", "self", ".", "object", ",", "*", "*", "self", ".",...
https://github.com/MongoEngine/django-mongoengine/blob/e8a75e8e5860545ecfbadaf1b1285495022bd7cb/django_mongoengine/views/embedded.py#L24-L30
segmentio/analytics-python
40b2bad5c2539d310dc06cf19d20b372b0b1b3eb
analytics/consumer.py
python
Consumer.run
(self)
Runs the consumer.
Runs the consumer.
[ "Runs", "the", "consumer", "." ]
def run(self): """Runs the consumer.""" self.log.debug('consumer is running...') while self.running: self.upload() self.log.debug('consumer exited.')
[ "def", "run", "(", "self", ")", ":", "self", ".", "log", ".", "debug", "(", "'consumer is running...'", ")", "while", "self", ".", "running", ":", "self", ".", "upload", "(", ")", "self", ".", "log", ".", "debug", "(", "'consumer exited.'", ")" ]
https://github.com/segmentio/analytics-python/blob/40b2bad5c2539d310dc06cf19d20b372b0b1b3eb/analytics/consumer.py#L49-L55
iotaledger/iota.py
f596c1ac0d9bcbceda1cf6109cd921943a6599b3
iota/types.py
python
TryteString.decode
(self, errors: str = 'strict', strip_padding: bool = True)
return bytes_.decode('utf-8', errors)
Decodes the TryteString into a higher-level abstraction (usually Unicode characters). :param str errors: How to handle trytes that can't be converted, or bytes that can't be decoded using UTF-8: 'strict' raise an exception (recommended). 'replace' replace with a placeholder character. 'ignore' omit the invalid tryte/byte sequence. :param bool strip_padding: Whether to strip trailing null trytes before converting. :raises: - :py:class:`iota.codecs.TrytesDecodeError` if the trytes cannot be decoded into bytes. - :py:class:`UnicodeDecodeError` if the resulting bytes cannot be decoded using UTF-8. :return: ``Unicode string`` object. Example usage:: from iota import TryteString trytes = TryteString(b'RBTC9D9DCDQAEASBYBCCKBFA') message = trytes.decode()
Decodes the TryteString into a higher-level abstraction (usually Unicode characters).
[ "Decodes", "the", "TryteString", "into", "a", "higher", "-", "level", "abstraction", "(", "usually", "Unicode", "characters", ")", "." ]
def decode(self, errors: str = 'strict', strip_padding: bool = True) -> str: """ Decodes the TryteString into a higher-level abstraction (usually Unicode characters). :param str errors: How to handle trytes that can't be converted, or bytes that can't be decoded using UTF-8: 'strict' raise an exception (recommended). 'replace' replace with a placeholder character. 'ignore' omit the invalid tryte/byte sequence. :param bool strip_padding: Whether to strip trailing null trytes before converting. :raises: - :py:class:`iota.codecs.TrytesDecodeError` if the trytes cannot be decoded into bytes. - :py:class:`UnicodeDecodeError` if the resulting bytes cannot be decoded using UTF-8. :return: ``Unicode string`` object. Example usage:: from iota import TryteString trytes = TryteString(b'RBTC9D9DCDQAEASBYBCCKBFA') message = trytes.decode() """ trytes = self._trytes if strip_padding and (trytes[-1] == ord(b'9')): trytes = trytes.rstrip(b'9') # Put one back to preserve even length for ASCII codec. trytes += b'9' * (len(trytes) % 2) bytes_ = decode(trytes, AsciiTrytesCodec.name, errors) return bytes_.decode('utf-8', errors)
[ "def", "decode", "(", "self", ",", "errors", ":", "str", "=", "'strict'", ",", "strip_padding", ":", "bool", "=", "True", ")", "->", "str", ":", "trytes", "=", "self", ".", "_trytes", "if", "strip_padding", "and", "(", "trytes", "[", "-", "1", "]", ...
https://github.com/iotaledger/iota.py/blob/f596c1ac0d9bcbceda1cf6109cd921943a6599b3/iota/types.py#L617-L666
zhl2008/awd-platform
0416b31abea29743387b10b3914581fbe8e7da5e
web_flaskbb/Python-2.7.9/Demo/pdist/client.py
python
Client._flush
(self)
[]
def _flush(self): self._wf.flush()
[ "def", "_flush", "(", "self", ")", ":", "self", ".", "_wf", ".", "flush", "(", ")" ]
https://github.com/zhl2008/awd-platform/blob/0416b31abea29743387b10b3914581fbe8e7da5e/web_flaskbb/Python-2.7.9/Demo/pdist/client.py#L125-L126
504ensicsLabs/DAMM
60e7ec7dacd6087cd6320b3615becca9b4cf9b24
volatility/obj.py
python
Object
(theType, offset, vm, name = None, **kwargs)
A function which instantiates the object named in theType (as a string) from the type in profile passing optional args of kwargs.
A function which instantiates the object named in theType (as a string) from the type in profile passing optional args of kwargs.
[ "A", "function", "which", "instantiates", "the", "object", "named", "in", "theType", "(", "as", "a", "string", ")", "from", "the", "type", "in", "profile", "passing", "optional", "args", "of", "kwargs", "." ]
def Object(theType, offset, vm, name = None, **kwargs): """ A function which instantiates the object named in theType (as a string) from the type in profile passing optional args of kwargs. """ name = name or theType offset = int(offset) try: if vm.profile.has_type(theType): result = vm.profile.types[theType](offset = offset, vm = vm, name = name, **kwargs) return result except InvalidOffsetError: ## If we cant instantiate the object here, we just error out: return NoneObject("Invalid Address 0x{0:08X}, instantiating {1}".format(offset, name), strict = vm.profile.strict) ## If we get here we have no idea what the type is supposed to be? ## This is a serious error. debug.warning("Cant find object {0} in profile {1}?".format(theType, vm.profile))
[ "def", "Object", "(", "theType", ",", "offset", ",", "vm", ",", "name", "=", "None", ",", "*", "*", "kwargs", ")", ":", "name", "=", "name", "or", "theType", "offset", "=", "int", "(", "offset", ")", "try", ":", "if", "vm", ".", "profile", ".", ...
https://github.com/504ensicsLabs/DAMM/blob/60e7ec7dacd6087cd6320b3615becca9b4cf9b24/volatility/obj.py#L164-L183
macanv/BERT-BiLSTM-CRF-NER
ccf3f093f0ac803e435cb8e8598fdddc2ba1105d
bert_base/train/models.py
python
DataProcessor.get_labels
(self)
Gets the list of labels for this data set.
Gets the list of labels for this data set.
[ "Gets", "the", "list", "of", "labels", "for", "this", "data", "set", "." ]
def get_labels(self): """Gets the list of labels for this data set.""" raise NotImplementedError()
[ "def", "get_labels", "(", "self", ")", ":", "raise", "NotImplementedError", "(", ")" ]
https://github.com/macanv/BERT-BiLSTM-CRF-NER/blob/ccf3f093f0ac803e435cb8e8598fdddc2ba1105d/bert_base/train/models.py#L60-L62
scipy/scipy
e0a749f01e79046642ccfdc419edbf9e7ca141ad
scipy/signal/_signaltools.py
python
_numeric_arrays
(arrays, kinds='buifc')
return True
See if a list of arrays are all numeric. Parameters ---------- arrays : array or list of arrays arrays to check if numeric. kinds : string-like The dtypes of the arrays to be checked. If the dtype.kind of the ndarrays are not in this string the function returns False and otherwise returns True.
See if a list of arrays are all numeric.
[ "See", "if", "a", "list", "of", "arrays", "are", "all", "numeric", "." ]
def _numeric_arrays(arrays, kinds='buifc'): """ See if a list of arrays are all numeric. Parameters ---------- arrays : array or list of arrays arrays to check if numeric. kinds : string-like The dtypes of the arrays to be checked. If the dtype.kind of the ndarrays are not in this string the function returns False and otherwise returns True. """ if type(arrays) == np.ndarray: return arrays.dtype.kind in kinds for array_ in arrays: if array_.dtype.kind not in kinds: return False return True
[ "def", "_numeric_arrays", "(", "arrays", ",", "kinds", "=", "'buifc'", ")", ":", "if", "type", "(", "arrays", ")", "==", "np", ".", "ndarray", ":", "return", "arrays", ".", "dtype", ".", "kind", "in", "kinds", "for", "array_", "in", "arrays", ":", "i...
https://github.com/scipy/scipy/blob/e0a749f01e79046642ccfdc419edbf9e7ca141ad/scipy/signal/_signaltools.py#L990-L1008
securesystemslab/zippy
ff0e84ac99442c2c55fe1d285332cfd4e185e089
zippy/benchmarks/src/benchmarks/sympy/sympy/polys/polytools.py
python
cancel
(f, *gens, **args)
Cancel common factors in a rational function ``f``. Examples ======== >>> from sympy import cancel, sqrt, Symbol >>> from sympy.abc import x >>> A = Symbol('A', commutative=False) >>> cancel((2*x**2 - 2)/(x**2 - 2*x + 1)) (2*x + 2)/(x - 1) >>> cancel((sqrt(3) + sqrt(15)*A)/(sqrt(2) + sqrt(10)*A)) sqrt(6)/2
Cancel common factors in a rational function ``f``.
[ "Cancel", "common", "factors", "in", "a", "rational", "function", "f", "." ]
def cancel(f, *gens, **args): """ Cancel common factors in a rational function ``f``. Examples ======== >>> from sympy import cancel, sqrt, Symbol >>> from sympy.abc import x >>> A = Symbol('A', commutative=False) >>> cancel((2*x**2 - 2)/(x**2 - 2*x + 1)) (2*x + 2)/(x - 1) >>> cancel((sqrt(3) + sqrt(15)*A)/(sqrt(2) + sqrt(10)*A)) sqrt(6)/2 """ from sympy.core.exprtools import factor_terms options.allowed_flags(args, ['polys']) f = sympify(f) if not isinstance(f, (tuple, Tuple)): if f.is_Number or isinstance(f, Relational) or not isinstance(f, Expr): return f f = factor_terms(f, radical=True) p, q = f.as_numer_denom() elif len(f) == 2: p, q = f elif isinstance(f, Tuple): return factor_terms(f) else: raise ValueError('unexpected argument: %s' % f) try: (F, G), opt = parallel_poly_from_expr((p, q), *gens, **args) except PolificationFailed: if not isinstance(f, (tuple, Tuple)): return f else: return S.One, p, q except PolynomialError as msg: if f.is_commutative and not f.has(Piecewise): raise PolynomialError(msg) # Handling of noncommutative and/or piecewise expressions if f.is_Add or f.is_Mul: sifted = sift(f.args, lambda x: x.is_commutative and not x.has(Piecewise)) c, nc = sifted[True], sifted[False] nc = [cancel(i) for i in nc] return f.func(cancel(f.func._from_args(c)), *nc) else: reps = [] pot = preorder_traversal(f) next(pot) for e in pot: # XXX: This should really skip anything that's not Expr. if isinstance(e, (tuple, Tuple, BooleanAtom)): continue try: reps.append((e, cancel(e))) pot.skip() # this was handled successfully except NotImplementedError: pass return f.xreplace(dict(reps)) c, P, Q = F.cancel(G) if not isinstance(f, (tuple, Tuple)): return c*(P.as_expr()/Q.as_expr()) else: if not opt.polys: return c, P.as_expr(), Q.as_expr() else: return c, P, Q
[ "def", "cancel", "(", "f", ",", "*", "gens", ",", "*", "*", "args", ")", ":", "from", "sympy", ".", "core", ".", "exprtools", "import", "factor_terms", "options", ".", "allowed_flags", "(", "args", ",", "[", "'polys'", "]", ")", "f", "=", "sympify", ...
https://github.com/securesystemslab/zippy/blob/ff0e84ac99442c2c55fe1d285332cfd4e185e089/zippy/benchmarks/src/benchmarks/sympy/sympy/polys/polytools.py#L6196-L6269
sagemath/sage
f9b2db94f675ff16963ccdefba4f1a3393b3fe0d
src/sage/modules/tensor_operations.py
python
TensorOperation.preimage
(self)
return self._index_map.keys()
A choice of pre-image multi-indices. OUTPUT: A list of multi-indices (tuples of integers) whose image is the entire image under the :meth:`index_map`. EXAMPLES:: sage: from sage.modules.tensor_operations import \ ....: VectorCollection, TensorOperation sage: R = VectorCollection([(1,0), (0,1), (-2,-3)], QQ, 2) sage: detR = TensorOperation([R]*2, 'antisymmetric') sage: sorted(detR.preimage()) [(0, 1), (0, 2), (1, 2)] sage: sorted(detR.codomain()) [0, 1, 2]
A choice of pre-image multi-indices.
[ "A", "choice", "of", "pre", "-", "image", "multi", "-", "indices", "." ]
def preimage(self): """ A choice of pre-image multi-indices. OUTPUT: A list of multi-indices (tuples of integers) whose image is the entire image under the :meth:`index_map`. EXAMPLES:: sage: from sage.modules.tensor_operations import \ ....: VectorCollection, TensorOperation sage: R = VectorCollection([(1,0), (0,1), (-2,-3)], QQ, 2) sage: detR = TensorOperation([R]*2, 'antisymmetric') sage: sorted(detR.preimage()) [(0, 1), (0, 2), (1, 2)] sage: sorted(detR.codomain()) [0, 1, 2] """ return self._index_map.keys()
[ "def", "preimage", "(", "self", ")", ":", "return", "self", ".", "_index_map", ".", "keys", "(", ")" ]
https://github.com/sagemath/sage/blob/f9b2db94f675ff16963ccdefba4f1a3393b3fe0d/src/sage/modules/tensor_operations.py#L543-L563
compas-dev/compas
0b33f8786481f710115fb1ae5fe79abc2a9a5175
src/compas/datastructures/halfedge/halfedge.py
python
HalfEdge.face_attribute
(self, key, name, value=None)
Get or set an attribute of a face. Parameters ---------- key : int The face identifier. name : str The name of the attribute. value : obj, optional The value of the attribute. Returns ------- object or None The value of the attribute, or ``None`` when the function is used as a "setter". Raises ------ KeyError If the face does not exist.
Get or set an attribute of a face.
[ "Get", "or", "set", "an", "attribute", "of", "a", "face", "." ]
def face_attribute(self, key, name, value=None): """Get or set an attribute of a face. Parameters ---------- key : int The face identifier. name : str The name of the attribute. value : obj, optional The value of the attribute. Returns ------- object or None The value of the attribute, or ``None`` when the function is used as a "setter". Raises ------ KeyError If the face does not exist. """ if key not in self.face: raise KeyError(key) if value is not None: if key not in self.facedata: self.facedata[key] = {} self.facedata[key][name] = value return if key in self.facedata and name in self.facedata[key]: return self.facedata[key][name] if name in self.default_face_attributes: return self.default_face_attributes[name]
[ "def", "face_attribute", "(", "self", ",", "key", ",", "name", ",", "value", "=", "None", ")", ":", "if", "key", "not", "in", "self", ".", "face", ":", "raise", "KeyError", "(", "key", ")", "if", "value", "is", "not", "None", ":", "if", "key", "n...
https://github.com/compas-dev/compas/blob/0b33f8786481f710115fb1ae5fe79abc2a9a5175/src/compas/datastructures/halfedge/halfedge.py#L1123-L1155
wistbean/learn_python3_spider
73c873f4845f4385f097e5057407d03dd37a117b
stackoverflow/venv/lib/python3.6/site-packages/pip-19.0.3-py3.6.egg/pip/_vendor/cachecontrol/caches/file_cache.py
python
url_to_file_path
(url, filecache)
return filecache._fn(key)
Return the file cache path based on the URL. This does not ensure the file exists!
Return the file cache path based on the URL.
[ "Return", "the", "file", "cache", "path", "based", "on", "the", "URL", "." ]
def url_to_file_path(url, filecache): """Return the file cache path based on the URL. This does not ensure the file exists! """ key = CacheController.cache_url(url) return filecache._fn(key)
[ "def", "url_to_file_path", "(", "url", ",", "filecache", ")", ":", "key", "=", "CacheController", ".", "cache_url", "(", "url", ")", "return", "filecache", ".", "_fn", "(", "key", ")" ]
https://github.com/wistbean/learn_python3_spider/blob/73c873f4845f4385f097e5057407d03dd37a117b/stackoverflow/venv/lib/python3.6/site-packages/pip-19.0.3-py3.6.egg/pip/_vendor/cachecontrol/caches/file_cache.py#L140-L146
stopstalk/stopstalk-deployment
10c3ab44c4ece33ae515f6888c15033db2004bb1
aws_lambda/spoj_aws_lambda_function/lambda_code/requests/packages/urllib3/packages/ordered_dict.py
python
OrderedDict.items
(self)
return [(key, self[key]) for key in self]
od.items() -> list of (key, value) pairs in od
od.items() -> list of (key, value) pairs in od
[ "od", ".", "items", "()", "-", ">", "list", "of", "(", "key", "value", ")", "pairs", "in", "od" ]
def items(self): 'od.items() -> list of (key, value) pairs in od' return [(key, self[key]) for key in self]
[ "def", "items", "(", "self", ")", ":", "return", "[", "(", "key", ",", "self", "[", "key", "]", ")", "for", "key", "in", "self", "]" ]
https://github.com/stopstalk/stopstalk-deployment/blob/10c3ab44c4ece33ae515f6888c15033db2004bb1/aws_lambda/spoj_aws_lambda_function/lambda_code/requests/packages/urllib3/packages/ordered_dict.py#L124-L126
kupferlauncher/kupfer
1c1e9bcbce05a82f503f68f8b3955c20b02639b3
kupfer/plugin/screen.py
python
get_username
()
return info[0]
Return username for current user
Return username for current user
[ "Return", "username", "for", "current", "user" ]
def get_username(): """Return username for current user""" import pwd info = pwd.getpwuid(os.geteuid()) return info[0]
[ "def", "get_username", "(", ")", ":", "import", "pwd", "info", "=", "pwd", ".", "getpwuid", "(", "os", ".", "geteuid", "(", ")", ")", "return", "info", "[", "0", "]" ]
https://github.com/kupferlauncher/kupfer/blob/1c1e9bcbce05a82f503f68f8b3955c20b02639b3/kupfer/plugin/screen.py#L31-L35
Tivix/django-common
407d208121011a8425139e541629554114d96c18
django_common/email_backends.py
python
TestEmailBackend._send
(self, email_message)
return True
A helper method that does the actual sending.
A helper method that does the actual sending.
[ "A", "helper", "method", "that", "does", "the", "actual", "sending", "." ]
def _send(self, email_message): """A helper method that does the actual sending.""" if not email_message.recipients(): return False from_email = email_message.from_email if hasattr(message, 'sanitize_address'): from_email = message.sanitize_address(email_message.from_email, email_message.encoding) if hasattr(settings, 'TEST_EMAIL_TO'): email_message.to = settings.TEST_EMAIL_TO else: email_message.to = dict(getattr(settings, 'ADMINS', ())).values() email_message.cc = getattr(settings, 'TEST_EMAIL_CC', []) email_message.bcc = getattr(settings, 'TEST_EMAIL_BCC', []) if hasattr(message, 'sanitize_address'): recipients = [message.sanitize_address(addr, email_message.encoding) for addr in email_message.recipients()] else: recipients = email_message.recipients() try: self.connection.sendmail(from_email, recipients, email_message.message().as_string()) except: if not self.fail_silently: raise return False return True
[ "def", "_send", "(", "self", ",", "email_message", ")", ":", "if", "not", "email_message", ".", "recipients", "(", ")", ":", "return", "False", "from_email", "=", "email_message", ".", "from_email", "if", "hasattr", "(", "message", ",", "'sanitize_address'", ...
https://github.com/Tivix/django-common/blob/407d208121011a8425139e541629554114d96c18/django_common/email_backends.py#L24-L50
compas-dev/compas
0b33f8786481f710115fb1ae5fe79abc2a9a5175
src/compas/robots/model/robot.py
python
RobotModel._rebuild_tree
(self)
Store tree structure from link and joint lists.
Store tree structure from link and joint lists.
[ "Store", "tree", "structure", "from", "link", "and", "joint", "lists", "." ]
def _rebuild_tree(self): """Store tree structure from link and joint lists.""" self._adjacency = dict() self._links = dict() self._joints = dict() for link in self.links: link.joints = self.find_children_joints(link) link.parent_joint = self.find_parent_joint(link) self._links[link.name] = link self._adjacency[link.name] = [joint.name for joint in link.joints] if not link.parent_joint: self.root = link for joint in self.joints: child_name = joint.child.link joint.child_link = self.get_link_by_name(child_name) self._joints[joint.name] = joint self._adjacency[joint.name] = [child_name]
[ "def", "_rebuild_tree", "(", "self", ")", ":", "self", ".", "_adjacency", "=", "dict", "(", ")", "self", ".", "_links", "=", "dict", "(", ")", "self", ".", "_joints", "=", "dict", "(", ")", "for", "link", "in", "self", ".", "links", ":", "link", ...
https://github.com/compas-dev/compas/blob/0b33f8786481f710115fb1ae5fe79abc2a9a5175/src/compas/robots/model/robot.py#L123-L144
GNS3/gns3-gui
da8adbaa18ab60e053af2a619efd468f4c8950f3
gns3/graphics_view.py
python
GraphicsView.consoleToNode
(self, node, aux=False)
return True
Start a console application to connect to a node. :param node: Node instance :param aux: auxiliary console mode :returns: False if the console application could not be started
Start a console application to connect to a node.
[ "Start", "a", "console", "application", "to", "connect", "to", "a", "node", "." ]
def consoleToNode(self, node, aux=False): """ Start a console application to connect to a node. :param node: Node instance :param aux: auxiliary console mode :returns: False if the console application could not be started """ if not hasattr(node, "console") or not node.initialized() or node.status() != Node.started: # returns True to ignore this node. return True if aux and not hasattr(node, "auxConsole"): # returns True to ignore this node. return True # TightVNC has lack support of IPv6 host at this moment if "vncviewer" in node.consoleCommand() and ":" in node.consoleHost(): QtWidgets.QMessageBox.warning(self, "TightVNC", "TightVNC (vncviewer) may not start because of lack of IPv6 support.") try: node.openConsole(aux=aux) except (OSError, ValueError) as e: QtWidgets.QMessageBox.critical(self, "Console", "Cannot start console application: {}".format(e)) return False return True
[ "def", "consoleToNode", "(", "self", ",", "node", ",", "aux", "=", "False", ")", ":", "if", "not", "hasattr", "(", "node", ",", "\"console\"", ")", "or", "not", "node", ".", "initialized", "(", ")", "or", "node", ".", "status", "(", ")", "!=", "Nod...
https://github.com/GNS3/gns3-gui/blob/da8adbaa18ab60e053af2a619efd468f4c8950f3/gns3/graphics_view.py#L1077-L1104
aiven/pghoard
1de0d2e33bf087b7ce3b6af556bbf941acfac3a4
pghoard/rohmu/dates.py
python
now
()
return datetime.datetime.now(datetime.timezone.utc)
[]
def now(): return datetime.datetime.now(datetime.timezone.utc)
[ "def", "now", "(", ")", ":", "return", "datetime", ".", "datetime", ".", "now", "(", "datetime", ".", "timezone", ".", "utc", ")" ]
https://github.com/aiven/pghoard/blob/1de0d2e33bf087b7ce3b6af556bbf941acfac3a4/pghoard/rohmu/dates.py#L44-L45
zhl2008/awd-platform
0416b31abea29743387b10b3914581fbe8e7da5e
web_hxb2/lib/python3.5/site-packages/pip/_vendor/colorama/ansi.py
python
clear_screen
(mode=2)
return CSI + str(mode) + 'J'
[]
def clear_screen(mode=2): return CSI + str(mode) + 'J'
[ "def", "clear_screen", "(", "mode", "=", "2", ")", ":", "return", "CSI", "+", "str", "(", "mode", ")", "+", "'J'" ]
https://github.com/zhl2008/awd-platform/blob/0416b31abea29743387b10b3914581fbe8e7da5e/web_hxb2/lib/python3.5/site-packages/pip/_vendor/colorama/ansi.py#L18-L19
materialsproject/pymatgen
8128f3062a334a2edd240e4062b5b9bdd1ae6f58
pymatgen/io/abinit/inputs.py
python
BasicAbinitInput.__init__
( self, structure, pseudos, pseudo_dir=None, comment=None, abi_args=None, abi_kwargs=None, )
Args: structure: Parameters defining the crystalline structure. Accepts |Structure| object file with structure (CIF, netcdf file, ...) or dictionary with ABINIT geo variables. pseudos: Pseudopotentials to be used for the calculation. Accepts: string or list of strings with the name of the pseudopotential files, list of |Pseudo| objects or |PseudoTable| object. pseudo_dir: Name of the directory where the pseudopotential files are located. ndtset: Number of datasets. comment: Optional string with a comment that will be placed at the beginning of the file. abi_args: list of tuples (key, value) with the initial set of variables. Default: Empty abi_kwargs: Dictionary with the initial set of variables. Default: Empty
Args: structure: Parameters defining the crystalline structure. Accepts |Structure| object file with structure (CIF, netcdf file, ...) or dictionary with ABINIT geo variables. pseudos: Pseudopotentials to be used for the calculation. Accepts: string or list of strings with the name of the pseudopotential files, list of |Pseudo| objects or |PseudoTable| object. pseudo_dir: Name of the directory where the pseudopotential files are located. ndtset: Number of datasets. comment: Optional string with a comment that will be placed at the beginning of the file. abi_args: list of tuples (key, value) with the initial set of variables. Default: Empty abi_kwargs: Dictionary with the initial set of variables. Default: Empty
[ "Args", ":", "structure", ":", "Parameters", "defining", "the", "crystalline", "structure", ".", "Accepts", "|Structure|", "object", "file", "with", "structure", "(", "CIF", "netcdf", "file", "...", ")", "or", "dictionary", "with", "ABINIT", "geo", "variables", ...
def __init__( self, structure, pseudos, pseudo_dir=None, comment=None, abi_args=None, abi_kwargs=None, ): """ Args: structure: Parameters defining the crystalline structure. Accepts |Structure| object file with structure (CIF, netcdf file, ...) or dictionary with ABINIT geo variables. pseudos: Pseudopotentials to be used for the calculation. Accepts: string or list of strings with the name of the pseudopotential files, list of |Pseudo| objects or |PseudoTable| object. pseudo_dir: Name of the directory where the pseudopotential files are located. ndtset: Number of datasets. comment: Optional string with a comment that will be placed at the beginning of the file. abi_args: list of tuples (key, value) with the initial set of variables. Default: Empty abi_kwargs: Dictionary with the initial set of variables. Default: Empty """ # Internal dict with variables. we use an ordered dict so that # variables will be likely grouped by `topics` when we fill the input. abi_args = [] if abi_args is None else abi_args for key, value in abi_args: self._check_varname(key) abi_kwargs = {} if abi_kwargs is None else abi_kwargs for key in abi_kwargs: self._check_varname(key) args = list(abi_args)[:] args.extend(list(abi_kwargs.items())) self._vars = OrderedDict(args) self.set_structure(structure) if pseudo_dir is not None: pseudo_dir = os.path.abspath(pseudo_dir) if not os.path.exists(pseudo_dir): raise self.Error("Directory %s does not exist" % pseudo_dir) pseudos = [os.path.join(pseudo_dir, p) for p in list_strings(pseudos)] try: self._pseudos = PseudoTable.as_table(pseudos).get_pseudos_for_structure(self.structure) except ValueError as exc: raise self.Error(str(exc)) if comment is not None: self.set_comment(comment)
[ "def", "__init__", "(", "self", ",", "structure", ",", "pseudos", ",", "pseudo_dir", "=", "None", ",", "comment", "=", "None", ",", "abi_args", "=", "None", ",", "abi_kwargs", "=", "None", ",", ")", ":", "# Internal dict with variables. we use an ordered dict so...
https://github.com/materialsproject/pymatgen/blob/8128f3062a334a2edd240e4062b5b9bdd1ae6f58/pymatgen/io/abinit/inputs.py#L720-L770
open-mmlab/mmclassification
5232965b17b6c050f9b328b3740c631ed4034624
mmcls/models/backbones/res2net.py
python
Bottle2neck.__init__
(self, in_channels, out_channels, scales=4, base_width=26, base_channels=64, stage_type='normal', **kwargs)
Bottle2neck block for Res2Net.
Bottle2neck block for Res2Net.
[ "Bottle2neck", "block", "for", "Res2Net", "." ]
def __init__(self, in_channels, out_channels, scales=4, base_width=26, base_channels=64, stage_type='normal', **kwargs): """Bottle2neck block for Res2Net.""" super(Bottle2neck, self).__init__(in_channels, out_channels, **kwargs) assert scales > 1, 'Res2Net degenerates to ResNet when scales = 1.' mid_channels = out_channels // self.expansion width = int(math.floor(mid_channels * (base_width / base_channels))) self.norm1_name, norm1 = build_norm_layer( self.norm_cfg, width * scales, postfix=1) self.norm3_name, norm3 = build_norm_layer( self.norm_cfg, self.out_channels, postfix=3) self.conv1 = build_conv_layer( self.conv_cfg, self.in_channels, width * scales, kernel_size=1, stride=self.conv1_stride, bias=False) self.add_module(self.norm1_name, norm1) if stage_type == 'stage': self.pool = nn.AvgPool2d( kernel_size=3, stride=self.conv2_stride, padding=1) self.convs = ModuleList() self.bns = ModuleList() for i in range(scales - 1): self.convs.append( build_conv_layer( self.conv_cfg, width, width, kernel_size=3, stride=self.conv2_stride, padding=self.dilation, dilation=self.dilation, bias=False)) self.bns.append( build_norm_layer(self.norm_cfg, width, postfix=i + 1)[1]) self.conv3 = build_conv_layer( self.conv_cfg, width * scales, self.out_channels, kernel_size=1, bias=False) self.add_module(self.norm3_name, norm3) self.stage_type = stage_type self.scales = scales self.width = width delattr(self, 'conv2') delattr(self, self.norm2_name)
[ "def", "__init__", "(", "self", ",", "in_channels", ",", "out_channels", ",", "scales", "=", "4", ",", "base_width", "=", "26", ",", "base_channels", "=", "64", ",", "stage_type", "=", "'normal'", ",", "*", "*", "kwargs", ")", ":", "super", "(", "Bottl...
https://github.com/open-mmlab/mmclassification/blob/5232965b17b6c050f9b328b3740c631ed4034624/mmcls/models/backbones/res2net.py#L18-L79
Infinidat/infi.clickhouse_orm
232a8d29ad237c833d3c851e1d61989ff55d7bd8
src/infi/clickhouse_orm/query.py
python
Q.is_empty
(self)
return not bool(self._conds or self._children)
Checks if there are any conditions in Q object Returns: Boolean
Checks if there are any conditions in Q object Returns: Boolean
[ "Checks", "if", "there", "are", "any", "conditions", "in", "Q", "object", "Returns", ":", "Boolean" ]
def is_empty(self): """ Checks if there are any conditions in Q object Returns: Boolean """ return not bool(self._conds or self._children)
[ "def", "is_empty", "(", "self", ")", ":", "return", "not", "bool", "(", "self", ".", "_conds", "or", "self", ".", "_children", ")" ]
https://github.com/Infinidat/infi.clickhouse_orm/blob/232a8d29ad237c833d3c851e1d61989ff55d7bd8/src/infi/clickhouse_orm/query.py#L203-L208
sagemath/sage
f9b2db94f675ff16963ccdefba4f1a3393b3fe0d
src/sage/combinat/cluster_algebra_quiver/quiver.py
python
ClusterQuiver.reorient
(self, data)
Reorient ``self`` with respect to the given total order, or with respect to an iterator of edges in ``self`` to be reverted. .. WARNING:: This operation might change the mutation type of ``self``. INPUT: - ``data`` -- an iterator defining a total order on ``self.vertices()``, or an iterator of edges in ``self`` to be reoriented. EXAMPLES:: sage: Q = ClusterQuiver(['A',(2,3),1]) sage: Q.mutation_type() ['A', [2, 3], 1] sage: Q.reorient([(0,1),(1,2),(2,3),(3,4)]) sage: Q.mutation_type() ['D', 5] sage: Q.reorient([0,1,2,3,4]) sage: Q.mutation_type() ['A', [1, 4], 1] TESTS:: sage: Q = ClusterQuiver(['A',2]) sage: Q.reorient([]) Traceback (most recent call last): ... ValueError: empty input sage: Q.reorient([3,4]) Traceback (most recent call last): ... ValueError: not a total order on the vertices of the quiver or a list of edges to be oriented
Reorient ``self`` with respect to the given total order, or with respect to an iterator of edges in ``self`` to be reverted.
[ "Reorient", "self", "with", "respect", "to", "the", "given", "total", "order", "or", "with", "respect", "to", "an", "iterator", "of", "edges", "in", "self", "to", "be", "reverted", "." ]
def reorient(self, data): """ Reorient ``self`` with respect to the given total order, or with respect to an iterator of edges in ``self`` to be reverted. .. WARNING:: This operation might change the mutation type of ``self``. INPUT: - ``data`` -- an iterator defining a total order on ``self.vertices()``, or an iterator of edges in ``self`` to be reoriented. EXAMPLES:: sage: Q = ClusterQuiver(['A',(2,3),1]) sage: Q.mutation_type() ['A', [2, 3], 1] sage: Q.reorient([(0,1),(1,2),(2,3),(3,4)]) sage: Q.mutation_type() ['D', 5] sage: Q.reorient([0,1,2,3,4]) sage: Q.mutation_type() ['A', [1, 4], 1] TESTS:: sage: Q = ClusterQuiver(['A',2]) sage: Q.reorient([]) Traceback (most recent call last): ... ValueError: empty input sage: Q.reorient([3,4]) Traceback (most recent call last): ... ValueError: not a total order on the vertices of the quiver or a list of edges to be oriented """ if not data: raise ValueError('empty input') first = data[0] if set(data) == set(range(self._n + self._m)): dg_new = DiGraph() for edge in self._digraph.edges(): if data.index(edge[0]) < data.index(edge[1]): dg_new.add_edge(edge[0], edge[1], edge[2]) else: dg_new.add_edge(edge[1], edge[0], edge[2]) self._digraph = dg_new self._M = _edge_list_to_matrix(dg_new.edges(), self._nlist, self._mlist) self._M.set_immutable() self._mutation_type = None elif isinstance(first, (list, tuple)) and len(first) == 2: edges = self._digraph.edges(labels=False) for edge in data: if (edge[1], edge[0]) in edges: label = self._digraph.edge_label(edge[1], edge[0]) self._digraph.delete_edge(edge[1], edge[0]) self._digraph.add_edge(edge[0], edge[1], label) self._M = _edge_list_to_matrix(self._digraph.edges(), self._nlist, self._mlist) self._M.set_immutable() self._mutation_type = None else: raise ValueError('not a total order on the vertices of the quiver or a list of edges to be oriented')
[ "def", "reorient", "(", "self", ",", "data", ")", ":", "if", "not", "data", ":", "raise", "ValueError", "(", "'empty input'", ")", "first", "=", "data", "[", "0", "]", "if", "set", "(", "data", ")", "==", "set", "(", "range", "(", "self", ".", "_...
https://github.com/sagemath/sage/blob/f9b2db94f675ff16963ccdefba4f1a3393b3fe0d/src/sage/combinat/cluster_algebra_quiver/quiver.py#L1531-L1602
rytilahti/python-miio
b6e53dd16fac77915426e7592e2528b78ef65190
miio/integrations/vacuum/roborock/vacuum.py
python
RoborockVacuum.consumable_reset
(self, consumable: Consumable)
return self.send("reset_consumable", [consumable.value])
Reset consumable information.
Reset consumable information.
[ "Reset", "consumable", "information", "." ]
def consumable_reset(self, consumable: Consumable): """Reset consumable information.""" return self.send("reset_consumable", [consumable.value])
[ "def", "consumable_reset", "(", "self", ",", "consumable", ":", "Consumable", ")", ":", "return", "self", ".", "send", "(", "\"reset_consumable\"", ",", "[", "consumable", ".", "value", "]", ")" ]
https://github.com/rytilahti/python-miio/blob/b6e53dd16fac77915426e7592e2528b78ef65190/miio/integrations/vacuum/roborock/vacuum.py#L389-L391
ohmyadd/wetland
76d296ec66dc438606e2455a848619d446f4a4b7
paramiko/message.py
python
Message.get_so_far
(self)
return self.packet.read(position)
Returns the `str` bytes of this message that have been parsed and returned. The string passed into a message's constructor can be regenerated by concatenating ``get_so_far`` and `get_remainder`.
Returns the `str` bytes of this message that have been parsed and returned. The string passed into a message's constructor can be regenerated by concatenating ``get_so_far`` and `get_remainder`.
[ "Returns", "the", "str", "bytes", "of", "this", "message", "that", "have", "been", "parsed", "and", "returned", ".", "The", "string", "passed", "into", "a", "message", "s", "constructor", "can", "be", "regenerated", "by", "concatenating", "get_so_far", "and", ...
def get_so_far(self): """ Returns the `str` bytes of this message that have been parsed and returned. The string passed into a message's constructor can be regenerated by concatenating ``get_so_far`` and `get_remainder`. """ position = self.packet.tell() self.rewind() return self.packet.read(position)
[ "def", "get_so_far", "(", "self", ")", ":", "position", "=", "self", ".", "packet", ".", "tell", "(", ")", "self", ".", "rewind", "(", ")", "return", "self", ".", "packet", ".", "read", "(", "position", ")" ]
https://github.com/ohmyadd/wetland/blob/76d296ec66dc438606e2455a848619d446f4a4b7/paramiko/message.py#L91-L99
hiroharu-kato/neural_renderer
85aa98d418bb18877baec4e987e5743c21ab5e99
neural_renderer/renderer.py
python
Renderer.render_depth
(self, vertices, faces)
return images
[]
def render_depth(self, vertices, faces): # fill back if self.fill_back: faces = cf.concat((faces, faces[:, :, ::-1]), axis=1).data # viewpoint transformation if self.camera_mode == 'look_at': vertices = neural_renderer.look_at(vertices, self.eye) elif self.camera_mode == 'look': vertices = neural_renderer.look(vertices, self.eye, self.camera_direction) # perspective transformation if self.perspective: vertices = neural_renderer.perspective(vertices, angle=self.viewing_angle) # rasterization faces = neural_renderer.vertices_to_faces(vertices, faces) images = neural_renderer.rasterize_depth(faces, self.image_size, self.anti_aliasing) return images
[ "def", "render_depth", "(", "self", ",", "vertices", ",", "faces", ")", ":", "# fill back", "if", "self", ".", "fill_back", ":", "faces", "=", "cf", ".", "concat", "(", "(", "faces", ",", "faces", "[", ":", ",", ":", ",", ":", ":", "-", "1", "]",...
https://github.com/hiroharu-kato/neural_renderer/blob/85aa98d418bb18877baec4e987e5743c21ab5e99/neural_renderer/renderer.py#L55-L73
zhl2008/awd-platform
0416b31abea29743387b10b3914581fbe8e7da5e
web_hxb2/lib/python3.5/site-packages/treebeard/templatetags/admin_tree_list.py
python
_line
(context, node, request)
return output + format_html( '<a href="{}/" {}>{}</a>', mark_safe(node.pk), mark_safe(raw_id_fields), mark_safe(str(node)))
[]
def _line(context, node, request): if TO_FIELD_VAR in request.GET and request.GET[TO_FIELD_VAR] == 'id': raw_id_fields = format_html(""" onclick="opener.dismissRelatedLookupPopup(window, '{}'); return false;" """, mark_safe(node.pk)) else: raw_id_fields = '' output = '' if needs_checkboxes(context): output += format_html(CHECKBOX_TMPL, mark_safe(node.pk)) return output + format_html( '<a href="{}/" {}>{}</a>', mark_safe(node.pk), mark_safe(raw_id_fields), mark_safe(str(node)))
[ "def", "_line", "(", "context", ",", "node", ",", "request", ")", ":", "if", "TO_FIELD_VAR", "in", "request", ".", "GET", "and", "request", ".", "GET", "[", "TO_FIELD_VAR", "]", "==", "'id'", ":", "raw_id_fields", "=", "format_html", "(", "\"\"\"\n ...
https://github.com/zhl2008/awd-platform/blob/0416b31abea29743387b10b3914581fbe8e7da5e/web_hxb2/lib/python3.5/site-packages/treebeard/templatetags/admin_tree_list.py#L15-L27
biopython/biopython
2dd97e71762af7b046d7f7f8a4f1e38db6b06c86
Bio/HMM/Trainer.py
python
AbstractTrainer.log_likelihood
(self, probabilities)
return total_likelihood
Calculate the log likelihood of the training seqs. Arguments: - probabilities -- A list of the probabilities of each training sequence under the current parameters, calculated using the forward algorithm.
Calculate the log likelihood of the training seqs.
[ "Calculate", "the", "log", "likelihood", "of", "the", "training", "seqs", "." ]
def log_likelihood(self, probabilities): """Calculate the log likelihood of the training seqs. Arguments: - probabilities -- A list of the probabilities of each training sequence under the current parameters, calculated using the forward algorithm. """ total_likelihood = 0 for probability in probabilities: total_likelihood += math.log(probability) return total_likelihood
[ "def", "log_likelihood", "(", "self", ",", "probabilities", ")", ":", "total_likelihood", "=", "0", "for", "probability", "in", "probabilities", ":", "total_likelihood", "+=", "math", ".", "log", "(", "probability", ")", "return", "total_likelihood" ]
https://github.com/biopython/biopython/blob/2dd97e71762af7b046d7f7f8a4f1e38db6b06c86/Bio/HMM/Trainer.py#L56-L69
bikalims/bika.lims
35e4bbdb5a3912cae0b5eb13e51097c8b0486349
bika/lims/browser/worksheet/views/referencesamples.py
python
ReferenceSamplesView.__call__
(self)
return super(ReferenceSamplesView, self).contents_table()
[]
def __call__(self): self.service_uids = self.request.get('service_uids', '').split(",") self.control_type = self.request.get('control_type', '') if not self.control_type: return t(_("No control type specified")) return super(ReferenceSamplesView, self).contents_table()
[ "def", "__call__", "(", "self", ")", ":", "self", ".", "service_uids", "=", "self", ".", "request", ".", "get", "(", "'service_uids'", ",", "''", ")", ".", "split", "(", "\",\"", ")", "self", ".", "control_type", "=", "self", ".", "request", ".", "ge...
https://github.com/bikalims/bika.lims/blob/35e4bbdb5a3912cae0b5eb13e51097c8b0486349/bika/lims/browser/worksheet/views/referencesamples.py#L54-L59
rwth-i6/returnn
f2d718a197a280b0d5f0fd91a7fcb8658560dddb
returnn/config.py
python
Config.is_true
(self, key, default=False)
return self.bool(key, default=default)
:param str key: :param bool default: :return: bool(value) if it is set or default :rtype: bool
:param str key: :param bool default: :return: bool(value) if it is set or default :rtype: bool
[ ":", "param", "str", "key", ":", ":", "param", "bool", "default", ":", ":", "return", ":", "bool", "(", "value", ")", "if", "it", "is", "set", "or", "default", ":", "rtype", ":", "bool" ]
def is_true(self, key, default=False): """ :param str key: :param bool default: :return: bool(value) if it is set or default :rtype: bool """ if self.is_typed(key): return bool(self.typed_dict[key]) return self.bool(key, default=default)
[ "def", "is_true", "(", "self", ",", "key", ",", "default", "=", "False", ")", ":", "if", "self", ".", "is_typed", "(", "key", ")", ":", "return", "bool", "(", "self", ".", "typed_dict", "[", "key", "]", ")", "return", "self", ".", "bool", "(", "k...
https://github.com/rwth-i6/returnn/blob/f2d718a197a280b0d5f0fd91a7fcb8658560dddb/returnn/config.py#L231-L240
pyparallel/pyparallel
11e8c6072d48c8f13641925d17b147bf36ee0ba3
Lib/decimal.py
python
Decimal.next_toward
(self, other, context=None)
return ans
Returns the number closest to self, in the direction towards other. The result is the closest representable number to self (excluding self) that is in the direction towards other, unless both have the same value. If the two operands are numerically equal, then the result is a copy of self with the sign set to be the same as the sign of other.
Returns the number closest to self, in the direction towards other.
[ "Returns", "the", "number", "closest", "to", "self", "in", "the", "direction", "towards", "other", "." ]
def next_toward(self, other, context=None): """Returns the number closest to self, in the direction towards other. The result is the closest representable number to self (excluding self) that is in the direction towards other, unless both have the same value. If the two operands are numerically equal, then the result is a copy of self with the sign set to be the same as the sign of other. """ other = _convert_other(other, raiseit=True) if context is None: context = getcontext() ans = self._check_nans(other, context) if ans: return ans comparison = self._cmp(other) if comparison == 0: return self.copy_sign(other) if comparison == -1: ans = self.next_plus(context) else: # comparison == 1 ans = self.next_minus(context) # decide which flags to raise using value of ans if ans._isinfinity(): context._raise_error(Overflow, 'Infinite result from next_toward', ans._sign) context._raise_error(Inexact) context._raise_error(Rounded) elif ans.adjusted() < context.Emin: context._raise_error(Underflow) context._raise_error(Subnormal) context._raise_error(Inexact) context._raise_error(Rounded) # if precision == 1 then we don't raise Clamped for a # result 0E-Etiny. if not ans: context._raise_error(Clamped) return ans
[ "def", "next_toward", "(", "self", ",", "other", ",", "context", "=", "None", ")", ":", "other", "=", "_convert_other", "(", "other", ",", "raiseit", "=", "True", ")", "if", "context", "is", "None", ":", "context", "=", "getcontext", "(", ")", "ans", ...
https://github.com/pyparallel/pyparallel/blob/11e8c6072d48c8f13641925d17b147bf36ee0ba3/Lib/decimal.py#L3547-L3591
openstack/zaqar
1726ac41b5369cc30e99fd652f29f5300b95d958
zaqar/storage/utils.py
python
load_storage_impl
(uri, control_mode=False, default_store=None)
Loads a storage driver implementation and returns it. :param uri: The connection uri to parse and load a driver for. :param control_mode: (Default False). Determines which driver type to load; if False, the data driver is loaded. If True, the control driver is loaded. :param default_store: The default store to load if no scheme is parsed.
Loads a storage driver implementation and returns it.
[ "Loads", "a", "storage", "driver", "implementation", "and", "returns", "it", "." ]
def load_storage_impl(uri, control_mode=False, default_store=None): """Loads a storage driver implementation and returns it. :param uri: The connection uri to parse and load a driver for. :param control_mode: (Default False). Determines which driver type to load; if False, the data driver is loaded. If True, the control driver is loaded. :param default_store: The default store to load if no scheme is parsed. """ mode = 'control' if control_mode else 'data' driver_type = 'zaqar.{0}.storage'.format(mode) storage_type = urllib_parse.urlparse(uri).scheme or default_store try: mgr = driver.DriverManager(driver_type, storage_type, invoke_on_load=False) return mgr.driver except Exception as exc: LOG.exception('Error loading storage driver') raise errors.InvalidDriver(exc)
[ "def", "load_storage_impl", "(", "uri", ",", "control_mode", "=", "False", ",", "default_store", "=", "None", ")", ":", "mode", "=", "'control'", "if", "control_mode", "else", "'data'", "driver_type", "=", "'zaqar.{0}.storage'", ".", "format", "(", "mode", ")"...
https://github.com/openstack/zaqar/blob/1726ac41b5369cc30e99fd652f29f5300b95d958/zaqar/storage/utils.py#L76-L99
microsoft/ProphetNet
32f806bf16598c59990fc932da3dbb3746716f7c
ProphetNet_En/cnndm/eval/bs_pyrouge.py
python
Rouge155.__set_rouge_dir
(self, home_dir=None)
Verfify presence of ROUGE-1.5.5.pl and data folder, and set those paths.
Verfify presence of ROUGE-1.5.5.pl and data folder, and set those paths.
[ "Verfify", "presence", "of", "ROUGE", "-", "1", ".", "5", ".", "5", ".", "pl", "and", "data", "folder", "and", "set", "those", "paths", "." ]
def __set_rouge_dir(self, home_dir=None): """ Verfify presence of ROUGE-1.5.5.pl and data folder, and set those paths. """ if not home_dir: self._home_dir = self.__get_rouge_home_dir_from_settings() else: self._home_dir = home_dir self.save_home_dir() self._bin_path = os.path.join(self._home_dir, 'ROUGE-1.5.5.pl') self.data_dir = os.path.join(self._home_dir, 'data') if not os.path.exists(self._bin_path): raise Exception( "ROUGE binary not found at {}. Please set the " "correct path by running pyrouge_set_rouge_path " "/path/to/rouge/home.".format(self._bin_path))
[ "def", "__set_rouge_dir", "(", "self", ",", "home_dir", "=", "None", ")", ":", "if", "not", "home_dir", ":", "self", ".", "_home_dir", "=", "self", ".", "__get_rouge_home_dir_from_settings", "(", ")", "else", ":", "self", ".", "_home_dir", "=", "home_dir", ...
https://github.com/microsoft/ProphetNet/blob/32f806bf16598c59990fc932da3dbb3746716f7c/ProphetNet_En/cnndm/eval/bs_pyrouge.py#L432-L449
scrapinghub/splash
802d8391984bae049ef95a3fe1a74feaee95a233
splash/lua.py
python
parse_error_message
(error_text)
return { 'source': m.group(1), 'line_number': int(m.group(2)), 'error': m.group(3) }
r""" Split Lua error message into 'source', 'line_number' and 'error'. If error message can't be parsed, an empty dict is returned. This function is not reliable because error text is ambiguous. Parse runtime error messages:: >>> info = parse_error_message('[string "function main(splash)\r..."]:2: /app/splash.lua:81: ValueError(\'could not convert string to float: sdf\'') >>> print(info['line_number']) 2 >>> print(repr(info['source'])) '[string "function main(splash)\r..."]' >>> print(info['error']) /app/splash.lua:81: ValueError('could not convert string to float: sdf' >>> parse_error_message('dfsadf') {} Parse syntax errors:: >>> info = parse_error_message("error loading code: [string \"<python>\"]:1: syntax error near 'ction'") >>> info['line_number'] 1 >>> print(info['error']) syntax error near 'ction'
r""" Split Lua error message into 'source', 'line_number' and 'error'. If error message can't be parsed, an empty dict is returned.
[ "r", "Split", "Lua", "error", "message", "into", "source", "line_number", "and", "error", ".", "If", "error", "message", "can", "t", "be", "parsed", "an", "empty", "dict", "is", "returned", "." ]
def parse_error_message(error_text): r""" Split Lua error message into 'source', 'line_number' and 'error'. If error message can't be parsed, an empty dict is returned. This function is not reliable because error text is ambiguous. Parse runtime error messages:: >>> info = parse_error_message('[string "function main(splash)\r..."]:2: /app/splash.lua:81: ValueError(\'could not convert string to float: sdf\'') >>> print(info['line_number']) 2 >>> print(repr(info['source'])) '[string "function main(splash)\r..."]' >>> print(info['error']) /app/splash.lua:81: ValueError('could not convert string to float: sdf' >>> parse_error_message('dfsadf') {} Parse syntax errors:: >>> info = parse_error_message("error loading code: [string \"<python>\"]:1: syntax error near 'ction'") >>> info['line_number'] 1 >>> print(info['error']) syntax error near 'ction' """ error_text = to_unicode(error_text) m = _LUA_ERROR_RE.match(error_text) if not m: m = _SYNTAX_ERROR_RE.match(error_text) if not m: return {} return { 'source': m.group(1), 'line_number': int(m.group(2)), 'error': m.group(3) }
[ "def", "parse_error_message", "(", "error_text", ")", ":", "error_text", "=", "to_unicode", "(", "error_text", ")", "m", "=", "_LUA_ERROR_RE", ".", "match", "(", "error_text", ")", "if", "not", "m", ":", "m", "=", "_SYNTAX_ERROR_RE", ".", "match", "(", "er...
https://github.com/scrapinghub/splash/blob/802d8391984bae049ef95a3fe1a74feaee95a233/splash/lua.py#L272-L313
Eniac-Xie/faster-rcnn-resnet
aba743e8404b47fc9bcccba4920846d1068c7e3c
lib/rpn/generate_anchors.py
python
generate_anchors
(base_size=16, ratios=[0.5, 1, 2], scales=2**np.arange(3, 6))
return anchors
Generate anchor (reference) windows by enumerating aspect ratios X scales wrt a reference (0, 0, 15, 15) window.
Generate anchor (reference) windows by enumerating aspect ratios X scales wrt a reference (0, 0, 15, 15) window.
[ "Generate", "anchor", "(", "reference", ")", "windows", "by", "enumerating", "aspect", "ratios", "X", "scales", "wrt", "a", "reference", "(", "0", "0", "15", "15", ")", "window", "." ]
def generate_anchors(base_size=16, ratios=[0.5, 1, 2], scales=2**np.arange(3, 6)): """ Generate anchor (reference) windows by enumerating aspect ratios X scales wrt a reference (0, 0, 15, 15) window. """ base_anchor = np.array([1, 1, base_size, base_size]) - 1 ratio_anchors = _ratio_enum(base_anchor, ratios) anchors = np.vstack([_scale_enum(ratio_anchors[i, :], scales) for i in xrange(ratio_anchors.shape[0])]) return anchors
[ "def", "generate_anchors", "(", "base_size", "=", "16", ",", "ratios", "=", "[", "0.5", ",", "1", ",", "2", "]", ",", "scales", "=", "2", "**", "np", ".", "arange", "(", "3", ",", "6", ")", ")", ":", "base_anchor", "=", "np", ".", "array", "(",...
https://github.com/Eniac-Xie/faster-rcnn-resnet/blob/aba743e8404b47fc9bcccba4920846d1068c7e3c/lib/rpn/generate_anchors.py#L37-L48
ihaveamac/ninfs
dee583ba3205c4e050e0b193a85a4177eb334159
ninfs/mount/romfs.py
python
RomFSMount.__del__
(self, *args)
[]
def __del__(self, *args): try: self.reader.close() except AttributeError: pass
[ "def", "__del__", "(", "self", ",", "*", "args", ")", ":", "try", ":", "self", ".", "reader", ".", "close", "(", ")", "except", "AttributeError", ":", "pass" ]
https://github.com/ihaveamac/ninfs/blob/dee583ba3205c4e050e0b193a85a4177eb334159/ninfs/mount/romfs.py#L33-L37
privacyidea/privacyidea
9490c12ddbf77a34ac935b082d09eb583dfafa2c
privacyidea/lib/auditmodules/containeraudit.py
python
Audit.add_policy
(self, policyname)
Call the add_policy method for all writeable modules
Call the add_policy method for all writeable modules
[ "Call", "the", "add_policy", "method", "for", "all", "writeable", "modules" ]
def add_policy(self, policyname): """ Call the add_policy method for all writeable modules """ for module in self.write_modules: module.add_policy(policyname)
[ "def", "add_policy", "(", "self", ",", "policyname", ")", ":", "for", "module", "in", "self", ".", "write_modules", ":", "module", ".", "add_policy", "(", "policyname", ")" ]
https://github.com/privacyidea/privacyidea/blob/9490c12ddbf77a34ac935b082d09eb583dfafa2c/privacyidea/lib/auditmodules/containeraudit.py#L79-L84
PyCQA/flake8
bff62cdf860a70d05ce84e2b5ee494e7f3fd31af
src/flake8/processor.py
python
FileProcessor.build_logical_line_tokens
(self)
return comments, logical, mapping
Build the mapping, comments, and logical line lists.
Build the mapping, comments, and logical line lists.
[ "Build", "the", "mapping", "comments", "and", "logical", "line", "lists", "." ]
def build_logical_line_tokens(self) -> _Logical: """Build the mapping, comments, and logical line lists.""" logical = [] comments = [] mapping: _LogicalMapping = [] length = 0 previous_row = previous_column = None for token_type, text, start, end, line in self.tokens: if token_type in SKIP_TOKENS: continue if not mapping: mapping = [(0, start)] if token_type == tokenize.COMMENT: comments.append(text) continue if token_type == tokenize.STRING: text = mutate_string(text) if previous_row: (start_row, start_column) = start if previous_row != start_row: row_index = previous_row - 1 column_index = previous_column - 1 previous_text = self.lines[row_index][column_index] if previous_text == "," or ( previous_text not in "{[(" and text not in "}])" ): text = f" {text}" elif previous_column != start_column: text = line[previous_column:start_column] + text logical.append(text) length += len(text) mapping.append((length, end)) (previous_row, previous_column) = end return comments, logical, mapping
[ "def", "build_logical_line_tokens", "(", "self", ")", "->", "_Logical", ":", "logical", "=", "[", "]", "comments", "=", "[", "]", "mapping", ":", "_LogicalMapping", "=", "[", "]", "length", "=", "0", "previous_row", "=", "previous_column", "=", "None", "fo...
https://github.com/PyCQA/flake8/blob/bff62cdf860a70d05ce84e2b5ee494e7f3fd31af/src/flake8/processor.py#L182-L215
xcmyz/FastSpeech
d8bd89790100542fda836b8f3f9342b64ad67e39
audio/stft.py
python
STFT.__init__
(self, filter_length=800, hop_length=200, win_length=800, window='hann')
[]
def __init__(self, filter_length=800, hop_length=200, win_length=800, window='hann'): super(STFT, self).__init__() self.filter_length = filter_length self.hop_length = hop_length self.win_length = win_length self.window = window self.forward_transform = None scale = self.filter_length / self.hop_length fourier_basis = np.fft.fft(np.eye(self.filter_length)) cutoff = int((self.filter_length / 2 + 1)) fourier_basis = np.vstack([np.real(fourier_basis[:cutoff, :]), np.imag(fourier_basis[:cutoff, :])]) forward_basis = torch.FloatTensor(fourier_basis[:, None, :]) inverse_basis = torch.FloatTensor( np.linalg.pinv(scale * fourier_basis).T[:, None, :]) if window is not None: assert(filter_length >= win_length) # get window and zero center pad it to filter_length fft_window = get_window(window, win_length, fftbins=True) fft_window = pad_center(fft_window, filter_length) fft_window = torch.from_numpy(fft_window).float() # window the bases forward_basis *= fft_window inverse_basis *= fft_window self.register_buffer('forward_basis', forward_basis.float()) self.register_buffer('inverse_basis', inverse_basis.float())
[ "def", "__init__", "(", "self", ",", "filter_length", "=", "800", ",", "hop_length", "=", "200", ",", "win_length", "=", "800", ",", "window", "=", "'hann'", ")", ":", "super", "(", "STFT", ",", "self", ")", ".", "__init__", "(", ")", "self", ".", ...
https://github.com/xcmyz/FastSpeech/blob/d8bd89790100542fda836b8f3f9342b64ad67e39/audio/stft.py#L20-L51
openhatch/oh-mainline
ce29352a034e1223141dcc2f317030bbc3359a51
vendor/packages/pytz/pytz/tzinfo.py
python
DstTzInfo.tzname
(self, dt, is_dst=None)
See datetime.tzinfo.tzname The is_dst parameter may be used to remove ambiguity during DST transitions. >>> from pytz import timezone >>> tz = timezone('America/St_Johns') >>> normal = datetime(2009, 9, 1) >>> tz.tzname(normal) 'NDT' >>> tz.tzname(normal, is_dst=False) 'NDT' >>> tz.tzname(normal, is_dst=True) 'NDT' >>> ambiguous = datetime(2009, 10, 31, 23, 30) >>> tz.tzname(ambiguous, is_dst=False) 'NST' >>> tz.tzname(ambiguous, is_dst=True) 'NDT' >>> try: ... tz.tzname(ambiguous) ... except AmbiguousTimeError: ... print('Ambiguous') Ambiguous
See datetime.tzinfo.tzname
[ "See", "datetime", ".", "tzinfo", ".", "tzname" ]
def tzname(self, dt, is_dst=None): '''See datetime.tzinfo.tzname The is_dst parameter may be used to remove ambiguity during DST transitions. >>> from pytz import timezone >>> tz = timezone('America/St_Johns') >>> normal = datetime(2009, 9, 1) >>> tz.tzname(normal) 'NDT' >>> tz.tzname(normal, is_dst=False) 'NDT' >>> tz.tzname(normal, is_dst=True) 'NDT' >>> ambiguous = datetime(2009, 10, 31, 23, 30) >>> tz.tzname(ambiguous, is_dst=False) 'NST' >>> tz.tzname(ambiguous, is_dst=True) 'NDT' >>> try: ... tz.tzname(ambiguous) ... except AmbiguousTimeError: ... print('Ambiguous') Ambiguous ''' if dt is None: return self.zone elif dt.tzinfo is not self: dt = self.localize(dt, is_dst) return dt.tzinfo._tzname else: return self._tzname
[ "def", "tzname", "(", "self", ",", "dt", ",", "is_dst", "=", "None", ")", ":", "if", "dt", "is", "None", ":", "return", "self", ".", "zone", "elif", "dt", ".", "tzinfo", "is", "not", "self", ":", "dt", "=", "self", ".", "localize", "(", "dt", "...
https://github.com/openhatch/oh-mainline/blob/ce29352a034e1223141dcc2f317030bbc3359a51/vendor/packages/pytz/pytz/tzinfo.py#L449-L485
pret/pokemon-reverse-engineering-tools
5e0715f2579adcfeb683448c9a7826cfd3afa57d
pokemontools/configuration.py
python
Config.__init__
(self, **kwargs)
Store all parameters.
Store all parameters.
[ "Store", "all", "parameters", "." ]
def __init__(self, **kwargs): """ Store all parameters. """ self._config = {} for (key, value) in kwargs.items(): if key not in self.__dict__: self._config[key] = value else: raise ConfigException( "Can't store \"{0}\" in configuration because the key conflicts with an existing property." .format(key) ) if "path" not in self._config: self._config["path"] = os.getcwd() # vba save states go into ./save-states/ if "save_state_path" not in self._config: self._config["save_state_path"] = os.path.join(self._config["path"], "save-states/") # assume rom is at ./baserom.gbc if "rom" not in self._config: self._config["rom_path"] = os.path.join(self._config["path"], "baserom.gbc")
[ "def", "__init__", "(", "self", ",", "*", "*", "kwargs", ")", ":", "self", ".", "_config", "=", "{", "}", "for", "(", "key", ",", "value", ")", "in", "kwargs", ".", "items", "(", ")", ":", "if", "key", "not", "in", "self", ".", "__dict__", ":",...
https://github.com/pret/pokemon-reverse-engineering-tools/blob/5e0715f2579adcfeb683448c9a7826cfd3afa57d/pokemontools/configuration.py#L19-L43
tomplus/kubernetes_asyncio
f028cc793e3a2c519be6a52a49fb77ff0b014c9b
kubernetes_asyncio/client/models/v2beta1_resource_metric_status.py
python
V2beta1ResourceMetricStatus.to_str
(self)
return pprint.pformat(self.to_dict())
Returns the string representation of the model
Returns the string representation of the model
[ "Returns", "the", "string", "representation", "of", "the", "model" ]
def to_str(self): """Returns the string representation of the model""" return pprint.pformat(self.to_dict())
[ "def", "to_str", "(", "self", ")", ":", "return", "pprint", ".", "pformat", "(", "self", ".", "to_dict", "(", ")", ")" ]
https://github.com/tomplus/kubernetes_asyncio/blob/f028cc793e3a2c519be6a52a49fb77ff0b014c9b/kubernetes_asyncio/client/models/v2beta1_resource_metric_status.py#L160-L162
omz/PythonistaAppTemplate
f560f93f8876d82a21d108977f90583df08d55af
PythonistaAppTemplate/PythonistaKit.framework/pylib_ext/mpmath/calculus/extrapolation.py
python
sumem
(ctx, f, interval, tol=None, reject=10, integral=None, adiffs=None, bdiffs=None, verbose=False, error=False, _fast_abort=False)
r""" Uses the Euler-Maclaurin formula to compute an approximation accurate to within ``tol`` (which defaults to the present epsilon) of the sum .. math :: S = \sum_{k=a}^b f(k) where `(a,b)` are given by ``interval`` and `a` or `b` may be infinite. The approximation is .. math :: S \sim \int_a^b f(x) \,dx + \frac{f(a)+f(b)}{2} + \sum_{k=1}^{\infty} \frac{B_{2k}}{(2k)!} \left(f^{(2k-1)}(b)-f^{(2k-1)}(a)\right). The last sum in the Euler-Maclaurin formula is not generally convergent (a notable exception is if `f` is a polynomial, in which case Euler-Maclaurin actually gives an exact result). The summation is stopped as soon as the quotient between two consecutive terms falls below *reject*. That is, by default (*reject* = 10), the summation is continued as long as each term adds at least one decimal. Although not convergent, convergence to a given tolerance can often be "forced" if `b = \infty` by summing up to `a+N` and then applying the Euler-Maclaurin formula to the sum over the range `(a+N+1, \ldots, \infty)`. This procedure is implemented by :func:`~mpmath.nsum`. By default numerical quadrature and differentiation is used. If the symbolic values of the integral and endpoint derivatives are known, it is more efficient to pass the value of the integral explicitly as ``integral`` and the derivatives explicitly as ``adiffs`` and ``bdiffs``. The derivatives should be given as iterables that yield `f(a), f'(a), f''(a), \ldots` (and the equivalent for `b`). **Examples** Summation of an infinite series, with automatic and symbolic integral and derivative values (the second should be much faster):: >>> from mpmath import * >>> mp.dps = 50; mp.pretty = True >>> sumem(lambda n: 1/n**2, [32, inf]) 0.03174336652030209012658168043874142714132886413417 >>> I = mpf(1)/32 >>> D = adiffs=((-1)**n*fac(n+1)*32**(-2-n) for n in range(999)) >>> sumem(lambda n: 1/n**2, [32, inf], integral=I, adiffs=D) 0.03174336652030209012658168043874142714132886413417 An exact evaluation of a finite polynomial sum:: >>> sumem(lambda n: n**5-12*n**2+3*n, [-100000, 200000]) 10500155000624963999742499550000.0 >>> print(sum(n**5-12*n**2+3*n for n in range(-100000, 200001))) 10500155000624963999742499550000
r""" Uses the Euler-Maclaurin formula to compute an approximation accurate to within ``tol`` (which defaults to the present epsilon) of the sum
[ "r", "Uses", "the", "Euler", "-", "Maclaurin", "formula", "to", "compute", "an", "approximation", "accurate", "to", "within", "tol", "(", "which", "defaults", "to", "the", "present", "epsilon", ")", "of", "the", "sum" ]
def sumem(ctx, f, interval, tol=None, reject=10, integral=None, adiffs=None, bdiffs=None, verbose=False, error=False, _fast_abort=False): r""" Uses the Euler-Maclaurin formula to compute an approximation accurate to within ``tol`` (which defaults to the present epsilon) of the sum .. math :: S = \sum_{k=a}^b f(k) where `(a,b)` are given by ``interval`` and `a` or `b` may be infinite. The approximation is .. math :: S \sim \int_a^b f(x) \,dx + \frac{f(a)+f(b)}{2} + \sum_{k=1}^{\infty} \frac{B_{2k}}{(2k)!} \left(f^{(2k-1)}(b)-f^{(2k-1)}(a)\right). The last sum in the Euler-Maclaurin formula is not generally convergent (a notable exception is if `f` is a polynomial, in which case Euler-Maclaurin actually gives an exact result). The summation is stopped as soon as the quotient between two consecutive terms falls below *reject*. That is, by default (*reject* = 10), the summation is continued as long as each term adds at least one decimal. Although not convergent, convergence to a given tolerance can often be "forced" if `b = \infty` by summing up to `a+N` and then applying the Euler-Maclaurin formula to the sum over the range `(a+N+1, \ldots, \infty)`. This procedure is implemented by :func:`~mpmath.nsum`. By default numerical quadrature and differentiation is used. If the symbolic values of the integral and endpoint derivatives are known, it is more efficient to pass the value of the integral explicitly as ``integral`` and the derivatives explicitly as ``adiffs`` and ``bdiffs``. The derivatives should be given as iterables that yield `f(a), f'(a), f''(a), \ldots` (and the equivalent for `b`). **Examples** Summation of an infinite series, with automatic and symbolic integral and derivative values (the second should be much faster):: >>> from mpmath import * >>> mp.dps = 50; mp.pretty = True >>> sumem(lambda n: 1/n**2, [32, inf]) 0.03174336652030209012658168043874142714132886413417 >>> I = mpf(1)/32 >>> D = adiffs=((-1)**n*fac(n+1)*32**(-2-n) for n in range(999)) >>> sumem(lambda n: 1/n**2, [32, inf], integral=I, adiffs=D) 0.03174336652030209012658168043874142714132886413417 An exact evaluation of a finite polynomial sum:: >>> sumem(lambda n: n**5-12*n**2+3*n, [-100000, 200000]) 10500155000624963999742499550000.0 >>> print(sum(n**5-12*n**2+3*n for n in range(-100000, 200001))) 10500155000624963999742499550000 """ tol = tol or +ctx.eps interval = ctx._as_points(interval) a = ctx.convert(interval[0]) b = ctx.convert(interval[-1]) err = ctx.zero prev = 0 M = 10000 if a == ctx.ninf: adiffs = (0 for n in xrange(M)) else: adiffs = adiffs or ctx.diffs(f, a) if b == ctx.inf: bdiffs = (0 for n in xrange(M)) else: bdiffs = bdiffs or ctx.diffs(f, b) orig = ctx.prec #verbose = 1 try: ctx.prec += 10 s = ctx.zero for k, (da, db) in enumerate(izip(adiffs, bdiffs)): if k & 1: term = (db-da) * ctx.bernoulli(k+1) / ctx.factorial(k+1) mag = abs(term) if verbose: print("term", k, "magnitude =", ctx.nstr(mag)) if k > 4 and mag < tol: s += term break elif k > 4 and abs(prev) / mag < reject: err += mag if _fast_abort: return [s, (s, err)][error] if verbose: print("Failed to converge") break else: s += term prev = term # Endpoint correction if a != ctx.ninf: s += f(a)/2 if b != ctx.inf: s += f(b)/2 # Tail integral if verbose: print("Integrating f(x) from x = %s to %s" % (ctx.nstr(a), ctx.nstr(b))) if integral: s += integral else: integral, ierr = ctx.quad(f, interval, error=True) if verbose: print("Integration error:", ierr) s += integral err += ierr finally: ctx.prec = orig if error: return s, err else: return s
[ "def", "sumem", "(", "ctx", ",", "f", ",", "interval", ",", "tol", "=", "None", ",", "reject", "=", "10", ",", "integral", "=", "None", ",", "adiffs", "=", "None", ",", "bdiffs", "=", "None", ",", "verbose", "=", "False", ",", "error", "=", "Fals...
https://github.com/omz/PythonistaAppTemplate/blob/f560f93f8876d82a21d108977f90583df08d55af/PythonistaAppTemplate/PythonistaKit.framework/pylib_ext/mpmath/calculus/extrapolation.py#L967-L1086
lmjohns3/theanets
79db9f878ef2071f2f576a1cf5d43a752a55894a
theanets/losses.py
python
Loss.__call__
(self, outputs)
Construct the computation graph for this loss function. Parameters ---------- outputs : dict of Theano expressions A dictionary mapping network output names to Theano expressions representing the outputs of a computation graph. Returns ------- loss : Theano expression The values of the loss given the network output.
Construct the computation graph for this loss function.
[ "Construct", "the", "computation", "graph", "for", "this", "loss", "function", "." ]
def __call__(self, outputs): '''Construct the computation graph for this loss function. Parameters ---------- outputs : dict of Theano expressions A dictionary mapping network output names to Theano expressions representing the outputs of a computation graph. Returns ------- loss : Theano expression The values of the loss given the network output. ''' raise NotImplementedError
[ "def", "__call__", "(", "self", ",", "outputs", ")", ":", "raise", "NotImplementedError" ]
https://github.com/lmjohns3/theanets/blob/79db9f878ef2071f2f576a1cf5d43a752a55894a/theanets/losses.py#L66-L80
caiiiac/Machine-Learning-with-Python
1a26c4467da41ca4ebc3d5bd789ea942ef79422f
MachineLearning/venv/lib/python3.5/site-packages/pip/vcs/__init__.py
python
VersionControl.controls_location
(cls, location)
return os.path.exists(path)
Check if a location is controlled by the vcs. It is meant to be overridden to implement smarter detection mechanisms for specific vcs.
Check if a location is controlled by the vcs. It is meant to be overridden to implement smarter detection mechanisms for specific vcs.
[ "Check", "if", "a", "location", "is", "controlled", "by", "the", "vcs", ".", "It", "is", "meant", "to", "be", "overridden", "to", "implement", "smarter", "detection", "mechanisms", "for", "specific", "vcs", "." ]
def controls_location(cls, location): """ Check if a location is controlled by the vcs. It is meant to be overridden to implement smarter detection mechanisms for specific vcs. """ logger.debug('Checking in %s for %s (%s)...', location, cls.dirname, cls.name) path = os.path.join(location, cls.dirname) return os.path.exists(path)
[ "def", "controls_location", "(", "cls", ",", "location", ")", ":", "logger", ".", "debug", "(", "'Checking in %s for %s (%s)...'", ",", "location", ",", "cls", ".", "dirname", ",", "cls", ".", "name", ")", "path", "=", "os", ".", "path", ".", "join", "("...
https://github.com/caiiiac/Machine-Learning-with-Python/blob/1a26c4467da41ca4ebc3d5bd789ea942ef79422f/MachineLearning/venv/lib/python3.5/site-packages/pip/vcs/__init__.py#L335-L344
jython/frozen-mirror
b8d7aa4cee50c0c0fe2f4b235dd62922dd0f3f99
lib-python/2.7/idlelib/TreeWidget.py
python
TreeItem.IsEditable
(self)
Return whether the item's text may be edited.
Return whether the item's text may be edited.
[ "Return", "whether", "the", "item", "s", "text", "may", "be", "edited", "." ]
def IsEditable(self): """Return whether the item's text may be edited."""
[ "def", "IsEditable", "(", "self", ")", ":" ]
https://github.com/jython/frozen-mirror/blob/b8d7aa4cee50c0c0fe2f4b235dd62922dd0f3f99/lib-python/2.7/idlelib/TreeWidget.py#L343-L344
dropbox/dropbox-sdk-python
015437429be224732990041164a21a0501235db1
dropbox/team_log.py
python
EventType.is_governance_policy_add_folder_failed
(self)
return self._tag == 'governance_policy_add_folder_failed'
Check if the union tag is ``governance_policy_add_folder_failed``. :rtype: bool
Check if the union tag is ``governance_policy_add_folder_failed``.
[ "Check", "if", "the", "union", "tag", "is", "governance_policy_add_folder_failed", "." ]
def is_governance_policy_add_folder_failed(self): """ Check if the union tag is ``governance_policy_add_folder_failed``. :rtype: bool """ return self._tag == 'governance_policy_add_folder_failed'
[ "def", "is_governance_policy_add_folder_failed", "(", "self", ")", ":", "return", "self", ".", "_tag", "==", "'governance_policy_add_folder_failed'" ]
https://github.com/dropbox/dropbox-sdk-python/blob/015437429be224732990041164a21a0501235db1/dropbox/team_log.py#L28553-L28559
tdryer/hangups
f0b37be1d46e71b30337227d7c46192cc726dcb4
hangups/conversation_event.py
python
ChatMessageEvent.attachments
(self)
return attachments
List of attachments in the message (:class:`list`).
List of attachments in the message (:class:`list`).
[ "List", "of", "attachments", "in", "the", "message", "(", ":", "class", ":", "list", ")", "." ]
def attachments(self): """List of attachments in the message (:class:`list`).""" raw_attachments = self._event.chat_message.message_content.attachment if raw_attachments is None: raw_attachments = [] attachments = [] for attachment in raw_attachments: for embed_item_type in attachment.embed_item.type: known_types = [ hangouts_pb2.ITEM_TYPE_PLUS_PHOTO, hangouts_pb2.ITEM_TYPE_PLACE_V2, hangouts_pb2.ITEM_TYPE_PLACE, hangouts_pb2.ITEM_TYPE_THING, ] if embed_item_type not in known_types: logger.warning('Received chat message attachment with ' 'unknown embed type: %r', embed_item_type) if attachment.embed_item.HasField('plus_photo'): attachments.append( attachment.embed_item.plus_photo.thumbnail.image_url ) return attachments
[ "def", "attachments", "(", "self", ")", ":", "raw_attachments", "=", "self", ".", "_event", ".", "chat_message", ".", "message_content", ".", "attachment", "if", "raw_attachments", "is", "None", ":", "raw_attachments", "=", "[", "]", "attachments", "=", "[", ...
https://github.com/tdryer/hangups/blob/f0b37be1d46e71b30337227d7c46192cc726dcb4/hangups/conversation_event.py#L174-L196
kubernetes-client/python
47b9da9de2d02b2b7a34fbe05afb44afd130d73a
kubernetes/client/models/v1beta1_pod_security_policy_spec.py
python
V1beta1PodSecurityPolicySpec.se_linux
(self)
return self._se_linux
Gets the se_linux of this V1beta1PodSecurityPolicySpec. # noqa: E501 :return: The se_linux of this V1beta1PodSecurityPolicySpec. # noqa: E501 :rtype: V1beta1SELinuxStrategyOptions
Gets the se_linux of this V1beta1PodSecurityPolicySpec. # noqa: E501
[ "Gets", "the", "se_linux", "of", "this", "V1beta1PodSecurityPolicySpec", ".", "#", "noqa", ":", "E501" ]
def se_linux(self): """Gets the se_linux of this V1beta1PodSecurityPolicySpec. # noqa: E501 :return: The se_linux of this V1beta1PodSecurityPolicySpec. # noqa: E501 :rtype: V1beta1SELinuxStrategyOptions """ return self._se_linux
[ "def", "se_linux", "(", "self", ")", ":", "return", "self", ".", "_se_linux" ]
https://github.com/kubernetes-client/python/blob/47b9da9de2d02b2b7a34fbe05afb44afd130d73a/kubernetes/client/models/v1beta1_pod_security_policy_spec.py#L646-L653
jgagneastro/coffeegrindsize
22661ebd21831dba4cf32bfc6ba59fe3d49f879c
App/dist/coffeegrindsize.app/Contents/Resources/lib/python3.7/scipy/signal/ltisys.py
python
lti.impulse
(self, X0=None, T=None, N=None)
return impulse(self, X0=X0, T=T, N=N)
Return the impulse response of a continuous-time system. See `impulse` for details.
Return the impulse response of a continuous-time system. See `impulse` for details.
[ "Return", "the", "impulse", "response", "of", "a", "continuous", "-", "time", "system", ".", "See", "impulse", "for", "details", "." ]
def impulse(self, X0=None, T=None, N=None): """ Return the impulse response of a continuous-time system. See `impulse` for details. """ return impulse(self, X0=X0, T=T, N=N)
[ "def", "impulse", "(", "self", ",", "X0", "=", "None", ",", "T", "=", "None", ",", "N", "=", "None", ")", ":", "return", "impulse", "(", "self", ",", "X0", "=", "X0", ",", "T", "=", "T", ",", "N", "=", "N", ")" ]
https://github.com/jgagneastro/coffeegrindsize/blob/22661ebd21831dba4cf32bfc6ba59fe3d49f879c/App/dist/coffeegrindsize.app/Contents/Resources/lib/python3.7/scipy/signal/ltisys.py#L231-L236
elastic/elasticsearch-py
6ef1adfa3c840a84afda7369cd8e43ae7dc45cdb
elasticsearch/_async/client/rollup.py
python
RollupClient.stop_job
( self, *, id: Any, error_trace: Optional[bool] = None, filter_path: Optional[Union[List[str], str]] = None, human: Optional[bool] = None, pretty: Optional[bool] = None, timeout: Optional[Any] = None, wait_for_completion: Optional[bool] = None, )
return await self._perform_request("POST", __target, headers=__headers)
Stops an existing, started rollup job. `<https://www.elastic.co/guide/en/elasticsearch/reference/master/rollup-stop-job.html>`_ :param id: The ID of the job to stop :param timeout: Block for (at maximum) the specified duration while waiting for the job to stop. Defaults to 30s. :param wait_for_completion: True if the API should block until the job has fully stopped, false if should be executed async. Defaults to false.
Stops an existing, started rollup job.
[ "Stops", "an", "existing", "started", "rollup", "job", "." ]
async def stop_job( self, *, id: Any, error_trace: Optional[bool] = None, filter_path: Optional[Union[List[str], str]] = None, human: Optional[bool] = None, pretty: Optional[bool] = None, timeout: Optional[Any] = None, wait_for_completion: Optional[bool] = None, ) -> ObjectApiResponse[Any]: """ Stops an existing, started rollup job. `<https://www.elastic.co/guide/en/elasticsearch/reference/master/rollup-stop-job.html>`_ :param id: The ID of the job to stop :param timeout: Block for (at maximum) the specified duration while waiting for the job to stop. Defaults to 30s. :param wait_for_completion: True if the API should block until the job has fully stopped, false if should be executed async. Defaults to false. """ if id in SKIP_IN_PATH: raise ValueError("Empty value passed for parameter 'id'") __path = f"/_rollup/job/{_quote(id)}/_stop" __query: Dict[str, Any] = {} if error_trace is not None: __query["error_trace"] = error_trace if filter_path is not None: __query["filter_path"] = filter_path if human is not None: __query["human"] = human if pretty is not None: __query["pretty"] = pretty if timeout is not None: __query["timeout"] = timeout if wait_for_completion is not None: __query["wait_for_completion"] = wait_for_completion if __query: __target = f"{__path}?{_quote_query(__query)}" else: __target = __path __headers = {"accept": "application/json"} return await self._perform_request("POST", __target, headers=__headers)
[ "async", "def", "stop_job", "(", "self", ",", "*", ",", "id", ":", "Any", ",", "error_trace", ":", "Optional", "[", "bool", "]", "=", "None", ",", "filter_path", ":", "Optional", "[", "Union", "[", "List", "[", "str", "]", ",", "str", "]", "]", "...
https://github.com/elastic/elasticsearch-py/blob/6ef1adfa3c840a84afda7369cd8e43ae7dc45cdb/elasticsearch/_async/client/rollup.py#L440-L483
JaniceWuo/MovieRecommend
4c86db64ca45598917d304f535413df3bc9fea65
movierecommend/venv1/Lib/site-packages/pip-9.0.1-py3.6.egg/pip/_vendor/colorama/ansitowin32.py
python
AnsiToWin32.convert_ansi
(self, paramstring, command)
[]
def convert_ansi(self, paramstring, command): if self.convert: params = self.extract_params(command, paramstring) self.call_win32(command, params)
[ "def", "convert_ansi", "(", "self", ",", "paramstring", ",", "command", ")", ":", "if", "self", ".", "convert", ":", "params", "=", "self", ".", "extract_params", "(", "command", ",", "paramstring", ")", "self", ".", "call_win32", "(", "command", ",", "p...
https://github.com/JaniceWuo/MovieRecommend/blob/4c86db64ca45598917d304f535413df3bc9fea65/movierecommend/venv1/Lib/site-packages/pip-9.0.1-py3.6.egg/pip/_vendor/colorama/ansitowin32.py#L178-L181
youyuge34/PI-REC
9f8f1adff169fda301f3134ac730991484a6b128
tool_draw.py
python
model_process
(color_domain, edge)
return result
Key function to reconstruct image from edge and color domain. :param color_domain: channel=3 :param edge: channel=1 :return: reconstruction
Key function to reconstruct image from edge and color domain. :param color_domain: channel=3 :param edge: channel=1 :return: reconstruction
[ "Key", "function", "to", "reconstruct", "image", "from", "edge", "and", "color", "domain", ".", ":", "param", "color_domain", ":", "channel", "=", "3", ":", "param", "edge", ":", "channel", "=", "1", ":", "return", ":", "reconstruction" ]
def model_process(color_domain, edge): """ Key function to reconstruct image from edge and color domain. :param color_domain: channel=3 :param edge: channel=1 :return: reconstruction """ # print(color_domain.shape, edge.shape) size_origin = color_domain.shape[:2] img = cv.cvtColor(color_domain, cv.COLOR_BGR2RGB) result = model_G.draw(img, edge) result = cv.resize(result, size_origin) result = cv.cvtColor(result, cv.COLOR_RGB2BGR) return result
[ "def", "model_process", "(", "color_domain", ",", "edge", ")", ":", "# print(color_domain.shape, edge.shape)", "size_origin", "=", "color_domain", ".", "shape", "[", ":", "2", "]", "img", "=", "cv", ".", "cvtColor", "(", "color_domain", ",", "cv", ".", "COLOR_...
https://github.com/youyuge34/PI-REC/blob/9f8f1adff169fda301f3134ac730991484a6b128/tool_draw.py#L215-L228
turicas/rows
27871dbfef3be445169eafe2c66f0333d3090579
rows/fields.py
python
make_header
(field_names, permit_not=False)
return result
Return unique and slugged field names.
Return unique and slugged field names.
[ "Return", "unique", "and", "slugged", "field", "names", "." ]
def make_header(field_names, permit_not=False): """Return unique and slugged field names.""" slug_chars = SLUG_CHARS if not permit_not else SLUG_CHARS + "^" header = [slug(field_name, permitted_chars=slug_chars) for field_name in field_names] result = [] for index, field_name in enumerate(header): if not field_name: field_name = "field_{}".format(index) elif field_name[0].isdigit(): field_name = "field_{}".format(field_name) if field_name in result: field_name = make_unique_name(name=field_name, existing_names=result, start=2) result.append(field_name) return result
[ "def", "make_header", "(", "field_names", ",", "permit_not", "=", "False", ")", ":", "slug_chars", "=", "SLUG_CHARS", "if", "not", "permit_not", "else", "SLUG_CHARS", "+", "\"^\"", "header", "=", "[", "slug", "(", "field_name", ",", "permitted_chars", "=", "...
https://github.com/turicas/rows/blob/27871dbfef3be445169eafe2c66f0333d3090579/rows/fields.py#L583-L599
heynemann/pyccuracy
0bbe3bcff4d13a6501bf77d5af9457f6a1491ab6
pyccuracy/airspeed.py
python
StopDirective.evaluate
(self, stream, namespace, loader)
[]
def evaluate(self, stream, namespace, loader): if hasattr(stream, 'stop'): stream.stop = True
[ "def", "evaluate", "(", "self", ",", "stream", ",", "namespace", ",", "loader", ")", ":", "if", "hasattr", "(", "stream", ",", "'stop'", ")", ":", "stream", ".", "stop", "=", "True" ]
https://github.com/heynemann/pyccuracy/blob/0bbe3bcff4d13a6501bf77d5af9457f6a1491ab6/pyccuracy/airspeed.py#L869-L871
seantis/suitable
047d8634c74498850911e2b14425fee6713adcca
suitable/module_runner.py
python
ModuleRunner.__init__
(self, module_name)
Runs any ansible module given the module's name and access to the api instance (done through the hookup method).
Runs any ansible module given the module's name and access to the api instance (done through the hookup method).
[ "Runs", "any", "ansible", "module", "given", "the", "module", "s", "name", "and", "access", "to", "the", "api", "instance", "(", "done", "through", "the", "hookup", "method", ")", "." ]
def __init__(self, module_name): """ Runs any ansible module given the module's name and access to the api instance (done through the hookup method). """ self.module_name = module_name self.api = None self.module_args = None
[ "def", "__init__", "(", "self", ",", "module_name", ")", ":", "self", ".", "module_name", "=", "module_name", "self", ".", "api", "=", "None", "self", ".", "module_args", "=", "None" ]
https://github.com/seantis/suitable/blob/047d8634c74498850911e2b14425fee6713adcca/suitable/module_runner.py#L96-L103
huawei-noah/Pretrained-Language-Model
d4694a134bdfacbaef8ff1d99735106bd3b3372b
DynaBERT/transformers/tokenization_bert.py
python
BasicTokenizer._tokenize_chinese_chars
(self, text)
return "".join(output)
Adds whitespace around any CJK character.
Adds whitespace around any CJK character.
[ "Adds", "whitespace", "around", "any", "CJK", "character", "." ]
def _tokenize_chinese_chars(self, text): """Adds whitespace around any CJK character.""" output = [] for char in text: cp = ord(char) if self._is_chinese_char(cp): output.append(" ") output.append(char) output.append(" ") else: output.append(char) return "".join(output)
[ "def", "_tokenize_chinese_chars", "(", "self", ",", "text", ")", ":", "output", "=", "[", "]", "for", "char", "in", "text", ":", "cp", "=", "ord", "(", "char", ")", "if", "self", ".", "_is_chinese_char", "(", "cp", ")", ":", "output", ".", "append", ...
https://github.com/huawei-noah/Pretrained-Language-Model/blob/d4694a134bdfacbaef8ff1d99735106bd3b3372b/DynaBERT/transformers/tokenization_bert.py#L356-L367
JiYou/openstack
8607dd488bde0905044b303eb6e52bdea6806923
packages/source/quantum/quantum/plugins/plumgrid/plumgrid_nos_plugin/plumgrid_plugin.py
python
QuantumPluginPLUMgridV2.create_port
(self, context, port)
return super(QuantumPluginPLUMgridV2, self).create_port(context, port)
Create port core Quantum API
Create port core Quantum API
[ "Create", "port", "core", "Quantum", "API" ]
def create_port(self, context, port): """ Create port core Quantum API """ LOG.debug(_("QuantumPluginPLUMgrid Status: create_port() called")) # Port operations on PLUMgrid NOS is an automatic operation from the # VIF driver operations in Nova. It requires admin_state_up to be True port["port"]["admin_state_up"] = True # Plugin DB - Port Create and Return port return super(QuantumPluginPLUMgridV2, self).create_port(context, port)
[ "def", "create_port", "(", "self", ",", "context", ",", "port", ")", ":", "LOG", ".", "debug", "(", "_", "(", "\"QuantumPluginPLUMgrid Status: create_port() called\"", ")", ")", "# Port operations on PLUMgrid NOS is an automatic operation from the", "# VIF driver operations i...
https://github.com/JiYou/openstack/blob/8607dd488bde0905044b303eb6e52bdea6806923/packages/source/quantum/quantum/plugins/plumgrid/plumgrid_nos_plugin/plumgrid_plugin.py#L186-L198
zhl2008/awd-platform
0416b31abea29743387b10b3914581fbe8e7da5e
web_flaskbb/Python-2.7.9/Lib/BaseHTTPServer.py
python
BaseHTTPRequestHandler.end_headers
(self)
Send the blank line ending the MIME headers.
Send the blank line ending the MIME headers.
[ "Send", "the", "blank", "line", "ending", "the", "MIME", "headers", "." ]
def end_headers(self): """Send the blank line ending the MIME headers.""" if self.request_version != 'HTTP/0.9': self.wfile.write("\r\n")
[ "def", "end_headers", "(", "self", ")", ":", "if", "self", ".", "request_version", "!=", "'HTTP/0.9'", ":", "self", ".", "wfile", ".", "write", "(", "\"\\r\\n\"", ")" ]
https://github.com/zhl2008/awd-platform/blob/0416b31abea29743387b10b3914581fbe8e7da5e/web_flaskbb/Python-2.7.9/Lib/BaseHTTPServer.py#L409-L412
theotherp/nzbhydra
4b03d7f769384b97dfc60dade4806c0fc987514e
libs/decimal.py
python
Context.logical_xor
(self, a, b)
return a.logical_xor(b, context=self)
Applies the logical operation 'xor' between each operand's digits. The operands must be both logical numbers. >>> ExtendedContext.logical_xor(Decimal('0'), Decimal('0')) Decimal('0') >>> ExtendedContext.logical_xor(Decimal('0'), Decimal('1')) Decimal('1') >>> ExtendedContext.logical_xor(Decimal('1'), Decimal('0')) Decimal('1') >>> ExtendedContext.logical_xor(Decimal('1'), Decimal('1')) Decimal('0') >>> ExtendedContext.logical_xor(Decimal('1100'), Decimal('1010')) Decimal('110') >>> ExtendedContext.logical_xor(Decimal('1111'), Decimal('10')) Decimal('1101') >>> ExtendedContext.logical_xor(110, 1101) Decimal('1011') >>> ExtendedContext.logical_xor(Decimal(110), 1101) Decimal('1011') >>> ExtendedContext.logical_xor(110, Decimal(1101)) Decimal('1011')
Applies the logical operation 'xor' between each operand's digits.
[ "Applies", "the", "logical", "operation", "xor", "between", "each", "operand", "s", "digits", "." ]
def logical_xor(self, a, b): """Applies the logical operation 'xor' between each operand's digits. The operands must be both logical numbers. >>> ExtendedContext.logical_xor(Decimal('0'), Decimal('0')) Decimal('0') >>> ExtendedContext.logical_xor(Decimal('0'), Decimal('1')) Decimal('1') >>> ExtendedContext.logical_xor(Decimal('1'), Decimal('0')) Decimal('1') >>> ExtendedContext.logical_xor(Decimal('1'), Decimal('1')) Decimal('0') >>> ExtendedContext.logical_xor(Decimal('1100'), Decimal('1010')) Decimal('110') >>> ExtendedContext.logical_xor(Decimal('1111'), Decimal('10')) Decimal('1101') >>> ExtendedContext.logical_xor(110, 1101) Decimal('1011') >>> ExtendedContext.logical_xor(Decimal(110), 1101) Decimal('1011') >>> ExtendedContext.logical_xor(110, Decimal(1101)) Decimal('1011') """ a = _convert_other(a, raiseit=True) return a.logical_xor(b, context=self)
[ "def", "logical_xor", "(", "self", ",", "a", ",", "b", ")", ":", "a", "=", "_convert_other", "(", "a", ",", "raiseit", "=", "True", ")", "return", "a", ".", "logical_xor", "(", "b", ",", "context", "=", "self", ")" ]
https://github.com/theotherp/nzbhydra/blob/4b03d7f769384b97dfc60dade4806c0fc987514e/libs/decimal.py#L4636-L4661
leoshine/Spherical_Regression
d19bc2f6f52982d4d58f5ddabe4231381d7facd7
S3.3D_Rotation/trainval_workdir.py
python
adjust_learning_rate
(optimizer, epoch)
return lr
Sets the learning rate to the initial LR decayed by 10 every _N_ epochs
Sets the learning rate to the initial LR decayed by 10 every _N_ epochs
[ "Sets", "the", "learning", "rate", "to", "the", "initial", "LR", "decayed", "by", "10", "every", "_N_", "epochs" ]
def adjust_learning_rate(optimizer, epoch): """Sets the learning rate to the initial LR decayed by 10 every _N_ epochs""" _N_ = int((40000*batch_size)/29786./2 /10)*10 # '2' mean at most decay 2 times. lr = opt.base_lr * (0.1 ** (epoch // _N_)) # 300)) for param_group in optimizer.param_groups: param_group['lr'] = lr return lr
[ "def", "adjust_learning_rate", "(", "optimizer", ",", "epoch", ")", ":", "_N_", "=", "int", "(", "(", "40000", "*", "batch_size", ")", "/", "29786.", "/", "2", "/", "10", ")", "*", "10", "# '2' mean at most decay 2 times.", "lr", "=", "opt", ".", "base_l...
https://github.com/leoshine/Spherical_Regression/blob/d19bc2f6f52982d4d58f5ddabe4231381d7facd7/S3.3D_Rotation/trainval_workdir.py#L256-L262
ajinabraham/OWASP-Xenotix-XSS-Exploit-Framework
cb692f527e4e819b6c228187c5702d990a180043
external/Scripting Engine/Xenotix Python Scripting Engine/packages/IronPython.StdLib.2.7.4/content/Lib/uuid.py
python
_ifconfig_getnode
()
return None
Get the hardware address on Unix by running ifconfig.
Get the hardware address on Unix by running ifconfig.
[ "Get", "the", "hardware", "address", "on", "Unix", "by", "running", "ifconfig", "." ]
def _ifconfig_getnode(): """Get the hardware address on Unix by running ifconfig.""" # This works on Linux ('' or '-a'), Tru64 ('-av'), but not all Unixes. for args in ('', '-a', '-av'): mac = _find_mac('ifconfig', args, ['hwaddr', 'ether'], lambda i: i+1) if mac: return mac import socket ip_addr = socket.gethostbyname(socket.gethostname()) # Try getting the MAC addr from arp based on our IP address (Solaris). mac = _find_mac('arp', '-an', [ip_addr], lambda i: -1) if mac: return mac # This might work on HP-UX. mac = _find_mac('lanscan', '-ai', ['lan0'], lambda i: 0) if mac: return mac return None
[ "def", "_ifconfig_getnode", "(", ")", ":", "# This works on Linux ('' or '-a'), Tru64 ('-av'), but not all Unixes.", "for", "args", "in", "(", "''", ",", "'-a'", ",", "'-av'", ")", ":", "mac", "=", "_find_mac", "(", "'ifconfig'", ",", "args", ",", "[", "'hwaddr'",...
https://github.com/ajinabraham/OWASP-Xenotix-XSS-Exploit-Framework/blob/cb692f527e4e819b6c228187c5702d990a180043/external/Scripting Engine/Xenotix Python Scripting Engine/packages/IronPython.StdLib.2.7.4/content/Lib/uuid.py#L316-L338
WZMIAOMIAO/deep-learning-for-image-processing
a4502c284958d4bf78fb77b089a90e7688ddc196
pytorch_object_detection/ssd/my_dataset.py
python
VOCDataSet.coco_index
(self, idx)
return target
该方法是专门为pycocotools统计标签信息准备,不对图像和标签作任何处理 由于不用去读取图片,可大幅缩减统计时间 Args: idx: 输入需要获取图像的索引
该方法是专门为pycocotools统计标签信息准备,不对图像和标签作任何处理 由于不用去读取图片,可大幅缩减统计时间
[ "该方法是专门为pycocotools统计标签信息准备,不对图像和标签作任何处理", "由于不用去读取图片,可大幅缩减统计时间" ]
def coco_index(self, idx): """ 该方法是专门为pycocotools统计标签信息准备,不对图像和标签作任何处理 由于不用去读取图片,可大幅缩减统计时间 Args: idx: 输入需要获取图像的索引 """ # read xml xml_path = self.xml_list[idx] with open(xml_path) as fid: xml_str = fid.read() xml = etree.fromstring(xml_str) data = self.parse_xml_to_dict(xml)["annotation"] data_height = int(data["size"]["height"]) data_width = int(data["size"]["width"]) height_width = [data_height, data_width] # img_path = os.path.join(self.img_root, data["filename"]) # image = Image.open(img_path) # if image.format != "JPEG": # raise ValueError("Image format not JPEG") boxes = [] labels = [] iscrowd = [] for obj in data["object"]: # 将所有的gt box信息转换成相对值0-1之间 xmin = float(obj["bndbox"]["xmin"]) / data_width xmax = float(obj["bndbox"]["xmax"]) / data_width ymin = float(obj["bndbox"]["ymin"]) / data_height ymax = float(obj["bndbox"]["ymax"]) / data_height boxes.append([xmin, ymin, xmax, ymax]) labels.append(self.class_dict[obj["name"]]) iscrowd.append(int(obj["difficult"])) # convert everything into a torch.Tensor boxes = torch.as_tensor(boxes, dtype=torch.float32) labels = torch.as_tensor(labels, dtype=torch.int64) iscrowd = torch.as_tensor(iscrowd, dtype=torch.int64) height_width = torch.as_tensor(height_width, dtype=torch.int64) image_id = torch.tensor([idx]) area = (boxes[:, 3] - boxes[:, 1]) * (boxes[:, 2] - boxes[:, 0]) target = {} target["boxes"] = boxes target["labels"] = labels target["image_id"] = image_id target["area"] = area target["iscrowd"] = iscrowd target["height_width"] = height_width return target
[ "def", "coco_index", "(", "self", ",", "idx", ")", ":", "# read xml", "xml_path", "=", "self", ".", "xml_list", "[", "idx", "]", "with", "open", "(", "xml_path", ")", "as", "fid", ":", "xml_str", "=", "fid", ".", "read", "(", ")", "xml", "=", "etre...
https://github.com/WZMIAOMIAO/deep-learning-for-image-processing/blob/a4502c284958d4bf78fb77b089a90e7688ddc196/pytorch_object_detection/ssd/my_dataset.py#L130-L180
AppScale/gts
46f909cf5dc5ba81faf9d81dc9af598dcf8a82a9
AppServer/lib/django-1.5/django/core/files/storage.py
python
Storage.get_valid_name
(self, name)
return get_valid_filename(name)
Returns a filename, based on the provided filename, that's suitable for use in the target storage system.
Returns a filename, based on the provided filename, that's suitable for use in the target storage system.
[ "Returns", "a", "filename", "based", "on", "the", "provided", "filename", "that", "s", "suitable", "for", "use", "in", "the", "target", "storage", "system", "." ]
def get_valid_name(self, name): """ Returns a filename, based on the provided filename, that's suitable for use in the target storage system. """ return get_valid_filename(name)
[ "def", "get_valid_name", "(", "self", ",", "name", ")", ":", "return", "get_valid_filename", "(", "name", ")" ]
https://github.com/AppScale/gts/blob/46f909cf5dc5ba81faf9d81dc9af598dcf8a82a9/AppServer/lib/django-1.5/django/core/files/storage.py#L55-L60
dpp/simply_lift
cf49f7dcce81c7f1557314dd0f0bb08aaedc73da
elyxer.py
python
Float.embed
(self)
return tagged
Embed the whole contents in a div.
Embed the whole contents in a div.
[ "Embed", "the", "whole", "contents", "in", "a", "div", "." ]
def embed(self): "Embed the whole contents in a div." embeddedtag = self.getembeddedtag() tagged = TaggedText().complete(self.contents, embeddedtag, True) self.contents = [tagged] return tagged
[ "def", "embed", "(", "self", ")", ":", "embeddedtag", "=", "self", ".", "getembeddedtag", "(", ")", "tagged", "=", "TaggedText", "(", ")", ".", "complete", "(", "self", ".", "contents", ",", "embeddedtag", ",", "True", ")", "self", ".", "contents", "="...
https://github.com/dpp/simply_lift/blob/cf49f7dcce81c7f1557314dd0f0bb08aaedc73da/elyxer.py#L6726-L6731
pwnieexpress/pwn_plug_sources
1a23324f5dc2c3de20f9c810269b6a29b2758cad
src/metagoofil/hachoir_core/tools.py
python
paddingSize
(value, align)
Compute size of a padding field. >>> paddingSize(31, 4) 1 >>> paddingSize(32, 4) 0 >>> paddingSize(33, 4) 3 Note: (value + paddingSize(value, align)) == alignValue(value, align)
Compute size of a padding field.
[ "Compute", "size", "of", "a", "padding", "field", "." ]
def paddingSize(value, align): """ Compute size of a padding field. >>> paddingSize(31, 4) 1 >>> paddingSize(32, 4) 0 >>> paddingSize(33, 4) 3 Note: (value + paddingSize(value, align)) == alignValue(value, align) """ if value % align != 0: return align - (value % align) else: return 0
[ "def", "paddingSize", "(", "value", ",", "align", ")", ":", "if", "value", "%", "align", "!=", "0", ":", "return", "align", "-", "(", "value", "%", "align", ")", "else", ":", "return", "0" ]
https://github.com/pwnieexpress/pwn_plug_sources/blob/1a23324f5dc2c3de20f9c810269b6a29b2758cad/src/metagoofil/hachoir_core/tools.py#L42-L58
Yelp/mrjob
091572e87bc24cc64be40278dd0f5c3617c98d4b
mrjob/cloud.py
python
HadoopInTheCloudJobRunner._cp_to_local_cmd
(self)
return 'hadoop fs -copyToLocal'
Command to copy files from the cloud to the local directory (usually via Hadoop). Redefine this as needed; for example, on EMR, we sometimes have to use ``aws s3 cp`` because ``hadoop`` isn't installed at bootstrap time.
Command to copy files from the cloud to the local directory (usually via Hadoop). Redefine this as needed; for example, on EMR, we sometimes have to use ``aws s3 cp`` because ``hadoop`` isn't installed at bootstrap time.
[ "Command", "to", "copy", "files", "from", "the", "cloud", "to", "the", "local", "directory", "(", "usually", "via", "Hadoop", ")", ".", "Redefine", "this", "as", "needed", ";", "for", "example", "on", "EMR", "we", "sometimes", "have", "to", "use", "aws",...
def _cp_to_local_cmd(self): """Command to copy files from the cloud to the local directory (usually via Hadoop). Redefine this as needed; for example, on EMR, we sometimes have to use ``aws s3 cp`` because ``hadoop`` isn't installed at bootstrap time.""" return 'hadoop fs -copyToLocal'
[ "def", "_cp_to_local_cmd", "(", "self", ")", ":", "return", "'hadoop fs -copyToLocal'" ]
https://github.com/Yelp/mrjob/blob/091572e87bc24cc64be40278dd0f5c3617c98d4b/mrjob/cloud.py#L196-L201
pm4py/pm4py-core
7807b09a088b02199cd0149d724d0e28793971bf
pm4py/objects/petri_net/utils/petri_utils.py
python
decorate_transitions_prepostset
(net)
Decorate transitions with sub and addition markings Parameters ------------- net Petri net
Decorate transitions with sub and addition markings
[ "Decorate", "transitions", "with", "sub", "and", "addition", "markings" ]
def decorate_transitions_prepostset(net): """ Decorate transitions with sub and addition markings Parameters ------------- net Petri net """ from pm4py.objects.petri_net.obj import Marking for trans in net.transitions: sub_marking = Marking() add_marking = Marking() for arc in trans.in_arcs: sub_marking[arc.source] = arc.weight add_marking[arc.source] = -arc.weight for arc in trans.out_arcs: if arc.target in add_marking: add_marking[arc.target] = arc.weight + add_marking[arc.target] else: add_marking[arc.target] = arc.weight trans.sub_marking = sub_marking trans.add_marking = add_marking
[ "def", "decorate_transitions_prepostset", "(", "net", ")", ":", "from", "pm4py", ".", "objects", ".", "petri_net", ".", "obj", "import", "Marking", "for", "trans", "in", "net", ".", "transitions", ":", "sub_marking", "=", "Marking", "(", ")", "add_marking", ...
https://github.com/pm4py/pm4py-core/blob/7807b09a088b02199cd0149d724d0e28793971bf/pm4py/objects/petri_net/utils/petri_utils.py#L395-L418
nginxinc/nginx-amplify-agent
81c4002c156809039933234abeb292edee3ac492
amplify/agent/objects/nginx/object.py
python
NginxObject.get_alive_plus_status_urls
(self)
return external_status_url, internal_status_url
Tries to get alive plus urls There are two types of plus status urls: internal and external - internal are for the agent and usually they have the localhost ip in address - external are for the browsers and usually they have a normal server name Returns a tuple of str or Nones - (external_url, internal_url) Even if external status url is not responding (cannot be accesible from the host) we should return it to show in our UI :return: (str or None, str or None)
Tries to get alive plus urls There are two types of plus status urls: internal and external - internal are for the agent and usually they have the localhost ip in address - external are for the browsers and usually they have a normal server name
[ "Tries", "to", "get", "alive", "plus", "urls", "There", "are", "two", "types", "of", "plus", "status", "urls", ":", "internal", "and", "external", "-", "internal", "are", "for", "the", "agent", "and", "usually", "they", "have", "the", "localhost", "ip", ...
def get_alive_plus_status_urls(self): """ Tries to get alive plus urls There are two types of plus status urls: internal and external - internal are for the agent and usually they have the localhost ip in address - external are for the browsers and usually they have a normal server name Returns a tuple of str or Nones - (external_url, internal_url) Even if external status url is not responding (cannot be accesible from the host) we should return it to show in our UI :return: (str or None, str or None) """ internal_urls = self.config.plus_status_internal_urls external_urls = self.config.plus_status_external_urls if 'plus_status' in context.app_config.get('nginx', {}): predefined_uri = context.app_config['nginx']['plus_status'] internal_urls.append(http.resolve_uri(predefined_uri)) internal_status_url = self.__get_alive_status(internal_urls, json=True, what='plus status internal') if internal_status_url: self.eventd.event( level=INFO, message='nginx internal plus_status detected, %s' % internal_status_url ) external_status_url = self.__get_alive_status(external_urls, json=True, what='plus status external') if len(self.config.plus_status_external_urls) > 0: if not external_status_url: external_status_url = self.config.plus_status_external_urls[0] self.eventd.event( level=INFO, message='nginx external plus_status detected, %s' % external_status_url ) return external_status_url, internal_status_url
[ "def", "get_alive_plus_status_urls", "(", "self", ")", ":", "internal_urls", "=", "self", ".", "config", ".", "plus_status_internal_urls", "external_urls", "=", "self", ".", "config", ".", "plus_status_external_urls", "if", "'plus_status'", "in", "context", ".", "ap...
https://github.com/nginxinc/nginx-amplify-agent/blob/81c4002c156809039933234abeb292edee3ac492/amplify/agent/objects/nginx/object.py#L147-L185
ayeowch/bitnodes
5fa4ac3fe8e748d450228ec1fea01b8e61978e5c
ping.py
python
set_reachable
(nodes)
return REDIS_CONN.scard('reachable')
Adds reachable nodes that are not already in the open set into the reachable set in Redis. New workers can be spawned separately to establish and maintain connection with these nodes.
Adds reachable nodes that are not already in the open set into the reachable set in Redis. New workers can be spawned separately to establish and maintain connection with these nodes.
[ "Adds", "reachable", "nodes", "that", "are", "not", "already", "in", "the", "open", "set", "into", "the", "reachable", "set", "in", "Redis", ".", "New", "workers", "can", "be", "spawned", "separately", "to", "establish", "and", "maintain", "connection", "wit...
def set_reachable(nodes): """ Adds reachable nodes that are not already in the open set into the reachable set in Redis. New workers can be spawned separately to establish and maintain connection with these nodes. """ for node in nodes: address = node[0] port = node[1] services = node[2] height = node[3] if not REDIS_CONN.sismember('open', (address, port)): REDIS_CONN.sadd('reachable', (address, port, services, height)) return REDIS_CONN.scard('reachable')
[ "def", "set_reachable", "(", "nodes", ")", ":", "for", "node", "in", "nodes", ":", "address", "=", "node", "[", "0", "]", "port", "=", "node", "[", "1", "]", "services", "=", "node", "[", "2", "]", "height", "=", "node", "[", "3", "]", "if", "n...
https://github.com/ayeowch/bitnodes/blob/5fa4ac3fe8e748d450228ec1fea01b8e61978e5c/ping.py#L276-L289
plivo/plivoframework
29fc41fb3c887d5d9022a941e87bbeb2269112ff
src/plivo/rest/freeswitch/api.py
python
PlivoRestApi.play
(self)
return self.send_response(Success=result, Message=msg)
Play something to a Call or bridged leg or both legs. Allow playing a sound to a Call via the REST API. To play sound, make an HTTP POST request to the resource URI. POST Parameters ---------------- Required Parameters - You must POST the following parameters: CallUUID: Unique Call ID to which the action should occur to. Sounds: Comma separated list of sound files to play. Optional Parameters: [Length]: number of seconds before terminating sounds. [Legs]: 'aleg'|'bleg'|'both'. On which leg(s) to play something. 'aleg' means only play on the Call. 'bleg' means only play on the bridged leg of the Call. 'both' means play on the Call and the bridged leg of the Call. Default is 'aleg' . [Loop]: 'true'|'false'. Play sound loop indefinitely (default 'false') [Mix]: 'true'|'false'. Mix with current audio stream (default 'true') [Delimiter]: The delimiter used in the sounds list (default: ',')
Play something to a Call or bridged leg or both legs. Allow playing a sound to a Call via the REST API. To play sound, make an HTTP POST request to the resource URI.
[ "Play", "something", "to", "a", "Call", "or", "bridged", "leg", "or", "both", "legs", ".", "Allow", "playing", "a", "sound", "to", "a", "Call", "via", "the", "REST", "API", ".", "To", "play", "sound", "make", "an", "HTTP", "POST", "request", "to", "t...
def play(self): """Play something to a Call or bridged leg or both legs. Allow playing a sound to a Call via the REST API. To play sound, make an HTTP POST request to the resource URI. POST Parameters ---------------- Required Parameters - You must POST the following parameters: CallUUID: Unique Call ID to which the action should occur to. Sounds: Comma separated list of sound files to play. Optional Parameters: [Length]: number of seconds before terminating sounds. [Legs]: 'aleg'|'bleg'|'both'. On which leg(s) to play something. 'aleg' means only play on the Call. 'bleg' means only play on the bridged leg of the Call. 'both' means play on the Call and the bridged leg of the Call. Default is 'aleg' . [Loop]: 'true'|'false'. Play sound loop indefinitely (default 'false') [Mix]: 'true'|'false'. Mix with current audio stream (default 'true') [Delimiter]: The delimiter used in the sounds list (default: ',') """ self._rest_inbound_socket.log.debug("RESTAPI Play with %s" \ % str(request.form.items())) msg = "" result = False calluuid = get_post_param(request, 'CallUUID') sounds = get_post_param(request, 'Sounds') legs = get_post_param(request, 'Legs') length = get_post_param(request, 'Length') loop = get_post_param(request, 'Loop') == 'true' mix = get_post_param(request, 'Mix') delimiter = get_post_param(request, 'Delimiter') if mix == 'false': mix = False else: mix = True if not calluuid: msg = "CallUUID Parameter Missing" return self.send_response(Success=result, Message=msg) if not sounds: msg = "Sounds Parameter Missing" return self.send_response(Success=result, Message=msg) if not legs: legs = 'aleg' if not length: length = 3600 else: try: length = int(length) except (ValueError, TypeError): msg = "Length Parameter must be a positive integer" return self.send_response(Success=result, Message=msg) if length < 1: msg = "Length Parameter must be a positive integer" return self.send_response(Success=result, Message=msg) if not delimiter: delimiter = ',' sounds_list = sounds.split(delimiter) if not sounds_list: msg = "Sounds Parameter is Invalid" return self.send_response(Success=result, Message=msg) # now do the job ! if self._rest_inbound_socket.play_on_call(calluuid, sounds_list, legs, length=length, schedule=0, mix=mix, loop=loop): msg = "Play Request Executed" result = True return self.send_response(Success=result, Message=msg) msg = "Play Request Failed" return self.send_response(Success=result, Message=msg)
[ "def", "play", "(", "self", ")", ":", "self", ".", "_rest_inbound_socket", ".", "log", ".", "debug", "(", "\"RESTAPI Play with %s\"", "%", "str", "(", "request", ".", "form", ".", "items", "(", ")", ")", ")", "msg", "=", "\"\"", "result", "=", "False",...
https://github.com/plivo/plivoframework/blob/29fc41fb3c887d5d9022a941e87bbeb2269112ff/src/plivo/rest/freeswitch/api.py#L1775-L1858
IronLanguages/ironpython3
7a7bb2a872eeab0d1009fc8a6e24dca43f65b693
Src/StdLib/Lib/asyncio/tasks.py
python
Task.get_stack
(self, *, limit=None)
return frames
Return the list of stack frames for this task's coroutine. If the coroutine is not done, this returns the stack where it is suspended. If the coroutine has completed successfully or was cancelled, this returns an empty list. If the coroutine was terminated by an exception, this returns the list of traceback frames. The frames are always ordered from oldest to newest. The optional limit gives the maximum number of frames to return; by default all available frames are returned. Its meaning differs depending on whether a stack or a traceback is returned: the newest frames of a stack are returned, but the oldest frames of a traceback are returned. (This matches the behavior of the traceback module.) For reasons beyond our control, only one stack frame is returned for a suspended coroutine.
Return the list of stack frames for this task's coroutine.
[ "Return", "the", "list", "of", "stack", "frames", "for", "this", "task", "s", "coroutine", "." ]
def get_stack(self, *, limit=None): """Return the list of stack frames for this task's coroutine. If the coroutine is not done, this returns the stack where it is suspended. If the coroutine has completed successfully or was cancelled, this returns an empty list. If the coroutine was terminated by an exception, this returns the list of traceback frames. The frames are always ordered from oldest to newest. The optional limit gives the maximum number of frames to return; by default all available frames are returned. Its meaning differs depending on whether a stack or a traceback is returned: the newest frames of a stack are returned, but the oldest frames of a traceback are returned. (This matches the behavior of the traceback module.) For reasons beyond our control, only one stack frame is returned for a suspended coroutine. """ frames = [] try: # 'async def' coroutines f = self._coro.cr_frame except AttributeError: f = self._coro.gi_frame if f is not None: while f is not None: if limit is not None: if limit <= 0: break limit -= 1 frames.append(f) f = f.f_back frames.reverse() elif self._exception is not None: tb = self._exception.__traceback__ while tb is not None: if limit is not None: if limit <= 0: break limit -= 1 frames.append(tb.tb_frame) tb = tb.tb_next return frames
[ "def", "get_stack", "(", "self", ",", "*", ",", "limit", "=", "None", ")", ":", "frames", "=", "[", "]", "try", ":", "# 'async def' coroutines", "f", "=", "self", ".", "_coro", ".", "cr_frame", "except", "AttributeError", ":", "f", "=", "self", ".", ...
https://github.com/IronLanguages/ironpython3/blob/7a7bb2a872eeab0d1009fc8a6e24dca43f65b693/Src/StdLib/Lib/asyncio/tasks.py#L110-L155