{"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreport on GitHub that the branch is failing to build at the specified commit.", "response": "def report(self, branch, commit, infourl=None):\n \"\"\"Report on GitHub that the specified branch is failing to build at\n the specified commit. The method will open an issue indicating that\n the branch is failing. If there is an issue already open, it will add a\n comment avoiding to report twice about the same failure.\n\n Args:\n branch (str): branch name to report about.\n commit (str): commit hash at which the build fails.\n infourl (str): URL with extra info about the failure such as the\n build logs.\n \"\"\"\n issue_number = self._get_report_issue_number()\n if issue_number:\n self._report_as_comment(issue_number, branch, commit, infourl)\n else:\n self._report_as_issue(branch, commit, infourl)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef iplot_state_paulivec(rho, figsize=None, slider=False, show_legend=False):\n\n # HTML\n html_template = Template(\"\"\"\n

\n

\n

\n \"\"\")\n\n # JavaScript\n javascript_template = Template(\"\"\"\n \n \"\"\")\n rho = _validate_input_state(rho)\n # set default figure size if none given\n if figsize is None:\n figsize = (7, 5)\n\n options = {'width': figsize[0], 'height': figsize[1],\n 'slider': int(slider), 'show_legend': int(show_legend)}\n\n # Process data and execute\n div_number = str(time.time())\n div_number = re.sub('[.]', '', div_number)\n\n data_to_plot = []\n rho_data = process_data(rho)\n data_to_plot.append(dict(\n data=rho_data\n ))\n\n html = html_template.substitute({\n 'divNumber': div_number\n })\n\n javascript = javascript_template.substitute({\n 'divNumber': div_number,\n 'executions': data_to_plot,\n 'options': options\n })\n\n display(HTML(html + javascript))", "response": "Create a paulivec representation of the state vector or density matrix."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nplot the state vector or density matrix of a quantum state.", "response": "def iplot_state(quantum_state, method='city', figsize=None):\n \"\"\"Plot the quantum state.\n\n Args:\n quantum_state (ndarray): statevector or density matrix\n representation of a quantum state.\n method (str): Plotting method to use.\n figsize (tuple): Figure size in pixels.\n\n Raises:\n VisualizationError: if the input is not a statevector or density\n matrix, or if the state is not an multi-qubit quantum state.\n \"\"\"\n warnings.warn(\"iplot_state is deprecated, and will be removed in \\\n the 0.9 release. Use the iplot_state_ * functions \\\n instead.\",\n DeprecationWarning)\n rho = _validate_input_state(quantum_state)\n if method == \"city\":\n iplot_state_city(rho, figsize=figsize)\n elif method == \"paulivec\":\n iplot_state_paulivec(rho, figsize=figsize)\n elif method == \"qsphere\":\n iplot_state_qsphere(rho, figsize=figsize)\n elif method == \"bloch\":\n iplot_bloch_multivector(rho, figsize=figsize)\n elif method == \"hinton\":\n iplot_state_hinton(rho, figsize=figsize)\n else:\n raise VisualizationError('Invalid plot state method.')"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\napplies RZZ to circuit.", "response": "def rzz(self, theta, qubit1, qubit2):\n \"\"\"Apply RZZ to circuit.\"\"\"\n return self.append(RZZGate(theta), [qubit1, qubit2], [])"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\napply Fredkin to circuit.", "response": "def cswap(self, ctl, tgt1, tgt2):\n \"\"\"Apply Fredkin to circuit.\"\"\"\n return self.append(FredkinGate(), [ctl, tgt1, tgt2], [])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ndefining the internal state of the object.", "response": "def _define(self):\n \"\"\"\n gate cswap a,b,c\n { cx c,b;\n ccx a,b,c;\n cx c,b;\n }\n \"\"\"\n definition = []\n q = QuantumRegister(3, \"q\")\n rule = [\n (CnotGate(), [q[2], q[1]], []),\n (ToffoliGate(), [q[0], q[1], q[2]], []),\n (CnotGate(), [q[2], q[1]], [])\n ]\n for inst in rule:\n definition.append(inst)\n self.definition = definition"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _initialize_backend_prop(self):\n backend_prop = self.backend_prop\n for ginfo in backend_prop.gates:\n if ginfo.gate == 'cx':\n for item in ginfo.parameters:\n if item.name == 'gate_error':\n g_reliab = 1.0 - item.value\n break\n else:\n g_reliab = 1.0\n swap_reliab = -math.log(pow(g_reliab, 3))\n self.swap_graph.add_edge(ginfo.qubits[0], ginfo.qubits[1], weight=swap_reliab)\n self.swap_graph.add_edge(ginfo.qubits[1], ginfo.qubits[0], weight=swap_reliab)\n self.cx_errors[(ginfo.qubits[0], ginfo.qubits[1])] = g_reliab\n self.gate_list.append((ginfo.qubits[0], ginfo.qubits[1]))\n idx = 0\n for q in backend_prop.qubits:\n for nduv in q:\n if nduv.name == 'readout_error':\n self.readout_errors[idx] = 1.0 - nduv.value\n self.available_hw_qubits.append(idx)\n idx += 1\n for edge in self.cx_errors:\n self.gate_cost[edge] = self.cx_errors[edge] * self.readout_errors[edge[0]] *\\\n self.readout_errors[edge[1]]\n self.swap_paths, swap_costs_temp = nx.algorithms.shortest_paths.dense.\\\n floyd_warshall_predecessor_and_distance(self.swap_graph, weight='weight')\n for i in swap_costs_temp:\n self.swap_costs[i] = {}\n for j in swap_costs_temp[i]:\n if (i, j) in self.cx_errors:\n self.swap_costs[i][j] = self.cx_errors[(i, j)]\n elif (j, i) in self.cx_errors:\n self.swap_costs[i][j] = self.cx_errors[(j, i)]\n else:\n best_reliab = 0.0\n for n in self.swap_graph.neighbors(j):\n if (n, j) in self.cx_errors:\n reliab = math.exp(-swap_costs_temp[i][n])*self.cx_errors[(n, j)]\n else:\n reliab = math.exp(-swap_costs_temp[i][n])*self.cx_errors[(j, n)]\n if reliab > best_reliab:\n best_reliab = reliab\n self.swap_costs[i][j] = best_reliab", "response": "Initialize the internal state of the backend properties."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _create_program_graph(self, dag):\n idx = 0\n for q in dag.qubits():\n self.qarg_to_id[q[0].name + str(q[1])] = idx\n idx += 1\n for gate in dag.twoQ_gates():\n qid1 = self._qarg_to_id(gate.qargs[0])\n qid2 = self._qarg_to_id(gate.qargs[1])\n min_q = min(qid1, qid2)\n max_q = max(qid1, qid2)\n edge_weight = 1\n if self.prog_graph.has_edge(min_q, max_q):\n edge_weight = self.prog_graph[min_q][max_q]['weight'] + 1\n self.prog_graph.add_edge(min_q, max_q, weight=edge_weight)\n return idx", "response": "Create the program graph for the given dag."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _select_next_edge(self):\n for edge in self.pending_program_edges:\n q1_mapped = edge[0] in self.prog2hw\n q2_mapped = edge[1] in self.prog2hw\n assert not (q1_mapped and q2_mapped)\n if q1_mapped or q2_mapped:\n return edge\n return self.pending_program_edges[0]", "response": "Select next edge in the list of pending program edges."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _select_best_remaining_cx(self):\n candidates = []\n for gate in self.gate_list:\n chk1 = gate[0] in self.available_hw_qubits\n chk2 = gate[1] in self.available_hw_qubits\n if chk1 and chk2:\n candidates.append(gate)\n best_reliab = 0\n best_item = None\n for item in candidates:\n if self.gate_cost[item] > best_reliab:\n best_reliab = self.gate_cost[item]\n best_item = item\n return best_item", "response": "Select the best remaining CNOT in the hardware for the next program edge."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nselect the best available hardware qubit for the next program qubit.", "response": "def _select_best_remaining_qubit(self, prog_qubit):\n \"\"\"\n Select the best remaining hardware qubit for the next program qubit.\n \"\"\"\n reliab_store = {}\n for hw_qubit in self.available_hw_qubits:\n reliab = 1\n for n in self.prog_graph.neighbors(prog_qubit):\n if n in self.prog2hw:\n reliab *= self.swap_costs[self.prog2hw[n]][hw_qubit]\n reliab *= self.readout_errors[hw_qubit]\n reliab_store[hw_qubit] = reliab\n max_reliab = 0\n best_hw_qubit = None\n for hw_qubit in reliab_store:\n if reliab_store[hw_qubit] > max_reliab:\n max_reliab = reliab_store[hw_qubit]\n best_hw_qubit = hw_qubit\n return best_hw_qubit"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef run(self, dag):\n self._initialize_backend_prop()\n num_qubits = self._create_program_graph(dag)\n if num_qubits > len(self.swap_graph):\n raise TranspilerError('Number of qubits greater than device.')\n for end1, end2, _ in sorted(self.prog_graph.edges(data=True),\n key=lambda x: x[2]['weight'], reverse=True):\n self.pending_program_edges.append((end1, end2))\n while self.pending_program_edges:\n edge = self._select_next_edge()\n q1_mapped = edge[0] in self.prog2hw\n q2_mapped = edge[1] in self.prog2hw\n if (not q1_mapped) and (not q2_mapped):\n best_hw_edge = self._select_best_remaining_cx()\n self.prog2hw[edge[0]] = best_hw_edge[0]\n self.prog2hw[edge[1]] = best_hw_edge[1]\n self.available_hw_qubits.remove(best_hw_edge[0])\n self.available_hw_qubits.remove(best_hw_edge[1])\n elif not q1_mapped:\n best_hw_qubit = self._select_best_remaining_qubit(edge[0])\n self.prog2hw[edge[0]] = best_hw_qubit\n self.available_hw_qubits.remove(best_hw_qubit)\n else:\n best_hw_qubit = self._select_best_remaining_qubit(edge[1])\n self.prog2hw[edge[1]] = best_hw_qubit\n self.available_hw_qubits.remove(best_hw_qubit)\n new_edges = [x for x in self.pending_program_edges\n if not (x[0] in self.prog2hw and x[1] in self.prog2hw)]\n self.pending_program_edges = new_edges\n for qid in self.qarg_to_id.values():\n if qid not in self.prog2hw:\n self.prog2hw[qid] = self.available_hw_qubits[0]\n self.available_hw_qubits.remove(self.prog2hw[qid])\n layout = Layout()\n for q in dag.qubits():\n pid = self._qarg_to_id(q)\n hwid = self.prog2hw[pid]\n layout[(q[0], q[1])] = hwid\n self.property_set['layout'] = layout", "response": "Main method for the noise adaptive layout."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef instruction_list(self):\n instruction_list = []\n for instruction in self.data:\n if isinstance(instruction, CompositeGate):\n instruction_list.extend(instruction.instruction_list())\n else:\n instruction_list.append(instruction)\n return instruction_list", "response": "Return a list of instructions for this CompositeGate."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding controls to this gate.", "response": "def q_if(self, *qregs):\n \"\"\"Add controls to this gate.\"\"\"\n self.data = [gate.q_if(qregs) for gate in self.data]\n return self"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef c_if(self, classical, val):\n self.data = [gate.c_if(classical, val) for gate in self.data]\n return self", "response": "Add classical control register."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef is_unitary(self, atol=None, rtol=None):\n if atol is None:\n atol = self._atol\n if rtol is None:\n rtol = self._rtol\n return is_unitary_matrix(self._data, rtol=rtol, atol=atol)", "response": "Return True if operator is a unitary matrix."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef conjugate(self):\n return Operator(\n np.conj(self.data), self.input_dims(), self.output_dims())", "response": "Return the conjugate of the operator."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the transpose of the operator.", "response": "def transpose(self):\n \"\"\"Return the transpose of the operator.\"\"\"\n return Operator(\n np.transpose(self.data), self.input_dims(), self.output_dims())"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef power(self, n):\n if not isinstance(n, int):\n raise QiskitError(\"Can only take integer powers of Operator.\")\n if self.input_dims() != self.output_dims():\n raise QiskitError(\"Can only power with input_dims = output_dims.\")\n # Override base class power so we can implement more efficiently\n # using Numpy.matrix_power\n return Operator(\n np.linalg.matrix_power(self.data, n), self.input_dims(),\n self.output_dims())", "response": "Return the matrix power of the operator."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef add(self, other):\n if not isinstance(other, Operator):\n other = Operator(other)\n if self.dim != other.dim:\n raise QiskitError(\"other operator has different dimensions.\")\n return Operator(self.data + other.data, self.input_dims(),\n self.output_dims())", "response": "Returns the operator self + other."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef multiply(self, other):\n if not isinstance(other, Number):\n raise QiskitError(\"other is not a number\")\n return Operator(other * self.data, self.input_dims(),\n self.output_dims())", "response": "Returns the operator self + other."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn the tensor shape of the matrix operator", "response": "def _shape(self):\n \"\"\"Return the tensor shape of the matrix operator\"\"\"\n return tuple(reversed(self.output_dims())) + tuple(\n reversed(self.input_dims()))"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nevolve a state by the operator.", "response": "def _evolve(self, state, qargs=None):\n \"\"\"Evolve a quantum state by the operator.\n\n Args:\n state (QuantumState): The input statevector or density matrix.\n qargs (list): a list of QuantumState subsystem positions to apply\n the operator on.\n\n Returns:\n QuantumState: the output quantum state.\n\n Raises:\n QiskitError: if the operator dimension does not match the\n specified QuantumState subsystem dimensions.\n \"\"\"\n state = self._format_state(state)\n if qargs is None:\n if state.shape[0] != self._input_dim:\n raise QiskitError(\n \"Operator input dimension is not equal to state dimension.\"\n )\n if state.ndim == 1:\n # Return evolved statevector\n return np.dot(self.data, state)\n # Return evolved density matrix\n return np.dot(\n np.dot(self.data, state), np.transpose(np.conj(self.data)))\n # Subsystem evolution\n return self._evolve_subsystem(state, qargs)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nevolve a state by the operator on the subsystem.", "response": "def _evolve_subsystem(self, state, qargs):\n \"\"\"Evolve a quantum state by the operator.\n\n Args:\n state (QuantumState): The input statevector or density matrix.\n qargs (list): a list of QuantumState subsystem positions to apply\n the operator on.\n\n Returns:\n QuantumState: the output quantum state.\n\n Raises:\n QiskitError: if the operator dimension does not match the\n specified QuantumState subsystem dimensions.\n \"\"\"\n mat = np.reshape(self.data, self._shape)\n # Hack to assume state is a N-qubit state until a proper class for states\n # is in place\n state_size = len(state)\n state_dims = self._automatic_dims(None, state_size)\n if self.input_dims() != len(qargs) * (2,):\n raise QiskitError(\n \"Operator input dimensions are not compatible with state subsystem dimensions.\"\n )\n if state.ndim == 1:\n # Return evolved statevector\n tensor = np.reshape(state, state_dims)\n indices = [len(state_dims) - 1 - qubit for qubit in qargs]\n tensor = self._einsum_matmul(tensor, mat, indices)\n return np.reshape(tensor, state_size)\n # Return evolved density matrix\n tensor = np.reshape(state, 2 * state_dims)\n indices = [len(state_dims) - 1 - qubit for qubit in qargs]\n right_shift = len(state_dims)\n # Left multiply by operator\n tensor = self._einsum_matmul(tensor, mat, indices)\n # Right multiply by adjoint operator\n # We implement the transpose by doing left multiplication instead of right\n # in the _einsum_matmul function\n tensor = self._einsum_matmul(\n tensor, np.conj(mat), indices, shift=right_shift)\n return np.reshape(tensor, [state_size, state_size])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nformatting input state so it is statevector or density matrix", "response": "def _format_state(self, state):\n \"\"\"Format input state so it is statevector or density matrix\"\"\"\n state = np.array(state)\n shape = state.shape\n ndim = state.ndim\n if ndim > 2:\n raise QiskitError('Input state is not a vector or matrix.')\n # Flatten column-vector to vector\n if ndim == 2:\n if shape[1] != 1 and shape[1] != shape[0]:\n raise QiskitError('Input state is not a vector or matrix.')\n if shape[1] == 1:\n # flatten colum-vector to vector\n state = np.reshape(state, shape[0])\n return state"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nconverts a QuantumCircuit or Instruction to an Operator.", "response": "def _instruction_to_operator(cls, instruction):\n \"\"\"Convert a QuantumCircuit or Instruction to an Operator.\"\"\"\n # Convert circuit to an instruction\n if isinstance(instruction, QuantumCircuit):\n instruction = instruction.to_instruction()\n # Initialize an identity operator of the correct size of the circuit\n op = Operator(np.eye(2 ** instruction.num_qubits))\n op._append_instruction(instruction)\n return op"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _append_instruction(self, obj, qargs=None):\n if isinstance(obj, Instruction):\n mat = None\n if hasattr(obj, 'to_matrix'):\n # If instruction is a gate first we see if it has a\n # `to_matrix` definition and if so use that.\n try:\n mat = obj.to_matrix()\n except QiskitError:\n pass\n if mat is not None:\n # Perform the composition and inplace update the current state\n # of the operator\n op = self.compose(mat, qargs=qargs)\n self._data = op.data\n else:\n # If the instruction doesn't have a matrix defined we use its\n # circuit decomposition definition if it exists, otherwise we\n # cannot compose this gate and raise an error.\n if obj.definition is None:\n raise QiskitError('Cannot apply Instruction: {}'.format(obj.name))\n for instr, qregs, cregs in obj.definition:\n if cregs:\n raise QiskitError(\n 'Cannot apply instruction with classical registers: {}'.format(\n instr.name))\n # Get the integer position of the flat register\n new_qargs = [tup[1] for tup in qregs]\n self._append_instruction(instr, qargs=new_qargs)\n else:\n raise QiskitError('Input is not an instruction.')", "response": "Update the current Operator by apply an instruction."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nmapping a DAGCircuit onto a CouplingGraph using swap gates.", "response": "def run(self, dag):\n \"\"\"Map a DAGCircuit onto a CouplingGraph using swap gates.\n\n Args:\n dag (DAGCircuit): input DAG circuit\n\n Returns:\n DAGCircuit: object containing a circuit equivalent to\n circuit_graph that respects couplings in coupling_map, and\n a layout dict mapping qubits of circuit_graph into qubits\n of coupling_map. The layout may differ from the initial_layout\n if the first layer of gates cannot be executed on the\n initial_layout.\n\n Raises:\n TranspilerError: if there was any error during the mapping or with the\n parameters.\n \"\"\"\n if dag.width() > self.coupling_map.size():\n raise TranspilerError(\"Not enough qubits in CouplingGraph\")\n\n # Schedule the input circuit\n layerlist = list(dag.layers())\n\n if self.initial_layout is None and self.property_set[\"layout\"]:\n self.initial_layout = self.property_set[\"layout\"]\n\n if self.initial_layout is not None:\n # update initial_layout from a user given dict{(regname,idx): (regname,idx)}\n # to an expected dict{(reg,idx): (reg,idx)}\n\n virtual_qubits = self.initial_layout.get_virtual_bits()\n self.initial_layout = {(v[0].name, v[1]): ('q', self.initial_layout[v]) for v in\n virtual_qubits}\n\n device_register = QuantumRegister(self.coupling_map.size(), 'q')\n initial_layout = {(dag.qregs[k[0]], k[1]): (device_register, v[1])\n for k, v in self.initial_layout.items()}\n # Check the input layout\n circ_qubits = dag.qubits()\n coup_qubits = [(QuantumRegister(self.coupling_map.size(), 'q'), wire) for wire in\n self.coupling_map.physical_qubits]\n qubit_subset = []\n for k, v in initial_layout.items():\n qubit_subset.append(v)\n if k not in circ_qubits:\n raise TranspilerError(\"initial_layout qubit %s[%d] not in input \"\n \"DAGCircuit\" % (k[0].name, k[1]))\n if v not in coup_qubits:\n raise TranspilerError(\"initial_layout qubit %s[%d] not in input \"\n \"CouplingGraph\" % (v[0].name, v[1]))\n else:\n # Supply a default layout\n qubit_subset = [(QuantumRegister(self.coupling_map.size(), 'q'), wire) for wire in\n self.coupling_map.physical_qubits]\n qubit_subset = qubit_subset[0:dag.width()]\n initial_layout = {a: b for a, b in zip(dag.qubits(), qubit_subset)}\n\n # Find swap circuit to preceed to each layer of input circuit\n layout = initial_layout.copy()\n\n # Construct an empty DAGCircuit with one qreg \"q\"\n # and the same set of cregs as the input circuit\n dagcircuit_output = DAGCircuit()\n dagcircuit_output.name = dag.name\n dagcircuit_output.add_qreg(QuantumRegister(self.coupling_map.size(), \"q\"))\n for creg in dag.cregs.values():\n dagcircuit_output.add_creg(creg)\n\n # Make a trivial wire mapping between the subcircuits\n # returned by swap_mapper_layer_update and the circuit\n # we are building\n identity_wire_map = {}\n q = QuantumRegister(self.coupling_map.size(), 'q')\n for j in range(self.coupling_map.size()):\n identity_wire_map[(q, j)] = (q, j)\n for creg in dag.cregs.values():\n for j in range(creg.size):\n identity_wire_map[(creg, j)] = (creg, j)\n\n first_layer = True # True until first layer is output\n\n # Iterate over layers\n for i, layer in enumerate(layerlist):\n\n # Attempt to find a permutation for this layer\n success_flag, best_circ, best_d, best_layout, trivial_flag \\\n = self.layer_permutation(layer[\"partition\"], layout, qubit_subset)\n\n # If this fails, try one gate at a time in this layer\n if not success_flag:\n serial_layerlist = list(layer[\"graph\"].serial_layers())\n\n # Go through each gate in the layer\n for j, serial_layer in enumerate(serial_layerlist):\n\n success_flag, best_circ, best_d, best_layout, trivial_flag \\\n = self.layer_permutation(serial_layer[\"partition\"], layout, qubit_subset)\n\n # Give up if we fail again\n if not success_flag:\n raise TranspilerError(\"swap_mapper failed: \" +\n \"layer %d, sublayer %d\" % (i, j))\n\n # If this layer is only single-qubit gates,\n # and we have yet to see multi-qubit gates,\n # continue to the next inner iteration\n if trivial_flag and first_layer:\n continue\n\n # Update the record of qubit positions for each inner iteration\n layout = best_layout\n # Update the QASM\n dagcircuit_output.compose_back(\n self.swap_mapper_layer_update(j,\n first_layer,\n best_layout,\n best_d,\n best_circ,\n serial_layerlist),\n identity_wire_map)\n # Update initial layout\n if first_layer:\n initial_layout = layout\n first_layer = False\n\n else:\n # Update the record of qubit positions for each iteration\n layout = best_layout\n\n # Update the QASM\n dagcircuit_output.compose_back(\n self.swap_mapper_layer_update(i,\n first_layer,\n best_layout,\n best_d,\n best_circ,\n layerlist),\n identity_wire_map)\n # Update initial layout\n if first_layer:\n initial_layout = layout\n first_layer = False\n\n # If first_layer is still set, the circuit only has single-qubit gates\n # so we can use the initial layout to output the entire circuit\n if first_layer:\n layout = initial_layout\n for i, layer in enumerate(layerlist):\n dagcircuit_output.compose_back(layer[\"graph\"], layout)\n\n return dagcircuit_output"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfinds a swap circuit that implements a permutation for this layer.", "response": "def layer_permutation(self, layer_partition, layout, qubit_subset):\n \"\"\"Find a swap circuit that implements a permutation for this layer.\n\n The goal is to swap qubits such that qubits in the same two-qubit gates\n are adjacent.\n\n Based on Sergey Bravyi's algorithm.\n\n The layer_partition is a list of (qu)bit lists and each qubit is a\n tuple (qreg, index).\n The layout is a dict mapping qubits in the circuit to qubits in the\n coupling graph and represents the current positions of the data.\n The qubit_subset is the subset of qubits in the coupling graph that\n we have chosen to map into.\n The coupling is a CouplingGraph.\n TRIALS is the number of attempts the randomized algorithm makes.\n\n Returns: success_flag, best_circ, best_d, best_layout, trivial_flag\n\n If success_flag is True, then best_circ contains a DAGCircuit with\n the swap circuit, best_d contains the depth of the swap circuit, and\n best_layout contains the new positions of the data qubits after the\n swap circuit has been applied. The trivial_flag is set if the layer\n has no multi-qubit gates.\n \"\"\"\n if self.seed is None:\n self.seed = np.random.randint(0, np.iinfo(np.int32).max)\n rng = np.random.RandomState(self.seed)\n rev_layout = {b: a for a, b in layout.items()}\n gates = []\n for layer in layer_partition:\n if len(layer) > 2:\n raise TranspilerError(\"Layer contains >2 qubit gates\")\n elif len(layer) == 2:\n gates.append(tuple(layer))\n\n # Can we already apply the gates?\n dist = sum([self.coupling_map.distance(layout[g[0]][1], layout[g[1]][1]) for g in gates])\n if dist == len(gates):\n circ = DAGCircuit()\n circ.add_qreg(QuantumRegister(self.coupling_map.size(), \"q\"))\n return True, circ, 0, layout, bool(gates)\n\n # Begin loop over trials of randomized algorithm\n n = self.coupling_map.size()\n best_d = sys.maxsize # initialize best depth\n best_circ = None # initialize best swap circuit\n best_layout = None # initialize best final layout\n QR = QuantumRegister(self.coupling_map.size(), \"q\")\n for _ in range(self.trials):\n\n trial_layout = layout.copy()\n rev_trial_layout = rev_layout.copy()\n # SWAP circuit constructed this trial\n trial_circ = DAGCircuit()\n trial_circ.add_qreg(QR)\n\n # Compute Sergey's randomized distance\n xi = {}\n for i in self.coupling_map.physical_qubits:\n xi[(QR, i)] = {}\n for i in self.coupling_map.physical_qubits:\n i = (QR, i)\n for j in self.coupling_map.physical_qubits:\n j = (QR, j)\n scale = 1 + rng.normal(0, 1 / n)\n xi[i][j] = scale * self.coupling_map.distance(i[1], j[1]) ** 2\n xi[j][i] = xi[i][j]\n\n # Loop over depths d up to a max depth of 2n+1\n d = 1\n # Circuit for this swap slice\n circ = DAGCircuit()\n circ.add_qreg(QR)\n\n # Identity wire-map for composing the circuits\n identity_wire_map = {(QR, j): (QR, j) for j in range(n)}\n\n while d < 2 * n + 1:\n # Set of available qubits\n qubit_set = set(qubit_subset)\n # While there are still qubits available\n while qubit_set:\n # Compute the objective function\n min_cost = sum([xi[trial_layout[g[0]]][trial_layout[g[1]]]\n for g in gates])\n # Try to decrease objective function\n progress_made = False\n # Loop over edges of coupling graph\n for e in self.coupling_map.get_edges():\n e = [(QR, edge) for edge in e]\n # Are the qubits available?\n if e[0] in qubit_set and e[1] in qubit_set:\n # Try this edge to reduce the cost\n new_layout = trial_layout.copy()\n new_layout[rev_trial_layout[e[0]]] = e[1]\n new_layout[rev_trial_layout[e[1]]] = e[0]\n rev_new_layout = rev_trial_layout.copy()\n rev_new_layout[e[0]] = rev_trial_layout[e[1]]\n rev_new_layout[e[1]] = rev_trial_layout[e[0]]\n # Compute the objective function\n new_cost = sum([xi[new_layout[g[0]]][new_layout[g[1]]]\n for g in gates])\n # Record progress if we succceed\n if new_cost < min_cost:\n progress_made = True\n min_cost = new_cost\n opt_layout = new_layout\n rev_opt_layout = rev_new_layout\n opt_edge = e\n\n # Were there any good choices?\n if progress_made:\n qubit_set.remove(opt_edge[0])\n qubit_set.remove(opt_edge[1])\n trial_layout = opt_layout\n rev_trial_layout = rev_opt_layout\n circ.apply_operation_back(\n SwapGate(),\n [(opt_edge[0][0], opt_edge[0][1]), (opt_edge[1][0], opt_edge[1][1])],\n [])\n else:\n break\n\n # We have either run out of qubits or failed to improve\n # Compute the coupling graph distance_qubits\n dist = sum([self.coupling_map.distance(trial_layout[g[0]][1],\n trial_layout[g[1]][1]) for g in gates])\n # If all gates can be applied now, we are finished\n # Otherwise we need to consider a deeper swap circuit\n if dist == len(gates):\n trial_circ.compose_back(circ, identity_wire_map)\n break\n\n # Increment the depth\n d += 1\n\n # Either we have succeeded at some depth d < dmax or failed\n dist = sum([self.coupling_map.distance(trial_layout[g[0]][1],\n trial_layout[g[1]][1]) for g in gates])\n if dist == len(gates):\n if d < best_d:\n best_circ = trial_circ\n best_layout = trial_layout\n best_d = min(best_d, d)\n\n if best_circ is None:\n return False, None, None, None, False\n\n return True, best_circ, best_d, best_layout, False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef swap_mapper_layer_update(self, i, first_layer, best_layout, best_d,\n best_circ, layer_list):\n \"\"\"Update the QASM string for an iteration of swap_mapper.\n\n i = layer number\n first_layer = True if this is the first layer with multi-qubit gates\n best_layout = layout returned from swap algorithm\n best_d = depth returned from swap algorithm\n best_circ = swap circuit returned from swap algorithm\n layer_list = list of circuit objects for each layer\n\n Return DAGCircuit object to append to the output DAGCircuit.\n \"\"\"\n layout = best_layout\n dagcircuit_output = DAGCircuit()\n QR = QuantumRegister(self.coupling_map.size(), 'q')\n dagcircuit_output.add_qreg(QR)\n # Identity wire-map for composing the circuits\n identity_wire_map = {(QR, j): (QR, j) for j in range(self.coupling_map.size())}\n\n # If this is the first layer with multi-qubit gates,\n # output all layers up to this point and ignore any\n # swap gates. Set the initial layout.\n if first_layer:\n # Output all layers up to this point\n for j in range(i + 1):\n dagcircuit_output.compose_back(layer_list[j][\"graph\"], layout)\n # Otherwise, we output the current layer and the associated swap gates.\n else:\n # Output any swaps\n if best_d > 0:\n dagcircuit_output.compose_back(best_circ, identity_wire_map)\n\n # Output this layer\n dagcircuit_output.compose_back(layer_list[i][\"graph\"], layout)\n return dagcircuit_output", "response": "Update the QASM string for an iteration of swap_mapper."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef real(self, nested_scope=None):\n operation = self.children[0].operation()\n expr = self.children[1].real(nested_scope)\n return operation(expr)", "response": "Return the correspond floating point number."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning the correspond symbolic number.", "response": "def sym(self, nested_scope=None):\n \"\"\"Return the correspond symbolic number.\"\"\"\n operation = self.children[0].operation()\n expr = self.children[1].sym(nested_scope)\n return operation(expr)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nseparating a bitstring according to the registers defined in the result header.", "response": "def _separate_bitstring(bitstring, creg_sizes):\n \"\"\"Separate a bitstring according to the registers defined in the result header.\"\"\"\n substrings = []\n running_index = 0\n for _, size in reversed(creg_sizes):\n substrings.append(bitstring[running_index: running_index + size])\n running_index += size\n return ' '.join(substrings)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef format_counts_memory(shot_memory, header=None):\n if shot_memory.startswith('0x'):\n shot_memory = _hex_to_bin(shot_memory)\n if header:\n creg_sizes = header.get('creg_sizes', None)\n memory_slots = header.get('memory_slots', None)\n if memory_slots:\n shot_memory = _pad_zeros(shot_memory, memory_slots)\n if creg_sizes and memory_slots:\n shot_memory = _separate_bitstring(shot_memory, creg_sizes)\n return shot_memory", "response": "Formats a single bitstring from a single shot experiment."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nconvert nested list of shape (... 2 to complex numpy array with shape (... 2 )", "response": "def _list_to_complex_array(complex_list):\n \"\"\"Convert nested list of shape (..., 2) to complex numpy array with shape (...)\n\n Args:\n complex_list (list): List to convert.\n\n Returns:\n np.ndarray: Complex numpy aray\n\n Raises:\n QiskitError: If inner most array of input nested list is not of length 2.\n \"\"\"\n arr = np.asarray(complex_list, dtype=np.complex_)\n if not arr.shape[-1] == 2:\n raise QiskitError('Inner most nested list is not of length 2.')\n\n return arr[..., 0] + 1j*arr[..., 1]"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef format_level_0_memory(memory):\n formatted_memory = _list_to_complex_array(memory)\n # infer meas_return from shape of returned data.\n if not 2 <= len(formatted_memory.shape) <= 3:\n raise QiskitError('Level zero memory is not of correct shape.')\n return formatted_memory", "response": "Formats an experiment result memory object for measurement level 0."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef format_level_1_memory(memory):\n formatted_memory = _list_to_complex_array(memory)\n # infer meas_return from shape of returned data.\n if not 1 <= len(formatted_memory.shape) <= 2:\n raise QiskitError('Level one memory is not of correct shape.')\n return formatted_memory", "response": "Formats an experiment result memory object for measurement level 1."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nformat an experiment result memory object for measurement level 2.", "response": "def format_level_2_memory(memory, header=None):\n \"\"\" Format an experiment result memory object for measurement level 2.\n\n Args:\n memory (list): Memory from experiment with `meas_level==2` and `memory==True`.\n header (dict): the experiment header dictionary containing\n useful information for postprocessing.\n\n Returns:\n list[str]: List of bitstrings\n \"\"\"\n memory_list = []\n for shot_memory in memory:\n memory_list.append(format_counts_memory(shot_memory, header))\n return memory_list"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nformatting a single experiment result coming from backend to present to the Qiskit user.", "response": "def format_counts(counts, header=None):\n \"\"\"Format a single experiment result coming from backend to present\n to the Qiskit user.\n\n Args:\n counts (dict): counts histogram of multiple shots\n header (dict): the experiment header dictionary containing\n useful information for postprocessing.\n\n Returns:\n dict: a formatted counts\n \"\"\"\n counts_dict = {}\n for key, val in counts.items():\n key = format_counts_memory(key, header)\n counts_dict[key] = val\n return counts_dict"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef format_statevector(vec, decimals=None):\n num_basis = len(vec)\n vec_complex = np.zeros(num_basis, dtype=complex)\n for i in range(num_basis):\n vec_complex[i] = vec[i][0] + 1j * vec[i][1]\n if decimals:\n vec_complex = np.around(vec_complex, decimals=decimals)\n return vec_complex", "response": "Formats the statevector coming from the backend to present to the Qiskit user."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef format_unitary(mat, decimals=None):\n num_basis = len(mat)\n mat_complex = np.zeros((num_basis, num_basis), dtype=complex)\n for i, vec in enumerate(mat):\n mat_complex[i] = format_statevector(vec, decimals)\n return mat_complex", "response": "Format the unitary coming from the backend to present to the Qiskit user."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsubmitting the job to the backend.", "response": "def submit(self):\n \"\"\"Submit the job to the backend for execution.\n\n Raises:\n QobjValidationError: if the JSON serialization of the Qobj passed\n during construction does not validate against the Qobj schema.\n\n JobError: if trying to re-submit the job.\n \"\"\"\n if self._future is not None:\n raise JobError(\"We have already submitted the job!\")\n\n validate_qobj_against_schema(self._qobj)\n self._future = self._executor.submit(self._fn, self._job_id, self._qobj)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting the status of the current job by querying the Python s future .", "response": "def status(self):\n \"\"\"Gets the status of the job by querying the Python's future\n\n Returns:\n qiskit.providers.JobStatus: The current JobStatus\n\n Raises:\n JobError: If the future is in unexpected state\n concurrent.futures.TimeoutError: if timeout occurred.\n \"\"\"\n # The order is important here\n if self._future.running():\n _status = JobStatus.RUNNING\n elif self._future.cancelled():\n _status = JobStatus.CANCELLED\n elif self._future.done():\n _status = JobStatus.DONE if self._future.exception() is None else JobStatus.ERROR\n else:\n # Note: There is an undocumented Future state: PENDING, that seems to show up when\n # the job is enqueued, waiting for someone to pick it up. We need to deal with this\n # state but there's no public API for it, so we are assuming that if the job is not\n # in any of the previous states, is PENDING, ergo INITIALIZING for us.\n _status = JobStatus.INITIALIZING\n\n return _status"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef iplot_bloch_multivector(rho, figsize=None):\n\n # HTML\n html_template = Template(\"\"\"\n

\n

\n
\n
\n

\n \"\"\")\n\n # JavaScript\n javascript_template = Template(\"\"\"\n \n \"\"\")\n rho = _validate_input_state(rho)\n if figsize is None:\n options = {}\n else:\n options = {'width': figsize[0], 'height': figsize[1]}\n\n # Process data and execute\n num = int(np.log2(len(rho)))\n\n bloch_data = []\n for i in range(num):\n pauli_singles = [Pauli.pauli_single(num, i, 'X'), Pauli.pauli_single(num, i, 'Y'),\n Pauli.pauli_single(num, i, 'Z')]\n bloch_state = list(map(lambda x: np.real(np.trace(np.dot(x.to_matrix(), rho))),\n pauli_singles))\n bloch_data.append(bloch_state)\n\n div_number = str(time.time())\n div_number = re.sub('[.]', '', div_number)\n\n html = html_template.substitute({\n 'divNumber': div_number\n })\n\n javascript = javascript_template.substitute({\n 'data': bloch_data,\n 'divNumber': div_number,\n 'options': options\n })\n\n display(HTML(html + javascript))", "response": "Create a bloch sphere representation of the input array using as much bloch spheres as qubit."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nparallel execution of a mapping of `values` to the function `task`. This is functionally equivalent to:: result = [task(value, *task_args, **task_kwargs) for value in values] On Windows this function defaults to a serial implementation to avoid the overhead from spawning processes in Windows. Args: task (func): Function that is to be called for each value in ``values``. values (array_like): List or array of values for which the ``task`` function is to be evaluated. task_args (list): Optional additional arguments to the ``task`` function. task_kwargs (dict): Optional additional keyword argument to the ``task`` function. num_processes (int): Number of processes to spawn. Returns: result: The result list contains the value of ``task(value, *task_args, **task_kwargs)`` for each value in ``values``. Raises: QiskitError: If user interrupts via keyboard. Events: terra.parallel.start: The collection of parallel tasks are about to start. terra.parallel.update: One of the parallel task has finished. terra.parallel.finish: All the parallel tasks have finished.", "response": "def parallel_map(task, values, task_args=tuple(), task_kwargs={}, # pylint: disable=W0102\n num_processes=CPU_COUNT):\n \"\"\"\n Parallel execution of a mapping of `values` to the function `task`. This\n is functionally equivalent to::\n\n result = [task(value, *task_args, **task_kwargs) for value in values]\n\n On Windows this function defaults to a serial implementation to avoid the\n overhead from spawning processes in Windows.\n\n Args:\n task (func): Function that is to be called for each value in ``values``.\n values (array_like): List or array of values for which the ``task``\n function is to be evaluated.\n task_args (list): Optional additional arguments to the ``task`` function.\n task_kwargs (dict): Optional additional keyword argument to the ``task`` function.\n num_processes (int): Number of processes to spawn.\n\n Returns:\n result: The result list contains the value of\n ``task(value, *task_args, **task_kwargs)`` for\n each value in ``values``.\n\n Raises:\n QiskitError: If user interrupts via keyboard.\n\n Events:\n terra.parallel.start: The collection of parallel tasks are about to start.\n terra.parallel.update: One of the parallel task has finished.\n terra.parallel.finish: All the parallel tasks have finished.\n \"\"\"\n if len(values) == 1:\n return [task(values[0], *task_args, **task_kwargs)]\n\n Publisher().publish(\"terra.parallel.start\", len(values))\n nfinished = [0]\n\n def _callback(_):\n nfinished[0] += 1\n Publisher().publish(\"terra.parallel.done\", nfinished[0])\n\n # Run in parallel if not Win and not in parallel already\n if platform.system() != 'Windows' and num_processes > 1 \\\n and os.getenv('QISKIT_IN_PARALLEL') == 'FALSE':\n os.environ['QISKIT_IN_PARALLEL'] = 'TRUE'\n try:\n pool = Pool(processes=num_processes)\n\n async_res = [pool.apply_async(task, (value,) + task_args, task_kwargs,\n _callback) for value in values]\n\n while not all([item.ready() for item in async_res]):\n for item in async_res:\n item.wait(timeout=0.1)\n\n pool.terminate()\n pool.join()\n\n except KeyboardInterrupt:\n pool.terminate()\n pool.join()\n Publisher().publish(\"terra.parallel.finish\")\n raise QiskitError('Keyboard interrupt in parallel_map.')\n\n Publisher().publish(\"terra.parallel.finish\")\n os.environ['QISKIT_IN_PARALLEL'] = 'FALSE'\n return [ar.get() for ar in async_res]\n\n # Cannot do parallel on Windows , if another parallel_map is running in parallel,\n # or len(values) == 1.\n results = []\n for _, value in enumerate(values):\n result = task(value, *task_args, **task_kwargs)\n results.append(result)\n _callback(0)\n Publisher().publish(\"terra.parallel.finish\")\n return results"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_qubit_los(self, user_lo_config):\n try:\n _q_los = self.default_qubit_los.copy()\n except KeyError:\n raise PulseError('Qubit default frequencies not exist.')\n\n for channel, lo_freq in user_lo_config.qubit_lo_dict().items():\n _q_los[channel.index] = lo_freq\n\n if _q_los == self.default_qubit_los:\n return None\n return _q_los", "response": "Embed default qubit LO frequencies from backend and format them to list object."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_meas_los(self, user_lo_config):\n try:\n _m_los = self.default_meas_los.copy()\n except KeyError:\n raise PulseError('Default measurement frequencies not exist.')\n\n for channel, lo_freq in user_lo_config.meas_lo_dict().items():\n _m_los[channel.index] = lo_freq\n\n if _m_los == self.default_meas_los:\n return None\n return _m_los", "response": "Embed default meas LO frequencies from backend and format them to list object."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run(self, dag):\n # Walk through the DAG and expand each non-basis node\n for node in dag.op_nodes():\n basic_insts = ['measure', 'reset', 'barrier', 'snapshot']\n if node.name in basic_insts:\n # TODO: this is legacy behavior.Basis_insts should be removed that these\n # instructions should be part of the device-reported basis. Currently, no\n # backend reports \"measure\", for example.\n continue\n if node.name in self.basis: # If already a base, ignore.\n continue\n\n # TODO: allow choosing other possible decompositions\n rule = node.op.definition\n if not rule:\n raise QiskitError(\"Cannot unroll the circuit to the given basis, %s. \"\n \"No rule to expand instruction %s.\" %\n (str(self.basis), node.op.name))\n\n # hacky way to build a dag on the same register as the rule is defined\n # TODO: need anonymous rules to address wires by index\n decomposition = DAGCircuit()\n decomposition.add_qreg(rule[0][1][0][0])\n for inst in rule:\n decomposition.apply_operation_back(*inst)\n\n unrolled_dag = self.run(decomposition) # recursively unroll ops\n dag.substitute_node_with_dag(node, unrolled_dag)\n return dag", "response": "Expand all op nodes to the given basis."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef iplot_state_qsphere(rho, figsize=None):\n\n # HTML\n html_template = Template(\"\"\"\n

\n

\n
\n
\n

\n \"\"\")\n\n # JavaScript\n javascript_template = Template(\"\"\"\n \n\n \"\"\")\n rho = _validate_input_state(rho)\n if figsize is None:\n options = {}\n else:\n options = {'width': figsize[0], 'height': figsize[1]}\n\n qspheres_data = []\n # Process data and execute\n num = int(np.log2(len(rho)))\n\n # get the eigenvectors and eigenvalues\n weig, stateall = linalg.eigh(rho)\n\n for _ in range(2**num):\n # start with the max\n probmix = weig.max()\n prob_location = weig.argmax()\n if probmix > 0.001:\n # print(\"The \" + str(k) + \"th eigenvalue = \" + str(probmix))\n # get the max eigenvalue\n state = stateall[:, prob_location]\n loc = np.absolute(state).argmax()\n # get the element location closes to lowest bin representation.\n for j in range(2**num):\n test = np.absolute(np.absolute(state[j]) -\n np.absolute(state[loc]))\n if test < 0.001:\n loc = j\n break\n # remove the global phase\n angles = (np.angle(state[loc]) + 2 * np.pi) % (2 * np.pi)\n angleset = np.exp(-1j*angles)\n state = angleset*state\n state.flatten()\n\n spherepoints = []\n for i in range(2**num):\n # get x,y,z points\n\n element = bin(i)[2:].zfill(num)\n weight = element.count(\"1\")\n\n number_of_divisions = n_choose_k(num, weight)\n weight_order = bit_string_index(element)\n\n angle = weight_order * 2 * np.pi / number_of_divisions\n\n zvalue = -2 * weight / num + 1\n xvalue = np.sqrt(1 - zvalue**2) * np.cos(angle)\n yvalue = np.sqrt(1 - zvalue**2) * np.sin(angle)\n\n # get prob and angle - prob will be shade and angle color\n prob = np.real(np.dot(state[i], state[i].conj()))\n angles = (np.angle(state[i]) + 2 * np.pi) % (2 * np.pi)\n qpoint = {\n 'x': xvalue,\n 'y': yvalue,\n 'z': zvalue,\n 'prob': prob,\n 'phase': angles\n }\n spherepoints.append(qpoint)\n\n # Associate all points to one sphere\n sphere = {\n 'points': spherepoints,\n 'eigenvalue': probmix\n }\n\n # Add sphere to the spheres array\n qspheres_data.append(sphere)\n weig[prob_location] = 0\n\n div_number = str(time.time())\n div_number = re.sub('[.]', '', div_number)\n\n html = html_template.substitute({\n 'divNumber': div_number\n })\n\n javascript = javascript_template.substitute({\n 'data': qspheres_data,\n 'divNumber': div_number,\n 'options': options\n })\n\n display(HTML(html + javascript))", "response": "Create a Q sphere representation of the input array using a Q sphere visualization."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the number of combinations for n choose k.", "response": "def n_choose_k(n, k):\n \"\"\"Return the number of combinations for n choose k.\n\n Args:\n n (int): the total number of options .\n k (int): The number of elements.\n\n Returns:\n int: returns the binomial coefficient\n \"\"\"\n if n == 0:\n return 0\n return reduce(lambda x, y: x * y[0] / y[1],\n zip(range(n - k + 1, n + 1),\n range(1, k + 1)), 1)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef bit_string_index(text):\n n = len(text)\n k = text.count(\"1\")\n if text.count(\"0\") != n - k:\n raise VisualizationError(\"s must be a string of 0 and 1\")\n ones = [pos for pos, char in enumerate(text) if char == \"1\"]\n return lex_index(n, k, ones)", "response": "Return the index of a string of 0s and 1s."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns the index of a combination of n and k.", "response": "def lex_index(n, k, lst):\n \"\"\"Return the lex index of a combination..\n\n Args:\n n (int): the total number of options .\n k (int): The number of elements.\n lst (list): list\n\n Returns:\n int: returns int index for lex order\n\n Raises:\n VisualizationError: if length of list is not equal to k\n \"\"\"\n if len(lst) != k:\n raise VisualizationError(\"list should have length k\")\n comb = list(map(lambda x: n - 1 - x, lst))\n dualm = sum([n_choose_k(comb[k - 1 - i], i + 1) for i in range(k)])\n return int(dualm)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef plot_state_hinton(rho, title='', figsize=None):\n if not HAS_MATPLOTLIB:\n raise ImportError('Must have Matplotlib installed.')\n rho = _validate_input_state(rho)\n if figsize is None:\n figsize = (8, 5)\n num = int(np.log2(len(rho)))\n fig, (ax1, ax2) = plt.subplots(1, 2, figsize=figsize)\n max_weight = 2 ** np.ceil(np.log(np.abs(rho).max()) / np.log(2))\n datareal = np.real(rho)\n dataimag = np.imag(rho)\n column_names = [bin(i)[2:].zfill(num) for i in range(2**num)]\n row_names = [bin(i)[2:].zfill(num) for i in range(2**num)]\n lx = len(datareal[0]) # Work out matrix dimensions\n ly = len(datareal[:, 0])\n # Real\n ax1.patch.set_facecolor('gray')\n ax1.set_aspect('equal', 'box')\n ax1.xaxis.set_major_locator(plt.NullLocator())\n ax1.yaxis.set_major_locator(plt.NullLocator())\n\n for (x, y), w in np.ndenumerate(datareal):\n color = 'white' if w > 0 else 'black'\n size = np.sqrt(np.abs(w) / max_weight)\n rect = plt.Rectangle([x - size / 2, y - size / 2], size, size,\n facecolor=color, edgecolor=color)\n ax1.add_patch(rect)\n\n ax1.set_xticks(np.arange(0, lx+0.5, 1))\n ax1.set_yticks(np.arange(0, ly+0.5, 1))\n ax1.set_yticklabels(row_names, fontsize=14)\n ax1.set_xticklabels(column_names, fontsize=14, rotation=90)\n ax1.autoscale_view()\n ax1.invert_yaxis()\n ax1.set_title('Real[rho]', fontsize=14)\n # Imaginary\n ax2.patch.set_facecolor('gray')\n ax2.set_aspect('equal', 'box')\n ax2.xaxis.set_major_locator(plt.NullLocator())\n ax2.yaxis.set_major_locator(plt.NullLocator())\n\n for (x, y), w in np.ndenumerate(dataimag):\n color = 'white' if w > 0 else 'black'\n size = np.sqrt(np.abs(w) / max_weight)\n rect = plt.Rectangle([x - size / 2, y - size / 2], size, size,\n facecolor=color, edgecolor=color)\n ax2.add_patch(rect)\n if np.any(dataimag != 0):\n ax2.set_xticks(np.arange(0, lx+0.5, 1))\n ax2.set_yticks(np.arange(0, ly+0.5, 1))\n ax2.set_yticklabels(row_names, fontsize=14)\n ax2.set_xticklabels(column_names, fontsize=14, rotation=90)\n ax2.autoscale_view()\n ax2.invert_yaxis()\n ax2.set_title('Imag[rho]', fontsize=14)\n if title:\n fig.suptitle(title, fontsize=16)\n plt.tight_layout()\n plt.close(fig)\n return fig", "response": "Plots a hinton diagram for the quanum state vector or density matrix."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nplots the Bloch vector.", "response": "def plot_bloch_vector(bloch, title=\"\", ax=None, figsize=None):\n \"\"\"Plot the Bloch sphere.\n\n Plot a sphere, axes, the Bloch vector, and its projections onto each axis.\n\n Args:\n bloch (list[double]): array of three elements where [, , ]\n title (str): a string that represents the plot title\n ax (matplotlib.Axes): An Axes to use for rendering the bloch sphere\n figsize (tuple): Figure size in inches. Has no effect is passing `ax`.\n\n Returns:\n Figure: A matplotlib figure instance if `ax = None`.\n\n Raises:\n ImportError: Requires matplotlib.\n \"\"\"\n if not HAS_MATPLOTLIB:\n raise ImportError('Must have Matplotlib installed.')\n if figsize is None:\n figsize = (5, 5)\n B = Bloch(axes=ax)\n B.add_vectors(bloch)\n B.render(title=title)\n if ax is None:\n fig = B.fig\n fig.set_size_inches(figsize[0], figsize[1])\n plt.close(fig)\n return fig\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef plot_bloch_multivector(rho, title='', figsize=None):\n if not HAS_MATPLOTLIB:\n raise ImportError('Must have Matplotlib installed.')\n rho = _validate_input_state(rho)\n num = int(np.log2(len(rho)))\n width, height = plt.figaspect(1/num)\n fig = plt.figure(figsize=(width, height))\n for i in range(num):\n ax = fig.add_subplot(1, num, i + 1, projection='3d')\n pauli_singles = [\n Pauli.pauli_single(num, i, 'X'),\n Pauli.pauli_single(num, i, 'Y'),\n Pauli.pauli_single(num, i, 'Z')\n ]\n bloch_state = list(\n map(lambda x: np.real(np.trace(np.dot(x.to_matrix(), rho))),\n pauli_singles))\n plot_bloch_vector(bloch_state, \"qubit \" + str(i), ax=ax,\n figsize=figsize)\n fig.suptitle(title, fontsize=16)\n plt.close(fig)\n return fig", "response": "Plot the Bloch sphere."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef plot_state_city(rho, title=\"\", figsize=None, color=None,\n alpha=1):\n \"\"\"Plot the cityscape of quantum state.\n\n Plot two 3d bar graphs (two dimensional) of the real and imaginary\n part of the density matrix rho.\n\n Args:\n rho (ndarray): Numpy array for state vector or density matrix.\n title (str): a string that represents the plot title\n figsize (tuple): Figure size in inches.\n color (list): A list of len=2 giving colors for real and\n imaginary components of matrix elements.\n alpha (float): Transparency value for bars\n Returns:\n matplotlib.Figure: The matplotlib.Figure of the visualization\n\n Raises:\n ImportError: Requires matplotlib.\n ValueError: When 'color' is not a list of len=2.\n \"\"\"\n if not HAS_MATPLOTLIB:\n raise ImportError('Must have Matplotlib installed.')\n rho = _validate_input_state(rho)\n\n num = int(np.log2(len(rho)))\n # get the real and imag parts of rho\n datareal = np.real(rho)\n dataimag = np.imag(rho)\n\n # get the labels\n column_names = [bin(i)[2:].zfill(num) for i in range(2**num)]\n row_names = [bin(i)[2:].zfill(num) for i in range(2**num)]\n\n lx = len(datareal[0]) # Work out matrix dimensions\n ly = len(datareal[:, 0])\n xpos = np.arange(0, lx, 1) # Set up a mesh of positions\n ypos = np.arange(0, ly, 1)\n xpos, ypos = np.meshgrid(xpos+0.25, ypos+0.25)\n\n xpos = xpos.flatten()\n ypos = ypos.flatten()\n zpos = np.zeros(lx*ly)\n\n dx = 0.5 * np.ones_like(zpos) # width of bars\n dy = dx.copy()\n dzr = datareal.flatten()\n dzi = dataimag.flatten()\n\n if color is None:\n color = [\"#648fff\", \"#648fff\"]\n else:\n if len(color) != 2:\n raise ValueError(\"'color' must be a list of len=2.\")\n if color[0] is None:\n color[0] = \"#648fff\"\n if color[1] is None:\n color[1] = \"#648fff\"\n\n # set default figure size\n if figsize is None:\n figsize = (15, 5)\n\n fig = plt.figure(figsize=figsize)\n ax1 = fig.add_subplot(1, 2, 1, projection='3d')\n\n x = [0, max(xpos)+0.5, max(xpos)+0.5, 0]\n y = [0, 0, max(ypos)+0.5, max(ypos)+0.5]\n z = [0, 0, 0, 0]\n verts = [list(zip(x, y, z))]\n\n fc1 = generate_facecolors(xpos, ypos, zpos, dx, dy, dzr, color[0])\n for idx, cur_zpos in enumerate(zpos):\n if dzr[idx] > 0:\n zorder = 2\n else:\n zorder = 0\n b1 = ax1.bar3d(xpos[idx], ypos[idx], cur_zpos,\n dx[idx], dy[idx], dzr[idx],\n alpha=alpha, zorder=zorder)\n b1.set_facecolors(fc1[6*idx:6*idx+6])\n\n pc1 = Poly3DCollection(verts, alpha=0.15, facecolor='k',\n linewidths=1, zorder=1)\n\n if min(dzr) < 0 < max(dzr):\n ax1.add_collection3d(pc1)\n\n ax2 = fig.add_subplot(1, 2, 2, projection='3d')\n fc2 = generate_facecolors(xpos, ypos, zpos, dx, dy, dzi, color[1])\n for idx, cur_zpos in enumerate(zpos):\n if dzi[idx] > 0:\n zorder = 2\n else:\n zorder = 0\n b2 = ax2.bar3d(xpos[idx], ypos[idx], cur_zpos,\n dx[idx], dy[idx], dzi[idx],\n alpha=alpha, zorder=zorder)\n b2.set_facecolors(fc2[6*idx:6*idx+6])\n\n pc2 = Poly3DCollection(verts, alpha=0.2, facecolor='k',\n linewidths=1, zorder=1)\n\n if min(dzi) < 0 < max(dzi):\n ax2.add_collection3d(pc2)\n ax1.set_xticks(np.arange(0.5, lx+0.5, 1))\n ax1.set_yticks(np.arange(0.5, ly+0.5, 1))\n max_dzr = max(dzr)\n min_dzr = min(dzr)\n if max_dzr != min_dzr:\n ax1.axes.set_zlim3d(np.min(dzr), np.max(dzr)+1e-9)\n else:\n if min_dzr == 0:\n ax1.axes.set_zlim3d(np.min(dzr), np.max(dzr)+1e-9)\n else:\n ax1.axes.set_zlim3d(auto=True)\n ax1.zaxis.set_major_locator(MaxNLocator(5))\n ax1.w_xaxis.set_ticklabels(row_names, fontsize=14, rotation=45)\n ax1.w_yaxis.set_ticklabels(column_names, fontsize=14, rotation=-22.5)\n ax1.set_zlabel(\"Real[rho]\", fontsize=14)\n for tick in ax1.zaxis.get_major_ticks():\n tick.label.set_fontsize(14)\n\n ax2.set_xticks(np.arange(0.5, lx+0.5, 1))\n ax2.set_yticks(np.arange(0.5, ly+0.5, 1))\n min_dzi = np.min(dzi)\n max_dzi = np.max(dzi)\n if min_dzi != max_dzi:\n eps = 0\n ax2.zaxis.set_major_locator(MaxNLocator(5))\n ax2.axes.set_zlim3d(np.min(dzi), np.max(dzi)+eps)\n else:\n if min_dzi == 0:\n ax2.set_zticks([0])\n eps = 1e-9\n ax2.axes.set_zlim3d(np.min(dzi), np.max(dzi)+eps)\n else:\n ax2.axes.set_zlim3d(auto=True)\n ax2.w_xaxis.set_ticklabels(row_names, fontsize=14, rotation=45)\n ax2.w_yaxis.set_ticklabels(column_names, fontsize=14, rotation=-22.5)\n ax2.set_zlabel(\"Imag[rho]\", fontsize=14)\n for tick in ax2.zaxis.get_major_ticks():\n tick.label.set_fontsize(14)\n plt.suptitle(title, fontsize=16)\n plt.tight_layout()\n plt.close(fig)\n return fig", "response": "Plots the cityscape of quantum state."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nplots the paulivec representation of a mixed state vector or density matrix.", "response": "def plot_state_paulivec(rho, title=\"\", figsize=None, color=None):\n \"\"\"Plot the paulivec representation of a quantum state.\n\n Plot a bargraph of the mixed state rho over the pauli matrices\n\n Args:\n rho (ndarray): Numpy array for state vector or density matrix\n title (str): a string that represents the plot title\n figsize (tuple): Figure size in inches.\n color (list or str): Color of the expectation value bars.\n Returns:\n matplotlib.Figure: The matplotlib.Figure of the visualization\n Raises:\n ImportError: Requires matplotlib.\n \"\"\"\n if not HAS_MATPLOTLIB:\n raise ImportError('Must have Matplotlib installed.')\n rho = _validate_input_state(rho)\n if figsize is None:\n figsize = (7, 5)\n num = int(np.log2(len(rho)))\n labels = list(map(lambda x: x.to_label(), pauli_group(num)))\n values = list(map(lambda x: np.real(np.trace(np.dot(x.to_matrix(), rho))),\n pauli_group(num)))\n numelem = len(values)\n if color is None:\n color = \"#648fff\"\n\n ind = np.arange(numelem) # the x locations for the groups\n width = 0.5 # the width of the bars\n fig, ax = plt.subplots(figsize=figsize)\n ax.grid(zorder=0, linewidth=1, linestyle='--')\n ax.bar(ind, values, width, color=color, zorder=2)\n ax.axhline(linewidth=1, color='k')\n # add some text for labels, title, and axes ticks\n ax.set_ylabel('Expectation value', fontsize=14)\n ax.set_xticks(ind)\n ax.set_yticks([-1, -0.5, 0, 0.5, 1])\n ax.set_xticklabels(labels, fontsize=14, rotation=70)\n ax.set_xlabel('Pauli', fontsize=14)\n ax.set_ylim([-1, 1])\n ax.set_facecolor('#eeeeee')\n for tick in ax.xaxis.get_major_ticks()+ax.yaxis.get_major_ticks():\n tick.label.set_fontsize(14)\n ax.set_title(title, fontsize=16)\n plt.close(fig)\n return fig"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn the index of a string of 0 and 1s.", "response": "def bit_string_index(s):\n \"\"\"Return the index of a string of 0s and 1s.\"\"\"\n n = len(s)\n k = s.count(\"1\")\n if s.count(\"0\") != n - k:\n raise VisualizationError(\"s must be a string of 0 and 1\")\n ones = [pos for pos, char in enumerate(s) if char == \"1\"]\n return lex_index(n, k, ones)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef phase_to_color_wheel(complex_number):\n angles = np.angle(complex_number)\n angle_round = int(((angles + 2 * np.pi) % (2 * np.pi))/np.pi*6)\n color_map = {\n 0: (0, 0, 1), # blue,\n 1: (0.5, 0, 1), # blue-violet\n 2: (1, 0, 1), # violet\n 3: (1, 0, 0.5), # red-violet,\n 4: (1, 0, 0), # red\n 5: (1, 0.5, 0), # red-oranage,\n 6: (1, 1, 0), # orange\n 7: (0.5, 1, 0), # orange-yellow\n 8: (0, 1, 0), # yellow,\n 9: (0, 1, 0.5), # yellow-green,\n 10: (0, 1, 1), # green,\n 11: (0, 0.5, 1) # green-blue,\n }\n return color_map[angle_round]", "response": "Map a phase of a complexnumber to a color wheel."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nplots the qsphere representation of a state vector or density matrix representation.", "response": "def plot_state_qsphere(rho, figsize=None):\n \"\"\"Plot the qsphere representation of a quantum state.\n\n Args:\n rho (ndarray): State vector or density matrix representation.\n of quantum state.\n figsize (tuple): Figure size in inches.\n\n Returns:\n Figure: A matplotlib figure instance.\n\n Raises:\n ImportError: Requires matplotlib.\n \"\"\"\n if not HAS_MATPLOTLIB:\n raise ImportError('Must have Matplotlib installed.')\n rho = _validate_input_state(rho)\n if figsize is None:\n figsize = (7, 7)\n num = int(np.log2(len(rho)))\n # get the eigenvectors and eigenvalues\n we, stateall = linalg.eigh(rho)\n for _ in range(2**num):\n # start with the max\n probmix = we.max()\n prob_location = we.argmax()\n if probmix > 0.001:\n # get the max eigenvalue\n state = stateall[:, prob_location]\n loc = np.absolute(state).argmax()\n # get the element location closes to lowest bin representation.\n for j in range(2**num):\n test = np.absolute(np.absolute(state[j]) -\n np.absolute(state[loc]))\n if test < 0.001:\n loc = j\n break\n # remove the global phase\n angles = (np.angle(state[loc]) + 2 * np.pi) % (2 * np.pi)\n angleset = np.exp(-1j*angles)\n # print(state)\n # print(angles)\n state = angleset*state\n # print(state)\n state.flatten()\n # start the plotting\n fig = plt.figure(figsize=figsize)\n ax = fig.add_subplot(111, projection='3d')\n ax.axes.set_xlim3d(-1.0, 1.0)\n ax.axes.set_ylim3d(-1.0, 1.0)\n ax.axes.set_zlim3d(-1.0, 1.0)\n ax.set_aspect(\"equal\")\n ax.axes.grid(False)\n # Plot semi-transparent sphere\n u = np.linspace(0, 2 * np.pi, 25)\n v = np.linspace(0, np.pi, 25)\n x = np.outer(np.cos(u), np.sin(v))\n y = np.outer(np.sin(u), np.sin(v))\n z = np.outer(np.ones(np.size(u)), np.cos(v))\n ax.plot_surface(x, y, z, rstride=1, cstride=1, color='k',\n alpha=0.05, linewidth=0)\n # wireframe\n # Get rid of the panes\n ax.w_xaxis.set_pane_color((1.0, 1.0, 1.0, 0.0))\n ax.w_yaxis.set_pane_color((1.0, 1.0, 1.0, 0.0))\n ax.w_zaxis.set_pane_color((1.0, 1.0, 1.0, 0.0))\n\n # Get rid of the spines\n ax.w_xaxis.line.set_color((1.0, 1.0, 1.0, 0.0))\n ax.w_yaxis.line.set_color((1.0, 1.0, 1.0, 0.0))\n ax.w_zaxis.line.set_color((1.0, 1.0, 1.0, 0.0))\n # Get rid of the ticks\n ax.set_xticks([])\n ax.set_yticks([])\n ax.set_zticks([])\n\n d = num\n for i in range(2**num):\n # get x,y,z points\n element = bin(i)[2:].zfill(num)\n weight = element.count(\"1\")\n zvalue = -2 * weight / d + 1\n number_of_divisions = n_choose_k(d, weight)\n weight_order = bit_string_index(element)\n # if weight_order >= number_of_divisions / 2:\n # com_key = compliment(element)\n # weight_order_temp = bit_string_index(com_key)\n # weight_order = np.floor(\n # number_of_divisions / 2) + weight_order_temp + 1\n angle = weight_order * 2 * np.pi / number_of_divisions\n xvalue = np.sqrt(1 - zvalue**2) * np.cos(angle)\n yvalue = np.sqrt(1 - zvalue**2) * np.sin(angle)\n ax.plot([xvalue], [yvalue], [zvalue],\n markerfacecolor=(.5, .5, .5),\n markeredgecolor=(.5, .5, .5),\n marker='o', markersize=10, alpha=1)\n # get prob and angle - prob will be shade and angle color\n prob = np.real(np.dot(state[i], state[i].conj()))\n colorstate = phase_to_color_wheel(state[i])\n a = Arrow3D([0, xvalue], [0, yvalue], [0, zvalue],\n mutation_scale=20, alpha=prob, arrowstyle=\"-\",\n color=colorstate, lw=10)\n ax.add_artist(a)\n # add weight lines\n for weight in range(d + 1):\n theta = np.linspace(-2 * np.pi, 2 * np.pi, 100)\n z = -2 * weight / d + 1\n r = np.sqrt(1 - z**2)\n x = r * np.cos(theta)\n y = r * np.sin(theta)\n ax.plot(x, y, z, color=(.5, .5, .5))\n # add center point\n ax.plot([0], [0], [0], markerfacecolor=(.5, .5, .5),\n markeredgecolor=(.5, .5, .5), marker='o', markersize=10,\n alpha=1)\n we[prob_location] = 0\n else:\n break\n plt.tight_layout()\n plt.close(fig)\n return fig"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nplot the state of a multi - qubit state.", "response": "def plot_state(quantum_state, method='city', figsize=None):\n \"\"\"Plot the quantum state.\n\n Args:\n quantum_state (ndarray): statevector or density matrix\n representation of a quantum state.\n method (str): Plotting method to use.\n figsize (tuple): Figure size in inches,\n\n Returns:\n matplotlib.Figure: The matplotlib.Figure of the visualization\n Raises:\n ImportError: Requires matplotlib.\n VisualizationError: if the input is not a statevector or density\n matrix, or if the state is not an multi-qubit quantum state.\n \"\"\"\n if not HAS_MATPLOTLIB:\n raise ImportError('Must have Matplotlib installed.')\n warnings.warn(\"plot_state is deprecated, and will be removed in \\\n the 0.9 release. Use the plot_state_ * functions \\\n instead.\",\n DeprecationWarning)\n # Check if input is a statevector, and convert to density matrix\n rho = _validate_input_state(quantum_state)\n fig = None\n if method == 'city':\n fig = plot_state_city(rho, figsize=figsize)\n elif method == \"paulivec\":\n fig = plot_state_paulivec(rho, figsize=figsize)\n elif method == \"qsphere\":\n fig = plot_state_qsphere(rho, figsize=figsize)\n elif method == \"bloch\":\n plot_bloch_multivector(rho, figsize=figsize)\n elif method == \"hinton\":\n fig = plot_state_hinton(rho, figsize=figsize)\n return fig"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngenerating a list of shaded facecolors for the given set of points.", "response": "def generate_facecolors(x, y, z, dx, dy, dz, color):\n \"\"\"Generates shaded facecolors for shaded bars.\n This is here to work around a Matplotlib bug\n where alpha does not work in Bar3D.\n Args:\n x (array_like): The x- coordinates of the anchor point of the bars.\n y (array_like): The y- coordinates of the anchor point of the bars.\n z (array_like): The z- coordinates of the anchor point of the bars.\n dx (array_like): Width of bars.\n dy (array_like): Depth of bars.\n dz (array_like): Height of bars.\n color (array_like): sequence of valid color specifications, optional\n Returns:\n list: Shaded colors for bars.\n \"\"\"\n cuboid = np.array([\n # -z\n (\n (0, 0, 0),\n (0, 1, 0),\n (1, 1, 0),\n (1, 0, 0),\n ),\n # +z\n (\n (0, 0, 1),\n (1, 0, 1),\n (1, 1, 1),\n (0, 1, 1),\n ),\n # -y\n (\n (0, 0, 0),\n (1, 0, 0),\n (1, 0, 1),\n (0, 0, 1),\n ),\n # +y\n (\n (0, 1, 0),\n (0, 1, 1),\n (1, 1, 1),\n (1, 1, 0),\n ),\n # -x\n (\n (0, 0, 0),\n (0, 0, 1),\n (0, 1, 1),\n (0, 1, 0),\n ),\n # +x\n (\n (1, 0, 0),\n (1, 1, 0),\n (1, 1, 1),\n (1, 0, 1),\n ),\n ])\n\n # indexed by [bar, face, vertex, coord]\n polys = np.empty(x.shape + cuboid.shape)\n # handle each coordinate separately\n for i, p, dp in [(0, x, dx), (1, y, dy), (2, z, dz)]:\n p = p[..., np.newaxis, np.newaxis]\n dp = dp[..., np.newaxis, np.newaxis]\n polys[..., i] = p + dp * cuboid[..., i]\n\n # collapse the first two axes\n polys = polys.reshape((-1,) + polys.shape[2:])\n\n facecolors = []\n if len(color) == len(x):\n # bar colors specified, need to expand to number of faces\n for c in color:\n facecolors.extend([c] * 6)\n else:\n # a single color specified, or face colors specified explicitly\n facecolors = list(mcolors.to_rgba_array(color))\n if len(facecolors) < len(x):\n facecolors *= (6 * len(x))\n\n normals = _generate_normals(polys)\n return _shade_colors(facecolors, normals)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ngenerate the array of the normals for the given list of polygons.", "response": "def _generate_normals(polygons):\n \"\"\"\n Takes a list of polygons and return an array of their normals.\n Normals point towards the viewer for a face with its vertices in\n counterclockwise order, following the right hand rule.\n Uses three points equally spaced around the polygon.\n This normal of course might not make sense for polygons with more than\n three points not lying in a plane, but it's a plausible and fast\n approximation.\n Args:\n polygons (list): list of (M_i, 3) array_like, or (..., M, 3) array_like\n A sequence of polygons to compute normals for, which can have\n varying numbers of vertices. If the polygons all have the same\n number of vertices and array is passed, then the operation will\n be vectorized.\n Returns:\n normals: (..., 3) array_like\n A normal vector estimated for the polygon.\n \"\"\"\n if isinstance(polygons, np.ndarray):\n # optimization: polygons all have the same number of points, so can\n # vectorize\n n = polygons.shape[-2]\n i1, i2, i3 = 0, n//3, 2*n//3\n v1 = polygons[..., i1, :] - polygons[..., i2, :]\n v2 = polygons[..., i2, :] - polygons[..., i3, :]\n else:\n # The subtraction doesn't vectorize because polygons is jagged.\n v1 = np.empty((len(polygons), 3))\n v2 = np.empty((len(polygons), 3))\n for poly_i, ps in enumerate(polygons):\n n = len(ps)\n i1, i2, i3 = 0, n//3, 2*n//3\n v1[poly_i, :] = ps[i1, :] - ps[i2, :]\n v2[poly_i, :] = ps[i2, :] - ps[i3, :]\n\n return np.cross(v1, v2)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _shade_colors(color, normals, lightsource=None):\n if lightsource is None:\n # chosen for backwards-compatibility\n lightsource = LightSource(azdeg=225, altdeg=19.4712)\n\n shade = np.array([np.dot(n / proj3d.mod(n), lightsource.direction)\n if proj3d.mod(n) else np.nan\n for n in normals])\n mask = ~np.isnan(shade)\n\n if mask.any():\n norm = Normalize(min(shade[mask]), max(shade[mask]))\n shade[~mask] = min(shade[mask])\n color = mcolors.to_rgba_array(color)\n # shape of color should be (M, 4) (where M is number of faces)\n # shape of shade should be (M,)\n # colors should have final shape of (M, 4)\n alpha = color[:, 3]\n colors = (0.5 + norm(shade)[:, np.newaxis] * 0.5) * color\n colors[:, 3] = alpha\n else:\n colors = np.asanyarray(color).copy()\n\n return colors", "response": "Shade colors using normal vectors given by normals."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_unique_backends():\n backends = IBMQ.backends()\n unique_hardware_backends = []\n unique_names = []\n for back in backends:\n if back.name() not in unique_names and not back.configuration().simulator:\n unique_hardware_backends.append(back)\n unique_names.append(back.name())\n if not unique_hardware_backends:\n raise QiskitError('No backends available.')\n return unique_hardware_backends", "response": "Gets the unique backends that are available."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef backend_monitor(backend):\n if not isinstance(backend, IBMQBackend):\n raise QiskitError('Input variable is not of type IBMQBackend.')\n config = backend.configuration().to_dict()\n status = backend.status().to_dict()\n config_dict = {**status, **config}\n if not config['simulator']:\n props = backend.properties().to_dict()\n\n print(backend.name())\n print('='*len(backend.name()))\n print('Configuration')\n print('-'*13)\n offset = ' '\n\n upper_list = ['n_qubits', 'operational',\n 'status_msg', 'pending_jobs',\n 'basis_gates', 'local', 'simulator']\n\n lower_list = list(set(config_dict.keys()).difference(upper_list))\n # Remove gates because they are in a different tab\n lower_list.remove('gates')\n for item in upper_list+lower_list:\n print(offset+item+':', config_dict[item])\n\n # Stop here if simulator\n if config['simulator']:\n return\n\n print()\n qubit_header = 'Qubits [Name / Freq / T1 / T2 / U1 err / U2 err / U3 err / Readout err]'\n print(qubit_header)\n print('-'*len(qubit_header))\n\n sep = ' / '\n for qub in range(len(props['qubits'])):\n name = 'Q%s' % qub\n qubit_data = props['qubits'][qub]\n gate_data = props['gates'][3*qub:3*qub+3]\n t1_info = qubit_data[0]\n t2_info = qubit_data[1]\n freq_info = qubit_data[2]\n readout_info = qubit_data[3]\n\n freq = str(round(freq_info['value'], 5))+' '+freq_info['unit']\n T1 = str(round(t1_info['value'], # pylint: disable=invalid-name\n 5))+' ' + t1_info['unit']\n T2 = str(round(t2_info['value'], # pylint: disable=invalid-name\n 5))+' ' + t2_info['unit']\n # pylint: disable=invalid-name\n U1 = str(round(gate_data[0]['parameters'][0]['value'], 5))\n # pylint: disable=invalid-name\n U2 = str(round(gate_data[1]['parameters'][0]['value'], 5))\n # pylint: disable=invalid-name\n U3 = str(round(gate_data[2]['parameters'][0]['value'], 5))\n\n readout_error = str(round(readout_info['value'], 5))\n\n qstr = sep.join([name, freq, T1, T2, U1, U2, U3, readout_error])\n print(offset+qstr)\n\n print()\n multi_qubit_gates = props['gates'][3*config['n_qubits']:]\n multi_header = 'Multi-Qubit Gates [Name / Type / Gate Error]'\n print(multi_header)\n print('-'*len(multi_header))\n\n for gate in multi_qubit_gates:\n name = gate['name']\n ttype = gate['gate']\n error = str(round(gate['parameters'][0]['value'], 5))\n mstr = sep.join([name, ttype, error])\n print(offset+mstr)", "response": "Monitor a single IBMQ backend."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ngive overview information on all the IBMQ alues that are available.", "response": "def backend_overview():\n \"\"\"Gives overview information on all the IBMQ\n backends that are available.\n \"\"\"\n unique_hardware_backends = get_unique_backends()\n _backends = []\n # Sort backends by operational or not\n for idx, back in enumerate(unique_hardware_backends):\n if back.status().operational:\n _backends = [back] + _backends\n else:\n _backends = _backends + [back]\n\n stati = [back.status() for back in _backends]\n idx = list(range(len(_backends)))\n pending = [s.pending_jobs for s in stati]\n _, least_idx = zip(*sorted(zip(pending, idx)))\n\n # Make sure least pending is operational\n for ind in least_idx:\n if stati[ind].operational:\n least_pending_idx = ind\n break\n\n num_rows = math.ceil(len(_backends)/3)\n\n count = 0\n num_backends = len(_backends)\n for _ in range(num_rows):\n max_len = 0\n str_list = ['']*8\n for idx in range(3):\n offset = ' ' * 10 if idx else ''\n config = _backends[count].configuration().to_dict()\n props = _backends[count].properties().to_dict()\n n_qubits = config['n_qubits']\n str_list[0] += (' '*(max_len-len(str_list[0]))+offset)\n str_list[0] += _backends[count].name()\n\n str_list[1] += (' '*(max_len-len(str_list[1]))+offset)\n str_list[1] += '-'*len(_backends[count].name())\n\n str_list[2] += (' '*(max_len-len(str_list[2]))+offset)\n str_list[2] += 'Num. Qubits: %s' % config['n_qubits']\n\n str_list[3] += (' '*(max_len-len(str_list[3]))+offset)\n str_list[3] += 'Pending Jobs: %s' % stati[count].pending_jobs\n\n str_list[4] += (' '*(max_len-len(str_list[4]))+offset)\n str_list[4] += 'Least busy: %s' % (count == least_pending_idx)\n\n str_list[5] += (' '*(max_len-len(str_list[5]))+offset)\n str_list[5] += 'Operational: %s' % stati[count].operational\n\n str_list[6] += (' '*(max_len-len(str_list[6]))+offset)\n str_list[6] += 'Avg. T1: %s' % round(sum([q[0]['value']\n for q in props['qubits']])/n_qubits, 1)\n str_list[7] += (' '*(max_len-len(str_list[7]))+offset)\n str_list[7] += 'Avg. T2: %s' % round(sum([q[1]['value']\n for q in props['qubits']])/n_qubits, 1)\n count += 1\n if count == num_backends:\n break\n max_len = max([len(s) for s in str_list])\n\n print(\"\\n\".join(str_list))\n print('\\n'*2)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef op(self):\n if 'type' not in self.data_dict or self.data_dict['type'] != 'op':\n raise QiskitError(\"The node %s is not an op node\" % (str(self)))\n return self.data_dict.get('op')", "response": "Returns the Instruction object corresponding to the op for the node else None"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef wire(self):\n if self.data_dict['type'] not in ['in', 'out']:\n raise QiskitError('The node %s is not an input/output node' % str(self))\n return self.data_dict.get('wire')", "response": "Returns the wire register of the node."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nchecking if two nodes are considered equivalent.", "response": "def semantic_eq(node1, node2):\n \"\"\"\n Check if DAG nodes are considered equivalent, e.g. as a node_match for nx.is_isomorphic.\n\n Args:\n node1 (DAGNode): A node to compare.\n node2 (DAGNode): The other node to compare.\n\n Return:\n Bool: If node1 == node2\n \"\"\"\n # For barriers, qarg order is not significant so compare as sets\n if 'barrier' == node1.name == node2.name:\n return set(node1.qargs) == set(node2.qargs)\n return node1.data_dict == node2.data_dict"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef constant(duration: int, amp: complex, name: str = None) -> SamplePulse:\n return _sampled_constant_pulse(duration, amp, name=name)", "response": "Generates a constant - sampled sample - pulse."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngenerates a zero - sampled version of the current object.", "response": "def zero(duration: int, name: str = None) -> SamplePulse:\n \"\"\"Generates zero-sampled `SamplePulse`.\n\n Args:\n duration: Duration of pulse. Must be greater than zero.\n name: Name of pulse.\n \"\"\"\n return _sampled_zero_pulse(duration, name=name)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngenerates a square wave.", "response": "def square(duration: int, amp: complex, period: float = None,\n phase: float = 0, name: str = None) -> SamplePulse:\n \"\"\"Generates square wave `SamplePulse`.\n\n Applies `left` sampling strategy to generate discrete pulse from continuous function.\n\n Args:\n duration: Duration of pulse. Must be greater than zero.\n amp: Pulse amplitude. Wave range is [-amp, amp].\n period: Pulse period, units of dt. If `None` defaults to single cycle.\n phase: Pulse phase.\n name: Name of pulse.\n \"\"\"\n if period is None:\n period = duration\n\n return _sampled_square_pulse(duration, amp, period, phase=phase, name=name)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef sawtooth(duration: int, amp: complex, period: float = None,\n phase: float = 0, name: str = None) -> SamplePulse:\n \"\"\"Generates sawtooth wave `SamplePulse`.\n\n Args:\n duration: Duration of pulse. Must be greater than zero.\n amp: Pulse amplitude. Wave range is [-amp, amp].\n period: Pulse period, units of dt. If `None` defaults to single cycle.\n phase: Pulse phase.\n name: Name of pulse.\n \"\"\"\n if period is None:\n period = duration\n\n return _sampled_sawtooth_pulse(duration, amp, period, phase=phase, name=name)", "response": "Generates a sawtooth wave."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngenerating a continuous triangle wave.", "response": "def triangle(duration: int, amp: complex, period: float = None,\n phase: float = 0, name: str = None) -> SamplePulse:\n \"\"\"Generates triangle wave `SamplePulse`.\n\n Applies `left` sampling strategy to generate discrete pulse from continuous function.\n\n Args:\n duration: Duration of pulse. Must be greater than zero.\n amp: Pulse amplitude. Wave range is [-amp, amp].\n period: Pulse period, units of dt. If `None` defaults to single cycle.\n phase: Pulse phase.\n name: Name of pulse.\n \"\"\"\n if period is None:\n period = duration\n\n return _sampled_triangle_pulse(duration, amp, period, phase=phase, name=name)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef cos(duration: int, amp: complex, freq: float = None,\n phase: float = 0, name: str = None) -> SamplePulse:\n \"\"\"Generates cosine wave `SamplePulse`.\n\n Applies `left` sampling strategy to generate discrete pulse from continuous function.\n\n Args:\n duration: Duration of pulse. Must be greater than zero.\n amp: Pulse amplitude.\n freq: Pulse frequency, units of 1/dt. If `None` defaults to single cycle.\n phase: Pulse phase.\n name: Name of pulse.\n \"\"\"\n if freq is None:\n freq = 1/duration\n\n return _sampled_cos_pulse(duration, amp, freq, phase=phase, name=name)", "response": "Generates a cosine wave from continuous function."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef sin(duration: int, amp: complex, freq: float = None,\n phase: float = 0, name: str = None) -> SamplePulse:\n \"\"\"Generates sine wave `SamplePulse`.\n\n Args:\n duration: Duration of pulse. Must be greater than zero.\n amp: Pulse amplitude.\n freq: Pulse frequency, units of 1/dt. If `None` defaults to single cycle.\n phase: Pulse phase.\n name: Name of pulse.\n \"\"\"\n if freq is None:\n freq = 1/duration\n\n return _sampled_sin_pulse(duration, amp, freq, phase=phase, name=name)", "response": "Generates a sine wave."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngenerating a gaussian square pulse.", "response": "def gaussian_square(duration: int, amp: complex, sigma: float,\n risefall: int, name: str = None) -> SamplePulse:\n \"\"\"Generates gaussian square `SamplePulse`.\n\n Centered at `duration/2` and zeroed at `t=-1` and `t=duration+1` to prevent\n large initial/final discontinuities.\n\n Applies `left` sampling strategy to generate discrete pulse from continuous function.\n\n Args:\n duration: Duration of pulse. Must be greater than zero.\n amp: Pulse amplitude.\n sigma: Width (standard deviation) of gaussian rise/fall portion of the pulse.\n risefall: Number of samples over which pulse rise and fall happen. Width of\n square portion of pulse will be `duration-2*risefall`.\n name: Name of pulse.\n \"\"\"\n center = duration/2\n width = duration-2*risefall\n zeroed_width = duration + 2\n return _sampled_gaussian_square_pulse(duration, amp, center, width, sigma,\n zeroed_width=zeroed_width, name=name)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef run(self, dag):\n diagonal_1q_gates = (RZGate, ZGate, TGate, SGate, TdgGate, SdgGate, U1Gate)\n diagonal_2q_gates = (CzGate, CrzGate, Cu1Gate, RZZGate)\n\n nodes_to_remove = set()\n for measure in dag.op_nodes(Measure):\n predecessor = dag.quantum_predecessors(measure)[0]\n\n if predecessor.type == 'op' and isinstance(predecessor.op, diagonal_1q_gates):\n nodes_to_remove.add(predecessor)\n\n if predecessor.type == 'op' and isinstance(predecessor.op, diagonal_2q_gates):\n successors = dag.quantum_successors(predecessor)\n if all([s.type == 'op' and isinstance(s.op, Measure) for s in successors]):\n nodes_to_remove.add(predecessor)\n\n for node_to_remove in nodes_to_remove:\n dag.remove_op_node(node_to_remove)\n\n return dag", "response": "Return a new circuit that has been optimized."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nplotting the gate map of a device.", "response": "def plot_gate_map(backend, figsize=None,\n plot_directed=False,\n label_qubits=True,\n qubit_size=24,\n line_width=4,\n font_size=12,\n qubit_color=None,\n line_color=None,\n font_color='w'):\n \"\"\"Plots the gate map of a device.\n\n Args:\n backend (BaseBackend): A backend instance,\n figsize (tuple): Output figure size (wxh) in inches.\n plot_directed (bool): Plot directed coupling map.\n label_qubits (bool): Label the qubits.\n qubit_size (float): Size of qubit marker.\n line_width (float): Width of lines.\n font_size (int): Font size of qubit labels.\n qubit_color (list): A list of colors for the qubits\n line_color (list): A list of colors for each line from coupling_map.\n font_color (str): The font color for the qubit labels.\n\n Returns:\n Figure: A Matplotlib figure instance.\n\n Raises:\n QiskitError: if tried to pass a simulator.\n ImportError: if matplotlib not installed.\n \"\"\"\n if not HAS_MATPLOTLIB:\n raise ImportError('Must have Matplotlib installed.')\n\n if backend.configuration().simulator:\n raise QiskitError('Requires a device backend, not simulator.')\n\n mpl_data = {}\n\n mpl_data['ibmq_20_tokyo'] = [[0, 0], [0, 1], [0, 2], [0, 3], [0, 4],\n [1, 0], [1, 1], [1, 2], [1, 3], [1, 4],\n [2, 0], [2, 1], [2, 2], [2, 3], [2, 4],\n [3, 0], [3, 1], [3, 2], [3, 3], [3, 4]]\n\n mpl_data['ibmq_poughkeepsie'] = mpl_data['ibmq_20_tokyo']\n\n mpl_data['ibmq_16_melbourne'] = [[0, 0], [0, 1], [0, 2], [0, 3], [0, 4],\n [0, 5], [0, 6], [1, 7], [1, 6], [1, 5],\n [1, 4], [1, 3], [1, 2], [1, 1]]\n\n mpl_data['ibmq_16_rueschlikon'] = [[1, 0], [0, 0], [0, 1], [0, 2], [0, 3],\n [0, 4], [0, 5], [0, 6], [0, 7], [1, 7],\n [1, 6], [1, 5], [1, 4], [1, 3], [1, 2], [1, 1]]\n\n mpl_data['ibmq_5_tenerife'] = [[1, 0], [0, 1], [1, 1], [1, 2], [2, 1]]\n\n mpl_data['ibmq_5_yorktown'] = mpl_data['ibmq_5_tenerife']\n\n config = backend.configuration()\n name = config.backend_name\n cmap = config.coupling_map\n\n dep_names = {'ibmqx5': 'ibmq_16_rueschlikon',\n 'ibmqx4': 'ibmq_5_tenerife',\n 'ibmqx2': 'ibmq_5_yorktown'}\n\n if name in dep_names.keys():\n name = dep_names[name]\n\n if name in mpl_data.keys():\n grid_data = mpl_data[name]\n else:\n fig, ax = plt.subplots(figsize=(5, 5)) # pylint: disable=invalid-name\n ax.axis('off')\n return fig\n\n x_max = max([d[1] for d in grid_data])\n y_max = max([d[0] for d in grid_data])\n max_dim = max(x_max, y_max)\n\n if figsize is None:\n if x_max/max_dim > 0.33 and y_max/max_dim > 0.33:\n figsize = (5, 5)\n else:\n figsize = (9, 3)\n\n fig, ax = plt.subplots(figsize=figsize) # pylint: disable=invalid-name\n ax.axis('off')\n fig.tight_layout()\n\n # set coloring\n if qubit_color is None:\n qubit_color = ['#648fff']*config.n_qubits\n if line_color is None:\n line_color = ['#648fff']*len(cmap)\n\n # Add lines for couplings\n for ind, edge in enumerate(cmap):\n is_symmetric = False\n if edge[::-1] in cmap:\n is_symmetric = True\n y_start = grid_data[edge[0]][0]\n x_start = grid_data[edge[0]][1]\n y_end = grid_data[edge[1]][0]\n x_end = grid_data[edge[1]][1]\n\n if is_symmetric:\n if y_start == y_end:\n x_end = (x_end - x_start)/2+x_start\n\n elif x_start == x_end:\n y_end = (y_end - y_start)/2+y_start\n\n else:\n x_end = (x_end - x_start)/2+x_start\n y_end = (y_end - y_start)/2+y_start\n ax.add_artist(plt.Line2D([x_start, x_end], [-y_start, -y_end],\n color=line_color[ind], linewidth=line_width,\n zorder=0))\n if plot_directed:\n dx = x_end-x_start # pylint: disable=invalid-name\n dy = y_end-y_start # pylint: disable=invalid-name\n if is_symmetric:\n x_arrow = x_start+dx*0.95\n y_arrow = -y_start-dy*0.95\n dx_arrow = dx*0.01\n dy_arrow = -dy*0.01\n head_width = 0.15\n else:\n x_arrow = x_start+dx*0.5\n y_arrow = -y_start-dy*0.5\n dx_arrow = dx*0.2\n dy_arrow = -dy*0.2\n head_width = 0.2\n ax.add_patch(mpatches.FancyArrow(x_arrow,\n y_arrow,\n dx_arrow,\n dy_arrow,\n head_width=head_width,\n length_includes_head=True,\n edgecolor=None,\n linewidth=0,\n facecolor=line_color[ind],\n zorder=1))\n\n # Add circles for qubits\n for var, idx in enumerate(grid_data):\n _idx = [idx[1], -idx[0]]\n width = _GraphDist(qubit_size, ax, True)\n height = _GraphDist(qubit_size, ax, False)\n ax.add_artist(mpatches.Ellipse(\n _idx, width, height, color=qubit_color[var], zorder=1))\n if label_qubits:\n ax.text(*_idx, s=str(var),\n horizontalalignment='center',\n verticalalignment='center',\n color=font_color, size=font_size, weight='bold')\n ax.set_xlim([-1, x_max+1])\n ax.set_ylim([-(y_max+1), 1])\n plt.close(fig)\n return fig"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndistances abs of the current time series", "response": "def dist_abs(self):\n \"\"\"Distance abs\n \"\"\"\n bounds = self.ax.get_xlim() if self.x else self.ax.get_ylim()\n return bounds[0] - bounds[1]"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nprints the node data with indent.", "response": "def to_string(self, indent):\n \"\"\"Print the node data, with indent.\"\"\"\n ind = indent * ' '\n print(ind, 'qreg')\n self.children[0].to_string(indent + 3)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _verify_backends(self):\n ret = OrderedDict()\n for backend_cls in SIMULATORS:\n try:\n backend_instance = self._get_backend_instance(backend_cls)\n backend_name = backend_instance.name()\n ret[backend_name] = backend_instance\n except QiskitError as err:\n # Ignore backends that could not be initialized.\n logger.info('Basic Aer backend %s is not available: %s',\n backend_cls, str(err))\n return ret", "response": "Verify that all Basic Aer backends are available."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning an instance of a backend from its class.", "response": "def _get_backend_instance(self, backend_cls):\n \"\"\"\n Return an instance of a backend from its class.\n\n Args:\n backend_cls (class): Backend class.\n Returns:\n BaseBackend: a backend instance.\n Raises:\n QiskitError: if the backend could not be instantiated.\n \"\"\"\n # Verify that the backend can be instantiated.\n try:\n backend_instance = backend_cls(provider=self)\n except Exception as err:\n raise QiskitError('Backend %s could not be instantiated: %s' %\n (backend_cls, err))\n\n return backend_instance"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning a list of qubits as ( QuantumRegister index ) pairs.", "response": "def qubits(self):\n \"\"\"Return a list of qubits as (QuantumRegister, index) pairs.\"\"\"\n return [(v, i) for k, v in self.qregs.items() for i in range(v.size)]"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns a list of bits as ( ClassicalRegister index ) pairs.", "response": "def clbits(self):\n \"\"\"Return a list of bits as (ClassicalRegister, index) pairs.\"\"\"\n return [(v, i) for k, v in self.cregs.items() for i in range(v.size)]"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nrenaming a classical or quantum register throughout the circuit.", "response": "def rename_register(self, regname, newname):\n \"\"\"Rename a classical or quantum register throughout the circuit.\n\n regname = existing register name string\n newname = replacement register name string\n \"\"\"\n if regname == newname:\n return\n if newname in self.qregs or newname in self.cregs:\n raise DAGCircuitError(\"duplicate register name %s\" % newname)\n if regname not in self.qregs and regname not in self.cregs:\n raise DAGCircuitError(\"no register named %s\" % regname)\n if regname in self.qregs:\n reg = self.qregs[regname]\n reg.name = newname\n self.qregs[newname] = reg\n self.qregs.pop(regname, None)\n if regname in self.cregs:\n reg = self.cregs[regname]\n reg.name = newname\n self.qregs[newname] = reg\n self.qregs.pop(regname, None)\n\n for node in self._multi_graph.nodes():\n if node.type == \"in\" or node.type == \"out\":\n if node.name and regname in node.name:\n node.name = newname\n elif node.type == \"op\":\n qa = []\n for a in node.qargs:\n if a[0] == regname:\n a = (newname, a[1])\n qa.append(a)\n node.qargs = qa\n ca = []\n for a in node.cargs:\n if a[0] == regname:\n a = (newname, a[1])\n ca.append(a)\n node.cargs = ca\n if node.condition is not None:\n if node.condition[0] == regname:\n node.condition = (newname, node.condition[1])\n # eX = edge, d= data\n for _, _, edge_data in self._multi_graph.edges(data=True):\n if regname in edge_data['name']:\n edge_data['name'] = re.sub(regname, newname, edge_data['name'])"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nremove all operations with the given name.", "response": "def remove_all_ops_named(self, opname):\n \"\"\"Remove all operation nodes with the given name.\"\"\"\n for n in self.named_nodes(opname):\n self.remove_op_node(n)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nadds all wires in a quantum register.", "response": "def add_qreg(self, qreg):\n \"\"\"Add all wires in a quantum register.\"\"\"\n if not isinstance(qreg, QuantumRegister):\n raise DAGCircuitError(\"not a QuantumRegister instance.\")\n if qreg.name in self.qregs:\n raise DAGCircuitError(\"duplicate register %s\" % qreg.name)\n self.qregs[qreg.name] = qreg\n for j in range(qreg.size):\n self._add_wire((qreg, j))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadding all wires in a classical register.", "response": "def add_creg(self, creg):\n \"\"\"Add all wires in a classical register.\"\"\"\n if not isinstance(creg, ClassicalRegister):\n raise DAGCircuitError(\"not a ClassicalRegister instance.\")\n if creg.name in self.cregs:\n raise DAGCircuitError(\"duplicate register %s\" % creg.name)\n self.cregs[creg.name] = creg\n for j in range(creg.size):\n self._add_wire((creg, j))"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nadd a wire to the circuit.", "response": "def _add_wire(self, wire):\n \"\"\"Add a qubit or bit to the circuit.\n\n Args:\n wire (tuple): (Register,int) containing a register instance and index\n This adds a pair of in and out nodes connected by an edge.\n\n Raises:\n DAGCircuitError: if trying to add duplicate wire\n \"\"\"\n if wire not in self.wires:\n self.wires.append(wire)\n self._max_node_id += 1\n input_map_wire = self.input_map[wire] = self._max_node_id\n\n self._max_node_id += 1\n output_map_wire = self._max_node_id\n\n wire_name = \"%s[%s]\" % (wire[0].name, wire[1])\n\n inp_node = DAGNode(data_dict={'type': 'in', 'name': wire_name, 'wire': wire},\n nid=input_map_wire)\n outp_node = DAGNode(data_dict={'type': 'out', 'name': wire_name, 'wire': wire},\n nid=output_map_wire)\n self._id_to_node[input_map_wire] = inp_node\n self._id_to_node[output_map_wire] = outp_node\n\n self.input_map[wire] = inp_node\n self.output_map[wire] = outp_node\n\n self._multi_graph.add_node(inp_node)\n self._multi_graph.add_node(outp_node)\n\n self._multi_graph.add_edge(inp_node,\n outp_node)\n\n self._multi_graph.adj[inp_node][outp_node][0][\"name\"] \\\n = \"%s[%s]\" % (wire[0].name, wire[1])\n self._multi_graph.adj[inp_node][outp_node][0][\"wire\"] \\\n = wire\n else:\n raise DAGCircuitError(\"duplicate wire %s\" % (wire,))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _check_condition(self, name, condition):\n # Verify creg exists\n if condition is not None and condition[0].name not in self.cregs:\n raise DAGCircuitError(\"invalid creg in condition for %s\" % name)", "response": "Verify that the condition is valid."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _check_bits(self, args, amap):\n # Check for each wire\n for wire in args:\n if wire not in amap:\n raise DAGCircuitError(\"(qu)bit %s[%d] not found\" %\n (wire[0].name, wire[1]))", "response": "Check the values of a list of ( qu ) bit arguments."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning a list of bits in the given condition.", "response": "def _bits_in_condition(self, cond):\n \"\"\"Return a list of bits in the given condition.\n\n Args:\n cond (tuple or None): optional condition (ClassicalRegister, int)\n\n Returns:\n list[(ClassicalRegister, idx)]: list of bits\n \"\"\"\n all_bits = []\n if cond is not None:\n all_bits.extend([(cond[0], j) for j in range(self.cregs[cond[0].name].size)])\n return all_bits"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding a new operation node to the multi - graph and assign properties.", "response": "def _add_op_node(self, op, qargs, cargs, condition=None):\n \"\"\"Add a new operation node to the graph and assign properties.\n\n Args:\n op (Instruction): the operation associated with the DAG node\n qargs (list): list of quantum wires to attach to.\n cargs (list): list of classical wires to attach to.\n condition (tuple or None): optional condition (ClassicalRegister, int)\n \"\"\"\n node_properties = {\n \"type\": \"op\",\n \"op\": op,\n \"name\": op.name,\n \"qargs\": qargs,\n \"cargs\": cargs,\n \"condition\": condition\n }\n\n # Add a new operation node to the graph\n self._max_node_id += 1\n new_node = DAGNode(data_dict=node_properties, nid=self._max_node_id)\n self._multi_graph.add_node(new_node)\n self._id_to_node[self._max_node_id] = new_node"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\napply an operation to the output of the circuit.", "response": "def apply_operation_back(self, op, qargs=None, cargs=None, condition=None):\n \"\"\"Apply an operation to the output of the circuit.\n\n Args:\n op (Instruction): the operation associated with the DAG node\n qargs (list[tuple]): qubits that op will be applied to\n cargs (list[tuple]): cbits that op will be applied to\n condition (tuple or None): optional condition (ClassicalRegister, int)\n\n Returns:\n DAGNode: the current max node\n\n Raises:\n DAGCircuitError: if a leaf node is connected to multiple outputs\n\n \"\"\"\n qargs = qargs or []\n cargs = cargs or []\n\n all_cbits = self._bits_in_condition(condition)\n all_cbits.extend(cargs)\n\n self._check_condition(op.name, condition)\n self._check_bits(qargs, self.output_map)\n self._check_bits(all_cbits, self.output_map)\n\n self._add_op_node(op, qargs, cargs, condition)\n\n # Add new in-edges from predecessors of the output nodes to the\n # operation node while deleting the old in-edges of the output nodes\n # and adding new edges from the operation node to each output node\n al = [qargs, all_cbits]\n for q in itertools.chain(*al):\n ie = list(self._multi_graph.predecessors(self.output_map[q]))\n\n if len(ie) != 1:\n raise DAGCircuitError(\"output node has multiple in-edges\")\n\n self._multi_graph.add_edge(ie[0], self._id_to_node[self._max_node_id],\n name=\"%s[%s]\" % (q[0].name, q[1]), wire=q)\n self._multi_graph.remove_edge(ie[0], self.output_map[q])\n self._multi_graph.add_edge(self._id_to_node[self._max_node_id], self.output_map[q],\n name=\"%s[%s]\" % (q[0].name, q[1]), wire=q)\n\n return self._id_to_node[self._max_node_id]"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _check_edgemap_registers(self, edge_map, keyregs, valregs, valreg=True):\n # FIXME: some mixing of objects and strings here are awkward (due to\n # self.qregs/self.cregs still keying on string.\n add_regs = set()\n reg_frag_chk = {}\n for v in keyregs.values():\n reg_frag_chk[v] = {j: False for j in range(len(v))}\n for k in edge_map.keys():\n if k[0].name in keyregs:\n reg_frag_chk[k[0]][k[1]] = True\n for k, v in reg_frag_chk.items():\n s = set(v.values())\n if len(s) == 2:\n raise DAGCircuitError(\"edge_map fragments reg %s\" % k)\n elif s == set([False]):\n if k in self.qregs.values() or k in self.cregs.values():\n raise DAGCircuitError(\"unmapped duplicate reg %s\" % k)\n else:\n # Add registers that appear only in keyregs\n add_regs.add(k)\n else:\n if valreg:\n # If mapping to a register not in valregs, add it.\n # (k,0) exists in edge_map because edge_map doesn't\n # fragment k\n if not edge_map[(k, 0)][0].name in valregs:\n size = max(map(lambda x: x[1],\n filter(lambda x: x[0] == edge_map[(k, 0)][0],\n edge_map.values())))\n qreg = QuantumRegister(size + 1, edge_map[(k, 0)][0].name)\n add_regs.add(qreg)\n return add_regs", "response": "Check that the wiremap does not leave duplicate registers."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks that the wire_map is consistent.", "response": "def _check_wiremap_validity(self, wire_map, keymap, valmap):\n \"\"\"Check that the wiremap is consistent.\n\n Check that the wiremap refers to valid wires and that\n those wires have consistent types.\n\n Args:\n wire_map (dict): map from (register,idx) in keymap to\n (register,idx) in valmap\n keymap (dict): a map whose keys are wire_map keys\n valmap (dict): a map whose keys are wire_map values\n\n Raises:\n DAGCircuitError: if wire_map not valid\n \"\"\"\n for k, v in wire_map.items():\n kname = \"%s[%d]\" % (k[0].name, k[1])\n vname = \"%s[%d]\" % (v[0].name, v[1])\n if k not in keymap:\n raise DAGCircuitError(\"invalid wire mapping key %s\" % kname)\n if v not in valmap:\n raise DAGCircuitError(\"invalid wire mapping value %s\" % vname)\n if type(k) is not type(v):\n raise DAGCircuitError(\"inconsistent wire_map at (%s,%s)\" %\n (kname, vname))"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nmaps the condition tuple to the new condition tuple.", "response": "def _map_condition(self, wire_map, condition):\n \"\"\"Use the wire_map dict to change the condition tuple's creg name.\n\n Args:\n wire_map (dict): a map from wires to wires\n condition (tuple): (ClassicalRegister,int)\n Returns:\n tuple(ClassicalRegister,int): new condition\n \"\"\"\n if condition is None:\n new_condition = None\n else:\n # Map the register name, using fact that registers must not be\n # fragmented by the wire_map (this must have been checked\n # elsewhere)\n bit0 = (condition[0], 0)\n new_condition = (wire_map.get(bit0, bit0)[0], condition[1])\n return new_condition"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef extend_back(self, dag, edge_map=None):\n edge_map = edge_map or {}\n for qreg in dag.qregs.values():\n if qreg.name not in self.qregs:\n self.add_qreg(QuantumRegister(qreg.size, qreg.name))\n edge_map.update([(qbit, qbit) for qbit in qreg if qbit not in edge_map])\n\n for creg in dag.cregs.values():\n if creg.name not in self.cregs:\n self.add_creg(ClassicalRegister(creg.size, creg.name))\n edge_map.update([(cbit, cbit) for cbit in creg if cbit not in edge_map])\n\n self.compose_back(dag, edge_map)", "response": "Add dag at the end of self using edge_map."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef compose_back(self, input_circuit, edge_map=None):\n edge_map = edge_map or {}\n\n # Check the wire map for duplicate values\n if len(set(edge_map.values())) != len(edge_map):\n raise DAGCircuitError(\"duplicates in wire_map\")\n\n add_qregs = self._check_edgemap_registers(edge_map,\n input_circuit.qregs,\n self.qregs)\n for qreg in add_qregs:\n self.add_qreg(qreg)\n\n add_cregs = self._check_edgemap_registers(edge_map,\n input_circuit.cregs,\n self.cregs)\n for creg in add_cregs:\n self.add_creg(creg)\n\n self._check_wiremap_validity(edge_map, input_circuit.input_map,\n self.output_map)\n\n # Compose\n for nd in input_circuit.topological_nodes():\n if nd.type == \"in\":\n # if in wire_map, get new name, else use existing name\n m_wire = edge_map.get(nd.wire, nd.wire)\n # the mapped wire should already exist\n if m_wire not in self.output_map:\n raise DAGCircuitError(\"wire %s[%d] not in self\" % (m_wire[0].name, m_wire[1]))\n\n if nd.wire not in input_circuit.wires:\n raise DAGCircuitError(\"inconsistent wire type for %s[%d] in input_circuit\"\n % (nd.wire[0].name, nd.wire[1]))\n\n elif nd.type == \"out\":\n # ignore output nodes\n pass\n elif nd.type == \"op\":\n condition = self._map_condition(edge_map, nd.condition)\n self._check_condition(nd.name, condition)\n m_qargs = list(map(lambda x: edge_map.get(x, x), nd.qargs))\n m_cargs = list(map(lambda x: edge_map.get(x, x), nd.cargs))\n self.apply_operation_back(nd.op, m_qargs, m_cargs, condition)\n else:\n raise DAGCircuitError(\"bad node type %s\" % nd.type)", "response": "Apply the input circuit to the output of this circuit."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef depth(self):\n if not nx.is_directed_acyclic_graph(self._multi_graph):\n raise DAGCircuitError(\"not a DAG\")\n\n depth = nx.dag_longest_path_length(self._multi_graph) - 1\n return depth if depth != -1 else 0", "response": "Return the depth of the current node in the DAG."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck that a list of wires is compatible with a node.", "response": "def _check_wires_list(self, wires, node):\n \"\"\"Check that a list of wires is compatible with a node to be replaced.\n\n - no duplicate names\n - correct length for operation\n Raise an exception otherwise.\n\n Args:\n wires (list[register, index]): gives an order for (qu)bits\n in the input circuit that is replacing the node.\n node (DAGNode): a node in the dag\n\n Raises:\n DAGCircuitError: if check doesn't pass.\n \"\"\"\n if len(set(wires)) != len(wires):\n raise DAGCircuitError(\"duplicate wires\")\n\n wire_tot = len(node.qargs) + len(node.cargs)\n if node.condition is not None:\n wire_tot += node.condition[0].size\n\n if len(wires) != wire_tot:\n raise DAGCircuitError(\"expected %d wires, got %d\"\n % (wire_tot, len(wires)))"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn predecessor and successor dictionaries.", "response": "def _make_pred_succ_maps(self, node):\n \"\"\"Return predecessor and successor dictionaries.\n\n Args:\n node (DAGNode): reference to multi_graph node\n\n Returns:\n tuple(dict): tuple(predecessor_map, successor_map)\n These map from wire (Register, int) to predecessor (successor)\n nodes of n.\n \"\"\"\n\n pred_map = {e[2]['wire']: e[0] for e in\n self._multi_graph.in_edges(nbunch=node, data=True)}\n succ_map = {e[2]['wire']: e[1] for e in\n self._multi_graph.out_edges(nbunch=node, data=True)}\n return pred_map, succ_map"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _full_pred_succ_maps(self, pred_map, succ_map, input_circuit,\n wire_map):\n \"\"\"Map all wires of the input circuit.\n\n Map all wires of the input circuit to predecessor and\n successor nodes in self, keyed on wires in self.\n\n Args:\n pred_map (dict): comes from _make_pred_succ_maps\n succ_map (dict): comes from _make_pred_succ_maps\n input_circuit (DAGCircuit): the input circuit\n wire_map (dict): the map from wires of input_circuit to wires of self\n\n Returns:\n tuple: full_pred_map, full_succ_map (dict, dict)\n\n Raises:\n DAGCircuitError: if more than one predecessor for output nodes\n \"\"\"\n full_pred_map = {}\n full_succ_map = {}\n for w in input_circuit.input_map:\n # If w is wire mapped, find the corresponding predecessor\n # of the node\n if w in wire_map:\n full_pred_map[wire_map[w]] = pred_map[wire_map[w]]\n full_succ_map[wire_map[w]] = succ_map[wire_map[w]]\n else:\n # Otherwise, use the corresponding output nodes of self\n # and compute the predecessor.\n full_succ_map[w] = self.output_map[w]\n full_pred_map[w] = self._multi_graph.predecessors(\n self.output_map[w])[0]\n if len(list(self._multi_graph.predecessors(self.output_map[w]))) != 1:\n raise DAGCircuitError(\"too many predecessors for %s[%d] \"\n \"output node\" % (w[0], w[1]))\n\n return full_pred_map, full_succ_map", "response": "Map all wires of the input circuit to predecessor and successor nodes in self and compute the full predecessors of the output nodes."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef topological_nodes(self):\n return nx.lexicographical_topological_sort(self._multi_graph,\n key=lambda x: str(x.qargs))", "response": "Yield nodes in topological order."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsubstitutes one node with a dag.", "response": "def substitute_node_with_dag(self, node, input_dag, wires=None):\n \"\"\"Replace one node with dag.\n\n Args:\n node (DAGNode): node to substitute\n input_dag (DAGCircuit): circuit that will substitute the node\n wires (list[(Register, index)]): gives an order for (qu)bits\n in the input circuit. This order gets matched to the node wires\n by qargs first, then cargs, then conditions.\n\n Raises:\n DAGCircuitError: if met with unexpected predecessor/successors\n \"\"\"\n if isinstance(node, int):\n warnings.warn('Calling substitute_node_with_dag() with a node id is deprecated,'\n ' use a DAGNode instead',\n DeprecationWarning, 2)\n\n node = self._id_to_node[node]\n\n condition = node.condition\n # the dag must be ammended if used in a\n # conditional context. delete the op nodes and replay\n # them with the condition.\n if condition:\n input_dag.add_creg(condition[0])\n to_replay = []\n for sorted_node in input_dag.topological_nodes():\n if sorted_node.type == \"op\":\n sorted_node.op.control = condition\n to_replay.append(sorted_node)\n for input_node in input_dag.op_nodes():\n input_dag.remove_op_node(input_node)\n for replay_node in to_replay:\n input_dag.apply_operation_back(replay_node.op, replay_node.qargs,\n replay_node.cargs, condition=condition)\n\n if wires is None:\n qwires = [w for w in input_dag.wires if isinstance(w[0], QuantumRegister)]\n cwires = [w for w in input_dag.wires if isinstance(w[0], ClassicalRegister)]\n wires = qwires + cwires\n\n self._check_wires_list(wires, node)\n\n # Create a proxy wire_map to identify fragments and duplicates\n # and determine what registers need to be added to self\n proxy_map = {w: QuantumRegister(1, 'proxy') for w in wires}\n add_qregs = self._check_edgemap_registers(proxy_map,\n input_dag.qregs,\n {}, False)\n for qreg in add_qregs:\n self.add_qreg(qreg)\n\n add_cregs = self._check_edgemap_registers(proxy_map,\n input_dag.cregs,\n {}, False)\n for creg in add_cregs:\n self.add_creg(creg)\n\n # Replace the node by iterating through the input_circuit.\n # Constructing and checking the validity of the wire_map.\n # If a gate is conditioned, we expect the replacement subcircuit\n # to depend on those control bits as well.\n if node.type != \"op\":\n raise DAGCircuitError(\"expected node type \\\"op\\\", got %s\"\n % node.type)\n\n condition_bit_list = self._bits_in_condition(node.condition)\n\n wire_map = {k: v for k, v in zip(wires,\n [i for s in [node.qargs,\n node.cargs,\n condition_bit_list]\n for i in s])}\n self._check_wiremap_validity(wire_map, wires, self.input_map)\n pred_map, succ_map = self._make_pred_succ_maps(node)\n full_pred_map, full_succ_map = self._full_pred_succ_maps(pred_map, succ_map,\n input_dag, wire_map)\n # Now that we know the connections, delete node\n self._multi_graph.remove_node(node)\n\n # Iterate over nodes of input_circuit\n for sorted_node in input_dag.topological_op_nodes():\n # Insert a new node\n condition = self._map_condition(wire_map, sorted_node.condition)\n m_qargs = list(map(lambda x: wire_map.get(x, x),\n sorted_node.qargs))\n m_cargs = list(map(lambda x: wire_map.get(x, x),\n sorted_node.cargs))\n self._add_op_node(sorted_node.op, m_qargs, m_cargs, condition)\n # Add edges from predecessor nodes to new node\n # and update predecessor nodes that change\n all_cbits = self._bits_in_condition(condition)\n all_cbits.extend(m_cargs)\n al = [m_qargs, all_cbits]\n for q in itertools.chain(*al):\n self._multi_graph.add_edge(full_pred_map[q],\n self._id_to_node[self._max_node_id],\n name=\"%s[%s]\" % (q[0].name, q[1]),\n wire=q)\n full_pred_map[q] = self._id_to_node[self._max_node_id]\n\n # Connect all predecessors and successors, and remove\n # residual edges between input and output nodes\n for w in full_pred_map:\n self._multi_graph.add_edge(full_pred_map[w],\n full_succ_map[w],\n name=\"%s[%s]\" % (w[0].name, w[1]),\n wire=w)\n o_pred = list(self._multi_graph.predecessors(self.output_map[w]))\n if len(o_pred) > 1:\n if len(o_pred) != 2:\n raise DAGCircuitError(\"expected 2 predecessors here\")\n\n p = [x for x in o_pred if x != full_pred_map[w]]\n if len(p) != 1:\n raise DAGCircuitError(\"expected 1 predecessor to pass filter\")\n\n self._multi_graph.remove_edge(p[0], self.output_map[w])"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef edges(self, nodes=None):\n for source_node, dest_node, edge_data in self._multi_graph.edges(nodes, data=True):\n yield source_node, dest_node, edge_data", "response": "Iterate over edges of the node values."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef op_nodes(self, op=None):\n nodes = []\n for node in self._multi_graph.nodes():\n if node.type == \"op\":\n if op is None or isinstance(node.op, op):\n nodes.append(node)\n return nodes", "response": "Returns the list of op nodes in the dag."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets the list of gate nodes in the dag.", "response": "def gate_nodes(self):\n \"\"\"Get the list of gate nodes in the dag.\n\n Returns:\n list: the list of node ids that represent gates.\n \"\"\"\n nodes = []\n for node in self.op_nodes():\n if isinstance(node.op, Gate):\n nodes.append(node)\n return nodes"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_named_nodes(self, *names):\n warnings.warn('The method get_named_nodes() is being replaced by named_nodes()',\n 'Returning a list of node_ids is also deprecated, named_nodes() '\n 'returns a list of DAGNodes ',\n DeprecationWarning, 2)\n\n named_nodes = []\n for node in self._multi_graph.nodes():\n if node.type == 'op' and node.op.name in names:\n named_nodes.append(node._node_id)\n return named_nodes", "response": "Deprecated. Use named_nodes instead."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets the set of op nodes with the given name.", "response": "def named_nodes(self, *names):\n \"\"\"Get the set of \"op\" nodes with the given name.\"\"\"\n named_nodes = []\n for node in self._multi_graph.nodes():\n if node.type == 'op' and node.op.name in names:\n named_nodes.append(node)\n return named_nodes"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_2q_nodes(self):\n warnings.warn('The method get_2q_nodes() is being replaced by twoQ_gates()',\n 'Returning a list of data_dicts is also deprecated, twoQ_gates() '\n 'returns a list of DAGNodes.',\n DeprecationWarning, 2)\n\n two_q_nodes = []\n for node in self._multi_graph.nodes():\n if node.type == 'op' and len(node.qargs) == 2:\n two_q_nodes.append(node.data_dict)\n\n return two_q_nodes", "response": "Deprecated. Use twoQ_gates() to get a list of data_dicts."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef twoQ_gates(self):\n two_q_gates = []\n for node in self.gate_nodes():\n if len(node.qargs) == 2:\n two_q_gates.append(node)\n return two_q_gates", "response": "Get list of 2 - qubit gates. Ignore snapshot barriers and the like."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget list of 3 - or - more - qubit gates.", "response": "def threeQ_or_more_gates(self):\n \"\"\"Get list of 3-or-more-qubit gates: (id, data).\"\"\"\n three_q_gates = []\n for node in self.gate_nodes():\n if len(node.qargs) >= 3:\n three_q_gates.append(node)\n return three_q_gates"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef predecessors(self, node):\n if isinstance(node, int):\n warnings.warn('Calling predecessors() with a node id is deprecated,'\n ' use a DAGNode instead',\n DeprecationWarning, 2)\n node = self._id_to_node[node]\n\n return self._multi_graph.predecessors(node)", "response": "Returns the list of predecessors of a node as DAGNodes."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef quantum_predecessors(self, node):\n\n predecessors = []\n for predecessor in self.predecessors(node):\n if isinstance(self._multi_graph.get_edge_data(predecessor, node, key=0)['wire'][0],\n QuantumRegister):\n predecessors.append(predecessor)\n return predecessors", "response": "Returns the list of predecessors of a node that are connected by a quantum edge as DAGNodes."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef ancestors(self, node):\n if isinstance(node, int):\n warnings.warn('Calling ancestors() with a node id is deprecated,'\n ' use a DAGNode instead',\n DeprecationWarning, 2)\n node = self._id_to_node[node]\n\n return nx.ancestors(self._multi_graph, node)", "response": "Returns the set of ancestors of a node as DAGNodes."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns the list of successors of a node that are connected by a quantum edge as DAGNodes.", "response": "def quantum_successors(self, node):\n \"\"\"Returns list of the successors of a node that are\n connected by a quantum edge as DAGNodes.\"\"\"\n if isinstance(node, int):\n warnings.warn('Calling quantum_successors() with a node id is deprecated,'\n ' use a DAGNode instead',\n DeprecationWarning, 2)\n node = self._id_to_node[node]\n\n successors = []\n for successor in self.successors(node):\n if isinstance(self._multi_graph.get_edge_data(\n node, successor, key=0)['wire'][0],\n QuantumRegister):\n successors.append(successor)\n return successors"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef remove_op_node(self, node):\n if isinstance(node, int):\n warnings.warn('Calling remove_op_node() with a node id is deprecated,'\n ' use a DAGNode instead',\n DeprecationWarning, 2)\n node = self._id_to_node[node]\n\n if node.type != 'op':\n raise DAGCircuitError('The method remove_op_node only works on op node types. An \"%s\" '\n 'node type was wrongly provided.' % node.type)\n\n pred_map, succ_map = self._make_pred_succ_maps(node)\n\n # remove from graph and map\n self._multi_graph.remove_node(node)\n\n for w in pred_map.keys():\n self._multi_graph.add_edge(pred_map[w], succ_map[w],\n name=\"%s[%s]\" % (w[0].name, w[1]), wire=w)", "response": "Remove an operation node n."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef remove_ancestors_of(self, node):\n if isinstance(node, int):\n warnings.warn('Calling remove_ancestors_of() with a node id is deprecated,'\n ' use a DAGNode instead',\n DeprecationWarning, 2)\n node = self._id_to_node[node]\n\n anc = nx.ancestors(self._multi_graph, node)\n # TODO: probably better to do all at once using\n # multi_graph.remove_nodes_from; same for related functions ...\n for anc_node in anc:\n if anc_node.type == \"op\":\n self.remove_op_node(anc_node)", "response": "Remove all of the ancestor operation nodes of node."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef remove_descendants_of(self, node):\n if isinstance(node, int):\n warnings.warn('Calling remove_descendants_of() with a node id is deprecated,'\n ' use a DAGNode instead',\n DeprecationWarning, 2)\n node = self._id_to_node[node]\n\n desc = nx.descendants(self._multi_graph, node)\n for desc_node in desc:\n if desc_node.type == \"op\":\n self.remove_op_node(desc_node)", "response": "Remove all of the descendant operation nodes of node."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef remove_nonancestors_of(self, node):\n if isinstance(node, int):\n warnings.warn('Calling remove_nonancestors_of() with a node id is deprecated,'\n ' use a DAGNode instead',\n DeprecationWarning, 2)\n node = self._id_to_node[node]\n\n anc = nx.ancestors(self._multi_graph, node)\n comp = list(set(self._multi_graph.nodes()) - set(anc))\n for n in comp:\n if n.type == \"op\":\n self.remove_op_node(n)", "response": "Remove all of the non - ancestors operation nodes of node."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nremoving all of the non - descendant operation nodes of node.", "response": "def remove_nondescendants_of(self, node):\n \"\"\"Remove all of the non-descendants operation nodes of node.\"\"\"\n if isinstance(node, int):\n warnings.warn('Calling remove_nondescendants_of() with a node id is deprecated,'\n ' use a DAGNode instead',\n DeprecationWarning, 2)\n node = self._id_to_node[node]\n\n dec = nx.descendants(self._multi_graph, node)\n comp = list(set(self._multi_graph.nodes()) - set(dec))\n for n in comp:\n if n.type == \"op\":\n self.remove_op_node(n)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef layers(self):\n graph_layers = self.multigraph_layers()\n try:\n next(graph_layers) # Remove input nodes\n except StopIteration:\n return\n\n def add_nodes_from(layer, nodes):\n \"\"\" Convert DAGNodes into a format that can be added to a\n multigraph and then add to graph\"\"\"\n layer._multi_graph.add_nodes_from(nodes)\n\n for graph_layer in graph_layers:\n\n # Get the op nodes from the layer, removing any input and output nodes.\n op_nodes = [node for node in graph_layer if node.type == \"op\"]\n\n # Stop yielding once there are no more op_nodes in a layer.\n if not op_nodes:\n return\n\n # Construct a shallow copy of self\n new_layer = DAGCircuit()\n new_layer.name = self.name\n\n for creg in self.cregs.values():\n new_layer.add_creg(creg)\n for qreg in self.qregs.values():\n new_layer.add_qreg(qreg)\n\n add_nodes_from(new_layer, self.input_map.values())\n add_nodes_from(new_layer, self.output_map.values())\n add_nodes_from(new_layer, op_nodes)\n\n # The quantum registers that have an operation in this layer.\n support_list = [\n op_node.qargs\n for op_node in op_nodes\n if op_node.name not in {\"barrier\", \"snapshot\", \"save\", \"load\", \"noise\"}\n ]\n\n # Now add the edges to the multi_graph\n # By default we just wire inputs to the outputs.\n wires = {self.input_map[wire]: self.output_map[wire]\n for wire in self.wires}\n # Wire inputs to op nodes, and op nodes to outputs.\n for op_node in op_nodes:\n args = self._bits_in_condition(op_node.condition) \\\n + op_node.cargs + op_node.qargs\n arg_ids = (self.input_map[(arg[0], arg[1])] for arg in args)\n for arg_id in arg_ids:\n wires[arg_id], wires[op_node] = op_node, wires[arg_id]\n\n # Add wiring to/from the operations and between unused inputs & outputs.\n new_layer._multi_graph.add_edges_from(wires.items())\n yield {\"graph\": new_layer, \"partition\": support_list}", "response": "Yields a shallow view on all d layers of this circuit."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef serial_layers(self):\n for next_node in self.topological_op_nodes():\n new_layer = DAGCircuit()\n for qreg in self.qregs.values():\n new_layer.add_qreg(qreg)\n for creg in self.cregs.values():\n new_layer.add_creg(creg)\n # Save the support of the operation we add to the layer\n support_list = []\n # Operation data\n op = copy.copy(next_node.op)\n qa = copy.copy(next_node.qargs)\n ca = copy.copy(next_node.cargs)\n co = copy.copy(next_node.condition)\n _ = self._bits_in_condition(co)\n\n # Add node to new_layer\n new_layer.apply_operation_back(op, qa, ca, co)\n # Add operation to partition\n if next_node.name not in [\"barrier\",\n \"snapshot\", \"save\", \"load\", \"noise\"]:\n support_list.append(list(qa))\n l_dict = {\"graph\": new_layer, \"partition\": support_list}\n yield l_dict", "response": "Yields a list of all serial layers for all gates of this circuit."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef multigraph_layers(self):\n predecessor_count = dict() # Dict[node, predecessors not visited]\n cur_layer = [node for node in self.input_map.values()]\n yield cur_layer\n next_layer = []\n while cur_layer:\n for node in cur_layer:\n # Count multiedges with multiplicity.\n for successor in self._multi_graph.successors(node):\n multiplicity = self._multi_graph.number_of_edges(node, successor)\n if successor in predecessor_count:\n predecessor_count[successor] -= multiplicity\n else:\n predecessor_count[successor] = \\\n self._multi_graph.in_degree(successor) - multiplicity\n\n if predecessor_count[successor] == 0:\n next_layer.append(successor)\n del predecessor_count[successor]\n\n yield next_layer\n cur_layer = next_layer\n next_layer = []", "response": "Yields all layers of the multigraph."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef collect_runs(self, namelist):\n group_list = []\n\n # Iterate through the nodes of self in topological order\n # and form tuples containing sequences of gates\n # on the same qubit(s).\n topo_ops = list(self.topological_op_nodes())\n nodes_seen = dict(zip(topo_ops, [False] * len(topo_ops)))\n for node in topo_ops:\n if node.name in namelist and node.condition is None \\\n and not nodes_seen[node]:\n group = [node]\n nodes_seen[node] = True\n s = list(self._multi_graph.successors(node))\n while len(s) == 1 and \\\n s[0].type == \"op\" and \\\n s[0].name in namelist:\n group.append(s[0])\n nodes_seen[s[0]] = True\n s = list(self._multi_graph.successors(s[0]))\n if len(group) >= 1:\n group_list.append(tuple(group))\n return set(group_list)", "response": "Return a set of non - conditional runs of op nodes with the given names."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef nodes_on_wire(self, wire, only_ops=False):\n current_node = self.input_map.get(wire, None)\n\n if not current_node:\n raise DAGCircuitError('The given wire %s is not present in the circuit'\n % str(wire))\n\n more_nodes = True\n while more_nodes:\n more_nodes = False\n # allow user to just get ops on the wire - not the input/output nodes\n if current_node.type == 'op' or not only_ops:\n yield current_node\n\n # find the adjacent node that takes the wire being looked at as input\n for node, edges in self._multi_graph.adj[current_node].items():\n if any(wire == edge['wire'] for edge in edges.values()):\n current_node = node\n more_nodes = True\n break", "response": "Iterator for nodes that affect a given wire."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef count_ops(self):\n op_dict = {}\n for node in self.topological_op_nodes():\n name = node.name\n if name not in op_dict:\n op_dict[name] = 1\n else:\n op_dict[name] += 1\n return op_dict", "response": "Count the occurrences of operation names."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef properties(self):\n summary = {\"size\": self.size(),\n \"depth\": self.depth(),\n \"width\": self.width(),\n \"bits\": self.num_cbits(),\n \"factors\": self.num_tensor_factors(),\n \"operations\": self.count_ops()}\n return summary", "response": "Return a dictionary of circuit properties."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef tomography_basis(basis, prep_fun=None, meas_fun=None):\n ret = TomographyBasis(basis)\n ret.prep_fun = prep_fun\n ret.meas_fun = meas_fun\n return ret", "response": "Generates a TomographyBasis object from a list of functions."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef __pauli_prep_gates(circuit, qreg, op):\n bas, proj = op\n if bas not in ['X', 'Y', 'Z']:\n raise QiskitError(\"There's no X, Y or Z basis for this Pauli \"\n \"preparation\")\n\n if bas == \"X\":\n if proj == 1:\n circuit.u2(np.pi, np.pi, qreg) # H.X\n else:\n circuit.u2(0., np.pi, qreg) # H\n elif bas == \"Y\":\n if proj == 1:\n circuit.u2(-0.5 * np.pi, np.pi, qreg) # S.H.X\n else:\n circuit.u2(0.5 * np.pi, np.pi, qreg) # S.H\n elif bas == \"Z\" and proj == 1:\n circuit.u3(np.pi, 0., np.pi, qreg)", "response": "Add state preparation gates to a circuit."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef __pauli_meas_gates(circuit, qreg, op):\n if op not in ['X', 'Y', 'Z']:\n raise QiskitError(\"There's no X, Y or Z basis for this Pauli \"\n \"measurement\")\n\n if op == \"X\":\n circuit.u2(0., np.pi, qreg) # H\n elif op == \"Y\":\n circuit.u2(0., 0.5 * np.pi, qreg)", "response": "Add state measurement gates to a circuit."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nadd state preparation gates to a SIC.", "response": "def __sic_prep_gates(circuit, qreg, op):\n \"\"\"\n Add state preparation gates to a circuit.\n \"\"\"\n bas, proj = op\n\n if bas != 'S':\n raise QiskitError('Not in SIC basis!')\n\n theta = -2 * np.arctan(np.sqrt(2))\n if proj == 1:\n circuit.u3(theta, np.pi, 0.0, qreg)\n elif proj == 2:\n circuit.u3(theta, np.pi / 3, 0.0, qreg)\n elif proj == 3:\n circuit.u3(theta, -np.pi / 3, 0.0, qreg)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngenerates a dictionary of process tomography experiment configurations. This returns a data structure that is used by other tomography functions to generate state and process tomography circuits, and extract tomography data from results after execution on a backend. A quantum process tomography set is created by specifying a preparation basis along with a measurement basis. The preparation basis may be a user defined `tomography_basis`, or one of the two built in basis 'SIC' or 'Pauli'. - SIC: Is a minimal symmetric informationally complete preparation basis for 4 states for each qubit (4 ^ number of qubits total preparation states). These correspond to the |0> state and the 3 other vertices of a tetrahedron on the Bloch-sphere. - Pauli: Is a tomographically overcomplete preparation basis of the six eigenstates of the 3 Pauli operators (6 ^ number of qubits total preparation states). Args: meas_qubits (list): The qubits being measured. meas_basis (tomography_basis or str): The qubit measurement basis. The default value is 'Pauli'. prep_qubits (list or None): The qubits being prepared. If None then meas_qubits will be used for process tomography experiments. prep_basis (tomography_basis or str): The qubit preparation basis. The default value is 'SIC'. Returns: dict: A dict of tomography configurations that can be parsed by `create_tomography_circuits` and `tomography_data` functions for implementing quantum tomography experiments. This output contains fields \"qubits\", \"meas_basis\", \"prep_basus\", circuits\". ``` { 'qubits': qubits (list[ints]), 'meas_basis': meas_basis (tomography_basis), 'prep_basis': prep_basis (tomography_basis), 'circuit_labels': (list[string]), 'circuits': (list[dict]) # prep and meas configurations } ```", "response": "def process_tomography_set(meas_qubits, meas_basis='Pauli',\n prep_qubits=None, prep_basis='SIC'):\n \"\"\"\n Generate a dictionary of process tomography experiment configurations.\n\n This returns a data structure that is used by other tomography functions\n to generate state and process tomography circuits, and extract tomography\n data from results after execution on a backend.\n\n A quantum process tomography set is created by specifying a preparation\n basis along with a measurement basis. The preparation basis may be a\n user defined `tomography_basis`, or one of the two built in basis 'SIC'\n or 'Pauli'.\n - SIC: Is a minimal symmetric informationally complete preparation\n basis for 4 states for each qubit (4 ^ number of qubits total\n preparation states). These correspond to the |0> state and the 3\n other vertices of a tetrahedron on the Bloch-sphere.\n - Pauli: Is a tomographically overcomplete preparation basis of the six\n eigenstates of the 3 Pauli operators (6 ^ number of qubits\n total preparation states).\n\n Args:\n meas_qubits (list): The qubits being measured.\n meas_basis (tomography_basis or str): The qubit measurement basis.\n The default value is 'Pauli'.\n prep_qubits (list or None): The qubits being prepared. If None then\n meas_qubits will be used for process tomography experiments.\n prep_basis (tomography_basis or str): The qubit preparation basis.\n The default value is 'SIC'.\n\n Returns:\n dict: A dict of tomography configurations that can be parsed by\n `create_tomography_circuits` and `tomography_data` functions\n for implementing quantum tomography experiments. This output contains\n fields \"qubits\", \"meas_basis\", \"prep_basus\", circuits\".\n ```\n {\n 'qubits': qubits (list[ints]),\n 'meas_basis': meas_basis (tomography_basis),\n 'prep_basis': prep_basis (tomography_basis),\n 'circuit_labels': (list[string]),\n 'circuits': (list[dict]) # prep and meas configurations\n }\n ```\n \"\"\"\n return tomography_set(meas_qubits, meas_basis=meas_basis,\n prep_qubits=prep_qubits, prep_basis=prep_basis)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nadd tomography measurement circuits to a QuantumProgram. The quantum program must contain a circuit 'name', which is treated as a state preparation circuit for state tomography, or as teh circuit being measured for process tomography. This function then appends the circuit with a set of measurements specified by the input `tomography_set`, optionally it also prepends the circuit with state preparation circuits if they are specified in the `tomography_set`. For n-qubit tomography with a tomographically complete set of preparations and measurements this results in $4^n 3^n$ circuits being added to the quantum program. Args: circuit (QuantumCircuit): The circuit to be appended with tomography state preparation and/or measurements. qreg (QuantumRegister): the quantum register containing qubits to be measured. creg (ClassicalRegister): the classical register containing bits to store measurement outcomes. tomoset (tomography_set): the dict of tomography configurations. Returns: list: A list of quantum tomography circuits for the input circuit. Raises: QiskitError: if circuit is not a valid QuantumCircuit Example: For a tomography set specifying state tomography of qubit-0 prepared by a circuit 'circ' this would return: ``` ['circ_meas_X(0)', 'circ_meas_Y(0)', 'circ_meas_Z(0)'] ``` For process tomography of the same circuit with preparation in the SIC-POVM basis it would return: ``` [ 'circ_prep_S0(0)_meas_X(0)', 'circ_prep_S0(0)_meas_Y(0)', 'circ_prep_S0(0)_meas_Z(0)', 'circ_prep_S1(0)_meas_X(0)', 'circ_prep_S1(0)_meas_Y(0)', 'circ_prep_S1(0)_meas_Z(0)', 'circ_prep_S2(0)_meas_X(0)', 'circ_prep_S2(0)_meas_Y(0)', 'circ_prep_S2(0)_meas_Z(0)', 'circ_prep_S3(0)_meas_X(0)', 'circ_prep_S3(0)_meas_Y(0)', 'circ_prep_S3(0)_meas_Z(0)' ] ```", "response": "def create_tomography_circuits(circuit, qreg, creg, tomoset):\n \"\"\"\n Add tomography measurement circuits to a QuantumProgram.\n\n The quantum program must contain a circuit 'name', which is treated as a\n state preparation circuit for state tomography, or as teh circuit being\n measured for process tomography. This function then appends the circuit\n with a set of measurements specified by the input `tomography_set`,\n optionally it also prepends the circuit with state preparation circuits if\n they are specified in the `tomography_set`.\n\n For n-qubit tomography with a tomographically complete set of preparations\n and measurements this results in $4^n 3^n$ circuits being added to the\n quantum program.\n\n Args:\n circuit (QuantumCircuit): The circuit to be appended with tomography\n state preparation and/or measurements.\n qreg (QuantumRegister): the quantum register containing qubits to be\n measured.\n creg (ClassicalRegister): the classical register containing bits to\n store measurement outcomes.\n tomoset (tomography_set): the dict of tomography configurations.\n\n Returns:\n list: A list of quantum tomography circuits for the input circuit.\n\n Raises:\n QiskitError: if circuit is not a valid QuantumCircuit\n\n Example:\n For a tomography set specifying state tomography of qubit-0 prepared\n by a circuit 'circ' this would return:\n ```\n ['circ_meas_X(0)', 'circ_meas_Y(0)', 'circ_meas_Z(0)']\n ```\n For process tomography of the same circuit with preparation in the\n SIC-POVM basis it would return:\n ```\n [\n 'circ_prep_S0(0)_meas_X(0)', 'circ_prep_S0(0)_meas_Y(0)',\n 'circ_prep_S0(0)_meas_Z(0)', 'circ_prep_S1(0)_meas_X(0)',\n 'circ_prep_S1(0)_meas_Y(0)', 'circ_prep_S1(0)_meas_Z(0)',\n 'circ_prep_S2(0)_meas_X(0)', 'circ_prep_S2(0)_meas_Y(0)',\n 'circ_prep_S2(0)_meas_Z(0)', 'circ_prep_S3(0)_meas_X(0)',\n 'circ_prep_S3(0)_meas_Y(0)', 'circ_prep_S3(0)_meas_Z(0)'\n ]\n ```\n \"\"\"\n\n if not isinstance(circuit, QuantumCircuit):\n raise QiskitError('Input circuit must be a QuantumCircuit object')\n\n dics = tomoset['circuits']\n labels = tomography_circuit_names(tomoset, circuit.name)\n tomography_circuits = []\n for label, conf in zip(labels, dics):\n tmp = circuit\n # Add prep circuits\n if 'prep' in conf:\n prep = QuantumCircuit(qreg, creg, name='tmp_prep')\n for qubit, op in conf['prep'].items():\n tomoset['prep_basis'].prep_gate(prep, qreg[qubit], op)\n prep.barrier(qreg[qubit])\n tmp = prep + tmp\n # Add measurement circuits\n meas = QuantumCircuit(qreg, creg, name='tmp_meas')\n for qubit, op in conf['meas'].items():\n meas.barrier(qreg[qubit])\n tomoset['meas_basis'].meas_gate(meas, qreg[qubit], op)\n meas.measure(qreg[qubit], creg[qubit])\n tmp = tmp + meas\n # Add label to the circuit\n tmp.name = label\n tomography_circuits.append(tmp)\n\n logger.info('>> created tomography circuits for \"%s\"', circuit.name)\n return tomography_circuits"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns a dictionary for a state or process tomography experiment.", "response": "def tomography_data(results, name, tomoset):\n \"\"\"\n Return a results dict for a state or process tomography experiment.\n\n Args:\n results (Result): Results from execution of a process tomography\n circuits on a backend.\n name (string): The name of the circuit being reconstructed.\n tomoset (tomography_set): the dict of tomography configurations.\n\n Returns:\n list: A list of dicts for the outcome of each process tomography\n measurement circuit.\n \"\"\"\n\n labels = tomography_circuit_names(tomoset, name)\n circuits = tomoset['circuits']\n data = []\n prep = None\n for j, _ in enumerate(labels):\n counts = marginal_counts(results.get_counts(labels[j]),\n tomoset['qubits'])\n shots = sum(counts.values())\n meas = circuits[j]['meas']\n prep = circuits[j].get('prep', None)\n meas_qubits = sorted(meas.keys())\n if prep:\n prep_qubits = sorted(prep.keys())\n circuit = {}\n for c in counts.keys():\n circuit[c] = {}\n circuit[c]['meas'] = [(meas[meas_qubits[k]], int(c[-1 - k]))\n for k in range(len(meas_qubits))]\n if prep:\n circuit[c]['prep'] = [prep[prep_qubits[k]]\n for k in range(len(prep_qubits))]\n data.append({'counts': counts, 'shots': shots, 'circuit': circuit})\n\n ret = {'data': data, 'meas_basis': tomoset['meas_basis']}\n if prep:\n ret['prep_basis'] = tomoset['prep_basis']\n return ret"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncomputing the marginal counts for a subset of measured qubits.", "response": "def marginal_counts(counts, meas_qubits):\n \"\"\"\n Compute the marginal counts for a subset of measured qubits.\n\n Args:\n counts (dict): the counts returned from a backend ({str: int}).\n meas_qubits (list[int]): the qubits to return the marginal\n counts distribution for.\n\n Returns:\n dict: A counts dict for the meas_qubits.abs\n Example: if `counts = {'00': 10, '01': 5}`\n `marginal_counts(counts, [0])` returns `{'0': 15, '1': 0}`.\n `marginal_counts(counts, [0])` returns `{'0': 10, '1': 5}`.\n \"\"\"\n # pylint: disable=cell-var-from-loop\n # Extract total number of qubits from count keys\n num_of_qubits = len(list(counts.keys())[0])\n\n # keys for measured qubits only\n qs = sorted(meas_qubits, reverse=True)\n\n meas_keys = count_keys(len(qs))\n\n # get regex match strings for summing outcomes of other qubits\n rgx = [\n reduce(lambda x, y: (key[qs.index(y)] if y in qs else '\\\\d') + x,\n range(num_of_qubits), '') for key in meas_keys\n ]\n\n # build the return list\n meas_counts = []\n for m in rgx:\n c = 0\n for key, val in counts.items():\n if match(m, key):\n c += val\n meas_counts.append(c)\n\n # return as counts dict on measured qubits only\n return dict(zip(meas_keys, meas_counts))"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfits a Choi - matrix or process - matrix from tomography measurement data.", "response": "def fit_tomography_data(tomo_data, method='wizard', options=None):\n \"\"\"\n Reconstruct a density matrix or process-matrix from tomography data.\n\n If the input data is state_tomography_data the returned operator will\n be a density matrix. If the input data is process_tomography_data the\n returned operator will be a Choi-matrix in the column-vectorization\n convention.\n\n Args:\n tomo_data (dict): process tomography measurement data.\n method (str): the fitting method to use.\n Available methods:\n - 'wizard' (default)\n - 'leastsq'\n options (dict or None): additional options for fitting method.\n\n Returns:\n numpy.array: The fitted operator.\n\n Available methods:\n - 'wizard' (Default): The returned operator will be constrained to be\n positive-semidefinite.\n Options:\n - 'trace': the trace of the returned operator.\n The default value is 1.\n - 'beta': hedging parameter for computing frequencies from\n zero-count data. The default value is 0.50922.\n - 'epsilon: threshold for truncating small eigenvalues to zero.\n The default value is 0\n - 'leastsq': Fitting without positive-semidefinite constraint.\n Options:\n - 'trace': Same as for 'wizard' method.\n - 'beta': Same as for 'wizard' method.\n Raises:\n Exception: if the `method` parameter is not valid.\n \"\"\"\n\n if isinstance(method, str) and method.lower() in ['wizard', 'leastsq']:\n # get options\n trace = __get_option('trace', options)\n beta = __get_option('beta', options)\n # fit state\n rho = __leastsq_fit(tomo_data, trace=trace, beta=beta)\n if method == 'wizard':\n # Use wizard method to constrain positivity\n epsilon = __get_option('epsilon', options)\n rho = __wizard(rho, epsilon=epsilon)\n return rho\n else:\n raise Exception('Invalid reconstruction method \"%s\"' % method)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef __leastsq_fit(tomo_data, weights=None, trace=None, beta=None):\n if trace is None:\n trace = 1. # default to unit trace\n\n data = tomo_data['data']\n keys = data[0]['circuit'].keys()\n\n # Get counts and shots\n counts = []\n shots = []\n ops = []\n for dat in data:\n for key in keys:\n counts.append(dat['counts'][key])\n shots.append(dat['shots'])\n projectors = dat['circuit'][key]\n op = __projector(projectors['meas'], tomo_data['meas_basis'])\n if 'prep' in projectors:\n op_prep = __projector(projectors['prep'],\n tomo_data['prep_basis'])\n op = np.kron(op_prep.conj(), op)\n ops.append(op)\n\n # Convert counts to frequencies\n counts = np.array(counts)\n shots = np.array(shots)\n freqs = counts / shots\n\n # Use hedged frequencies to calculate least squares fitting weights\n if weights is None:\n if beta is None:\n beta = 0.50922\n K = len(keys)\n freqs_hedged = (counts + beta) / (shots + K * beta)\n weights = np.sqrt(shots / (freqs_hedged * (1 - freqs_hedged)))\n\n return __tomo_linear_inv(freqs, ops, weights, trace=trace)", "response": "Reconstruct a state from unconstrained least - squares fitting."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef __tomo_linear_inv(freqs, ops, weights=None, trace=None):\n # get weights matrix\n if weights is not None:\n W = np.array(weights)\n if W.ndim == 1:\n W = np.diag(W)\n\n # Get basis S matrix\n S = np.array([vectorize(m).conj()\n for m in ops]).reshape(len(ops), ops[0].size)\n if weights is not None:\n S = np.dot(W, S) # W.S\n\n # get frequencies vec\n v = np.array(freqs) # |f>\n if weights is not None:\n v = np.dot(W, freqs) # W.|f>\n Sdg = S.T.conj() # S^*.W^*\n inv = np.linalg.pinv(np.dot(Sdg, S)) # (S^*.W^*.W.S)^-1\n\n # linear inversion of freqs\n ret = devectorize(np.dot(inv, np.dot(Sdg, v)))\n # renormalize to input trace value\n if trace is not None:\n ret = trace * ret / np.trace(ret)\n return ret", "response": "Reconstruct a matrix through linear inversion of the given freqs and ops."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning the nearest positive semidefinite operator to an operator. This method is based on reference [1]. It constrains positivity by setting negative eigenvalues to zero and rescaling the positive eigenvalues. Args: rho (array_like): the input operator. epsilon(float or None): threshold (>=0) for truncating small eigenvalues values to zero. Returns: numpy.array: A positive semidefinite numpy array.", "response": "def __wizard(rho, epsilon=None):\n \"\"\"\n Returns the nearest positive semidefinite operator to an operator.\n\n This method is based on reference [1]. It constrains positivity\n by setting negative eigenvalues to zero and rescaling the positive\n eigenvalues.\n\n Args:\n rho (array_like): the input operator.\n epsilon(float or None): threshold (>=0) for truncating small\n eigenvalues values to zero.\n\n Returns:\n numpy.array: A positive semidefinite numpy array.\n \"\"\"\n if epsilon is None:\n epsilon = 0. # default value\n\n dim = len(rho)\n rho_wizard = np.zeros([dim, dim])\n v, w = np.linalg.eigh(rho) # v eigenvecrors v[0] < v[1] <...\n for j in range(dim):\n if v[j] < epsilon:\n tmp = v[j]\n v[j] = 0.\n # redistribute loop\n x = 0.\n for k in range(j + 1, dim):\n x += tmp / (dim - (j + 1))\n v[k] = v[k] + tmp / (dim - (j + 1))\n for j in range(dim):\n rho_wizard = rho_wizard + v[j] * outer(w[:, j])\n return rho_wizard"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nbuild the list of circuits to rotate to points in phase space.", "response": "def build_wigner_circuits(circuit, phis, thetas, qubits,\n qreg, creg):\n \"\"\"Create the circuits to rotate to points in phase space\n Args:\n circuit (QuantumCircuit): The circuit to be appended with tomography\n state preparation and/or measurements.\n phis (np.matrix[[complex]]): phis\n thetas (np.matrix[[complex]]): thetas\n qubits (list[int]): a list of the qubit indexes of qreg to be measured.\n qreg (QuantumRegister): the quantum register containing qubits to be\n measured.\n creg (ClassicalRegister): the classical register containing bits to\n store measurement outcomes.\n\n Returns:\n list: A list of names of the added wigner function circuits.\n\n Raises:\n QiskitError: if circuit is not a valid QuantumCircuit.\n \"\"\"\n\n if not isinstance(circuit, QuantumCircuit):\n raise QiskitError('Input circuit must be a QuantumCircuit object')\n\n tomography_circuits = []\n points = len(phis[0])\n for point in range(points):\n label = '_wigner_phase_point'\n label += str(point)\n tmp_circ = QuantumCircuit(qreg, creg, name=label)\n for qubit, _ in enumerate(qubits):\n tmp_circ.u3(thetas[qubit][point], 0,\n phis[qubit][point], qreg[qubits[qubit]])\n tmp_circ.measure(qreg[qubits[qubit]], creg[qubits[qubit]])\n # Add to original circuit\n tmp_circ = circuit + tmp_circ\n tmp_circ.name = circuit.name + label\n tomography_circuits.append(tmp_circ)\n\n logger.info('>> Created Wigner function circuits for \"%s\"', circuit.name)\n return tomography_circuits"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget the value of the Wigner function at measured points in the state tomography.", "response": "def wigner_data(q_result, meas_qubits, labels, shots=None):\n \"\"\"Get the value of the Wigner function from measurement results.\n\n Args:\n q_result (Result): Results from execution of a state tomography\n circuits on a backend.\n meas_qubits (list[int]): a list of the qubit indexes measured.\n labels (list[str]): a list of names of the circuits\n shots (int): number of shots\n\n Returns:\n list: The values of the Wigner function at measured points in\n phase space\n \"\"\"\n num = len(meas_qubits)\n\n dim = 2**num\n p = [0.5 + 0.5 * np.sqrt(3), 0.5 - 0.5 * np.sqrt(3)]\n parity = 1\n\n for i in range(num):\n parity = np.kron(parity, p)\n\n w = [0] * len(labels)\n wpt = 0\n counts = [marginal_counts(q_result.get_counts(circ), meas_qubits)\n for circ in labels]\n for entry in counts:\n x = [0] * dim\n\n for i in range(dim):\n if bin(i)[2:].zfill(num) in entry:\n x[i] = float(entry[bin(i)[2:].zfill(num)])\n\n if shots is None:\n shots = np.sum(x)\n\n for i in range(dim):\n w[wpt] = w[wpt] + (x[i] / shots) * parity[i]\n wpt += 1\n\n return w"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nadd state preparation gates to a circuit.", "response": "def prep_gate(self, circuit, qreg, op):\n \"\"\"\n Add state preparation gates to a circuit.\n\n Args:\n circuit (QuantumCircuit): circuit to add a preparation to.\n qreg (tuple(QuantumRegister,int)): quantum register to apply\n preparation to.\n op (tuple(str, int)): the basis label and index for the\n preparation op.\n \"\"\"\n if self.prep_fun is None:\n pass\n else:\n self.prep_fun(circuit, qreg, op)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef meas_gate(self, circuit, qreg, op):\n if self.meas_fun is None:\n pass\n else:\n self.meas_fun(circuit, qreg, op)", "response": "Add measurement gates to a circuit."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _text_checker(job, interval, _interval_set=False, quiet=False, output=sys.stdout):\n status = job.status()\n msg = status.value\n prev_msg = msg\n msg_len = len(msg)\n\n if not quiet:\n print('\\r%s: %s' % ('Job Status', msg), end='', file=output)\n while status.name not in ['DONE', 'CANCELLED', 'ERROR']:\n time.sleep(interval)\n status = job.status()\n msg = status.value\n\n if status.name == 'QUEUED':\n msg += ' (%s)' % job.queue_position()\n if not _interval_set:\n interval = max(job.queue_position(), 2)\n else:\n if not _interval_set:\n interval = 2\n\n # Adjust length of message so there are no artifacts\n if len(msg) < msg_len:\n msg += ' ' * (msg_len - len(msg))\n elif len(msg) > msg_len:\n msg_len = len(msg)\n\n if msg != prev_msg and not quiet:\n print('\\r%s: %s' % ('Job Status', msg), end='', file=output)\n prev_msg = msg\n if not quiet:\n print('', file=output)", "response": "A text - based job status checker."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nmonitor the status of a Jupyter notebook.", "response": "def job_monitor(job, interval=None, monitor_async=False, quiet=False, output=sys.stdout):\n \"\"\"Monitor the status of a IBMQJob instance.\n\n Args:\n job (BaseJob): Job to monitor.\n interval (int): Time interval between status queries.\n monitor_async (bool): Monitor asyncronously (in Jupyter only).\n quiet (bool): If True, do not print status messages.\n output (file): The file like object to write status messages to.\n By default this is sys.stdout.\n\n Raises:\n QiskitError: When trying to run async outside of Jupyter\n ImportError: ipywidgets not available for notebook.\n \"\"\"\n if interval is None:\n _interval_set = False\n interval = 2\n else:\n _interval_set = True\n if _NOTEBOOK_ENV:\n if monitor_async:\n try:\n import ipywidgets as widgets # pylint: disable=import-error\n except ImportError:\n raise ImportError('These functions need ipywidgets. '\n 'Run \"pip install ipywidgets\" before.')\n from qiskit.tools.jupyter.jupyter_magics import _html_checker # pylint: disable=C0412\n\n style = \"font-size:16px;\"\n header = \"

Job Status: %s

\".format(\n style=style)\n status = widgets.HTML(value=header % job.status().value)\n display(status)\n\n thread = threading.Thread(target=_html_checker, args=(job, interval,\n status, header))\n thread.start()\n else:\n _text_checker(job, interval, _interval_set,\n quiet=quiet, output=output)\n\n else:\n if monitor_async:\n raise QiskitError(\n 'monitor_async only available in Jupyter notebooks.')\n _text_checker(job, interval, _interval_set, quiet=quiet, output=output)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncompute the Euler angles for a single - qubit gate.", "response": "def euler_angles_1q(unitary_matrix):\n \"\"\"Compute Euler angles for a single-qubit gate.\n\n Find angles (theta, phi, lambda) such that\n unitary_matrix = phase * Rz(phi) * Ry(theta) * Rz(lambda)\n\n Args:\n unitary_matrix (ndarray): 2x2 unitary matrix\n\n Returns:\n tuple: (theta, phi, lambda) Euler angles of SU(2)\n\n Raises:\n QiskitError: if unitary_matrix not 2x2, or failure\n \"\"\"\n if unitary_matrix.shape != (2, 2):\n raise QiskitError(\"euler_angles_1q: expected 2x2 matrix\")\n phase = la.det(unitary_matrix)**(-1.0/2.0)\n U = phase * unitary_matrix # U in SU(2)\n # OpenQASM SU(2) parameterization:\n # U[0, 0] = exp(-i(phi+lambda)/2) * cos(theta/2)\n # U[0, 1] = -exp(-i(phi-lambda)/2) * sin(theta/2)\n # U[1, 0] = exp(i(phi-lambda)/2) * sin(theta/2)\n # U[1, 1] = exp(i(phi+lambda)/2) * cos(theta/2)\n # Find theta\n if abs(U[0, 0]) > _CUTOFF_PRECISION:\n theta = 2 * math.acos(abs(U[0, 0]))\n else:\n theta = 2 * math.asin(abs(U[1, 0]))\n # Find phi and lambda\n phase11 = 0.0\n phase10 = 0.0\n if abs(math.cos(theta/2.0)) > _CUTOFF_PRECISION:\n phase11 = U[1, 1] / math.cos(theta/2.0)\n if abs(math.sin(theta/2.0)) > _CUTOFF_PRECISION:\n phase10 = U[1, 0] / math.sin(theta/2.0)\n phiplambda = 2 * math.atan2(np.imag(phase11), np.real(phase11))\n phimlambda = 2 * math.atan2(np.imag(phase10), np.real(phase10))\n phi = 0.0\n if abs(U[0, 0]) > _CUTOFF_PRECISION and abs(U[1, 0]) > _CUTOFF_PRECISION:\n phi = (phiplambda + phimlambda) / 2.0\n lamb = (phiplambda - phimlambda) / 2.0\n else:\n if abs(U[0, 0]) < _CUTOFF_PRECISION:\n lamb = -phimlambda\n else:\n lamb = phiplambda\n # Check the solution\n Rzphi = np.array([[np.exp(-1j*phi/2.0), 0],\n [0, np.exp(1j*phi/2.0)]], dtype=complex)\n Rytheta = np.array([[np.cos(theta/2.0), -np.sin(theta/2.0)],\n [np.sin(theta/2.0), np.cos(theta/2.0)]], dtype=complex)\n Rzlambda = np.array([[np.exp(-1j*lamb/2.0), 0],\n [0, np.exp(1j*lamb/2.0)]], dtype=complex)\n V = np.dot(Rzphi, np.dot(Rytheta, Rzlambda))\n if la.norm(V - U) > _CUTOFF_PRECISION:\n raise QiskitError(\"euler_angles_1q: incorrect result\")\n return theta, phi, lamb"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nsimplify a general U gate.", "response": "def simplify_U(theta, phi, lam):\n \"\"\"Return the gate u1, u2, or u3 implementing U with the fewest pulses.\n\n The returned gate implements U exactly, not up to a global phase.\n\n Args:\n theta, phi, lam: input Euler rotation angles for a general U gate\n\n Returns:\n Gate: one of IdGate, U1Gate, U2Gate, U3Gate.\n \"\"\"\n gate = U3Gate(theta, phi, lam)\n # Y rotation is 0 mod 2*pi, so the gate is a u1\n if abs(gate.params[0] % (2.0 * math.pi)) < _CUTOFF_PRECISION:\n gate = U1Gate(gate.params[0] + gate.params[1] + gate.params[2])\n # Y rotation is pi/2 or -pi/2 mod 2*pi, so the gate is a u2\n if isinstance(gate, U3Gate):\n # theta = pi/2 + 2*k*pi\n if abs((gate.params[0] - math.pi / 2) % (2.0 * math.pi)) < _CUTOFF_PRECISION:\n gate = U2Gate(gate.params[1],\n gate.params[2] + (gate.params[0] - math.pi / 2))\n # theta = -pi/2 + 2*k*pi\n if abs((gate.params[0] + math.pi / 2) % (2.0 * math.pi)) < _CUTOFF_PRECISION:\n gate = U2Gate(gate.params[1] + math.pi,\n gate.params[2] - math.pi + (gate.params[0] + math.pi / 2))\n # u1 and lambda is 0 mod 4*pi so gate is nop\n if isinstance(gate, U1Gate) and abs(gate.params[0] % (4.0 * math.pi)) < _CUTOFF_PRECISION:\n gate = IdGate()\n return gate"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef two_qubit_kak(unitary):\n if hasattr(unitary, 'to_operator'):\n # If input is a BaseOperator subclass this attempts to convert\n # the object to an Operator so that we can extract the underlying\n # numpy matrix from `Operator.data`.\n unitary = unitary.to_operator().data\n if hasattr(unitary, 'to_matrix'):\n # If input is Gate subclass or some other class object that has\n # a to_matrix method this will call that method.\n unitary = unitary.to_matrix()\n # Convert to numpy array incase not already an array\n unitary_matrix = np.array(unitary, dtype=complex)\n # Check input is a 2-qubit unitary\n if unitary_matrix.shape != (4, 4):\n raise QiskitError(\"two_qubit_kak: Expected 4x4 matrix\")\n if not is_unitary_matrix(unitary_matrix):\n raise QiskitError(\"Input matrix is not unitary.\")\n phase = la.det(unitary_matrix)**(-1.0/4.0)\n # Make it in SU(4), correct phase at the end\n U = phase * unitary_matrix\n # B changes to the Bell basis\n B = (1.0/math.sqrt(2)) * np.array([[1, 1j, 0, 0],\n [0, 0, 1j, 1],\n [0, 0, 1j, -1],\n [1, -1j, 0, 0]], dtype=complex)\n\n # We also need B.conj().T below\n Bdag = B.conj().T\n # U' = Bdag . U . B\n Uprime = Bdag.dot(U.dot(B))\n # M^2 = trans(U') . U'\n M2 = Uprime.T.dot(Uprime)\n\n # Diagonalize M2\n # Must use diagonalization routine which finds a real orthogonal matrix P\n # when M2 is real.\n D, P = la.eig(M2)\n D = np.diag(D)\n # If det(P) == -1 then in O(4), apply a swap to make P in SO(4)\n if abs(la.det(P)+1) < 1e-5:\n swap = np.array([[1, 0, 0, 0],\n [0, 0, 1, 0],\n [0, 1, 0, 0],\n [0, 0, 0, 1]], dtype=complex)\n P = P.dot(swap)\n D = swap.dot(D.dot(swap))\n\n Q = np.sqrt(D) # array from elementwise sqrt\n # Want to take square root so that Q has determinant 1\n if abs(la.det(Q)+1) < 1e-5:\n Q[0, 0] = -Q[0, 0]\n\n # Q^-1*P.T = P' -> QP' = P.T (solve for P' using Ax=b)\n Pprime = la.solve(Q, P.T)\n # K' now just U' * P * P'\n Kprime = Uprime.dot(P.dot(Pprime))\n\n K1 = B.dot(Kprime.dot(P.dot(Bdag)))\n A = B.dot(Q.dot(Bdag))\n K2 = B.dot(P.T.dot(Bdag))\n # KAK = K1 * A * K2\n KAK = K1.dot(A.dot(K2))\n\n # Verify decomp matches input unitary.\n if la.norm(KAK - U) > 1e-6:\n raise QiskitError(\"two_qubit_kak: KAK decomposition \" +\n \"does not return input unitary.\")\n\n # Compute parameters alpha, beta, gamma so that\n # A = exp(i * (alpha * XX + beta * YY + gamma * ZZ))\n xx = np.array([[0, 0, 0, 1],\n [0, 0, 1, 0],\n [0, 1, 0, 0],\n [1, 0, 0, 0]], dtype=complex)\n\n yy = np.array([[0, 0, 0, -1],\n [0, 0, 1, 0],\n [0, 1, 0, 0],\n [-1, 0, 0, 0]], dtype=complex)\n\n zz = np.array([[1, 0, 0, 0],\n [0, -1, 0, 0],\n [0, 0, -1, 0],\n [0, 0, 0, 1]], dtype=complex)\n\n A_real_tr = A.real.trace()\n alpha = math.atan2(A.dot(xx).imag.trace(), A_real_tr)\n beta = math.atan2(A.dot(yy).imag.trace(), A_real_tr)\n gamma = math.atan2(A.dot(zz).imag.trace(), A_real_tr)\n\n # K1 = kron(U1, U2) and K2 = kron(V1, V2)\n # Find the matrices U1, U2, V1, V2\n\n # Find a block in K1 where U1_ij * [U2] is not zero\n L = K1[0:2, 0:2]\n if la.norm(L) < 1e-9:\n L = K1[0:2, 2:4]\n if la.norm(L) < 1e-9:\n L = K1[2:4, 2:4]\n # Remove the U1_ij prefactor\n Q = L.dot(L.conj().T)\n U2 = L / math.sqrt(Q[0, 0].real)\n\n # Now grab U1 given we know U2\n R = K1.dot(np.kron(np.identity(2), U2.conj().T))\n U1 = np.zeros((2, 2), dtype=complex)\n U1[0, 0] = R[0, 0]\n U1[0, 1] = R[0, 2]\n U1[1, 0] = R[2, 0]\n U1[1, 1] = R[2, 2]\n\n # Repeat K1 routine for K2\n L = K2[0:2, 0:2]\n if la.norm(L) < 1e-9:\n L = K2[0:2, 2:4]\n if la.norm(L) < 1e-9:\n L = K2[2:4, 2:4]\n Q = np.dot(L, np.transpose(L.conjugate()))\n V2 = L / np.sqrt(Q[0, 0])\n R = np.dot(K2, np.kron(np.identity(2), np.transpose(V2.conjugate())))\n\n V1 = np.zeros_like(U1)\n V1[0, 0] = R[0, 0]\n V1[0, 1] = R[0, 2]\n V1[1, 0] = R[2, 0]\n V1[1, 1] = R[2, 2]\n\n if la.norm(np.kron(U1, U2) - K1) > 1e-4:\n raise QiskitError(\"two_qubit_kak: K1 != U1 x U2\")\n if la.norm(np.kron(V1, V2) - K2) > 1e-4:\n raise QiskitError(\"two_qubit_kak: K2 != V1 x V2\")\n\n test = la.expm(1j*(alpha * xx + beta * yy + gamma * zz))\n if la.norm(A - test) > 1e-4:\n raise QiskitError(\"two_qubit_kak: \" +\n \"Matrix A does not match xx,yy,zz decomposition.\")\n\n # Circuit that implements K1 * A * K2 (up to phase), using\n # Vatan and Williams Fig. 6 of quant-ph/0308006v3\n # Include prefix and suffix single-qubit gates into U2, V1 respectively.\n\n V2 = np.array([[np.exp(1j*np.pi/4), 0],\n [0, np.exp(-1j*np.pi/4)]], dtype=complex).dot(V2)\n U1 = U1.dot(np.array([[np.exp(-1j*np.pi/4), 0],\n [0, np.exp(1j*np.pi/4)]], dtype=complex))\n\n # Corrects global phase: exp(ipi/4)*phase'\n U1 = U1.dot(np.array([[np.exp(1j*np.pi/4), 0],\n [0, np.exp(1j*np.pi/4)]], dtype=complex))\n U1 = phase.conjugate() * U1\n\n # Test\n g1 = np.kron(V1, V2)\n g2 = np.array([[1, 0, 0, 0],\n [0, 0, 0, 1],\n [0, 0, 1, 0],\n [0, 1, 0, 0]], dtype=complex)\n\n theta = 2*gamma - np.pi/2\n\n Ztheta = np.array([[np.exp(1j*theta/2), 0],\n [0, np.exp(-1j*theta/2)]], dtype=complex)\n\n kappa = np.pi/2 - 2*alpha\n Ykappa = np.array([[math.cos(kappa/2), math.sin(kappa/2)],\n [-math.sin(kappa/2), math.cos(kappa/2)]], dtype=complex)\n g3 = np.kron(Ztheta, Ykappa)\n g4 = np.array([[1, 0, 0, 0],\n [0, 1, 0, 0],\n [0, 0, 0, 1],\n [0, 0, 1, 0]], dtype=complex)\n\n zeta = 2*beta - np.pi/2\n Yzeta = np.array([[math.cos(zeta/2), math.sin(zeta/2)],\n [-math.sin(zeta/2), math.cos(zeta/2)]], dtype=complex)\n g5 = np.kron(np.identity(2), Yzeta)\n g6 = g2\n g7 = np.kron(U1, U2)\n\n V = g2.dot(g1)\n V = g3.dot(V)\n V = g4.dot(V)\n V = g5.dot(V)\n V = g6.dot(V)\n V = g7.dot(V)\n\n if la.norm(V - U*phase.conjugate()) > 1e-6:\n raise QiskitError(\"two_qubit_kak: \" +\n \"sequence incorrect, unknown error\")\n\n v1_param = euler_angles_1q(V1)\n v2_param = euler_angles_1q(V2)\n u1_param = euler_angles_1q(U1)\n u2_param = euler_angles_1q(U2)\n\n v1_gate = U3Gate(v1_param[0], v1_param[1], v1_param[2])\n v2_gate = U3Gate(v2_param[0], v2_param[1], v2_param[2])\n u1_gate = U3Gate(u1_param[0], u1_param[1], u1_param[2])\n u2_gate = U3Gate(u2_param[0], u2_param[1], u2_param[2])\n\n q = QuantumRegister(2)\n return_circuit = QuantumCircuit(q)\n\n return_circuit.append(v1_gate, [q[1]])\n\n return_circuit.append(v2_gate, [q[0]])\n\n return_circuit.append(CnotGate(), [q[0], q[1]])\n\n gate = U3Gate(0.0, 0.0, -2.0*gamma + np.pi/2.0)\n return_circuit.append(gate, [q[1]])\n\n gate = U3Gate(-np.pi/2.0 + 2.0*alpha, 0.0, 0.0)\n return_circuit.append(gate, [q[0]])\n\n return_circuit.append(CnotGate(), [q[1], q[0]])\n\n gate = U3Gate(-2.0*beta + np.pi/2.0, 0.0, 0.0)\n return_circuit.append(gate, [q[0]])\n\n return_circuit.append(CnotGate(), [q[0], q[1]])\n\n return_circuit.append(u1_gate, [q[1]])\n\n return_circuit.append(u2_gate, [q[0]])\n\n return return_circuit", "response": "Decomposes a two - qubit gate over SU ( 2 + CNOT using the KAK decomposition."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nextend a DAG with virtual qubits that are not in the circuit yet.", "response": "def run(self, dag):\n \"\"\"\n Extends dag with virtual qubits that are in layout but not in the circuit yet.\n\n Args:\n dag (DAGCircuit): DAG to extend.\n\n Returns:\n DAGCircuit: An extended DAG.\n\n Raises:\n TranspilerError: If there is not layout in the property set or not set at init time.\n \"\"\"\n self.layout = self.layout or self.property_set['layout']\n\n if self.layout is None:\n raise TranspilerError(\"EnlargeWithAncilla requires property_set[\\\"layout\\\"] or\"\n \" \\\"layout\\\" parameter to run\")\n\n layout_virtual_qubits = self.layout.get_virtual_bits().keys()\n new_qregs = set(virtual_qubit[0] for virtual_qubit in layout_virtual_qubits\n if virtual_qubit not in dag.wires)\n\n for qreg in new_qregs:\n dag.add_qreg(qreg)\n\n return dag"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ndefining the class entry.", "response": "def _define(self):\n \"\"\"\n gate ch a,b {\n h b;\n sdg b;\n cx a,b;\n h b;\n t b;\n cx a,b;\n t b;\n h b;\n s b;\n x b;\n s a;}\n \"\"\"\n definition = []\n q = QuantumRegister(2, \"q\")\n rule = [\n (HGate(), [q[1]], []),\n (SdgGate(), [q[1]], []),\n (CnotGate(), [q[0], q[1]], []),\n (HGate(), [q[1]], []),\n (TGate(), [q[1]], []),\n (CnotGate(), [q[0], q[1]], []),\n (TGate(), [q[1]], []),\n (HGate(), [q[1]], []),\n (SGate(), [q[1]], []),\n (XGate(), [q[1]], []),\n (SGate(), [q[0]], [])\n ]\n for inst in rule:\n definition.append(inst)\n self.definition = definition"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef config_tab(backend):\n status = backend.status().to_dict()\n config = backend.configuration().to_dict()\n\n config_dict = {**status, **config}\n\n upper_list = ['n_qubits', 'operational',\n 'status_msg', 'pending_jobs',\n 'basis_gates', 'local', 'simulator']\n\n lower_list = list(set(config_dict.keys()).difference(upper_list))\n # Remove gates because they are in a different tab\n lower_list.remove('gates')\n upper_str = \"\"\n upper_str += \"\"\"\"\"\"\n\n footer = \"
\"\n\n # Upper HBox widget data\n\n upper_str += \"PropertyValue\"\n for key in upper_list:\n upper_str += \"%s%s\" % (\n key, config_dict[key])\n upper_str += footer\n\n upper_table = widgets.HTML(\n value=upper_str, layout=widgets.Layout(width='100%', grid_area='left'))\n\n image_widget = widgets.Output(\n layout=widgets.Layout(display='flex-inline', grid_area='right',\n padding='10px 10px 10px 10px',\n width='auto', max_height='300px',\n align_items='center'))\n\n if not config['simulator']:\n with image_widget:\n gate_map = plot_gate_map(backend)\n display(gate_map)\n plt.close(gate_map)\n\n lower_str = \"\"\n lower_str += \"\"\"\"\"\"\n lower_str += \"\"\n for key in lower_list:\n if key != 'name':\n lower_str += \"\" % (\n key, config_dict[key])\n lower_str += footer\n\n lower_table = widgets.HTML(value=lower_str,\n layout=widgets.Layout(\n width='auto',\n grid_area='bottom'))\n\n grid = widgets.GridBox(children=[upper_table, image_widget, lower_table],\n layout=widgets.Layout(\n grid_template_rows='auto auto',\n grid_template_columns='25% 25% 25% 25%',\n grid_template_areas='''\n \"left right right right\"\n \"bottom bottom bottom bottom\"\n ''',\n grid_gap='0px 0px'))\n\n return grid", "response": "A function that creates a new configuration tab."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ndisplay a detailed noise map for a particular backend.", "response": "def detailed_map(backend):\n \"\"\"Widget for displaying detailed noise map.\n\n Args:\n backend (IBMQbackend): The backend.\n\n Returns:\n GridBox: Widget holding noise map images.\n \"\"\"\n props = backend.properties().to_dict()\n config = backend.configuration().to_dict()\n single_gate_errors = [q['parameters'][0]['value']\n for q in props['gates'][2:3*config['n_qubits']:3]]\n single_norm = matplotlib.colors.Normalize(\n vmin=min(single_gate_errors), vmax=max(single_gate_errors))\n q_colors = [cm.viridis(single_norm(err)) for err in single_gate_errors]\n\n cmap = config['coupling_map']\n\n cx_errors = []\n for line in cmap:\n for item in props['gates'][3*config['n_qubits']:]:\n if item['qubits'] == line:\n cx_errors.append(item['parameters'][0]['value'])\n break\n else:\n continue\n\n cx_norm = matplotlib.colors.Normalize(\n vmin=min(cx_errors), vmax=max(cx_errors))\n line_colors = [cm.viridis(cx_norm(err)) for err in cx_errors]\n\n single_widget = widgets.Output(layout=widgets.Layout(display='flex-inline', grid_area='left',\n align_items='center'))\n\n cmap_widget = widgets.Output(layout=widgets.Layout(display='flex-inline', grid_area='top',\n width='auto', height='auto',\n align_items='center'))\n\n cx_widget = widgets.Output(layout=widgets.Layout(display='flex-inline', grid_area='right',\n align_items='center'))\n\n tick_locator = mpl.ticker.MaxNLocator(nbins=5)\n with cmap_widget:\n noise_map = plot_gate_map(backend, qubit_color=q_colors,\n line_color=line_colors,\n qubit_size=28,\n plot_directed=True)\n width, height = noise_map.get_size_inches()\n\n noise_map.set_size_inches(1.25*width, 1.25*height)\n\n display(noise_map)\n plt.close(noise_map)\n\n with single_widget:\n cbl_fig = plt.figure(figsize=(3, 1))\n ax1 = cbl_fig.add_axes([0.05, 0.80, 0.9, 0.15])\n single_cb = mpl.colorbar.ColorbarBase(ax1, cmap=cm.viridis,\n norm=single_norm,\n orientation='horizontal')\n single_cb.locator = tick_locator\n single_cb.update_ticks()\n ax1.set_title('Single-qubit U3 error rate')\n display(cbl_fig)\n plt.close(cbl_fig)\n\n with cx_widget:\n cx_fig = plt.figure(figsize=(3, 1))\n ax2 = cx_fig.add_axes([0.05, 0.80, 0.9, 0.15])\n cx_cb = mpl.colorbar.ColorbarBase(ax2, cmap=cm.viridis,\n norm=cx_norm,\n orientation='horizontal')\n cx_cb.locator = tick_locator\n cx_cb.update_ticks()\n ax2.set_title('CNOT error rate')\n display(cx_fig)\n plt.close(cx_fig)\n\n out_box = widgets.GridBox([single_widget, cmap_widget, cx_widget],\n layout=widgets.Layout(\n grid_template_rows='auto auto',\n grid_template_columns='33% 33% 33%',\n grid_template_areas='''\n \"top top top\"\n \"left . right\"\n ''',\n grid_gap='0px 0px'))\n return out_box"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef job_history(backend):\n year = widgets.Output(layout=widgets.Layout(display='flex-inline',\n align_items='center',\n min_height='400px'))\n\n month = widgets.Output(layout=widgets.Layout(display='flex-inline',\n align_items='center',\n min_height='400px'))\n\n week = widgets.Output(layout=widgets.Layout(display='flex-inline',\n align_items='center',\n min_height='400px'))\n\n tabs = widgets.Tab(layout=widgets.Layout(max_height='620px'))\n tabs.children = [year, month, week]\n tabs.set_title(0, 'Year')\n tabs.set_title(1, 'Month')\n tabs.set_title(2, 'Week')\n tabs.selected_index = 1\n\n _build_job_history(tabs, backend)\n return tabs", "response": "A tab widget for displaying job history images."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nplot the job history of the user from the given list of jobs.", "response": "def plot_job_history(jobs, interval='year'):\n \"\"\"Plots the job history of the user from the given list of jobs.\n\n Args:\n jobs (list): A list of jobs with type IBMQjob.\n interval (str): Interval over which to examine.\n\n Returns:\n fig: A Matplotlib figure instance.\n \"\"\"\n def get_date(job):\n \"\"\"Returns a datetime object from a IBMQJob instance.\n\n Args:\n job (IBMQJob): A job.\n\n Returns:\n dt: A datetime object.\n \"\"\"\n return datetime.datetime.strptime(job.creation_date(),\n '%Y-%m-%dT%H:%M:%S.%fZ')\n\n current_time = datetime.datetime.now()\n\n if interval == 'year':\n bins = [(current_time - datetime.timedelta(days=k*365/12))\n for k in range(12)]\n elif interval == 'month':\n bins = [(current_time - datetime.timedelta(days=k)) for k in range(30)]\n elif interval == 'week':\n bins = [(current_time - datetime.timedelta(days=k)) for k in range(7)]\n\n binned_jobs = [0]*len(bins)\n\n if interval == 'year':\n for job in jobs:\n for ind, dat in enumerate(bins):\n date = get_date(job)\n if date.month == dat.month:\n binned_jobs[ind] += 1\n break\n else:\n continue\n else:\n for job in jobs:\n for ind, dat in enumerate(bins):\n date = get_date(job)\n if date.day == dat.day and date.month == dat.month:\n binned_jobs[ind] += 1\n break\n else:\n continue\n\n nz_bins = []\n nz_idx = []\n for ind, val in enumerate(binned_jobs):\n if val != 0:\n nz_idx.append(ind)\n nz_bins.append(val)\n\n total_jobs = sum(binned_jobs)\n\n colors = ['#003f5c', '#ffa600', '#374c80', '#ff764a',\n '#7a5195', '#ef5675', '#bc5090']\n\n if interval == 'year':\n labels = ['{}-{}'.format(str(bins[b].year)[2:], bins[b].month) for b in nz_idx]\n else:\n labels = ['{}-{}'.format(bins[b].month, bins[b].day) for b in nz_idx]\n fig, ax = plt.subplots(1, 1, figsize=(5, 5)) # pylint: disable=invalid-name\n ax.pie(nz_bins[::-1], labels=labels, colors=colors, textprops={'fontsize': 14},\n rotatelabels=True, counterclock=False)\n ax.add_artist(Circle((0, 0), 0.7, color='white', zorder=1))\n ax.text(0, 0, total_jobs, horizontalalignment='center',\n verticalalignment='center', fontsize=26)\n fig.tight_layout()\n return fig"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run(self, dag):\n resets = dag.op_nodes(Reset)\n for reset in resets:\n predecessor = next(dag.predecessors(reset))\n if predecessor.type == 'in':\n dag.remove_op_node(reset)\n return dag", "response": "Return a new circuit that has been optimized."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nplot the interpolated envelope of pulse.", "response": "def draw(self, **kwargs):\n \"\"\"Plot the interpolated envelope of pulse.\n\n Keyword Args:\n dt (float): Time interval of samples.\n interp_method (str): Method of interpolation\n (set `None` for turn off the interpolation).\n filename (str): Name required to save pulse image.\n interactive (bool): When set true show the circuit in a new window\n (this depends on the matplotlib backend being used supporting this).\n dpi (int): Resolution of saved image.\n nop (int): Data points for interpolation.\n size (tuple): Size of figure.\n \"\"\"\n from qiskit.tools.visualization import pulse_drawer\n\n return pulse_drawer(self._samples, self.duration, **kwargs)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef cu3(self, theta, phi, lam, ctl, tgt):\n return self.append(Cu3Gate(theta, phi, lam), [ctl, tgt], [])", "response": "Apply cu3 to tgt with angle theta phi lam."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _define(self):\n definition = []\n q = QuantumRegister(2, \"q\")\n rule = [\n (U1Gate((self.params[2] - self.params[1]) / 2), [q[1]], []),\n (CnotGate(), [q[0], q[1]], []),\n (U3Gate(-self.params[0] / 2, 0, -(self.params[1] + self.params[2]) / 2), [q[1]], []),\n (CnotGate(), [q[0], q[1]], []),\n (U3Gate(self.params[0] / 2, self.params[1], 0), [q[1]], [])\n ]\n for inst in rule:\n definition.append(inst)\n self.definition = definition", "response": "Define the related class."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef build_bell_circuit():\n q = QuantumRegister(2)\n c = ClassicalRegister(2)\n qc = QuantumCircuit(q, c)\n qc.h(q[0])\n qc.cx(q[0], q[1])\n qc.measure(q, c)\n return qc", "response": "Builds a Bell circuit."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _transpile_circuit(circuit_config_tuple):\n circuit, transpile_config = circuit_config_tuple\n\n # if the pass manager is not already selected, choose an appropriate one.\n if transpile_config.pass_manager:\n pass_manager = transpile_config.pass_manager\n\n elif transpile_config.coupling_map:\n pass_manager = default_pass_manager(transpile_config.basis_gates,\n transpile_config.coupling_map,\n transpile_config.initial_layout,\n transpile_config.seed_transpiler)\n else:\n pass_manager = default_pass_manager_simulator(transpile_config.basis_gates)\n\n return pass_manager.run(circuit)", "response": "Select a PassManager and run a single circuit through it."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns a list of TranspileConfig objects for each argument in the list of circuits.", "response": "def _parse_transpile_args(circuits, backend,\n basis_gates, coupling_map, backend_properties,\n initial_layout, seed_transpiler, optimization_level,\n pass_manager):\n \"\"\"Resolve the various types of args allowed to the transpile() function through\n duck typing, overriding args, etc. Refer to the transpile() docstring for details on\n what types of inputs are allowed.\n\n Here the args are resolved by converting them to standard instances, and prioritizing\n them in case a transpile option is passed through multiple args (explicitly setting an\n arg has more priority than the arg set by backend)\n\n Returns:\n list[TranspileConfig]: a transpile config for each circuit, which is a standardized\n object that configures the transpiler and determines the pass manager to use.\n \"\"\"\n # Each arg could be single or a list. If list, it must be the same size as\n # number of circuits. If single, duplicate to create a list of that size.\n num_circuits = len(circuits)\n\n basis_gates = _parse_basis_gates(basis_gates, backend, circuits)\n\n coupling_map = _parse_coupling_map(coupling_map, backend, num_circuits)\n\n backend_properties = _parse_backend_properties(backend_properties, backend, num_circuits)\n\n initial_layout = _parse_initial_layout(initial_layout, circuits)\n\n seed_transpiler = _parse_seed_transpiler(seed_transpiler, num_circuits)\n\n optimization_level = _parse_optimization_level(optimization_level, num_circuits)\n\n pass_manager = _parse_pass_manager(pass_manager, num_circuits)\n\n transpile_configs = []\n for args in zip(basis_gates, coupling_map, backend_properties, initial_layout,\n seed_transpiler, optimization_level, pass_manager):\n transpile_config = TranspileConfig(basis_gates=args[0],\n coupling_map=args[1],\n backend_properties=args[2],\n initial_layout=args[3],\n seed_transpiler=args[4],\n optimization_level=args[5],\n pass_manager=args[6])\n transpile_configs.append(transpile_config)\n\n return transpile_configs"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef execute(experiments, backend,\n basis_gates=None, coupling_map=None, # circuit transpile options\n backend_properties=None, initial_layout=None,\n seed_transpiler=None, optimization_level=None, pass_manager=None,\n qobj_id=None, qobj_header=None, shots=1024, # common run options\n memory=False, max_credits=10, seed_simulator=None,\n default_qubit_los=None, default_meas_los=None, # schedule run options\n schedule_los=None, meas_level=2, meas_return='avg',\n memory_slots=None, memory_slot_size=100, rep_time=None, parameter_binds=None,\n seed=None, seed_mapper=None, # deprecated\n config=None, circuits=None,\n **run_config):\n \"\"\"Execute a list of circuits or pulse schedules on a backend.\n\n The execution is asynchronous, and a handle to a job instance is returned.\n\n Args:\n experiments (QuantumCircuit or list[QuantumCircuit] or Schedule or list[Schedule]):\n Circuit(s) or pulse schedule(s) to execute\n\n backend (BaseBackend):\n Backend to execute circuits on.\n Transpiler options are automatically grabbed from\n backend.configuration() and backend.properties().\n If any other option is explicitly set (e.g. coupling_map), it\n will override the backend's.\n\n basis_gates (list[str]):\n List of basis gate names to unroll to.\n e.g:\n ['u1', 'u2', 'u3', 'cx']\n If None, do not unroll.\n\n coupling_map (CouplingMap or list):\n Coupling map (perhaps custom) to target in mapping.\n Multiple formats are supported:\n a. CouplingMap instance\n\n b. list\n Must be given as an adjacency matrix, where each entry\n specifies all two-qubit interactions supported by backend\n e.g:\n [[0, 1], [0, 3], [1, 2], [1, 5], [2, 5], [4, 1], [5, 3]]\n\n backend_properties (BackendProperties):\n Properties returned by a backend, including information on gate\n errors, readout errors, qubit coherence times, etc. For a backend\n that provides this information, it can be obtained with:\n ``backend.properties()``\n\n initial_layout (Layout or dict or list):\n Initial position of virtual qubits on physical qubits.\n If this layout makes the circuit compatible with the coupling_map\n constraints, it will be used.\n The final layout is not guaranteed to be the same, as the transpiler\n may permute qubits through swaps or other means.\n\n Multiple formats are supported:\n a. Layout instance\n\n b. dict\n virtual to physical:\n {qr[0]: 0,\n qr[1]: 3,\n qr[2]: 5}\n\n physical to virtual:\n {0: qr[0],\n 3: qr[1],\n 5: qr[2]}\n\n c. list\n virtual to physical:\n [0, 3, 5] # virtual qubits are ordered (in addition to named)\n\n physical to virtual:\n [qr[0], None, None, qr[1], None, qr[2]]\n\n seed_transpiler (int):\n Sets random seed for the stochastic parts of the transpiler\n\n optimization_level (int):\n How much optimization to perform on the circuits.\n Higher levels generate more optimized circuits,\n at the expense of longer transpilation time.\n 0: no optimization\n 1: light optimization\n 2: heavy optimization\n\n pass_manager (PassManager):\n The pass manager to use during transpilation. If this arg is present,\n auto-selection of pass manager based on the transpile options will be\n turned off and this pass manager will be used directly.\n\n qobj_id (str):\n String identifier to annotate the Qobj\n\n qobj_header (QobjHeader or dict):\n User input that will be inserted in Qobj header, and will also be\n copied to the corresponding Result header. Headers do not affect the run.\n\n shots (int):\n Number of repetitions of each circuit, for sampling. Default: 2014\n\n memory (bool):\n If True, per-shot measurement bitstrings are returned as well\n (provided the backend supports it). For OpenPulse jobs, only\n measurement level 2 supports this option. Default: False\n\n max_credits (int):\n Maximum credits to spend on job. Default: 10\n\n seed_simulator (int):\n Random seed to control sampling, for when backend is a simulator\n\n default_qubit_los (list):\n List of default qubit lo frequencies\n\n default_meas_los (list):\n List of default meas lo frequencies\n\n schedule_los (None or list[Union[Dict[PulseChannel, float], LoConfig]] or\n Union[Dict[PulseChannel, float], LoConfig]):\n Experiment LO configurations\n\n meas_level (int):\n Set the appropriate level of the measurement output for pulse experiments.\n\n meas_return (str):\n Level of measurement data for the backend to return\n For `meas_level` 0 and 1:\n \"single\" returns information from every shot.\n \"avg\" returns average measurement output (averaged over number of shots).\n\n memory_slots (int):\n Number of classical memory slots used in this job.\n\n memory_slot_size (int):\n Size of each memory slot if the output is Level 0.\n\n rep_time (int): repetition time of the experiment in \u03bcs.\n The delay between experiments will be rep_time.\n Must be from the list provided by the device.\n\n parameter_binds (list[dict{Parameter: Value}]):\n List of Parameter bindings over which the set of experiments will be\n executed. Each list element (bind) should be of the form\n {Parameter1: value1, Parameter2: value2, ...}. All binds will be\n executed across all experiments, e.g. if parameter_binds is a\n length-n list, and there are m experiments, a total of m x n\n experiments will be run (one for each experiment/bind pair).\n\n seed (int):\n DEPRECATED in 0.8: use ``seed_simulator`` kwarg instead\n\n seed_mapper (int):\n DEPRECATED in 0.8: use ``seed_transpiler`` kwarg instead\n\n config (dict):\n DEPRECATED in 0.8: use run_config instead\n\n circuits (QuantumCircuit or list[QuantumCircuit]):\n DEPRECATED in 0.8: use ``experiments`` kwarg instead.\n\n run_config (dict):\n Extra arguments used to configure the run (e.g. for Aer configurable backends)\n Refer to the backend documentation for details on these arguments\n Note: for now, these keyword arguments will both be copied to the\n Qobj config, and passed to backend.run()\n\n Returns:\n BaseJob: returns job instance derived from BaseJob\n\n Raises:\n QiskitError: if the execution cannot be interpreted as either circuits or schedules\n \"\"\"\n if circuits is not None:\n experiments = circuits\n warnings.warn(\"the `circuits` arg in `execute()` has been deprecated. \"\n \"please use `experiments`, which can handle both circuit \"\n \"and pulse Schedules\", DeprecationWarning)\n\n # transpiling the circuits using given transpile options\n experiments = transpile(experiments,\n basis_gates=basis_gates,\n coupling_map=coupling_map,\n backend_properties=backend_properties,\n initial_layout=initial_layout,\n seed_transpiler=seed_transpiler,\n optimization_level=optimization_level,\n backend=backend,\n pass_manager=pass_manager,\n seed_mapper=seed_mapper, # deprecated\n )\n\n # assembling the circuits into a qobj to be run on the backend\n qobj = assemble(experiments,\n qobj_id=qobj_id,\n qobj_header=qobj_header,\n shots=shots,\n memory=memory,\n max_credits=max_credits,\n seed_simulator=seed_simulator,\n default_qubit_los=default_qubit_los,\n default_meas_los=default_meas_los,\n schedule_los=schedule_los,\n meas_level=meas_level,\n meas_return=meas_return,\n memory_slots=memory_slots,\n memory_slot_size=memory_slot_size,\n rep_time=rep_time,\n parameter_binds=parameter_binds,\n backend=backend,\n config=config, # deprecated\n seed=seed, # deprecated\n run_config=run_config\n )\n\n # executing the circuits on the backend and returning the job\n return backend.run(qobj, **run_config)", "response": "Execute a list of circuits or pulse schedules on a backend."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the primary drive channel of this qubit.", "response": "def drive(self) -> DriveChannel:\n \"\"\"Return the primary drive channel of this qubit.\"\"\"\n if self._drives:\n return self._drives[0]\n else:\n raise PulseError(\"No drive channels in q[%d]\" % self._index)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the primary control channel of this qubit.", "response": "def control(self) -> ControlChannel:\n \"\"\"Return the primary control channel of this qubit.\"\"\"\n if self._controls:\n return self._controls[0]\n else:\n raise PulseError(\"No control channels in q[%d]\" % self._index)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the primary measure channel of this qubit.", "response": "def measure(self) -> MeasureChannel:\n \"\"\"Return the primary measure channel of this qubit.\"\"\"\n if self._measures:\n return self._measures[0]\n else:\n raise PulseError(\"No measurement channels in q[%d]\" % self._index)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning the primary acquire channel of this qubit.", "response": "def acquire(self) -> AcquireChannel:\n \"\"\"Return the primary acquire channel of this qubit.\"\"\"\n if self._acquires:\n return self._acquires[0]\n else:\n raise PulseError(\"No acquire channels in q[%d]\" % self._index)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef assemble_circuits(circuits, qobj_id=None, qobj_header=None, run_config=None):\n qobj_config = QasmQobjConfig()\n if run_config:\n qobj_config = QasmQobjConfig(**run_config.to_dict())\n\n # Pack everything into the Qobj\n experiments = []\n max_n_qubits = 0\n max_memory_slots = 0\n for circuit in circuits:\n # header stuff\n n_qubits = 0\n memory_slots = 0\n qubit_labels = []\n clbit_labels = []\n\n qreg_sizes = []\n creg_sizes = []\n for qreg in circuit.qregs:\n qreg_sizes.append([qreg.name, qreg.size])\n for j in range(qreg.size):\n qubit_labels.append([qreg.name, j])\n n_qubits += qreg.size\n for creg in circuit.cregs:\n creg_sizes.append([creg.name, creg.size])\n for j in range(creg.size):\n clbit_labels.append([creg.name, j])\n memory_slots += creg.size\n\n # TODO: why do we need creq_sizes and qreg_sizes in header\n # TODO: we need to rethink memory_slots as they are tied to classical bit\n experimentheader = QobjExperimentHeader(qubit_labels=qubit_labels,\n n_qubits=n_qubits,\n qreg_sizes=qreg_sizes,\n clbit_labels=clbit_labels,\n memory_slots=memory_slots,\n creg_sizes=creg_sizes,\n name=circuit.name)\n # TODO: why do we need n_qubits and memory_slots in both the header and the config\n experimentconfig = QasmQobjExperimentConfig(n_qubits=n_qubits, memory_slots=memory_slots)\n\n # Convert conditionals from QASM-style (creg ?= int) to qobj-style\n # (register_bit ?= 1), by assuming device has unlimited register slots\n # (supported only for simulators). Map all measures to a register matching\n # their clbit_index, create a new register slot for every conditional gate\n # and add a bfunc to map the creg=val mask onto the gating register bit.\n\n is_conditional_experiment = any(op.control for (op, qargs, cargs) in circuit.data)\n max_conditional_idx = 0\n\n instructions = []\n for op_context in circuit.data:\n instruction = op_context[0].assemble()\n\n # Add register attributes to the instruction\n qargs = op_context[1]\n cargs = op_context[2]\n if qargs:\n qubit_indices = [qubit_labels.index([qubit[0].name, qubit[1]])\n for qubit in qargs]\n instruction.qubits = qubit_indices\n if cargs:\n clbit_indices = [clbit_labels.index([clbit[0].name, clbit[1]])\n for clbit in cargs]\n instruction.memory = clbit_indices\n # If the experiment has conditional instructions, assume every\n # measurement result may be needed for a conditional gate.\n if instruction.name == \"measure\" and is_conditional_experiment:\n instruction.register = clbit_indices\n\n # To convert to a qobj-style conditional, insert a bfunc prior\n # to the conditional instruction to map the creg ?= val condition\n # onto a gating register bit.\n if hasattr(instruction, '_control'):\n ctrl_reg, ctrl_val = instruction._control\n mask = 0\n val = 0\n for clbit in clbit_labels:\n if clbit[0] == ctrl_reg.name:\n mask |= (1 << clbit_labels.index(clbit))\n val |= (((ctrl_val >> clbit[1]) & 1) << clbit_labels.index(clbit))\n\n conditional_reg_idx = memory_slots + max_conditional_idx\n conversion_bfunc = QasmQobjInstruction(name='bfunc',\n mask=\"0x%X\" % mask,\n relation='==',\n val=\"0x%X\" % val,\n register=conditional_reg_idx)\n instructions.append(conversion_bfunc)\n instruction.conditional = conditional_reg_idx\n max_conditional_idx += 1\n # Delete control attribute now that we have replaced it with\n # the conditional and bfuc\n del instruction._control\n\n instructions.append(instruction)\n\n experiments.append(QasmQobjExperiment(instructions=instructions, header=experimentheader,\n config=experimentconfig))\n if n_qubits > max_n_qubits:\n max_n_qubits = n_qubits\n if memory_slots > max_memory_slots:\n max_memory_slots = memory_slots\n\n qobj_config.memory_slots = max_memory_slots\n qobj_config.n_qubits = max_n_qubits\n\n return QasmQobj(qobj_id=qobj_id,\n config=qobj_config,\n experiments=experiments,\n header=qobj_header)", "response": "Assembles a list of circuits into a single Qobj which can be run on the backend."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nassembles a list of schedules into a single Qobj which can be run on the backend.", "response": "def assemble_schedules(schedules, qobj_id=None, qobj_header=None, run_config=None):\n \"\"\"Assembles a list of schedules into a qobj which can be run on the backend.\n Args:\n schedules (list[Schedule]): schedules to assemble\n qobj_id (int): identifier for the generated qobj\n qobj_header (QobjHeader): header to pass to the results\n run_config (RunConfig): configuration of the runtime environment\n Returns:\n PulseQobj: the Qobj to be run on the backends\n Raises:\n QiskitError: when invalid schedules or configs are provided\n \"\"\"\n qobj_config = QasmQobjConfig()\n if run_config:\n qobj_config = QasmQobjConfig(**run_config.to_dict())\n\n # Get appropriate convertors\n instruction_converter = PulseQobjConverter\n instruction_converter = instruction_converter(PulseQobjInstruction, **run_config.to_dict())\n lo_converter = LoConfigConverter(PulseQobjExperimentConfig, run_config.qubit_lo_freq,\n run_config.meas_lo_freq, **run_config.to_dict())\n\n # Pack everything into the Qobj\n qobj_schedules = []\n user_pulselib = set()\n for idx, schedule in enumerate(schedules):\n # instructions\n qobj_instructions = []\n # Instructions are returned as tuple of shifted time and instruction\n for shift, instruction in schedule.instructions:\n # TODO: support conditional gate\n qobj_instructions.append(instruction_converter(shift, instruction))\n if isinstance(instruction, PulseInstruction):\n # add samples to pulse library\n user_pulselib.add(instruction.command)\n # experiment header\n qobj_experiment_header = QobjExperimentHeader(\n name=schedule.name or 'Experiment-%d' % idx\n )\n\n qobj_schedules.append({\n 'header': qobj_experiment_header,\n 'instructions': qobj_instructions\n })\n\n # setup pulse_library\n run_config.pulse_library = [QobjPulseLibrary(name=pulse.name, samples=pulse.samples)\n for pulse in user_pulselib]\n\n # create qob experiment field\n experiments = []\n if len(run_config.schedule_los) == 1:\n lo_dict = run_config.schedule_los.pop()\n # update global config\n q_los = lo_converter.get_qubit_los(lo_dict)\n if q_los:\n run_config.qubit_lo_freq = q_los\n m_los = lo_converter.get_meas_los(lo_dict)\n if m_los:\n run_config.meas_lo_freq = m_los\n\n if run_config.schedule_los:\n # multiple frequency setups\n if len(qobj_schedules) == 1:\n # frequency sweep\n for lo_dict in run_config.schedule_los:\n experiments.append(PulseQobjExperiment(\n instructions=qobj_schedules[0]['instructions'],\n experimentheader=qobj_schedules[0]['header'],\n experimentconfig=lo_converter(lo_dict)\n ))\n elif len(qobj_schedules) == len(run_config.schedule_los):\n # n:n setup\n for lo_dict, schedule in zip(run_config.schedule_los, qobj_schedules):\n experiments.append(PulseQobjExperiment(\n instructions=schedule['instructions'],\n experimentheader=schedule['header'],\n experimentconfig=lo_converter(lo_dict)\n ))\n else:\n raise QiskitError('Invalid LO setting is specified. '\n 'The LO should be configured for each schedule, or '\n 'single setup for all schedules (unique), or '\n 'multiple setups for a single schedule (frequency sweep),'\n 'or no LO configured at all.')\n else:\n # unique frequency setup\n for schedule in qobj_schedules:\n experiments.append(PulseQobjExperiment(\n instructions=schedule['instructions'],\n experimentheader=schedule['header'],\n ))\n\n qobj_config = PulseQobjConfig(**run_config.to_dict())\n\n return PulseQobj(qobj_id=qobj_id,\n config=qobj_config,\n experiments=experiments,\n header=qobj_header)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef assemble(experiments,\n backend=None,\n qobj_id=None, qobj_header=None, # common run options\n shots=1024, memory=False, max_credits=None, seed_simulator=None,\n default_qubit_los=None, default_meas_los=None, # schedule run options\n schedule_los=None, meas_level=2, meas_return='avg',\n memory_slots=None, memory_slot_size=100, rep_time=None, parameter_binds=None,\n config=None, seed=None, # deprecated\n **run_config):\n \"\"\"Assemble a list of circuits or pulse schedules into a Qobj.\n\n This function serializes the payloads, which could be either circuits or schedules,\n to create Qobj \"experiments\". It further annotates the experiment payload with\n header and configurations.\n\n Args:\n experiments (QuantumCircuit or list[QuantumCircuit] or Schedule or list[Schedule]):\n Circuit(s) or pulse schedule(s) to execute\n\n backend (BaseBackend):\n If set, some runtime options are automatically grabbed from\n backend.configuration() and backend.defaults().\n If any other option is explicitly set (e.g. rep_rate), it\n will override the backend's.\n If any other options is set in the run_config, it will\n also override the backend's.\n\n qobj_id (str):\n String identifier to annotate the Qobj\n\n qobj_header (QobjHeader or dict):\n User input that will be inserted in Qobj header, and will also be\n copied to the corresponding Result header. Headers do not affect the run.\n\n shots (int):\n Number of repetitions of each circuit, for sampling. Default: 2014\n\n memory (bool):\n If True, per-shot measurement bitstrings are returned as well\n (provided the backend supports it). For OpenPulse jobs, only\n measurement level 2 supports this option. Default: False\n\n max_credits (int):\n Maximum credits to spend on job. Default: 10\n\n seed_simulator (int):\n Random seed to control sampling, for when backend is a simulator\n\n default_qubit_los (list):\n List of default qubit lo frequencies\n\n default_meas_los (list):\n List of default meas lo frequencies\n\n schedule_los (None or list[Union[Dict[PulseChannel, float], LoConfig]] or\n Union[Dict[PulseChannel, float], LoConfig]):\n Experiment LO configurations\n\n meas_level (int):\n Set the appropriate level of the measurement output for pulse experiments.\n\n meas_return (str):\n Level of measurement data for the backend to return\n For `meas_level` 0 and 1:\n \"single\" returns information from every shot.\n \"avg\" returns average measurement output (averaged over number of shots).\n\n memory_slots (int):\n Number of classical memory slots used in this job.\n\n memory_slot_size (int):\n Size of each memory slot if the output is Level 0.\n\n rep_time (int): repetition time of the experiment in \u03bcs.\n The delay between experiments will be rep_time.\n Must be from the list provided by the device.\n\n parameter_binds (list[dict{Parameter: Value}]):\n List of Parameter bindings over which the set of experiments will be\n executed. Each list element (bind) should be of the form\n {Parameter1: value1, Parameter2: value2, ...}. All binds will be\n executed across all experiments, e.g. if parameter_binds is a\n length-n list, and there are m experiments, a total of m x n\n experiments will be run (one for each experiment/bind pair).\n\n seed (int):\n DEPRECATED in 0.8: use ``seed_simulator`` kwarg instead\n\n config (dict):\n DEPRECATED in 0.8: use run_config instead\n\n run_config (dict):\n extra arguments used to configure the run (e.g. for Aer configurable backends)\n Refer to the backend documentation for details on these arguments\n\n Returns:\n Qobj: a qobj which can be run on a backend. Depending on the type of input,\n this will be either a QasmQobj or a PulseQobj.\n\n Raises:\n QiskitError: if the input cannot be interpreted as either circuits or schedules\n \"\"\"\n # deprecation matter\n if config:\n warnings.warn('config is not used anymore. Set all configs in '\n 'run_config.', DeprecationWarning)\n run_config = run_config or config\n if seed:\n warnings.warn('seed is deprecated in favor of seed_simulator.', DeprecationWarning)\n seed_simulator = seed_simulator or seed\n\n # Get RunConfig(s) that will be inserted in Qobj to configure the run\n experiments = experiments if isinstance(experiments, list) else [experiments]\n qobj_id, qobj_header, run_config = _parse_run_args(backend, qobj_id, qobj_header,\n shots, memory, max_credits, seed_simulator,\n default_qubit_los, default_meas_los,\n schedule_los, meas_level, meas_return,\n memory_slots, memory_slot_size, rep_time,\n parameter_binds, **run_config)\n\n # assemble either circuits or schedules\n if all(isinstance(exp, QuantumCircuit) for exp in experiments):\n # If circuits are parameterized, bind parameters and remove from run_config\n bound_experiments, run_config = _expand_parameters(circuits=experiments,\n run_config=run_config)\n return assemble_circuits(circuits=bound_experiments, qobj_id=qobj_id,\n qobj_header=qobj_header, run_config=run_config)\n\n elif all(isinstance(exp, Schedule) for exp in experiments):\n return assemble_schedules(schedules=experiments, qobj_id=qobj_id,\n qobj_header=qobj_header, run_config=run_config)\n\n else:\n raise QiskitError(\"bad input to assemble() function; \"\n \"must be either circuits or schedules\")", "response": "Assemble a list of circuits or schedules into a single Qobj."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nparse the args for the assemble function.", "response": "def _parse_run_args(backend, qobj_id, qobj_header,\n shots, memory, max_credits, seed_simulator,\n default_qubit_los, default_meas_los,\n schedule_los, meas_level, meas_return,\n memory_slots, memory_slot_size, rep_time,\n parameter_binds, **run_config):\n \"\"\"Resolve the various types of args allowed to the assemble() function through\n duck typing, overriding args, etc. Refer to the assemble() docstring for details on\n what types of inputs are allowed.\n\n Here the args are resolved by converting them to standard instances, and prioritizing\n them in case a run option is passed through multiple args (explicitly setting an arg\n has more priority than the arg set by backend)\n\n Returns:\n RunConfig: a run config, which is a standardized object that configures the qobj\n and determines the runtime environment.\n \"\"\"\n # grab relevant info from backend if it exists\n backend_config = None\n backend_default = None\n if backend:\n backend_config = backend.configuration()\n # TODO : Remove usage of config.defaults when backend.defaults() is updated.\n try:\n backend_default = backend.defaults()\n except (ModelValidationError, AttributeError):\n from collections import namedtuple\n backend_config_defaults = getattr(backend_config, 'defaults', {})\n BackendDefault = namedtuple('BackendDefault', ('qubit_freq_est', 'meas_freq_est'))\n backend_default = BackendDefault(\n qubit_freq_est=backend_config_defaults.get('qubit_freq_est'),\n meas_freq_est=backend_config_defaults.get('meas_freq_est')\n )\n\n memory_slots = memory_slots or getattr(backend_config, 'memory_slots', None)\n rep_time = rep_time or getattr(backend_config, 'rep_times', None)\n if isinstance(rep_time, list):\n rep_time = rep_time[-1]\n\n parameter_binds = parameter_binds or []\n\n # add default empty lo config\n schedule_los = schedule_los or []\n if isinstance(schedule_los, (LoConfig, dict)):\n schedule_los = [schedule_los]\n\n # Convert to LoConfig if lo configuration supplied as dictionary\n schedule_los = [lo_config if isinstance(lo_config, LoConfig) else LoConfig(lo_config)\n for lo_config in schedule_los]\n\n qubit_lo_freq = default_qubit_los or getattr(backend_default, 'qubit_freq_est', [])\n meas_lo_freq = default_meas_los or getattr(backend_default, 'meas_freq_est', [])\n\n # an identifier for the Qobj\n qobj_id = qobj_id or str(uuid.uuid4())\n\n # The header that goes at the top of the Qobj (and later Result)\n # we process it as dict, then write entries that are not None to a QobjHeader object\n qobj_header = qobj_header or {}\n if isinstance(qobj_header, QobjHeader):\n qobj_header = qobj_header.to_dict()\n backend_name = getattr(backend_config, 'backend_name', None)\n backend_version = getattr(backend_config, 'backend_version', None)\n qobj_header = {**dict(backend_name=backend_name, backend_version=backend_version),\n **qobj_header}\n qobj_header = QobjHeader(**{k: v for k, v in qobj_header.items() if v is not None})\n\n # create run configuration and populate\n run_config_dict = dict(shots=shots,\n memory=memory,\n max_credits=max_credits,\n seed_simulator=seed_simulator,\n seed=seed_simulator, # deprecated\n qubit_lo_freq=qubit_lo_freq,\n meas_lo_freq=meas_lo_freq,\n schedule_los=schedule_los,\n meas_level=meas_level,\n meas_return=meas_return,\n memory_slots=memory_slots,\n memory_slot_size=memory_slot_size,\n rep_time=rep_time,\n parameter_binds=parameter_binds,\n **run_config)\n run_config = RunConfig(**{k: v for k, v in run_config_dict.items() if v is not None})\n\n return qobj_id, qobj_header, run_config"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nexpand all parameters in the input circuits and run_config.", "response": "def _expand_parameters(circuits, run_config):\n \"\"\"Verifies that there is a single common set of parameters shared between\n all circuits and all parameter binds in the run_config. Returns an expanded\n list of circuits (if parameterized) with all parameters bound, and a copy of\n the run_config with parameter_binds cleared.\n\n If neither the circuits nor the run_config specify parameters, the two are\n returned unmodified.\n\n Raises:\n QiskitError: if run_config parameters are not compatible with circuit parameters\n\n Returns:\n Tuple(List[QuantumCircuit], RunConfig):\n - List of input circuits expanded and with parameters bound\n - RunConfig with parameter_binds removed\n \"\"\"\n\n parameter_binds = run_config.parameter_binds\n if parameter_binds or \\\n any(circuit.parameters for circuit in circuits):\n\n all_bind_parameters = [bind.keys()\n for bind in parameter_binds]\n all_circuit_parameters = [circuit.parameters for circuit in circuits]\n\n # Collect set of all unique parameters across all circuits and binds\n unique_parameters = set(param\n for param_list in all_bind_parameters + all_circuit_parameters\n for param in param_list)\n\n # Check that all parameters are common to all circuits and binds\n if not all_bind_parameters \\\n or not all_circuit_parameters \\\n or any(unique_parameters != bind_params for bind_params in all_bind_parameters) \\\n or any(unique_parameters != parameters for parameters in all_circuit_parameters):\n raise QiskitError(\n ('Mismatch between run_config.parameter_binds and all circuit parameters. ' +\n 'Parameter binds: {} ' +\n 'Circuit parameters: {}').format(all_bind_parameters, all_circuit_parameters))\n\n circuits = [circuit.bind_parameters(binds)\n for circuit in circuits\n for binds in parameter_binds]\n\n # All parameters have been expanded and bound, so remove from run_config\n run_config = copy.deepcopy(run_config)\n run_config.parameter_binds = []\n\n return circuits, run_config"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef unset_qiskit_logger():\n qiskit_logger = logging.getLogger('qiskit')\n for handler in qiskit_logger.handlers:\n qiskit_logger.removeHandler(handler)", "response": "Remove the handlers for the qiskit logger."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef iplot_state_hinton(rho, figsize=None):\n\n # HTML\n html_template = Template(\"\"\"\n

\n

\n

\n \"\"\")\n\n # JavaScript\n javascript_template = Template(\"\"\"\n \n \"\"\")\n rho = _validate_input_state(rho)\n if figsize is None:\n options = {}\n else:\n options = {'width': figsize[0], 'height': figsize[1]}\n\n # Process data and execute\n div_number = str(time.time())\n div_number = re.sub('[.]', '', div_number)\n\n # Process data and execute\n real = []\n imag = []\n for xvalue in rho:\n row_real = []\n col_imag = []\n\n for value_real in xvalue.real:\n row_real.append(float(value_real))\n real.append(row_real)\n\n for value_imag in xvalue.imag:\n col_imag.append(float(value_imag))\n imag.append(col_imag)\n\n html = html_template.substitute({\n 'divNumber': div_number\n })\n\n javascript = javascript_template.substitute({\n 'divNumber': div_number,\n 'executions': [{'data': real}, {'data': imag}],\n 'options': options\n })\n\n display(HTML(html + javascript))", "response": "Creates a 2D city style hinton representation of the input array."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef process_fidelity(channel1, channel2, require_cptp=True):\n # First we must determine if input is to be interpreted as a unitary matrix\n # or as a channel.\n # If input is a raw numpy array we will interpret it as a unitary matrix.\n is_cptp1 = None\n is_cptp2 = None\n if isinstance(channel1, (list, np.ndarray)):\n channel1 = Operator(channel1)\n if require_cptp:\n is_cptp1 = channel1.is_unitary()\n if isinstance(channel2, (list, np.ndarray)):\n channel2 = Operator(channel2)\n if require_cptp:\n is_cptp2 = channel2.is_unitary()\n\n # Next we convert inputs SuperOp objects\n # This works for objects that also have a `to_operator` or `to_channel` method\n s1 = SuperOp(channel1)\n s2 = SuperOp(channel2)\n\n # Check inputs are CPTP\n if require_cptp:\n # Only check SuperOp if we didn't already check unitary inputs\n if is_cptp1 is None:\n is_cptp1 = s1.is_cptp()\n if not is_cptp1:\n raise QiskitError('channel1 is not CPTP')\n if is_cptp2 is None:\n is_cptp2 = s2.is_cptp()\n if not is_cptp2:\n raise QiskitError('channel2 is not CPTP')\n\n # Check dimensions match\n input_dim1, output_dim1 = s1.dim\n input_dim2, output_dim2 = s2.dim\n if input_dim1 != output_dim1 or input_dim2 != output_dim2:\n raise QiskitError('Input channels must have same size input and output dimensions.')\n if input_dim1 != input_dim2:\n raise QiskitError('Input channels have different dimensions.')\n\n # Compute process fidelity\n fidelity = np.trace(s1.compose(s2.adjoint()).data) / (input_dim1 ** 2)\n return fidelity", "response": "This function returns the process fidelity between two quantum channels."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef input(self, data):\n self.data = data\n self.lexer.input(data)", "response": "Set the input text data."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef pop(self):\n self.lexer = self.stack.pop()\n self.filename = self.lexer.qasm_file\n self.lineno = self.lexer.qasm_line", "response": "Pop a PLY lexer off the stack."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\npushing a PLY lexer on the stack to parse filename.", "response": "def push(self, filename):\n \"\"\"Push a PLY lexer on the stack to parse filename.\"\"\"\n self.lexer.qasm_file = self.filename\n self.lexer.qasm_line = self.lineno\n self.stack.append(self.lexer)\n self.__mklexer__(filename)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nparsing include file and return the next token.", "response": "def t_INCLUDE(self, t):\n 'include'\n #\n # Now eat up the next two tokens which must be\n # 1 - the name of the include file, and\n # 2 - a terminating semicolon\n #\n # Then push the current lexer onto the stack, create a new one from\n # the include file, and push it onto the stack.\n #\n # When we hit eof (the t_eof) rule, we pop.\n next_token = self.lexer.token()\n lineno = next_token.lineno\n # print('NEXT', next, \"next.value\", next.value, type(next))\n if isinstance(next_token.value, str):\n incfile = next_token.value.strip('\"')\n else:\n raise QasmError(\"Invalid include: must be a quoted string.\")\n\n if incfile in CORE_LIBS:\n incfile = os.path.join(CORE_LIBS_PATH, incfile)\n\n next_token = self.lexer.token()\n if next_token is None or next_token.value != ';':\n raise QasmError('Invalid syntax, missing \";\" at line', str(lineno))\n\n if not os.path.exists(incfile):\n raise QasmError(\n 'Include file %s cannot be found, line %s, file %s' %\n (incfile, str(next_token.lineno), self.filename))\n self.push(incfile)\n return self.lexer.token()"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef t_ID(self, t):\n r'[a-z][a-zA-Z0-9_]*'\n\n t.type = self.reserved.get(t.value, 'ID')\n if t.type == 'ID':\n t.value = node.Id(t.value, self.lineno, self.filename)\n return t", "response": "r A type can be used to set the ID attribute of a resource."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef create_from(cls, backend):\n backend_config = backend.configuration()\n\n # TODO : Remove usage of config.defaults when backend.defaults() is updated.\n try:\n backend_default = backend.defaults()\n except ModelValidationError:\n from collections import namedtuple\n BackendDefault = namedtuple('BackendDefault', ('qubit_freq_est', 'meas_freq_est'))\n\n backend_default = BackendDefault(\n qubit_freq_est=backend_config.defaults['qubit_freq_est'],\n meas_freq_est=backend_config.defaults['meas_freq_est']\n )\n\n # system size\n n_qubits = backend_config.n_qubits\n n_registers = backend_config.n_registers\n n_uchannels = backend_config.n_uchannels\n\n if n_uchannels > 0 and n_uchannels != n_qubits:\n raise PulseError(\"This version assumes no U-channels or #U-cannels==#qubits.\")\n\n # frequency information\n qubit_lo_freqs = backend_default.qubit_freq_est\n qubit_lo_ranges = backend_config.qubit_lo_range\n meas_lo_freqs = backend_default.meas_freq_est\n meas_lo_ranges = backend_config.meas_lo_range\n\n # generate channels with assuming their numberings are aligned with qubits\n drives = [\n DriveChannel(i, qubit_lo_freqs[i], tuple(qubit_lo_ranges[i]))\n for i in range(n_qubits)\n ]\n measures = [\n MeasureChannel(i, meas_lo_freqs[i], tuple(meas_lo_ranges[i]))\n for i in range(n_qubits)\n ]\n acquires = [AcquireChannel(i) for i in range(n_qubits)]\n controls = [ControlChannel(i) for i in range(n_uchannels)]\n\n qubits = []\n for i in range(n_qubits):\n # TODO: get qubits <-> channels relationship from backend\n qubit = Qubit(i,\n drive_channels=[drives[i]],\n control_channels=None if n_uchannels == 0 else controls[i],\n measure_channels=[measures[i]],\n acquire_channels=[acquires[i]])\n qubits.append(qubit)\n\n registers = [RegisterSlot(i) for i in range(n_registers)]\n # TODO: get #mem_slots from backend\n mem_slots = [MemorySlot(i) for i in range(len(qubits))]\n\n return DeviceSpecification(qubits, registers, mem_slots)", "response": "Create a new instance of the object based on the given backend."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef run(self, dag):\n new_dag = DAGCircuit()\n for qreg in dag.qregs.values():\n new_dag.add_qreg(qreg)\n for creg in dag.cregs.values():\n new_dag.add_creg(creg)\n\n # compute ordered indices for the global circuit wires\n global_index_map = {}\n for wire in dag.wires:\n if not isinstance(wire[0], QuantumRegister):\n continue\n global_qregs = list(dag.qregs.values())\n global_index_map[wire] = global_qregs.index(wire[0]) + wire[1]\n\n blocks = self.property_set['block_list']\n nodes_seen = set()\n\n for node in dag.topological_op_nodes():\n # skip already-visited nodes or input/output nodes\n if node in nodes_seen or node.type == 'in' or node.type == 'out':\n continue\n # check if the node belongs to the next block\n if blocks and node in blocks[0]:\n block = blocks[0]\n # find the qubits involved in this block\n block_qargs = set()\n for nd in block:\n block_qargs |= set(nd.qargs)\n # convert block to a sub-circuit, then simulate unitary and add\n block_width = len(block_qargs)\n q = QuantumRegister(block_width)\n subcirc = QuantumCircuit(q)\n block_index_map = self._block_qargs_to_indices(block_qargs,\n global_index_map)\n for nd in block:\n nodes_seen.add(nd)\n subcirc.append(nd.op, [q[block_index_map[i]] for i in nd.qargs])\n unitary = UnitaryGate(Operator(subcirc)) # simulates the circuit\n new_dag.apply_operation_back(\n unitary, sorted(block_qargs, key=lambda x: block_index_map[x]))\n del blocks[0]\n else:\n # the node could belong to some future block, but in that case\n # we simply skip it. It is guaranteed that we will revisit that\n # future block, via its other nodes\n for block in blocks[1:]:\n if node in block:\n break\n # freestanding nodes can just be added\n else:\n nodes_seen.add(node)\n new_dag.apply_operation_back(node.op, node.qargs, node.cargs)\n\n return new_dag", "response": "iterate over each block and replace it with an equivalent UnitaryGate on the same wires."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nmap each qubit in block_qargs to its wire position among the block s wires.", "response": "def _block_qargs_to_indices(self, block_qargs, global_index_map):\n \"\"\"\n Map each qubit in block_qargs to its wire position among the block's wires.\n Args:\n block_qargs (list): list of qubits that a block acts on\n global_index_map (dict): mapping from each qubit in the\n circuit to its wire position within that circuit\n Returns:\n dict: mapping from qarg to position in block\n \"\"\"\n block_indices = [global_index_map[q] for q in block_qargs]\n ordered_block_indices = sorted(block_indices)\n block_positions = {q: ordered_block_indices.index(global_index_map[q])\n for q in block_qargs}\n return block_positions"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_bound_method(self, instruction):\n try:\n return self._bound_instructions[type(instruction)]\n except KeyError:\n raise PulseError('Qobj conversion method for %s is not found.' % instruction)", "response": "Get conversion method for instruction."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nconverting an acquire instruction to a new version of the QOBJ model.", "response": "def convert_acquire(self, shift, instruction):\n \"\"\"Return converted `AcquireInstruction`.\n\n Args:\n shift(int): Offset time.\n instruction (AcquireInstruction): acquire instruction.\n Returns:\n dict: Dictionary of required parameters.\n \"\"\"\n meas_level = self._run_config.get('meas_level', 2)\n\n command_dict = {\n 'name': 'acquire',\n 't0': shift+instruction.start_time,\n 'duration': instruction.duration,\n 'qubits': [q.index for q in instruction.acquires],\n 'memory_slot': [m.index for m in instruction.mem_slots]\n }\n if meas_level == 2:\n # setup discriminators\n if instruction.command.discriminator:\n command_dict.update({\n 'discriminators': [\n QobjMeasurementOption(\n name=instruction.command.discriminator.name,\n params=instruction.command.discriminator.params)\n ]\n })\n # setup register_slots\n command_dict.update({\n 'register_slot': [regs.index for regs in instruction.reg_slots]\n })\n if meas_level >= 1:\n # setup kernels\n if instruction.command.kernel:\n command_dict.update({\n 'kernels': [\n QobjMeasurementOption(\n name=instruction.command.kernel.name,\n params=instruction.command.kernel.params)\n ]\n })\n return self._qobj_model(**command_dict)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn converted frame change instruction.", "response": "def convert_frame_change(self, shift, instruction):\n \"\"\"Return converted `FrameChangeInstruction`.\n\n Args:\n shift(int): Offset time.\n instruction (FrameChangeInstruction): frame change instruction.\n Returns:\n dict: Dictionary of required parameters.\n \"\"\"\n command_dict = {\n 'name': 'fc',\n 't0': shift+instruction.start_time,\n 'ch': instruction.channels[0].name,\n 'phase': instruction.command.phase\n }\n return self._qobj_model(**command_dict)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn converted persistent value instruction.", "response": "def convert_persistent_value(self, shift, instruction):\n \"\"\"Return converted `PersistentValueInstruction`.\n\n Args:\n shift(int): Offset time.\n instruction (PersistentValueInstruction): persistent value instruction.\n Returns:\n dict: Dictionary of required parameters.\n \"\"\"\n command_dict = {\n 'name': 'pv',\n 't0': shift+instruction.start_time,\n 'ch': instruction.channels[0].name,\n 'val': instruction.command.value\n }\n return self._qobj_model(**command_dict)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn converted drive command.", "response": "def convert_drive(self, shift, instruction):\n \"\"\"Return converted `PulseInstruction`.\n\n Args:\n shift(int): Offset time.\n instruction (PulseInstruction): drive instruction.\n Returns:\n dict: Dictionary of required parameters.\n \"\"\"\n command_dict = {\n 'name': instruction.command.name,\n 't0': shift+instruction.start_time,\n 'ch': instruction.channels[0].name\n }\n return self._qobj_model(**command_dict)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef convert_snapshot(self, shift, instruction):\n command_dict = {\n 'name': 'snapshot',\n 't0': shift+instruction.start_time,\n 'label': instruction.name,\n 'type': instruction.type\n }\n return self._qobj_model(**command_dict)", "response": "Return converted `Snapshot`.\n\n Args:\n shift(int): Offset time.\n instruction (Snapshot): snapshot instruction.\n Returns:\n dict: Dictionary of required parameters."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _update_annotations(discretized_pulse: Callable) -> Callable:\n undecorated_annotations = list(discretized_pulse.__annotations__.items())\n decorated_annotations = undecorated_annotations[1:]\n decorated_annotations.insert(0, ('duration', int))\n discretized_pulse.__annotations__ = dict(decorated_annotations)\n return discretized_pulse", "response": "Update annotations of discretized continuous pulse function with duration."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _update_docstring(discretized_pulse: Callable, sampler_inst: Callable) -> Callable:\n wrapped_docstring = pydoc.render_doc(discretized_pulse, '%s')\n header, body = wrapped_docstring.split('\\n', 1)\n body = textwrap.indent(body, ' ')\n wrapped_docstring = header+body\n updated_ds = \"\"\"\n Discretized continuous pulse function: `{continuous_name}` using\n sampler: `{sampler_name}`.\n\n The first argument (time) of the continuous pulse function has been replaced with\n a discretized `duration` of type (int).\n\n Args:\n duration (int)\n *args: Remaining arguments of continuous pulse function.\n See continuous pulse function documentation below.\n **kwargs: Remaining kwargs of continuous pulse function.\n See continuous pulse function documentation below.\n\n Sampled continuous function:\n\n {continuous_doc}\n \"\"\".format(continuous_name=discretized_pulse.__name__,\n sampler_name=sampler_inst.__name__,\n continuous_doc=wrapped_docstring)\n\n discretized_pulse.__doc__ = updated_ds\n return discretized_pulse", "response": "Update the docstring of a continuous pulse function."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef sampler(sample_function: Callable) -> Callable:\n\n def generate_sampler(continuous_pulse: Callable) -> Callable:\n \"\"\"Return a decorated sampler function.\"\"\"\n\n @functools.wraps(continuous_pulse)\n def call_sampler(duration: int, *args, **kwargs) -> commands.SamplePulse:\n \"\"\"Replace the call to the continuous function with a call to the sampler applied\n to the anlytic pulse function.\"\"\"\n sampled_pulse = sample_function(continuous_pulse, duration, *args, **kwargs)\n return np.asarray(sampled_pulse, dtype=np.complex_)\n\n # Update type annotations for wrapped continuous function to be discrete\n call_sampler = _update_annotations(call_sampler)\n # Update docstring with that of the sampler and include sampled function documentation.\n call_sampler = _update_docstring(call_sampler, sample_function)\n # Unset wrapped to return base sampler signature\n # but still get rest of benefits of wraps\n # such as __name__, __qualname__\n call_sampler.__dict__.pop('__wrapped__')\n # wrap with functional pulse\n return commands.functional_pulse(call_sampler)\n\n return generate_sampler", "response": "Decorator for generating a sampler function."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nfiltering the list of backends by their configuration or status attributes or from a boolean callable.", "response": "def filter_backends(backends, filters=None, **kwargs):\n \"\"\"Return the backends matching the specified filtering.\n\n Filter the `backends` list by their `configuration` or `status`\n attributes, or from a boolean callable. The criteria for filtering can\n be specified via `**kwargs` or as a callable via `filters`, and the\n backends must fulfill all specified conditions.\n\n Args:\n backends (list[BaseBackend]): list of backends.\n filters (callable): filtering conditions as a callable.\n **kwargs (dict): dict of criteria.\n\n Returns:\n list[BaseBackend]: a list of backend instances matching the\n conditions.\n \"\"\"\n def _match_all(obj, criteria):\n \"\"\"Return True if all items in criteria matches items in obj.\"\"\"\n return all(getattr(obj, key_, None) == value_ for\n key_, value_ in criteria.items())\n\n # Inspect the backends to decide which filters belong to\n # backend.configuration and which ones to backend.status, as it does\n # not involve querying the API.\n configuration_filters = {}\n status_filters = {}\n for key, value in kwargs.items():\n if all(key in backend.configuration() for backend in backends):\n configuration_filters[key] = value\n else:\n status_filters[key] = value\n\n # 1. Apply backend.configuration filtering.\n if configuration_filters:\n backends = [b for b in backends if\n _match_all(b.configuration(), configuration_filters)]\n\n # 2. Apply backend.status filtering (it involves one API call for\n # each backend).\n if status_filters:\n backends = [b for b in backends if\n _match_all(b.status(), status_filters)]\n\n # 3. Apply acceptor filter.\n backends = list(filter(filters, backends))\n\n return backends"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef resolve_backend_name(name, backends, deprecated, aliased):\n available = [backend.name() for backend in backends]\n\n resolved_name = deprecated.get(name, aliased.get(name, name))\n if isinstance(resolved_name, list):\n resolved_name = next((b for b in resolved_name if b in available), \"\")\n\n if resolved_name not in available:\n raise LookupError(\"backend '{}' not found.\".format(name))\n\n if name in deprecated:\n logger.warning(\"WARNING: '%s' is deprecated. Use '%s'.\", name, resolved_name)\n\n return resolved_name", "response": "Resolve backend name from a deprecated name or an aliased name."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef dag_to_circuit(dag):\n qregs = collections.OrderedDict()\n for qreg in dag.qregs.values():\n qreg_tmp = QuantumRegister(qreg.size, name=qreg.name)\n qregs[qreg.name] = qreg_tmp\n cregs = collections.OrderedDict()\n for creg in dag.cregs.values():\n creg_tmp = ClassicalRegister(creg.size, name=creg.name)\n cregs[creg.name] = creg_tmp\n\n name = dag.name or None\n circuit = QuantumCircuit(*qregs.values(), *cregs.values(), name=name)\n\n for node in dag.topological_op_nodes():\n qubits = []\n for qubit in node.qargs:\n qubits.append(qregs[qubit[0].name][qubit[1]])\n\n clbits = []\n for clbit in node.cargs:\n clbits.append(cregs[clbit[0].name][clbit[1]])\n\n # Get arguments for classical control (if any)\n if node.condition is None:\n control = None\n else:\n control = (node.condition[0], node.condition[1])\n\n inst = node.op.copy()\n inst.control = control\n circuit.append(inst, qubits, clbits)\n return circuit", "response": "Return a QuantumCircuit object from a DAGCircuit."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nconvert an observable in matrix form to dictionary form.", "response": "def make_dict_observable(matrix_observable):\n \"\"\"Convert an observable in matrix form to dictionary form.\n\n Takes in a diagonal observable as a matrix and converts it to a dictionary\n form. Can also handle a list sorted of the diagonal elements.\n\n Args:\n matrix_observable (list): The observable to be converted to dictionary\n form. Can be a matrix or just an ordered list of observed values\n\n Returns:\n Dict: A dictionary with all observable states as keys, and corresponding\n values being the observed value for that state\n \"\"\"\n dict_observable = {}\n observable = np.array(matrix_observable)\n observable_size = len(observable)\n observable_bits = int(np.ceil(np.log2(observable_size)))\n binary_formater = '0{}b'.format(observable_bits)\n if observable.ndim == 2:\n observable = observable.diagonal()\n for state_no in range(observable_size):\n state_str = format(state_no, binary_formater)\n dict_observable[state_str] = observable[state_no]\n return dict_observable"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nupdate a node in the symbol table.", "response": "def update_symtab(self, obj):\n \"\"\"Update a node in the symbol table.\n\n Everything in the symtab must be a node with these attributes:\n name - the string name of the object\n type - the string type of the object\n line - the source line where the type was first found\n file - the source file where the type was first found\n \"\"\"\n if obj.name in self.current_symtab:\n prev = self.current_symtab[obj.name]\n raise QasmError(\"Duplicate declaration for\", obj.type + \" '\"\n + obj.name + \"' at line\", str(obj.line)\n + ', file', obj.file\n + '.\\nPrevious occurrence at line',\n str(prev.line) + ', file', prev.file)\n self.current_symtab[obj.name] = obj"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef verify_declared_bit(self, obj):\n # We are verifying gate args against the formal parameters of a\n # gate prototype.\n if obj.name not in self.current_symtab:\n raise QasmError(\"Cannot find symbol '\" + obj.name\n + \"' in argument list for gate, line\",\n str(obj.line), 'file', obj.file)\n\n # This insures the thing is from the bitlist and not from the\n # argument list.\n sym = self.current_symtab[obj.name]\n if not (sym.type == 'id' and sym.is_bit):\n raise QasmError(\"Bit\", obj.name,\n 'is not declared as a bit in the gate.')", "response": "Verify a qubit id against the gate prototype."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef verify_exp_list(self, obj):\n # A tad harder. This is a list of expressions each of which could be\n # the head of a tree. We need to recursively walk each of these and\n # ensure that any Id elements resolve to the current stack.\n #\n # I believe we only have to look at the current symtab.\n if obj.children is not None:\n for children in obj.children:\n if isinstance(children, node.Id):\n if children.name in self.external_functions:\n continue\n\n if children.name not in self.current_symtab:\n raise QasmError(\"Argument '\" + children.name\n + \"' in expression cannot be \"\n + \"found, line\", str(children.line),\n \"file\", children.file)\n else:\n if hasattr(children, \"children\"):\n self.verify_exp_list(children)", "response": "Verify each expression in a list."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nverifies a user defined gate call.", "response": "def verify_as_gate(self, obj, bitlist, arglist=None):\n \"\"\"Verify a user defined gate call.\"\"\"\n if obj.name not in self.global_symtab:\n raise QasmError(\"Cannot find gate definition for '\" + obj.name\n + \"', line\", str(obj.line), 'file', obj.file)\n g_sym = self.global_symtab[obj.name]\n if not (g_sym.type == 'gate' or g_sym.type == 'opaque'):\n raise QasmError(\"'\" + obj.name + \"' is used as a gate \"\n + \"or opaque call but the symbol is neither;\"\n + \" it is a '\" + g_sym.type + \"' line\",\n str(obj.line), 'file', obj.file)\n\n if g_sym.n_bits() != bitlist.size():\n raise QasmError(\"Gate or opaque call to '\" + obj.name\n + \"' uses\", str(bitlist.size()),\n \"qubits but is declared for\",\n str(g_sym.n_bits()), \"qubits\", \"line\",\n str(obj.line), 'file', obj.file)\n\n if arglist:\n if g_sym.n_args() != arglist.size():\n raise QasmError(\"Gate or opaque call to '\" + obj.name\n + \"' uses\", str(arglist.size()),\n \"qubits but is declared for\",\n str(g_sym.n_args()), \"qubits\", \"line\",\n str(obj.line), 'file', obj.file)\n else:\n if g_sym.n_args() > 0:\n raise QasmError(\"Gate or opaque call to '\" + obj.name\n + \"' has no arguments but is declared for\",\n str(g_sym.n_args()), \"qubits\", \"line\",\n str(obj.line), 'file', obj.file)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef verify_reg_list(self, obj, object_type):\n # We expect the object to be a bitlist or an idlist, we don't care.\n # We will iterate it and ensure everything in it is declared as a bit,\n # and throw if not.\n for children in obj.children:\n self.verify_reg(children, object_type)", "response": "Verify a list of registers."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning a list of tuples for this id node.", "response": "def id_tuple_list(self, id_node):\n \"\"\"Return a list of (name, index) tuples for this id node.\"\"\"\n if id_node.type != \"id\":\n raise QasmError(\"internal error, id_tuple_list\")\n bit_list = []\n try:\n g_sym = self.current_symtab[id_node.name]\n except KeyError:\n g_sym = self.global_symtab[id_node.name]\n if g_sym.type == \"qreg\" or g_sym.type == \"creg\":\n # Return list of (name, idx) for reg ids\n for idx in range(g_sym.index):\n bit_list.append((id_node.name, idx))\n else:\n # Return (name, -1) for other ids\n bit_list.append((id_node.name, -1))\n return bit_list"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef verify_distinct(self, list_of_nodes):\n bit_list = []\n line_number = -1\n filename = \"\"\n for node_ in list_of_nodes:\n # id node: add all bits in register or (name, -1) for id\n if node_.type == \"id\":\n bit_list.extend(self.id_tuple_list(node_))\n line_number = node_.line\n filename = node_.file\n # indexed_id: add the bit\n elif node_.type == \"indexed_id\":\n bit_list.append((node_.name, node_.index))\n line_number = node_.line\n filename = node_.file\n # primary_list: for each id or indexed_id child, add\n elif node_.type == \"primary_list\":\n for child in node_.children:\n if child.type == \"id\":\n bit_list.extend(self.id_tuple_list(child))\n else:\n bit_list.append((child.name, child.index))\n line_number = child.line\n filename = child.file\n # id_list: for each id, add\n elif node_.type == \"id_list\":\n for child in node_.children:\n bit_list.extend(self.id_tuple_list(child))\n line_number = child.line\n filename = child.file\n else:\n raise QasmError(\"internal error, verify_distinct\")\n if len(bit_list) != len(set(bit_list)):\n raise QasmError(\"duplicate identifiers at line %d file %s\"\n % (line_number, filename))", "response": "Verify that the objects in list_of_nodes represent distinct bits."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nprocessing a single statement.", "response": "def p_statement(self, program):\n \"\"\"\n statement : decl\n | quantum_op ';'\n | format ';'\n | ignore\n | quantum_op error\n | format error\n \"\"\"\n if len(program) > 2:\n if program[2] != ';':\n raise QasmError(\"Missing ';' at end of statement; \"\n + \"received\", str(program[2].value))\n program[0] = program[1]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nparse indexed ID program.", "response": "def p_indexed_id(self, program):\n \"\"\"\n indexed_id : id '[' NNINTEGER ']'\n | id '[' NNINTEGER error\n | id '[' error\n \"\"\"\n if len(program) == 4:\n raise QasmError(\"Expecting an integer index; received\",\n str(program[3].value))\n if program[4] != ']':\n raise QasmError(\"Missing ']' in indexed ID; received\",\n str(program[4].value))\n program[0] = node.IndexedId([program[1], node.Int(program[3])])"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef p_gate_id_list_0(self, program):\n program[0] = node.IdList([program[1]])\n self.update_symtab(program[1])", "response": "Update the symtab with the list of gate IDs."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef p_bit_list_0(self, program):\n program[0] = node.IdList([program[1]])\n program[1].is_bit = True\n self.update_symtab(program[1])", "response": "Set the bit list to 0"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nprocessing bit list 1.", "response": "def p_bit_list_1(self, program):\n \"\"\"\n bit_list : bit_list ',' id\n \"\"\"\n program[0] = program[1]\n program[0].add_child(program[3])\n program[3].is_bit = True\n self.update_symtab(program[3])"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef p_decl(self, program):\n if len(program) > 2:\n if program[2] != ';':\n raise QasmError(\"Missing ';' in qreg or creg declaration.\"\n \" Instead received '\" + program[2].value + \"'\")\n program[0] = program[1]", "response": "Process the declaration of a set of keys."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nprocessing a QREG declaration.", "response": "def p_qreg_decl(self, program):\n \"\"\"\n qreg_decl : QREG indexed_id\n \"\"\"\n program[0] = node.Qreg([program[2]])\n if program[2].name in self.external_functions:\n raise QasmError(\"QREG names cannot be reserved words. \"\n + \"Received '\" + program[2].name + \"'\")\n if program[2].index == 0:\n raise QasmError(\"QREG size must be positive\")\n self.update_symtab(program[0])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nprocessing a CREG declaration program.", "response": "def p_creg_decl(self, program):\n \"\"\"\n creg_decl : CREG indexed_id\n \"\"\"\n program[0] = node.Creg([program[2]])\n if program[2].name in self.external_functions:\n raise QasmError(\"CREG names cannot be reserved words. \"\n + \"Received '\" + program[2].name + \"'\")\n if program[2].index == 0:\n raise QasmError(\"CREG size must be positive\")\n self.update_symtab(program[0])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nparsing the gate body.", "response": "def p_gate_body_0(self, program):\n \"\"\"\n gate_body : '{' '}'\n \"\"\"\n if program[2] != '}':\n raise QasmError(\"Missing '}' in gate definition; received'\"\n + str(program[2].value) + \"'\")\n program[0] = node.GateBody(None)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef p_unitary_op_0(self, program):\n program[0] = node.UniversalUnitary([program[3], program[5]])\n self.verify_reg(program[5], 'qreg')\n self.verify_exp_list(program[3])", "response": "P_unitary_op_0 is the unitary operation that is used in the code."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef p_unitary_op_1(self, program):\n program[0] = node.Cnot([program[2], program[4]])\n self.verify_reg(program[2], 'qreg')\n self.verify_reg(program[4], 'qreg')\n self.verify_distinct([program[2], program[4]])", "response": "P 1. 2. 6. 1. 1."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef p_unitary_op_2(self, program):\n program[0] = node.CustomUnitary([program[1], program[2]])\n self.verify_as_gate(program[1], program[2])\n self.verify_reg_list(program[2], 'qreg')\n self.verify_distinct([program[2]])", "response": "Verify that unitary_op is 2."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck that the unitary op is 3.", "response": "def p_unitary_op_3(self, program):\n \"\"\"\n unitary_op : id '(' ')' primary_list\n \"\"\"\n program[0] = node.CustomUnitary([program[1], program[4]])\n self.verify_as_gate(program[1], program[4])\n self.verify_reg_list(program[4], 'qreg')\n self.verify_distinct([program[4]])"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nhandles the 4 - level unitary operation.", "response": "def p_unitary_op_4(self, program):\n \"\"\"\n unitary_op : id '(' exp_list ')' primary_list\n \"\"\"\n program[0] = node.CustomUnitary([program[1], program[3], program[5]])\n self.verify_as_gate(program[1], program[5], arglist=program[3])\n self.verify_reg_list(program[5], 'qreg')\n self.verify_exp_list(program[3])\n self.verify_distinct([program[5]])"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef p_gate_op_1(self, program):\n program[0] = node.Cnot([program[2], program[4]])\n self.verify_declared_bit(program[2])\n self.verify_declared_bit(program[4])\n self.verify_distinct([program[2], program[4]])", "response": "P gate op 1"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef p_gate_op_2(self, program):\n program[0] = node.CustomUnitary([program[1], program[2]])\n # To verify:\n # 1. id is declared as a gate in global scope\n # 2. everything in the id_list is declared as a bit in local scope\n self.verify_as_gate(program[1], program[2])\n self.verify_bit_list(program[2])\n self.verify_distinct([program[2]])", "response": "Check that gate_op is 2."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nhandle gate op 3.", "response": "def p_gate_op_3(self, program):\n \"\"\"\n gate_op : id '(' ')' id_list ';'\n \"\"\"\n program[0] = node.CustomUnitary([program[1], program[4]])\n self.verify_as_gate(program[1], program[4])\n self.verify_bit_list(program[4])\n self.verify_distinct([program[4]])"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef p_gate_op_4(self, program):\n program[0] = node.CustomUnitary([program[1], program[3], program[5]])\n self.verify_as_gate(program[1], program[5], arglist=program[3])\n self.verify_bit_list(program[5])\n self.verify_exp_list(program[3])\n self.verify_distinct([program[5]])", "response": "Handle gate op 4."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nhandles the gate operation 5.", "response": "def p_gate_op_5(self, program):\n \"\"\"\n gate_op : BARRIER id_list ';'\n \"\"\"\n program[0] = node.Barrier([program[2]])\n self.verify_bit_list(program[2])\n self.verify_distinct([program[2]])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nopaque : OPAQUE id gate_scope bit_list", "response": "def p_opaque_0(self, program):\n \"\"\"\n opaque : OPAQUE id gate_scope bit_list\n \"\"\"\n # TODO: Review Opaque function\n program[0] = node.Opaque([program[2], program[4]])\n if program[2].name in self.external_functions:\n raise QasmError(\"OPAQUE names cannot be reserved words. \"\n + \"Received '\" + program[2].name + \"'\")\n self.pop_scope()\n self.update_symtab(program[0])"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nopaquing : OPAQUE id gate_scope '(' ')' bit_list", "response": "def p_opaque_1(self, program):\n \"\"\"\n opaque : OPAQUE id gate_scope '(' ')' bit_list\n \"\"\"\n program[0] = node.Opaque([program[2], program[6]])\n self.pop_scope()\n self.update_symtab(program[0])"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef p_measure(self, program):\n program[0] = node.Measure([program[2], program[4]])\n self.verify_reg(program[2], 'qreg')\n self.verify_reg(program[4], 'creg')", "response": "P_MEASURE is a basic operation on the COORDINATES section."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nresetting the entry point.", "response": "def p_reset(self, program):\n \"\"\"\n reset : RESET primary\n \"\"\"\n program[0] = node.Reset([program[2]])\n self.verify_reg(program[2], 'qreg')"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef p_if(self, program):\n if len(program) == 3:\n raise QasmError(\"Ill-formed IF statement. Perhaps a\"\n + \" missing '('?\")\n if len(program) == 5:\n raise QasmError(\"Ill-formed IF statement. Expected '==', \"\n + \"received '\" + str(program[4].value))\n if len(program) == 6:\n raise QasmError(\"Ill-formed IF statement. Expected a number, \"\n + \"received '\" + str(program[5].value))\n if len(program) == 7:\n raise QasmError(\"Ill-formed IF statement, unmatched '('\")\n\n if program[7].type == 'if':\n raise QasmError(\"Nested IF statements not allowed\")\n if program[7].type == 'barrier':\n raise QasmError(\"barrier not permitted in IF statement\")\n\n program[0] = node.If([program[3], node.Int(program[5]), program[7]])", "response": "P - IF statement."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef p_expression_1(self, program):\n program[0] = node.Prefix([node.UnaryOperator(program[1]), program[2]])", "response": "expression : '-' expression %prec negative\n | '+' expression %prec positive"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef p_expression_0(self, program):\n program[0] = node.BinaryOp([node.BinaryOperator(program[2]),\n program[1], program[3]])", "response": "expression : expression '*' expression\n | expression '/' expression\n | expression '+' expression\n | expression '-' expression\n | expression '^' expression"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncompute the column in the input text string.", "response": "def find_column(self, input_, token):\n \"\"\"Compute the column.\n\n Input is the input text string.\n token is a token instance.\n \"\"\"\n if token is None:\n return 0\n last_cr = input_.rfind('\\n', 0, token.lexpos)\n if last_cr < 0:\n last_cr = 0\n column = (token.lexpos - last_cr) + 1\n return column"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_tokens(self):\n try:\n while True:\n token = self.lexer.token()\n\n if not token:\n break\n\n yield token\n except QasmError as e:\n print('Exception tokenizing qasm file:', e.msg)", "response": "Returns a generator of the tokens."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsets the parse_deb field.", "response": "def parse_debug(self, val):\n \"\"\"Set the parse_deb field.\"\"\"\n if val is True:\n self.parse_deb = True\n elif val is False:\n self.parse_deb = False\n else:\n raise QasmError(\"Illegal debug value '\" + str(val)\n + \"' must be True or False.\")"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nparsing the data and return the result.", "response": "def run(self, data):\n \"\"\"Parser runner.\n\n To use this module stand-alone.\n \"\"\"\n ast = self.parser.parse(data, debug=True)\n self.parser.parse(data, debug=True)\n ast.to_string(0)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn a generator of the tokens.", "response": "def get_tokens(self):\n \"\"\"Returns a generator of the tokens.\"\"\"\n if self._filename:\n with open(self._filename) as ifile:\n self._data = ifile.read()\n\n with QasmParser(self._filename) as qasm_p:\n return qasm_p.get_tokens()"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\napplies crz from ctl to tgt with angle theta.", "response": "def crz(self, theta, ctl, tgt):\n \"\"\"Apply crz from ctl to tgt with angle theta.\"\"\"\n return self.append(CrzGate(theta), [ctl, tgt], [])"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef basis_state(str_state, num):\n n = int(str_state, 2)\n if num >= len(str_state):\n state = np.zeros(1 << num, dtype=complex)\n state[n] = 1\n return state\n else:\n raise QiskitError('size of bitstring is greater than num.')", "response": "Return a basis state ndarray."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef projector(state, flatten=False):\n density_matrix = np.outer(state.conjugate(), state)\n if flatten:\n return density_matrix.flatten(order='F')\n return density_matrix", "response": "Maps a pure state matrix to a state matrix of column workitems."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncalculates the purity of a quantum state.", "response": "def purity(state):\n \"\"\"Calculate the purity of a quantum state.\n\n Args:\n state (ndarray): a quantum state\n Returns:\n float: purity.\n \"\"\"\n rho = np.array(state)\n if rho.ndim == 1:\n return 1.0\n return np.real(np.trace(rho.dot(rho)))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef qasm(self, prec=15):\n string = \"gate \" + self.name\n if self.arguments is not None:\n string += \"(\" + self.arguments.qasm(prec) + \")\"\n string += \" \" + self.bitlist.qasm(prec) + \"\\n\"\n string += \"{\\n\" + self.body.qasm(prec) + \"}\"\n return string", "response": "Return the corresponding OPENQASM string."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nrunning the pass on the DAG and write the discovered commutation relations to the property_set.", "response": "def run(self, dag):\n \"\"\"\n Run the pass on the DAG, and write the discovered commutation relations\n into the property_set.\n \"\"\"\n # Initiate the commutation set\n self.property_set['commutation_set'] = defaultdict(list)\n\n # Build a dictionary to keep track of the gates on each qubit\n for wire in dag.wires:\n wire_name = \"{0}[{1}]\".format(str(wire[0].name), str(wire[1]))\n self.property_set['commutation_set'][wire_name] = []\n\n # Add edges to the dictionary for each qubit\n for node in dag.topological_op_nodes():\n for (_, _, edge_data) in dag.edges(node):\n\n edge_name = edge_data['name']\n self.property_set['commutation_set'][(node, edge_name)] = -1\n\n for wire in dag.wires:\n wire_name = \"{0}[{1}]\".format(str(wire[0].name), str(wire[1]))\n\n for current_gate in dag.nodes_on_wire(wire):\n\n current_comm_set = self.property_set['commutation_set'][wire_name]\n if not current_comm_set:\n current_comm_set.append([current_gate])\n\n if current_gate not in current_comm_set[-1]:\n prev_gate = current_comm_set[-1][-1]\n if _commute(current_gate, prev_gate):\n current_comm_set[-1].append(current_gate)\n\n else:\n current_comm_set.append([current_gate])\n\n temp_len = len(current_comm_set)\n self.property_set['commutation_set'][(current_gate, wire_name)] = temp_len - 1"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a backend widget.", "response": "def backend_widget(backend):\n \"\"\"Creates a backend widget.\n \"\"\"\n config = backend.configuration().to_dict()\n props = backend.properties().to_dict()\n\n name = widgets.HTML(value=\"

{name}

\".format(name=backend.name()),\n layout=widgets.Layout())\n\n n_qubits = config['n_qubits']\n\n qubit_count = widgets.HTML(value=\"
{qubits}
\".format(qubits=n_qubits),\n layout=widgets.Layout(justify_content='center'))\n\n cmap = widgets.Output(layout=widgets.Layout(min_width='250px', max_width='250px',\n max_height='250px',\n min_height='250px',\n justify_content='center',\n align_items='center',\n margin='0px 0px 0px 0px'))\n\n with cmap:\n _cmap_fig = plot_gate_map(backend,\n plot_directed=False,\n label_qubits=False)\n if _cmap_fig is not None:\n display(_cmap_fig)\n # Prevents plot from showing up twice.\n plt.close(_cmap_fig)\n\n pending = generate_jobs_pending_widget()\n\n is_oper = widgets.HTML(value=\"
\",\n layout=widgets.Layout(justify_content='center'))\n\n least_busy = widgets.HTML(value=\"
\",\n layout=widgets.Layout(justify_content='center'))\n\n t1_units = props['qubits'][0][0]['unit']\n avg_t1 = round(sum([q[0]['value'] for q in props['qubits']])/n_qubits, 1)\n t1_widget = widgets.HTML(value=\"
{t1} {units}
\".format(t1=avg_t1, units=t1_units),\n layout=widgets.Layout())\n\n t2_units = props['qubits'][0][1]['unit']\n avg_t2 = round(sum([q[1]['value'] for q in props['qubits']])/n_qubits, 1)\n t2_widget = widgets.HTML(value=\"
{t2} {units}
\".format(t2=avg_t2, units=t2_units),\n layout=widgets.Layout())\n\n out = widgets.VBox([name, cmap, qubit_count, pending,\n least_busy, is_oper, t1_widget, t2_widget],\n layout=widgets.Layout(display='inline-flex',\n flex_flow='column',\n align_items='center'))\n\n out._is_alive = True\n return out"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef update_backend_info(self, interval=60):\n my_thread = threading.currentThread()\n current_interval = 0\n started = False\n all_dead = False\n stati = [None]*len(self._backends)\n while getattr(my_thread, \"do_run\", True) and not all_dead:\n if current_interval == interval or started is False:\n for ind, back in enumerate(self._backends):\n _value = self.children[ind].children[2].value\n _head = _value.split('')[0]\n try:\n _status = back.status()\n stati[ind] = _status\n except Exception: # pylint: disable=W0703\n self.children[ind].children[2].value = _value.replace(\n _head, \"
\")\n self.children[ind]._is_alive = False\n else:\n self.children[ind]._is_alive = True\n self.children[ind].children[2].value = _value.replace(\n _head, \"
\")\n\n idx = list(range(len(self._backends)))\n pending = [s.pending_jobs for s in stati]\n _, least_idx = zip(*sorted(zip(pending, idx)))\n\n # Make sure least pending is operational\n for ind in least_idx:\n if stati[ind].operational:\n least_pending_idx = ind\n break\n\n for var in idx:\n if var == least_pending_idx:\n self.children[var].children[4].value = \"
True
\"\n else:\n self.children[var].children[4].value = \"
False
\"\n\n self.children[var].children[3].children[1].value = pending[var]\n self.children[var].children[3].children[1].max = max(\n self.children[var].children[3].children[1].max, pending[var]+10)\n if stati[var].operational:\n self.children[var].children[5].value = \"
True
\"\n else:\n self.children[var].children[5].value = \"
False
\"\n\n started = True\n current_interval = 0\n time.sleep(1)\n all_dead = not any([wid._is_alive for wid in self.children])\n current_interval += 1", "response": "Updates the monitor info of the current thread. Called from another thread."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngenerates a jobs_pending progress bar widget.", "response": "def generate_jobs_pending_widget():\n \"\"\"Generates a jobs_pending progress bar widget.\n \"\"\"\n pbar = widgets.IntProgress(\n value=0,\n min=0,\n max=50,\n description='',\n orientation='horizontal', layout=widgets.Layout(max_width='180px'))\n pbar.style.bar_color = '#71cddd'\n\n pbar_current = widgets.Label(\n value=str(pbar.value), layout=widgets.Layout(min_width='auto'))\n pbar_max = widgets.Label(\n value=str(pbar.max), layout=widgets.Layout(min_width='auto'))\n\n def _on_max_change(change):\n pbar_max.value = str(change['new'])\n\n def _on_val_change(change):\n pbar_current.value = str(change['new'])\n\n pbar.observe(_on_max_change, names='max')\n pbar.observe(_on_val_change, names='value')\n\n jobs_widget = widgets.HBox([pbar_current, pbar, pbar_max],\n layout=widgets.Layout(max_width='250px',\n min_width='250px',\n justify_content='center'))\n\n return jobs_widget"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef run(self, dag):\n new_dag = DAGCircuit()\n\n if self.layout is None:\n # LegacySwap renames the register in the DAG and does not match the property set\n self.layout = Layout.generate_trivial_layout(*dag.qregs.values())\n\n for layer in dag.serial_layers():\n subdag = layer['graph']\n\n for cnot_node in subdag.named_nodes('cx', 'CX'):\n control = cnot_node.qargs[0]\n target = cnot_node.qargs[1]\n\n physical_q0 = self.layout[control]\n physical_q1 = self.layout[target]\n if self.coupling_map.distance(physical_q0, physical_q1) != 1:\n raise TranspilerError('The circuit requires a connection between physical '\n 'qubits %s and %s' % (physical_q0, physical_q1))\n\n if (physical_q0, physical_q1) not in self.coupling_map.get_edges():\n # A flip needs to be done\n\n # Create the involved registers\n if control[0] not in subdag.qregs.values():\n subdag.add_qreg(control[0])\n if target[0] not in subdag.qregs.values():\n subdag.add_qreg(target[0])\n\n # Add H gates around\n subdag.apply_operation_back(HGate(), [target], [])\n subdag.apply_operation_back(HGate(), [control], [])\n subdag.apply_operation_front(HGate(), [target], [])\n subdag.apply_operation_front(HGate(), [control], [])\n\n # Flips the CX\n cnot_node.qargs[0], cnot_node.qargs[1] = target, control\n\n new_dag.extend_back(subdag)\n\n return new_dag", "response": "Flips the coupling map to match the directed coupling map."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nrun one pass of cx cancellation on the circuit", "response": "def run(self, dag):\n \"\"\"\n Run one pass of cx cancellation on the circuit\n\n Args:\n dag (DAGCircuit): the directed acyclic graph to run on.\n Returns:\n DAGCircuit: Transformed DAG.\n \"\"\"\n cx_runs = dag.collect_runs([\"cx\"])\n for cx_run in cx_runs:\n # Partition the cx_run into chunks with equal gate arguments\n partition = []\n chunk = []\n for i in range(len(cx_run) - 1):\n chunk.append(cx_run[i])\n\n qargs0 = cx_run[i].qargs\n qargs1 = cx_run[i + 1].qargs\n\n if qargs0 != qargs1:\n partition.append(chunk)\n chunk = []\n chunk.append(cx_run[-1])\n partition.append(chunk)\n # Simplify each chunk in the partition\n for chunk in partition:\n if len(chunk) % 2 == 0:\n for n in chunk:\n dag.remove_op_node(n)\n else:\n for n in chunk[1:]:\n dag.remove_op_node(n)\n return dag"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a single backend matching the specified criteria.", "response": "def get_backend(self, name=None, **kwargs):\n \"\"\"Return a single backend matching the specified filtering.\n\n Args:\n name (str): name of the backend.\n **kwargs (dict): dict used for filtering.\n\n Returns:\n BaseBackend: a backend matching the filtering.\n\n Raises:\n QiskitBackendNotFoundError: if no backend could be found or\n more than one backend matches.\n \"\"\"\n backends = self.backends(name, **kwargs)\n if len(backends) > 1:\n raise QiskitBackendNotFoundError('More than one backend matches the criteria')\n elif not backends:\n raise QiskitBackendNotFoundError('No backend matches the criteria')\n\n return backends[0]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning the shape for bipartite matrix", "response": "def _bipartite_shape(self):\n \"\"\"Return the shape for bipartite matrix\"\"\"\n return (self._input_dim, self._output_dim, self._input_dim,\n self._output_dim)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the conjugate of the QuantumChannel.", "response": "def conjugate(self):\n \"\"\"Return the conjugate of the QuantumChannel.\"\"\"\n return Choi(np.conj(self._data), self.input_dims(), self.output_dims())"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning the transpose of the QuantumChannel.", "response": "def transpose(self):\n \"\"\"Return the transpose of the QuantumChannel.\"\"\"\n # Make bipartite matrix\n d_in, d_out = self.dim\n data = np.reshape(self._data, (d_in, d_out, d_in, d_out))\n # Swap input and output indicies on bipartite matrix\n data = np.transpose(data, (1, 0, 3, 2))\n # Transpose channel has input and output dimensions swapped\n data = np.reshape(data, (d_in * d_out, d_in * d_out))\n return Choi(\n data, input_dims=self.output_dims(), output_dims=self.input_dims())"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef compose(self, other, qargs=None, front=False):\n if qargs is not None:\n return Choi(\n SuperOp(self).compose(other, qargs=qargs, front=front))\n\n # Convert to Choi matrix\n if not isinstance(other, Choi):\n other = Choi(other)\n # Check dimensions match up\n if front and self._input_dim != other._output_dim:\n raise QiskitError(\n 'input_dim of self must match output_dim of other')\n if not front and self._output_dim != other._input_dim:\n raise QiskitError(\n 'input_dim of other must match output_dim of self')\n\n if front:\n first = np.reshape(other._data, other._bipartite_shape)\n second = np.reshape(self._data, self._bipartite_shape)\n input_dim = other._input_dim\n input_dims = other.input_dims()\n output_dim = self._output_dim\n output_dims = self.output_dims()\n else:\n first = np.reshape(self._data, self._bipartite_shape)\n second = np.reshape(other._data, other._bipartite_shape)\n input_dim = self._input_dim\n input_dims = self.input_dims()\n output_dim = other._output_dim\n output_dims = other.output_dims()\n\n # Contract Choi matrices for composition\n data = np.reshape(\n np.einsum('iAjB,AkBl->ikjl', first, second),\n (input_dim * output_dim, input_dim * output_dim))\n return Choi(data, input_dims, output_dims)", "response": "Returns the composition of self and other."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef power(self, n):\n if n > 0:\n return super().power(n)\n return Choi(SuperOp(self).power(n))", "response": "The matrix power of the superoperator matrix."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _evolve(self, state, qargs=None):\n # If subsystem evolution we use the SuperOp representation\n if qargs is not None:\n return SuperOp(self)._evolve(state, qargs)\n # Otherwise we compute full evolution directly\n state = self._format_state(state, density_matrix=True)\n if state.shape[0] != self._input_dim:\n raise QiskitError(\n \"QuantumChannel input dimension is not equal to state dimension.\"\n )\n return np.einsum('AB,AiBj->ij', state,\n np.reshape(self._data, self._bipartite_shape))", "response": "Evolve a state by the QuantumChannel."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _tensor_product(self, other, reverse=False):\n # Convert other to Choi\n if not isinstance(other, Choi):\n other = Choi(other)\n\n if reverse:\n input_dims = self.input_dims() + other.input_dims()\n output_dims = self.output_dims() + other.output_dims()\n data = _bipartite_tensor(\n other.data,\n self._data,\n shape1=other._bipartite_shape,\n shape2=self._bipartite_shape)\n else:\n input_dims = other.input_dims() + self.input_dims()\n output_dims = other.output_dims() + self.output_dims()\n data = _bipartite_tensor(\n self._data,\n other.data,\n shape1=self._bipartite_shape,\n shape2=other._bipartite_shape)\n return Choi(data, input_dims, output_dims)", "response": "Returns the tensor product of two quantum channels."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting the number and size of unique registers from bit_labels list.", "response": "def _get_register_specs(bit_labels):\n \"\"\"Get the number and size of unique registers from bit_labels list.\n\n Args:\n bit_labels (list): this list is of the form::\n\n [['reg1', 0], ['reg1', 1], ['reg2', 0]]\n\n which indicates a register named \"reg1\" of size 2\n and a register named \"reg2\" of size 1. This is the\n format of classic and quantum bit labels in qobj\n header.\n\n Yields:\n tuple: iterator of register_name:size pairs.\n \"\"\"\n it = itertools.groupby(bit_labels, operator.itemgetter(0))\n for register_name, sub_it in it:\n yield register_name, max(ind[1] for ind in sub_it) + 1"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _truncate_float(matchobj, format_str='0.2g'):\n if matchobj.group(0):\n return format(float(matchobj.group(0)), format_str)\n return ''", "response": "Truncate long floats in the log file"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef latex(self, aliases=None):\n self._initialize_latex_array(aliases)\n self._build_latex_array(aliases)\n header_1 = r\"\"\"% \\documentclass[preview]{standalone}\n% If the image is too large to fit on this documentclass use\n\\documentclass[draft]{beamer}\n\"\"\"\n beamer_line = \"\\\\usepackage[size=custom,height=%d,width=%d,scale=%.1f]{beamerposter}\\n\"\n header_2 = r\"\"\"% instead and customize the height and width (in cm) to fit.\n% Large images may run out of memory quickly.\n% To fix this use the LuaLaTeX compiler, which dynamically\n% allocates memory.\n\\usepackage[braket, qm]{qcircuit}\n\\usepackage{amsmath}\n\\pdfmapfile{+sansmathaccent.map}\n% \\usepackage[landscape]{geometry}\n% Comment out the above line if using the beamer documentclass.\n\\begin{document}\n\\begin{equation*}\"\"\"\n qcircuit_line = r\"\"\"\n \\Qcircuit @C=%.1fem @R=%.1fem @!R {\n\"\"\"\n output = io.StringIO()\n output.write(header_1)\n output.write('%% img_width = %d, img_depth = %d\\n' % (self.img_width, self.img_depth))\n output.write(beamer_line % self._get_beamer_page())\n output.write(header_2)\n output.write(qcircuit_line %\n (self.column_separation, self.row_separation))\n for i in range(self.img_width):\n output.write(\"\\t \\t\")\n for j in range(self.img_depth + 1):\n cell_str = self._latex[i][j]\n # Don't truncate offset float if drawing a barrier\n if 'barrier' in cell_str:\n output.write(cell_str)\n else:\n # floats can cause \"Dimension too large\" latex error in\n # xymatrix this truncates floats to avoid issue.\n cell_str = re.sub(r'[-+]?\\d*\\.\\d{2,}|\\d{2,}',\n _truncate_float,\n cell_str)\n output.write(cell_str)\n if j != self.img_depth:\n output.write(\" & \")\n else:\n output.write(r'\\\\' + '\\n')\n output.write('\\t }\\n')\n output.write('\\\\end{equation*}\\n\\n')\n output.write('\\\\end{document}')\n contents = output.getvalue()\n output.close()\n return contents", "response": "Return LaTeX representation of the circuit."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting the number of columns in the circuit.", "response": "def _get_image_depth(self):\n \"\"\"Get depth information for the circuit.\n\n Returns:\n int: number of columns in the circuit\n int: total size of columns in the circuit\n \"\"\"\n\n max_column_widths = []\n\n for layer in self.ops:\n\n # store the max width for the layer\n current_max = 0\n\n for op in layer:\n\n # update current op width\n arg_str_len = 0\n\n # the wide gates\n for arg in op.op.params:\n arg_str = re.sub(r'[-+]?\\d*\\.\\d{2,}|\\d{2,}',\n _truncate_float, str(arg))\n arg_str_len += len(arg_str)\n\n # the width of the column is the max of all the gates in the column\n current_max = max(arg_str_len, current_max)\n\n max_column_widths.append(current_max)\n\n # wires in the beginning and end\n columns = 2\n # each layer is one column\n columns += len(self.ops)\n\n # every 3 characters is roughly one extra 'unit' of width in the cell\n # the gate name is 1 extra 'unit'\n # the qubit/cbit labels plus initial states is 2 more\n # the wires poking out at the ends is 2 more\n sum_column_widths = sum(1 + v / 3 for v in max_column_widths)\n\n # could be a fraction so ceil\n return columns, math.ceil(sum_column_widths) + 4"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _get_beamer_page(self):\n # PIL python package limits image size to around a quarter gigabyte\n # this means the beamer image should be limited to < 50000\n # if you want to avoid a \"warning\" too, set it to < 25000\n PIL_limit = 40000\n\n # the beamer latex template limits each dimension to < 19 feet\n # (i.e. 575cm)\n beamer_limit = 550\n\n # columns are roughly twice as big as rows\n aspect_ratio = self.sum_row_heights / self.sum_column_widths\n\n # choose a page margin so circuit is not cropped\n margin_factor = 1.5\n height = min(self.sum_row_heights * margin_factor, beamer_limit)\n width = min(self.sum_column_widths * margin_factor, beamer_limit)\n\n # if too large, make it fit\n if height * width > PIL_limit:\n height = min(np.sqrt(PIL_limit * aspect_ratio), beamer_limit)\n width = min(np.sqrt(PIL_limit / aspect_ratio), beamer_limit)\n\n # if too small, give it a minimum size\n height = max(height, 10)\n width = max(width, 10)\n\n return (height, width, self.scale)", "response": "Get height width and scale attributes for the beamer page."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nbuild a LaTeX string for the current qubit names.", "response": "def _build_latex_array(self, aliases=None):\n \"\"\"Returns an array of strings containing \\\\LaTeX for this circuit.\n\n If aliases is not None, aliases contains a dict mapping\n the current qubits in the circuit to new qubit names.\n We will deduce the register names and sizes from aliases.\n \"\"\"\n columns = 1\n\n # Rename qregs if necessary\n if aliases:\n qregdata = {}\n for q in aliases.values():\n if q[0] not in qregdata:\n qregdata[q[0]] = q[1] + 1\n elif qregdata[q[0]] < q[1] + 1:\n qregdata[q[0]] = q[1] + 1\n else:\n qregdata = self.qregs\n\n for column, layer in enumerate(self.ops, 1):\n for op in layer:\n if op.condition:\n mask = self._get_mask(op.condition[0])\n cl_reg = self.clbit_list[self._ffs(mask)]\n if_reg = cl_reg[0]\n pos_2 = self.img_regs[cl_reg]\n if_value = format(op.condition[1],\n 'b').zfill(self.cregs[if_reg])[::-1]\n if op.name not in ['measure', 'barrier', 'snapshot', 'load',\n 'save', 'noise']:\n nm = op.name\n qarglist = op.qargs\n if aliases is not None:\n qarglist = map(lambda x: aliases[x], qarglist)\n if len(qarglist) == 1:\n pos_1 = self.img_regs[(qarglist[0][0],\n qarglist[0][1])]\n\n if op.condition:\n mask = self._get_mask(op.condition[0])\n cl_reg = self.clbit_list[self._ffs(mask)]\n if_reg = cl_reg[0]\n pos_2 = self.img_regs[cl_reg]\n\n if nm == \"x\":\n self._latex[pos_1][column] = \"\\\\gate{X}\"\n elif nm == \"y\":\n self._latex[pos_1][column] = \"\\\\gate{Y}\"\n elif nm == \"z\":\n self._latex[pos_1][column] = \"\\\\gate{Z}\"\n elif nm == \"h\":\n self._latex[pos_1][column] = \"\\\\gate{H}\"\n elif nm == \"s\":\n self._latex[pos_1][column] = \"\\\\gate{S}\"\n elif nm == \"sdg\":\n self._latex[pos_1][column] = \"\\\\gate{S^\\\\dag}\"\n elif nm == \"t\":\n self._latex[pos_1][column] = \"\\\\gate{T}\"\n elif nm == \"tdg\":\n self._latex[pos_1][column] = \"\\\\gate{T^\\\\dag}\"\n elif nm == \"u0\":\n self._latex[pos_1][column] = \"\\\\gate{U_0(%s)}\" % (\n op.op.params[0])\n elif nm == \"u1\":\n self._latex[pos_1][column] = \"\\\\gate{U_1(%s)}\" % (\n op.op.params[0])\n elif nm == \"u2\":\n self._latex[pos_1][column] = \\\n \"\\\\gate{U_2\\\\left(%s,%s\\\\right)}\" % (\n op.op.params[0], op.op.params[1])\n elif nm == \"u3\":\n self._latex[pos_1][column] = (\"\\\\gate{U_3(%s,%s,%s)}\" % (\n op.op.params[0],\n op.op.params[1],\n op.op.params[2]))\n elif nm == \"rx\":\n self._latex[pos_1][column] = \"\\\\gate{R_x(%s)}\" % (\n op.op.params[0])\n elif nm == \"ry\":\n self._latex[pos_1][column] = \"\\\\gate{R_y(%s)}\" % (\n op.op.params[0])\n elif nm == \"rz\":\n self._latex[pos_1][column] = \"\\\\gate{R_z(%s)}\" % (\n op.op.params[0])\n else:\n self._latex[pos_1][columns] = \"\\\\gate{%s}\" % nm\n\n gap = pos_2 - pos_1\n for i in range(self.cregs[if_reg]):\n if if_value[i] == '1':\n self._latex[pos_2 + i][column] = \\\n \"\\\\control \\\\cw \\\\cwx[-\" + str(gap) + \"]\"\n gap = 1\n else:\n self._latex[pos_2 + i][column] = \\\n \"\\\\controlo \\\\cw \\\\cwx[-\" + str(gap) + \"]\"\n gap = 1\n\n else:\n if nm == \"x\":\n self._latex[pos_1][column] = \"\\\\gate{X}\"\n elif nm == \"y\":\n self._latex[pos_1][column] = \"\\\\gate{Y}\"\n elif nm == \"z\":\n self._latex[pos_1][column] = \"\\\\gate{Z}\"\n elif nm == \"h\":\n self._latex[pos_1][column] = \"\\\\gate{H}\"\n elif nm == \"s\":\n self._latex[pos_1][column] = \"\\\\gate{S}\"\n elif nm == \"sdg\":\n self._latex[pos_1][column] = \"\\\\gate{S^\\\\dag}\"\n elif nm == \"t\":\n self._latex[pos_1][column] = \"\\\\gate{T}\"\n elif nm == \"tdg\":\n self._latex[pos_1][column] = \"\\\\gate{T^\\\\dag}\"\n elif nm == \"u0\":\n self._latex[pos_1][column] = \"\\\\gate{U_0(%s)}\" % (\n op.op.params[0])\n elif nm == \"u1\":\n self._latex[pos_1][column] = \"\\\\gate{U_1(%s)}\" % (\n op.op.params[0])\n elif nm == \"u2\":\n self._latex[pos_1][column] = \\\n \"\\\\gate{U_2\\\\left(%s,%s\\\\right)}\" % (\n op.op.params[0], op.op.params[1])\n elif nm == \"u3\":\n self._latex[pos_1][column] = (\"\\\\gate{U_3(%s,%s,%s)}\" % (\n op.op.params[0],\n op.op.params[1],\n op.op.params[2]))\n elif nm == \"rx\":\n self._latex[pos_1][column] = \"\\\\gate{R_x(%s)}\" % (\n op.op.params[0])\n elif nm == \"ry\":\n self._latex[pos_1][column] = \"\\\\gate{R_y(%s)}\" % (\n op.op.params[0])\n elif nm == \"rz\":\n self._latex[pos_1][column] = \"\\\\gate{R_z(%s)}\" % (\n op.op.params[0])\n elif nm == \"reset\":\n self._latex[pos_1][column] = (\n \"\\\\push{\\\\rule{.6em}{0em}\\\\ket{0}\\\\\"\n \"rule{.2em}{0em}} \\\\qw\")\n else:\n self._latex[pos_1][columns] = \"\\\\gate{%s}\" % nm\n\n elif len(qarglist) == 2:\n pos_1 = self.img_regs[(qarglist[0][0], qarglist[0][1])]\n pos_2 = self.img_regs[(qarglist[1][0], qarglist[1][1])]\n\n if op.condition:\n pos_3 = self.img_regs[(if_reg, 0)]\n temp = [pos_1, pos_2, pos_3]\n temp.sort(key=int)\n bottom = temp[1]\n\n gap = pos_3 - bottom\n for i in range(self.cregs[if_reg]):\n if if_value[i] == '1':\n self._latex[pos_3 + i][column] = \\\n \"\\\\control \\\\cw \\\\cwx[-\" + str(gap) + \"]\"\n gap = 1\n else:\n self._latex[pos_3 + i][column] = \\\n \"\\\\controlo \\\\cw \\\\cwx[-\" + str(gap) + \"]\"\n gap = 1\n\n if nm == \"cx\":\n self._latex[pos_1][column] = \\\n \"\\\\ctrl{\" + str(pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\targ\"\n elif nm == \"cz\":\n self._latex[pos_1][column] = \\\n \"\\\\ctrl{\" + str(pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\control\\\\qw\"\n elif nm == \"cy\":\n self._latex[pos_1][column] = \\\n \"\\\\ctrl{\" + str(pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\gate{Y}\"\n elif nm == \"ch\":\n self._latex[pos_1][column] = \\\n \"\\\\ctrl{\" + str(pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\gate{H}\"\n elif nm == \"swap\":\n self._latex[pos_1][column] = \"\\\\qswap\"\n self._latex[pos_2][column] = \\\n \"\\\\qswap \\\\qwx[\" + str(pos_1 - pos_2) + \"]\"\n elif nm == \"crz\":\n self._latex[pos_1][column] = \\\n \"\\\\ctrl{\" + str(pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \\\n \"\\\\gate{R_z(%s)}\" % (op.op.params[0])\n elif nm == \"cu1\":\n self._latex[pos_1][column - 1] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column - 1] = \"\\\\control\\\\qw\"\n self._latex[min(pos_1, pos_2)][column] = \\\n \"\\\\dstick{%s}\\\\qw\" % (op.op.params[0])\n self._latex[max(pos_1, pos_2)][column] = \"\\\\qw\"\n elif nm == \"cu3\":\n self._latex[pos_1][column] = \\\n \"\\\\ctrl{\" + str(pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \\\n \"\\\\gate{U_3(%s,%s,%s)}\" % (op.op.params[0],\n op.op.params[1],\n op.op.params[2])\n else:\n temp = [pos_1, pos_2]\n temp.sort(key=int)\n\n if nm == \"cx\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\targ\"\n elif nm == \"cz\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\control\\\\qw\"\n elif nm == \"cy\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\gate{Y}\"\n elif nm == \"ch\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\gate{H}\"\n elif nm == \"swap\":\n self._latex[pos_1][column] = \"\\\\qswap\"\n self._latex[pos_2][column] = \\\n \"\\\\qswap \\\\qwx[\" + str(pos_1 - pos_2) + \"]\"\n elif nm == \"crz\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \\\n \"\\\\gate{R_z(%s)}\" % (op.op.params[0])\n elif nm == \"cu1\":\n self._latex[pos_1][column - 1] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column - 1] = \"\\\\control\\\\qw\"\n self._latex[min(pos_1, pos_2)][column] = \\\n \"\\\\dstick{%s}\\\\qw\" % (op.op.params[0])\n self._latex[max(pos_1, pos_2)][column] = \"\\\\qw\"\n elif nm == \"cu3\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = (\"\\\\gate{U_3(%s,%s,%s)}\" % (\n op.op.params[0],\n op.op.params[1],\n op.op.params[2]))\n else:\n start_pos = min([pos_1, pos_2])\n stop_pos = max([pos_1, pos_2])\n if stop_pos - start_pos >= 2:\n delta = stop_pos - start_pos\n self._latex[start_pos][columns] = (\n \"\\\\multigate{%s}{%s}\" % (delta, nm))\n for i_pos in range(start_pos + 1, stop_pos + 1):\n self._latex[i_pos][columns] = \"\\\\ghost{%s}\" % nm\n else:\n self._latex[start_pos][columns] = (\n \"\\\\multigate{1}{%s}\" % nm)\n self._latex[stop_pos][columns] = \"\\\\ghost{%s}\" % nm\n\n elif len(qarglist) == 3:\n pos_1 = self.img_regs[(qarglist[0][0], qarglist[0][1])]\n pos_2 = self.img_regs[(qarglist[1][0], qarglist[1][1])]\n pos_3 = self.img_regs[(qarglist[2][0], qarglist[2][1])]\n\n if op.condition:\n pos_4 = self.img_regs[(if_reg, 0)]\n\n temp = [pos_1, pos_2, pos_3, pos_4]\n temp.sort(key=int)\n bottom = temp[2]\n\n prev_column = [x[column - 1] for x in self._latex]\n for item, prev_entry in enumerate(prev_column):\n if 'barrier' in prev_entry:\n span = re.search('barrier{(.*)}', prev_entry)\n if span and any(i in temp for i in range(\n item, int(span.group(1)))):\n self._latex[item][column - 1] = \\\n prev_entry.replace(\n '\\\\barrier{',\n '\\\\barrier[-0.65em]{')\n\n gap = pos_4 - bottom\n for i in range(self.cregs[if_reg]):\n if if_value[i] == '1':\n self._latex[pos_4 + i][column] = \\\n \"\\\\control \\\\cw \\\\cwx[-\" + str(gap) + \"]\"\n gap = 1\n else:\n self._latex[pos_4 + i][column] = \\\n \"\\\\controlo \\\\cw \\\\cwx[-\" + str(gap) + \"]\"\n gap = 1\n\n if nm == \"ccx\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\ctrl{\" + str(\n pos_3 - pos_2) + \"}\"\n self._latex[pos_3][column] = \"\\\\targ\"\n\n if nm == \"cswap\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\qswap\"\n self._latex[pos_3][column] = \\\n \"\\\\qswap \\\\qwx[\" + str(pos_2 - pos_3) + \"]\"\n else:\n temp = [pos_1, pos_2, pos_3]\n temp.sort(key=int)\n\n prev_column = [x[column - 1] for x in self._latex]\n for item, prev_entry in enumerate(prev_column):\n if 'barrier' in prev_entry:\n span = re.search('barrier{(.*)}', prev_entry)\n if span and any(i in temp for i in range(\n item, int(span.group(1)))):\n self._latex[item][column - 1] = \\\n prev_entry.replace(\n '\\\\barrier{',\n '\\\\barrier[-0.65em]{')\n\n if nm == \"ccx\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\ctrl{\" + str(\n pos_3 - pos_2) + \"}\"\n self._latex[pos_3][column] = \"\\\\targ\"\n\n elif nm == \"cswap\":\n self._latex[pos_1][column] = \"\\\\ctrl{\" + str(\n pos_2 - pos_1) + \"}\"\n self._latex[pos_2][column] = \"\\\\qswap\"\n self._latex[pos_3][column] = \\\n \"\\\\qswap \\\\qwx[\" + str(pos_2 - pos_3) + \"]\"\n else:\n start_pos = min([pos_1, pos_2, pos_3])\n stop_pos = max([pos_1, pos_2, pos_3])\n if stop_pos - start_pos >= 3:\n delta = stop_pos - start_pos\n self._latex[start_pos][columns] = (\n \"\\\\multigate{%s}{%s}\" % (delta, nm))\n for i_pos in range(start_pos + 1, stop_pos + 1):\n self._latex[i_pos][columns] = \"\\\\ghost{%s}\" % nm\n else:\n self._latex[pos_1][columns] = (\n \"\\\\multigate{2}{%s}\" % nm)\n self._latex[pos_2][columns] = \"\\\\ghost{%s}\" % nm\n self._latex[pos_3][columns] = \"\\\\ghost{%s}\" % nm\n\n elif len(qarglist) > 3:\n nbits = len(qarglist)\n pos_array = [self.img_regs[(qarglist[0][0],\n qarglist[0][1])]]\n for i in range(1, nbits):\n pos_array.append(self.img_regs[(qarglist[i][0],\n qarglist[i][1])])\n pos_start = min(pos_array)\n pos_stop = max(pos_array)\n delta = pos_stop - pos_start\n self._latex[pos_start][columns] = (\n \"\\\\multigate{%s}{%s}\" % (nbits - 1, nm))\n for pos in range(pos_start + 1, pos_stop + 1):\n self._latex[pos][columns] = \"\\\\ghost{%s}\" % nm\n\n elif op.name == \"measure\":\n if (len(op.cargs) != 1\n or len(op.qargs) != 1\n or op.op.params):\n raise exceptions.VisualizationError(\"bad operation record\")\n\n if op.condition:\n raise exceptions.VisualizationError(\n \"If controlled measures currently not supported.\")\n\n qname, qindex = op.qargs[0]\n cname, cindex = op.cargs[0]\n if aliases:\n newq = aliases[(qname, qindex)]\n qname = newq[0]\n qindex = newq[1]\n\n pos_1 = self.img_regs[(qname, qindex)]\n pos_2 = self.img_regs[(cname, cindex)]\n\n try:\n self._latex[pos_1][column] = \"\\\\meter\"\n prev_column = [x[column - 1] for x in self._latex]\n for item, prev_entry in enumerate(prev_column):\n if 'barrier' in prev_entry:\n span = re.search('barrier{(.*)}', prev_entry)\n if span and (\n item + int(span.group(1))) - pos_1 >= 0:\n self._latex[item][column - 1] = \\\n prev_entry.replace(\n '\\\\barrier{',\n '\\\\barrier[-1.15em]{')\n\n self._latex[pos_2][column] = \\\n \"\\\\cw \\\\cwx[-\" + str(pos_2 - pos_1) + \"]\"\n except Exception as e:\n raise exceptions.VisualizationError(\n 'Error during Latex building: %s' % str(e))\n\n elif op.name in ['barrier', 'snapshot', 'load', 'save',\n 'noise']:\n if self.plot_barriers:\n qarglist = op.qargs\n indexes = [self._get_qubit_index(x) for x in qarglist]\n start_bit = self.qubit_list[min(indexes)]\n if aliases is not None:\n qarglist = map(lambda x: aliases[x], qarglist)\n start = self.img_regs[start_bit]\n span = len(op.qargs) - 1\n\n self._latex[start][column] = \"\\\\qw \\\\barrier{\" + str(\n span) + \"}\"\n else:\n raise exceptions.VisualizationError(\"bad node data\")"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _get_qubit_index(self, qubit):\n for i, bit in enumerate(self.qubit_list):\n if qubit == bit:\n qindex = i\n break\n else:\n raise exceptions.VisualizationError(\"unable to find bit for operation\")\n return qindex", "response": "Get the index number for a qubit in the bit list"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _load_schema(file_path, name=None):\n if name is None:\n # filename without extension\n name = os.path.splitext(os.path.basename(file_path))[0]\n if name not in _SCHEMAS:\n with open(file_path, 'r') as schema_file:\n _SCHEMAS[name] = json.load(schema_file)\n\n return _SCHEMAS[name]", "response": "Loads the schema for use in future validations."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngenerates and store a validator for the given name and schema.", "response": "def _get_validator(name, schema=None, check_schema=True,\n validator_class=None, **validator_kwargs):\n \"\"\"Generate validator for JSON schema.\n\n Args:\n name (str): Name for validator. Will be validator key in\n `_VALIDATORS` dict.\n schema (dict): JSON schema `dict`. If not provided searches for schema\n in `_SCHEMAS`.\n check_schema (bool): Verify schema is valid.\n validator_class (jsonschema.IValidator): jsonschema IValidator instance.\n Default behavior is to determine this from the schema `$schema`\n field.\n **validator_kwargs (dict): Additional keyword arguments for validator.\n\n Return:\n jsonschema.IValidator: Validator for JSON schema.\n\n Raises:\n SchemaValidationError: Raised if validation fails.\n \"\"\"\n if schema is None:\n try:\n schema = _SCHEMAS[name]\n except KeyError:\n raise SchemaValidationError(\"Valid schema name or schema must \"\n \"be provided.\")\n\n if name not in _VALIDATORS:\n\n # Resolve JSON spec from schema if needed\n if validator_class is None:\n validator_class = jsonschema.validators.validator_for(schema)\n\n # Generate and store validator in _VALIDATORS\n _VALIDATORS[name] = validator_class(schema, **validator_kwargs)\n\n validator = _VALIDATORS[name]\n\n if check_schema:\n validator.check_schema(schema)\n\n return validator"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nloads all default schemas into _SCHEMAS and _VALIDATORS.", "response": "def _load_schemas_and_validators():\n \"\"\"Load all default schemas into `_SCHEMAS`.\"\"\"\n schema_base_path = os.path.join(os.path.dirname(__file__), '../..')\n for name, path in _DEFAULT_SCHEMA_PATHS.items():\n _load_schema(os.path.join(schema_base_path, path), name)\n _get_validator(name)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef validate_json_against_schema(json_dict, schema,\n err_msg=None):\n \"\"\"Validates JSON dict against a schema.\n\n Args:\n json_dict (dict): JSON to be validated.\n schema (dict or str): JSON schema dictionary or the name of one of the\n standards schemas in Qiskit to validate against it. The list of\n standard schemas is: ``backend_configuration``,\n ``backend_properties``, ``backend_status``,\n ``default_pulse_configuration``, ``job_status``, ``qobj``,\n ``result``.\n err_msg (str): Optional error message.\n\n Raises:\n SchemaValidationError: Raised if validation fails.\n \"\"\"\n\n try:\n if isinstance(schema, str):\n schema_name = schema\n schema = _SCHEMAS[schema_name]\n validator = _get_validator(schema_name)\n validator.validate(json_dict)\n else:\n jsonschema.validate(json_dict, schema)\n except jsonschema.ValidationError as err:\n if err_msg is None:\n err_msg = \"JSON failed validation. Set Qiskit log level to DEBUG \" \\\n \"for further information.\"\n newerr = SchemaValidationError(err_msg)\n newerr.__cause__ = _SummaryValidationError(err)\n logger.debug('%s', _format_causes(err))\n raise newerr", "response": "Validates a JSON dict against a given schema."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning a string representation of the validation error.", "response": "def _format_causes(err, level=0):\n \"\"\"Return a cascading explanation of the validation error.\n\n Returns a cascading explanation of the validation error in the form of::\n\n failed @ because of:\n failed @ because of:\n ...\n failed @ because of:\n ...\n ...\n\n For example::\n\n 'oneOf' failed @ '' because of:\n 'required' failed @ '.config' because of:\n 'meas_level' is a required property\n\n Meaning the validator 'oneOf' failed while validating the whole object\n because of the validator 'required' failing while validating the property\n 'config' because its 'meas_level' field is missing.\n\n The cascade repeats the format \" failed @ because of\"\n until there are no deeper causes. In this case, the string representation\n of the error is shown.\n\n Args:\n err (jsonschema.ValidationError): the instance to explain.\n level (int): starting level of indentation for the cascade of\n explanations.\n\n Return:\n str: a formatted string with the explanation of the error.\n\n \"\"\"\n lines = []\n\n def _print(string, offset=0):\n lines.append(_pad(string, offset=offset))\n\n def _pad(string, offset=0):\n padding = ' ' * (level + offset)\n padded_lines = [padding + line for line in string.split('\\n')]\n return '\\n'.join(padded_lines)\n\n def _format_path(path):\n def _format(item):\n if isinstance(item, str):\n return '.{}'.format(item)\n\n return '[{}]'.format(item)\n\n return ''.join([''] + list(map(_format, path)))\n\n _print('\\'{}\\' failed @ \\'{}\\' because of:'.format(\n err.validator, _format_path(err.absolute_path)))\n\n if not err.context:\n _print(str(err.message), offset=1)\n else:\n for suberr in err.context:\n lines.append(_format_causes(suberr, level+1))\n\n return '\\n'.join(lines)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the corresponding OPENQASM string.", "response": "def qasm(self, prec=15):\n \"\"\"Return the corresponding OPENQASM string.\"\"\"\n return \",\".join([self.children[j].qasm(prec)\n for j in range(self.size())])"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndraws a quantum circuit to a single page of text.", "response": "def circuit_drawer(circuit,\n scale=0.7,\n filename=None,\n style=None,\n output='text',\n interactive=False,\n line_length=None,\n plot_barriers=True,\n reverse_bits=False,\n justify=None):\n \"\"\"Draw a quantum circuit to different formats (set by output parameter):\n 0. text: ASCII art TextDrawing that can be printed in the console.\n 1. latex: high-quality images, but heavy external software dependencies\n 2. matplotlib: purely in Python with no external dependencies\n\n Args:\n circuit (QuantumCircuit): the quantum circuit to draw\n scale (float): scale of image to draw (shrink if < 1)\n filename (str): file path to save image to\n style (dict or str): dictionary of style or file name of style file.\n This option is only used by the `mpl`, `latex`, and `latex_source`\n output types. If a str is passed in that is the path to a json\n file which contains that will be open, parsed, and then used just\n as the input dict.\n output (TextDrawing): Select the output method to use for drawing the circuit.\n Valid choices are `text`, `latex`, `latex_source`, `mpl`. Note if\n one is not specified it will use latex and if that fails fallback\n to mpl. However this behavior is deprecated and in a future release\n the default will change.\n interactive (bool): when set true show the circuit in a new window\n (for `mpl` this depends on the matplotlib backend being used\n supporting this). Note when used with either the `text` or the\n `latex_source` output type this has no effect and will be silently\n ignored.\n line_length (int): Sets the length of the lines generated by `text`\n output type. This useful when the drawing does not fit in the\n console. If None (default), it will try to guess the console width\n using shutil.get_terminal_size(). However, if you're running in\n jupyter the default line length is set to 80 characters. If you\n don't want pagination at all, set `line_length=-1`.\n reverse_bits (bool): When set to True reverse the bit order inside\n registers for the output visualization.\n plot_barriers (bool): Enable/disable drawing barriers in the output\n circuit. Defaults to True.\n justify (string): Options are `left`, `right` or `none`, if anything\n else is supplied it defaults to left justified. It refers to where\n gates should be placed in the output circuit if there is an option.\n `none` results in each gate being placed in its own column. Currently\n only supported by text drawer.\n\n Returns:\n PIL.Image: (output `latex`) an in-memory representation of the image\n of the circuit diagram.\n matplotlib.figure: (output `mpl`) a matplotlib figure object for the\n circuit diagram.\n String: (output `latex_source`). The LaTeX source code.\n TextDrawing: (output `text`). A drawing that can be printed as ascii art\n Raises:\n VisualizationError: when an invalid output method is selected\n ImportError: when the output methods requieres non-installed libraries.\n\n .. _style-dict-doc:\n\n The style dict kwarg contains numerous options that define the style of the\n output circuit visualization. While the style dict is used by the `mpl`,\n `latex`, and `latex_source` outputs some options in that are only used\n by the `mpl` output. These options are defined below, if it is only used by\n the `mpl` output it is marked as such:\n\n textcolor (str): The color code to use for text. Defaults to\n `'#000000'` (`mpl` only)\n subtextcolor (str): The color code to use for subtext. Defaults to\n `'#000000'` (`mpl` only)\n linecolor (str): The color code to use for lines. Defaults to\n `'#000000'` (`mpl` only)\n creglinecolor (str): The color code to use for classical register lines\n `'#778899'`(`mpl` only)\n gatetextcolor (str): The color code to use for gate text `'#000000'`\n (`mpl` only)\n gatefacecolor (str): The color code to use for gates. Defaults to\n `'#ffffff'` (`mpl` only)\n barrierfacecolor (str): The color code to use for barriers. Defaults to\n `'#bdbdbd'` (`mpl` only)\n backgroundcolor (str): The color code to use for the background.\n Defaults to `'#ffffff'` (`mpl` only)\n fontsize (int): The font size to use for text. Defaults to 13 (`mpl`\n only)\n subfontsize (int): The font size to use for subtext. Defaults to 8\n (`mpl` only)\n displaytext (dict): A dictionary of the text to use for each element\n type in the output visualization. The default values are:\n {\n 'id': 'id',\n 'u0': 'U_0',\n 'u1': 'U_1',\n 'u2': 'U_2',\n 'u3': 'U_3',\n 'x': 'X',\n 'y': 'Y',\n 'z': 'Z',\n 'h': 'H',\n 's': 'S',\n 'sdg': 'S^\\\\dagger',\n 't': 'T',\n 'tdg': 'T^\\\\dagger',\n 'rx': 'R_x',\n 'ry': 'R_y',\n 'rz': 'R_z',\n 'reset': '\\\\left|0\\\\right\\\\rangle'\n }\n You must specify all the necessary values if using this. There is\n no provision for passing an incomplete dict in. (`mpl` only)\n displaycolor (dict): The color codes to use for each circuit element.\n By default all values default to the value of `gatefacecolor` and\n the keys are the same as `displaytext`. Also, just like\n `displaytext` there is no provision for an incomplete dict passed\n in. (`mpl` only)\n latexdrawerstyle (bool): When set to True enable latex mode which will\n draw gates like the `latex` output modes. (`mpl` only)\n usepiformat (bool): When set to True use radians for output (`mpl`\n only)\n fold (int): The number of circuit elements to fold the circuit at.\n Defaults to 20 (`mpl` only)\n cregbundle (bool): If set True bundle classical registers (`mpl` only)\n showindex (bool): If set True draw an index. (`mpl` only)\n compress (bool): If set True draw a compressed circuit (`mpl` only)\n figwidth (int): The maximum width (in inches) for the output figure.\n (`mpl` only)\n dpi (int): The DPI to use for the output image. Defaults to 150 (`mpl`\n only)\n margin (list): `mpl` only\n creglinestyle (str): The style of line to use for classical registers.\n Choices are `'solid'`, `'doublet'`, or any valid matplotlib\n `linestyle` kwarg value. Defaults to `doublet`(`mpl` only)\n \"\"\"\n image = None\n\n if output == 'text':\n return _text_circuit_drawer(circuit, filename=filename,\n line_length=line_length,\n reverse_bits=reverse_bits,\n plotbarriers=plot_barriers,\n justify=justify)\n elif output == 'latex':\n image = _latex_circuit_drawer(circuit, scale=scale,\n filename=filename, style=style,\n plot_barriers=plot_barriers,\n reverse_bits=reverse_bits,\n justify=justify)\n elif output == 'latex_source':\n return _generate_latex_source(circuit,\n filename=filename, scale=scale,\n style=style,\n plot_barriers=plot_barriers,\n reverse_bits=reverse_bits,\n justify=justify)\n elif output == 'mpl':\n image = _matplotlib_circuit_drawer(circuit, scale=scale,\n filename=filename, style=style,\n plot_barriers=plot_barriers,\n reverse_bits=reverse_bits,\n justify=justify)\n else:\n raise exceptions.VisualizationError(\n 'Invalid output type %s selected. The only valid choices '\n 'are latex, latex_source, text, and mpl' % output)\n if image and interactive:\n image.show()\n return image"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns a TextDrawing instance that draws a single entry in ascii art.", "response": "def _text_circuit_drawer(circuit, filename=None, line_length=None, reverse_bits=False,\n plotbarriers=True, justify=None, vertically_compressed=True):\n \"\"\"\n Draws a circuit using ascii art.\n Args:\n circuit (QuantumCircuit): Input circuit\n filename (str): optional filename to write the result\n line_length (int): Optional. Breaks the circuit drawing to this length. This\n useful when the drawing does not fit in the console. If\n None (default), it will try to guess the console width using\n shutil.get_terminal_size(). If you don't want pagination\n at all, set line_length=-1.\n reverse_bits (bool): Rearrange the bits in reverse order.\n plotbarriers (bool): Draws the barriers when they are there.\n justify (str) : `left`, `right` or `none`. Defaults to `left`. Says how\n the circuit should be justified.\n vertically_compressed (bool): Default is `True`. It merges the lines so the\n drawing will take less vertical room.\n Returns:\n TextDrawing: An instances that, when printed, draws the circuit in ascii art.\n \"\"\"\n qregs, cregs, ops = utils._get_layered_instructions(circuit,\n reverse_bits=reverse_bits,\n justify=justify)\n text_drawing = _text.TextDrawing(qregs, cregs, ops)\n text_drawing.plotbarriers = plotbarriers\n text_drawing.line_length = line_length\n text_drawing.vertically_compressed = vertically_compressed\n\n if filename:\n text_drawing.dump(filename)\n return text_drawing"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _latex_circuit_drawer(circuit,\n scale=0.7,\n filename=None,\n style=None,\n plot_barriers=True,\n reverse_bits=False,\n justify=None):\n \"\"\"Draw a quantum circuit based on latex (Qcircuit package)\n\n Requires version >=2.6.0 of the qcircuit LaTeX package.\n\n Args:\n circuit (QuantumCircuit): a quantum circuit\n scale (float): scaling factor\n filename (str): file path to save image to\n style (dict or str): dictionary of style or file name of style file\n reverse_bits (bool): When set to True reverse the bit order inside\n registers for the output visualization.\n plot_barriers (bool): Enable/disable drawing barriers in the output\n circuit. Defaults to True.\n justify (str) : `left`, `right` or `none`. Defaults to `left`. Says how\n the circuit should be justified.\n\n Returns:\n PIL.Image: an in-memory representation of the circuit diagram\n\n Raises:\n OSError: usually indicates that ```pdflatex``` or ```pdftocairo``` is\n missing.\n CalledProcessError: usually points errors during diagram creation.\n \"\"\"\n tmpfilename = 'circuit'\n with tempfile.TemporaryDirectory() as tmpdirname:\n tmppath = os.path.join(tmpdirname, tmpfilename + '.tex')\n _generate_latex_source(circuit, filename=tmppath,\n scale=scale, style=style,\n plot_barriers=plot_barriers,\n reverse_bits=reverse_bits, justify=justify)\n image = None\n try:\n\n subprocess.run([\"pdflatex\", \"-halt-on-error\",\n \"-output-directory={}\".format(tmpdirname),\n \"{}\".format(tmpfilename + '.tex')],\n stdout=subprocess.PIPE, stderr=subprocess.DEVNULL,\n check=True)\n except OSError as ex:\n if ex.errno == errno.ENOENT:\n logger.warning('WARNING: Unable to compile latex. '\n 'Is `pdflatex` installed? '\n 'Skipping latex circuit drawing...')\n raise\n except subprocess.CalledProcessError as ex:\n with open('latex_error.log', 'wb') as error_file:\n error_file.write(ex.stdout)\n logger.warning('WARNING Unable to compile latex. '\n 'The output from the pdflatex command can '\n 'be found in latex_error.log')\n raise\n else:\n try:\n base = os.path.join(tmpdirname, tmpfilename)\n subprocess.run([\"pdftocairo\", \"-singlefile\", \"-png\", \"-q\",\n base + '.pdf', base])\n image = Image.open(base + '.png')\n image = utils._trim(image)\n os.remove(base + '.png')\n if filename:\n image.save(filename, 'PNG')\n except OSError as ex:\n if ex.errno == errno.ENOENT:\n logger.warning('WARNING: Unable to convert pdf to image. '\n 'Is `poppler` installed? '\n 'Skipping circuit drawing...')\n raise\n return image", "response": "Draw a quantum circuit using LaTeX."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nconvert QuantumCircuit to LaTeX string.", "response": "def _generate_latex_source(circuit, filename=None,\n scale=0.7, style=None, reverse_bits=False,\n plot_barriers=True, justify=None):\n \"\"\"Convert QuantumCircuit to LaTeX string.\n\n Args:\n circuit (QuantumCircuit): input circuit\n scale (float): image scaling\n filename (str): optional filename to write latex\n style (dict or str): dictionary of style or file name of style file\n reverse_bits (bool): When set to True reverse the bit order inside\n registers for the output visualization.\n plot_barriers (bool): Enable/disable drawing barriers in the output\n circuit. Defaults to True.\n justify (str) : `left`, `right` or `none`. Defaults to `left`. Says how\n the circuit should be justified.\n\n Returns:\n str: Latex string appropriate for writing to file.\n \"\"\"\n qregs, cregs, ops = utils._get_layered_instructions(circuit,\n reverse_bits=reverse_bits,\n justify=justify)\n\n qcimg = _latex.QCircuitImage(qregs, cregs, ops, scale, style=style,\n plot_barriers=plot_barriers,\n reverse_bits=reverse_bits)\n latex = qcimg.latex()\n if filename:\n with open(filename, 'w') as latex_file:\n latex_file.write(latex)\n\n return latex"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _matplotlib_circuit_drawer(circuit,\n scale=0.7,\n filename=None,\n style=None,\n plot_barriers=True,\n reverse_bits=False,\n justify=None):\n \"\"\"Draw a quantum circuit based on matplotlib.\n If `%matplotlib inline` is invoked in a Jupyter notebook, it visualizes a circuit inline.\n We recommend `%config InlineBackend.figure_format = 'svg'` for the inline visualization.\n\n Args:\n circuit (QuantumCircuit): a quantum circuit\n scale (float): scaling factor\n filename (str): file path to save image to\n style (dict or str): dictionary of style or file name of style file\n reverse_bits (bool): When set to True reverse the bit order inside\n registers for the output visualization.\n plot_barriers (bool): Enable/disable drawing barriers in the output\n circuit. Defaults to True.\n justify (str) : `left`, `right` or `none`. Defaults to `left`. Says how\n the circuit should be justified.\n\n\n Returns:\n matplotlib.figure: a matplotlib figure object for the circuit diagram\n \"\"\"\n\n qregs, cregs, ops = utils._get_layered_instructions(circuit,\n reverse_bits=reverse_bits,\n justify=justify)\n qcd = _matplotlib.MatplotlibDrawer(qregs, cregs, ops, scale=scale, style=style,\n plot_barriers=plot_barriers,\n reverse_bits=reverse_bits)\n return qcd.draw(filename)", "response": "Draw a single line of a quantum circuit using matplotlib."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef random_state(dim, seed=None):\n if seed is None:\n seed = np.random.randint(0, np.iinfo(np.int32).max)\n rng = np.random.RandomState(seed)\n # Random array over interval (0, 1]\n x = rng.rand(dim)\n x += x == 0\n x = -np.log(x)\n sumx = sum(x)\n phases = rng.rand(dim)*2.0*np.pi\n return np.sqrt(x/sumx)*np.exp(1j*phases)", "response": "Returns a random quantum state from the uniform Haar measure on\n state space."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef random_unitary(dim, seed=None):\n if dim == 0 or not math.log2(dim).is_integer():\n raise QiskitError(\"Desired unitary dimension not a positive power of 2.\")\n matrix = np.zeros([dim, dim], dtype=complex)\n for j in range(dim):\n if j == 0:\n a = random_state(dim, seed)\n else:\n a = random_state(dim)\n matrix[:, j] = np.copy(a)\n # Grahm-Schmidt Orthogonalize\n i = j-1\n while i >= 0:\n dc = np.vdot(matrix[:, i], a)\n matrix[:, j] = matrix[:, j]-dc*matrix[:, i]\n i = i - 1\n # normalize\n matrix[:, j] = matrix[:, j] * (1.0 / np.sqrt(np.vdot(matrix[:, j], matrix[:, j])))\n return Operator(matrix)", "response": "Returns a random dim x dim unitary operator from the Haar measure."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ngenerate a random density matrix rho.", "response": "def random_density_matrix(length, rank=None, method='Hilbert-Schmidt', seed=None):\n \"\"\"\n Generate a random density matrix rho.\n\n Args:\n length (int): the length of the density matrix.\n rank (int or None): the rank of the density matrix. The default\n value is full-rank.\n method (string): the method to use.\n 'Hilbert-Schmidt': sample rho from the Hilbert-Schmidt metric.\n 'Bures': sample rho from the Bures metric.\n seed (int): Optional. To set a random seed.\n Returns:\n ndarray: rho (length, length) a density matrix.\n Raises:\n QiskitError: if the method is not valid.\n \"\"\"\n if method == 'Hilbert-Schmidt':\n return __random_density_hs(length, rank, seed)\n elif method == 'Bures':\n return __random_density_bures(length, rank, seed)\n else:\n raise QiskitError('Error: unrecognized method {}'.format(method))"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning a normally distributed complex random matrix.", "response": "def __ginibre_matrix(nrow, ncol=None, seed=None):\n \"\"\"\n Return a normally distributed complex random matrix.\n\n Args:\n nrow (int): number of rows in output matrix.\n ncol (int): number of columns in output matrix.\n seed (int): Optional. To set a random seed.\n Returns:\n ndarray: A complex rectangular matrix where each real and imaginary\n entry is sampled from the normal distribution.\n \"\"\"\n if ncol is None:\n ncol = nrow\n if seed is not None:\n np.random.seed(seed)\n G = np.random.normal(size=(nrow, ncol)) + \\\n np.random.normal(size=(nrow, ncol)) * 1j\n return G"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngenerates a random density matrix from the Hilbert - Schmidt metric.", "response": "def __random_density_hs(N, rank=None, seed=None):\n \"\"\"\n Generate a random density matrix from the Hilbert-Schmidt metric.\n\n Args:\n N (int): the length of the density matrix.\n rank (int or None): the rank of the density matrix. The default\n value is full-rank.\n seed (int): Optional. To set a random seed.\n Returns:\n ndarray: rho (N,N a density matrix.\n \"\"\"\n G = __ginibre_matrix(N, rank, seed)\n G = G.dot(G.conj().T)\n return G / np.trace(G)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef __random_density_bures(N, rank=None, seed=None):\n P = np.eye(N) + random_unitary(N).data\n G = P.dot(__ginibre_matrix(N, rank, seed))\n G = G.dot(G.conj().T)\n return G / np.trace(G)", "response": "Generate a random density matrix from the Bures metric."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef qasm(self, prec=15):\n string = \"\"\n for children in self.children:\n string += \" \" + children.qasm(prec) + \"\\n\"\n return string", "response": "Return the corresponding OPENQASM string."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef calls(self):\n lst = []\n for children in self.children:\n if children.type == \"custom_unitary\":\n lst.append(children.name)\n return lst", "response": "Return a list of custom gate names in this gate body."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the corresponding OPENQASM string.", "response": "def qasm(self, prec=15):\n \"\"\"Return the corresponding OPENQASM string.\"\"\"\n if self.value == pi:\n return \"pi\"\n\n return ccode(self.value, precision=prec)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _define(self):\n definition = []\n q = QuantumRegister(1, \"q\")\n rule = [\n (U1Gate(-pi/2), [q[0]], [])\n ]\n for inst in rule:\n definition.append(inst)\n self.definition = definition", "response": "Define the internal state of the class."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef compose(self, other, qargs=None, front=False):\n if qargs is not None:\n return Chi(\n SuperOp(self).compose(other, qargs=qargs, front=front))\n\n # Convert other to Choi since we convert via Choi\n if not isinstance(other, Choi):\n other = Choi(other)\n # Check dimensions match up\n if front and self._input_dim != other._output_dim:\n raise QiskitError(\n 'input_dim of self must match output_dim of other')\n if not front and self._output_dim != other._input_dim:\n raise QiskitError(\n 'input_dim of other must match output_dim of self')\n # Since we cannot directly add two channels in the Chi\n # representation we convert to the Choi representation\n return Chi(Choi(self).compose(other, front=front))", "response": "Return the composition of self and other."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef power(self, n):\n if n > 0:\n return super().power(n)\n return Chi(SuperOp(self).power(n))", "response": "The matrix power of the superoperator matrix."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef add(self, other):\n if not isinstance(other, Chi):\n other = Chi(other)\n if self.dim != other.dim:\n raise QiskitError(\"other QuantumChannel dimensions are not equal\")\n return Chi(self._data + other.data, self._input_dims,\n self._output_dims)", "response": "Returns the linear addition of self and other."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the QuantumChannel self + other.", "response": "def multiply(self, other):\n \"\"\"Return the QuantumChannel self + other.\n\n Args:\n other (complex): a complex number.\n\n Returns:\n Chi: the scalar multiplication other * self as a Chi object.\n\n Raises:\n QiskitError: if other is not a valid scalar.\n \"\"\"\n if not isinstance(other, Number):\n raise QiskitError(\"other is not a number\")\n return Chi(other * self._data, self._input_dims, self._output_dims)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _tensor_product(self, other, reverse=False):\n if not isinstance(other, Chi):\n other = Chi(other)\n if reverse:\n input_dims = self.input_dims() + other.input_dims()\n output_dims = self.output_dims() + other.output_dims()\n data = np.kron(other.data, self._data)\n else:\n input_dims = other.input_dims() + self.input_dims()\n output_dims = other.output_dims() + self.output_dims()\n data = np.kron(self._data, other.data)\n return Chi(data, input_dims, output_dims)", "response": "Returns the tensor product of two quantum channels."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef conjugate(self):\n return SuperOp(\n np.conj(self._data), self.input_dims(), self.output_dims())", "response": "Return the conjugate of the QuantumChannel."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef transpose(self):\n return SuperOp(\n np.transpose(self._data),\n input_dims=self.output_dims(),\n output_dims=self.input_dims())", "response": "Return the transpose of the QuantumChannel."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the composition of self and other on the same subsystem.", "response": "def compose(self, other, qargs=None, front=False):\n \"\"\"Return the composition channel self\u2218other.\n\n Args:\n other (QuantumChannel): a quantum channel.\n qargs (list): a list of subsystem positions to compose other on.\n front (bool): If False compose in standard order other(self(input))\n otherwise compose in reverse order self(other(input))\n [default: False]\n\n Returns:\n SuperOp: The composition channel as a SuperOp object.\n\n Raises:\n QiskitError: if other is not a QuantumChannel subclass, or\n has incompatible dimensions.\n \"\"\"\n # Convert other to SuperOp\n if not isinstance(other, SuperOp):\n other = SuperOp(other)\n # Check dimensions are compatible\n if front and self.input_dims(qargs=qargs) != other.output_dims():\n raise QiskitError(\n 'output_dims of other must match subsystem input_dims')\n if not front and self.output_dims(qargs=qargs) != other.input_dims():\n raise QiskitError(\n 'input_dims of other must match subsystem output_dims')\n\n # Full composition of superoperators\n if qargs is None:\n if front:\n # Composition A(B(input))\n return SuperOp(\n np.dot(self._data, other.data),\n input_dims=other.input_dims(),\n output_dims=self.output_dims())\n # Composition B(A(input))\n return SuperOp(\n np.dot(other.data, self._data),\n input_dims=self.input_dims(),\n output_dims=other.output_dims())\n # Composition on subsystem\n return self._compose_subsystem(other, qargs, front)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef power(self, n):\n if not isinstance(n, (int, np.integer)):\n raise QiskitError(\"Can only power with integer powers.\")\n if self._input_dim != self._output_dim:\n raise QiskitError(\"Can only power with input_dim = output_dim.\")\n # Override base class power so we can implement more efficiently\n # using Numpy.matrix_power\n return SuperOp(\n np.linalg.matrix_power(self._data, n), self.input_dims(),\n self.output_dims())", "response": "Return the composition of a QuantumChannel with itself n times."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add(self, other):\n # Convert other to SuperOp\n if not isinstance(other, SuperOp):\n other = SuperOp(other)\n if self.dim != other.dim:\n raise QiskitError(\"other QuantumChannel dimensions are not equal\")\n return SuperOp(self._data + other.data, self.input_dims(),\n self.output_dims())", "response": "Return the QuantumChannel self + other."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef multiply(self, other):\n if not isinstance(other, Number):\n raise QiskitError(\"other is not a number\")\n return SuperOp(other * self._data, self.input_dims(),\n self.output_dims())", "response": "Return the QuantumChannel self + other."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nevolve a state vector or density matrix by the QuantumChannel.", "response": "def _evolve(self, state, qargs=None):\n \"\"\"Evolve a quantum state by the QuantumChannel.\n\n Args:\n state (QuantumState): The input statevector or density matrix.\n qargs (list): a list of QuantumState subsystem positions to apply\n the operator on.\n\n Returns:\n DensityMatrix: the output quantum state as a density matrix.\n\n Raises:\n QiskitError: if the operator dimension does not match the\n specified QuantumState subsystem dimensions.\n \"\"\"\n state = self._format_state(state, density_matrix=True)\n if qargs is None:\n if state.shape[0] != self._input_dim:\n raise QiskitError(\n \"QuantumChannel input dimension is not equal to state dimension.\"\n )\n shape_in = self._input_dim * self._input_dim\n shape_out = (self._output_dim, self._output_dim)\n # Return evolved density matrix\n return np.reshape(\n np.dot(self._data, np.reshape(state, shape_in, order='F')),\n shape_out,\n order='F')\n # Subsystem evolution\n return self._evolve_subsystem(state, qargs)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the composition channel of the subsystem.", "response": "def _compose_subsystem(self, other, qargs, front=False):\n \"\"\"Return the composition channel.\"\"\"\n # Compute tensor contraction indices from qargs\n input_dims = list(self.input_dims())\n output_dims = list(self.output_dims())\n if front:\n num_indices = len(self.input_dims())\n shift = 2 * len(self.output_dims())\n right_mul = True\n for pos, qubit in enumerate(qargs):\n input_dims[qubit] = other._input_dims[pos]\n else:\n num_indices = len(self.output_dims())\n shift = 0\n right_mul = False\n for pos, qubit in enumerate(qargs):\n output_dims[qubit] = other._output_dims[pos]\n # Reshape current matrix\n # Note that we must reverse the subsystem dimension order as\n # qubit 0 corresponds to the right-most position in the tensor\n # product, which is the last tensor wire index.\n tensor = np.reshape(self.data, self._shape)\n mat = np.reshape(other.data, other._shape)\n # Add first set of indicies\n indices = [2 * num_indices - 1 - qubit for qubit in qargs\n ] + [num_indices - 1 - qubit for qubit in qargs]\n final_shape = [np.product(output_dims)**2, np.product(input_dims)**2]\n data = np.reshape(\n self._einsum_matmul(tensor, mat, indices, shift, right_mul),\n final_shape)\n return SuperOp(data, input_dims, output_dims)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nevolving a state by the operator on the subsystem.", "response": "def _evolve_subsystem(self, state, qargs):\n \"\"\"Evolve a quantum state by the operator.\n\n Args:\n state (QuantumState): The input statevector or density matrix.\n qargs (list): a list of QuantumState subsystem positions to apply\n the operator on.\n\n Returns:\n QuantumState: the output quantum state.\n\n Raises:\n QiskitError: if the operator dimension does not match the\n specified QuantumState subsystem dimensions.\n \"\"\"\n mat = np.reshape(self.data, self._shape)\n # Hack to assume state is a N-qubit state until a proper class for states\n # is in place\n state_size = len(state)\n state_dims = self._automatic_dims(None, state_size)\n if self.input_dims() != len(qargs) * (2, ):\n raise QiskitError(\n \"Channel input dimensions are not compatible with state subsystem dimensions.\"\n )\n # Return evolved density matrix\n tensor = np.reshape(state, 2 * state_dims)\n num_inidices = len(state_dims)\n indices = [num_inidices - 1 - qubit for qubit in qargs\n ] + [2 * num_inidices - 1 - qubit for qubit in qargs]\n tensor = self._einsum_matmul(tensor, mat, indices)\n return np.reshape(tensor, [state_size, state_size])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nconverting a QuantumCircuit or Instruction to a SuperOp.", "response": "def _instruction_to_superop(cls, instruction):\n \"\"\"Convert a QuantumCircuit or Instruction to a SuperOp.\"\"\"\n # Convert circuit to an instruction\n if isinstance(instruction, QuantumCircuit):\n instruction = instruction.to_instruction()\n # Initialize an identity superoperator of the correct size\n # of the circuit\n op = SuperOp(np.eye(4 ** instruction.num_qubits))\n op._append_instruction(instruction)\n return op"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _append_instruction(self, obj, qargs=None):\n if isinstance(obj, Instruction):\n chan = None\n if obj.name == 'reset':\n # For superoperator evolution we can simulate a reset as\n # a non-unitary supeorperator matrix\n chan = SuperOp(\n np.array([[1, 0, 0, 1], [0, 0, 0, 0], [0, 0, 0, 0],\n [0, 0, 0, 0]]))\n if obj.name == 'kraus':\n kraus = obj.params\n dim = len(kraus[0])\n chan = SuperOp(_to_superop('Kraus', (kraus, None), dim, dim))\n elif hasattr(obj, 'to_matrix'):\n # If instruction is a gate first we see if it has a\n # `to_matrix` definition and if so use that.\n try:\n kraus = [obj.to_matrix()]\n dim = len(kraus[0])\n chan = SuperOp(\n _to_superop('Kraus', (kraus, None), dim, dim))\n except QiskitError:\n pass\n if chan is not None:\n # Perform the composition and inplace update the current state\n # of the operator\n op = self.compose(chan, qargs=qargs)\n self._data = op.data\n else:\n # If the instruction doesn't have a matrix defined we use its\n # circuit decomposition definition if it exists, otherwise we\n # cannot compose this gate and raise an error.\n if obj.definition is None:\n raise QiskitError('Cannot apply Instruction: {}'.format(\n obj.name))\n for instr, qregs, cregs in obj.definition:\n if cregs:\n raise QiskitError(\n 'Cannot apply instruction with classical registers: {}'\n .format(instr.name))\n # Get the integer position of the flat register\n new_qargs = [tup[1] for tup in qregs]\n self._append_instruction(instr, qargs=new_qargs)\n else:\n raise QiskitError('Input is not an instruction.')", "response": "Update the current Operator by apply an instruction."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn a DAGCircuit with a barrier before last measurements.", "response": "def run(self, dag):\n \"\"\"Return a circuit with a barrier before last measurements.\"\"\"\n\n # Collect DAG nodes which are followed only by barriers or other measures.\n final_op_types = ['measure', 'barrier']\n final_ops = []\n for candidate_node in dag.named_nodes(*final_op_types):\n is_final_op = True\n\n for _, child_successors in dag.bfs_successors(candidate_node):\n\n if any(suc.type == 'op' and suc.name not in final_op_types\n for suc in child_successors):\n is_final_op = False\n break\n\n if is_final_op:\n final_ops.append(candidate_node)\n\n if not final_ops:\n return dag\n\n # Create a layer with the barrier and add registers from the original dag.\n barrier_layer = DAGCircuit()\n for qreg in dag.qregs.values():\n barrier_layer.add_qreg(qreg)\n for creg in dag.cregs.values():\n barrier_layer.add_creg(creg)\n\n final_qubits = set(final_op.qargs[0] for final_op in final_ops)\n\n barrier_layer.apply_operation_back(\n Barrier(len(final_qubits)), list(final_qubits), [])\n\n # Preserve order of final ops collected earlier from the original DAG.\n ordered_final_nodes = [node for node in dag.topological_op_nodes()\n if node in set(final_ops)]\n\n # Move final ops to the new layer and append the new layer to the DAG.\n for final_node in ordered_final_nodes:\n barrier_layer.apply_operation_back(final_node.op,\n final_node.qargs,\n final_node.cargs)\n\n for final_op in final_ops:\n dag.remove_op_node(final_op)\n\n dag.extend_back(barrier_layer)\n\n # Merge the new barrier into any other barriers\n adjacent_pass = MergeAdjacentBarriers()\n return adjacent_pass.run(dag)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nconvert a list of QuantumCircuits into a Qobj.", "response": "def circuits_to_qobj(circuits, qobj_header=None,\n qobj_id=None, backend_name=None,\n config=None, shots=None, max_credits=None,\n basis_gates=None,\n coupling_map=None, seed=None, memory=None):\n \"\"\"Convert a list of circuits into a qobj.\n\n Args:\n circuits (list[QuantumCircuits] or QuantumCircuit): circuits to compile\n qobj_header (QobjHeader): header to pass to the results\n qobj_id (int): TODO: delete after qiskit-terra 0.8\n backend_name (str): TODO: delete after qiskit-terra 0.8\n config (dict): TODO: delete after qiskit-terra 0.8\n shots (int): TODO: delete after qiskit-terra 0.8\n max_credits (int): TODO: delete after qiskit-terra 0.8\n basis_gates (str): TODO: delete after qiskit-terra 0.8\n coupling_map (list): TODO: delete after qiskit-terra 0.8\n seed (int): TODO: delete after qiskit-terra 0.8\n memory (bool): TODO: delete after qiskit-terra 0.8\n\n Returns:\n Qobj: the Qobj to be run on the backends\n \"\"\"\n warnings.warn('circuits_to_qobj is deprecated and will be removed in Qiskit Terra 0.9. '\n 'Use qiskit.compiler.assemble() to serialize circuits into a qobj.',\n DeprecationWarning)\n\n qobj_header = qobj_header or QobjHeader()\n\n if backend_name:\n qobj_header.backend_name = backend_name\n if basis_gates:\n warnings.warn('basis_gates was unused and will be removed.', DeprecationWarning)\n if coupling_map:\n warnings.warn('coupling_map was unused and will be removed.', DeprecationWarning)\n\n qobj = assemble(experiments=circuits,\n qobj_id=qobj_id,\n qobj_header=qobj_header,\n shots=shots,\n memory=memory,\n max_credits=max_credits,\n seed_simulator=seed,\n config=config)\n\n return qobj"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nexpand 3 + qubit gates using their decomposition rules.", "response": "def run(self, dag):\n \"\"\"Expand 3+ qubit gates using their decomposition rules.\n\n Args:\n dag(DAGCircuit): input dag\n Returns:\n DAGCircuit: output dag with maximum node degrees of 2\n Raises:\n QiskitError: if a 3q+ gate is not decomposable\n \"\"\"\n for node in dag.threeQ_or_more_gates():\n # TODO: allow choosing other possible decompositions\n rule = node.op.definition\n if not rule:\n raise QiskitError(\"Cannot unroll all 3q or more gates. \"\n \"No rule to expand instruction %s.\" %\n node.op.name)\n\n # hacky way to build a dag on the same register as the rule is defined\n # TODO: need anonymous rules to address wires by index\n decomposition = DAGCircuit()\n decomposition.add_qreg(rule[0][1][0][0])\n for inst in rule:\n decomposition.apply_operation_back(*inst)\n decomposition = self.run(decomposition) # recursively unroll\n dag.substitute_node_with_dag(node, decomposition)\n return dag"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef run(self, dag):\n # Walk through the DAG and expand each non-basis node\n for node in dag.op_nodes(self.gate):\n # opaque or built-in gates are not decomposable\n if not node.op.definition:\n continue\n # TODO: allow choosing among multiple decomposition rules\n rule = node.op.definition\n # hacky way to build a dag on the same register as the rule is defined\n # TODO: need anonymous rules to address wires by index\n decomposition = DAGCircuit()\n decomposition.add_qreg(rule[0][1][0][0])\n if rule[0][2]:\n decomposition.add_creg(rule[0][2][0][0])\n for inst in rule:\n decomposition.apply_operation_back(*inst)\n dag.substitute_node_with_dag(node, decomposition)\n return dag", "response": "Expand a given gate into its decomposition."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\napplies u2 to q.", "response": "def unitary(self, obj, qubits, label=None):\n \"\"\"Apply u2 to q.\"\"\"\n if isinstance(qubits, QuantumRegister):\n qubits = qubits[:]\n return self.append(UnitaryGate(obj, label=label), qubits, [])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncalculate a subcircuit that implements this unitary.", "response": "def _define(self):\n \"\"\"Calculate a subcircuit that implements this unitary.\"\"\"\n if self.num_qubits == 1:\n q = QuantumRegister(1, \"q\")\n angles = euler_angles_1q(self.to_matrix())\n self.definition = [(U3Gate(*angles), [q[0]], [])]\n if self.num_qubits == 2:\n self.definition = two_qubit_kak(self.to_matrix())"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef check_type(self, value, attr, data):\n if self.many and not is_collection(value):\n raise self._not_expected_type(\n value, Iterable, fields=[self], field_names=attr, data=data)\n\n _check_type = super().check_type\n\n errors = []\n values = value if self.many else [value]\n for idx, v in enumerate(values):\n try:\n _check_type(v, idx, values)\n except ValidationError as err:\n errors.append(err.messages)\n\n if errors:\n errors = errors if self.many else errors[0]\n raise ValidationError(errors)\n\n return value", "response": "Validate if the value is of the type of the schema s model."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef check_type(self, value, attr, data):\n super().check_type(value, attr, data)\n\n errors = []\n for idx, v in enumerate(value):\n try:\n self.container.check_type(v, idx, value)\n except ValidationError as err:\n errors.append(err.messages)\n\n if errors:\n raise ValidationError(errors)\n\n return value", "response": "Validate if the value is a list of valid item - field values."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndraws the graph of the dag.", "response": "def dag_drawer(dag, scale=0.7, filename=None, style='color'):\n \"\"\"Plot the directed acyclic graph (dag) to represent operation dependencies\n in a quantum circuit.\n\n Note this function leverages\n `pydot `_ (via\n `nxpd `_ installed on your\n system is required for this to work.\n\n Args:\n dag (DAGCircuit): The dag to draw.\n scale (float): scaling factor\n filename (str): file path to save image to (format inferred from name)\n style (str): 'plain': B&W graph\n 'color' (default): color input/output/op nodes\n\n Returns:\n Ipython.display.Image: if in Jupyter notebook and not saving to file,\n otherwise None.\n\n Raises:\n VisualizationError: when style is not recognized.\n ImportError: when nxpd or pydot not installed.\n \"\"\"\n try:\n import nxpd\n import pydot # pylint: disable=unused-import\n except ImportError:\n raise ImportError(\"dag_drawer requires nxpd, pydot, and Graphviz. \"\n \"Run 'pip install nxpd pydot', and install graphviz\")\n\n G = dag.to_networkx()\n G.graph['dpi'] = 100 * scale\n\n if style == 'plain':\n pass\n elif style == 'color':\n for node in G.nodes:\n n = G.nodes[node]\n n['label'] = node.name\n if node.type == 'op':\n n['color'] = 'blue'\n n['style'] = 'filled'\n n['fillcolor'] = 'lightblue'\n if node.type == 'in':\n n['color'] = 'black'\n n['style'] = 'filled'\n n['fillcolor'] = 'green'\n if node.type == 'out':\n n['color'] = 'black'\n n['style'] = 'filled'\n n['fillcolor'] = 'red'\n for e in G.edges(data=True):\n e[2]['label'] = e[2]['name']\n else:\n raise VisualizationError(\"Unrecognized style for the dag_drawer.\")\n\n if filename:\n show = False\n elif ('ipykernel' in sys.modules) and ('spyder' not in sys.modules):\n show = 'ipynb'\n else:\n show = True\n\n return nxpd.draw(G, filename=filename, show=show)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _atol(self, atol):\n # NOTE: that this overrides the class value so applies to all\n # instances of the class.\n max_tol = self.__class__.MAX_TOL\n if atol < 0:\n raise QiskitError(\"Invalid atol: must be non-negative.\")\n if atol > max_tol:\n raise QiskitError(\n \"Invalid atol: must be less than {}.\".format(max_tol))\n self.__class__.ATOL = atol", "response": "Set the absolute tolerence parameter for float comparisons."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsetting the relative tolerence parameter for float comparisons.", "response": "def _rtol(self, rtol):\n \"\"\"Set the relative tolerence parameter for float comparisons.\"\"\"\n # NOTE: that this overrides the class value so applies to all\n # instances of the class.\n max_tol = self.__class__.MAX_TOL\n if rtol < 0:\n raise QiskitError(\"Invalid rtol: must be non-negative.\")\n if rtol > max_tol:\n raise QiskitError(\n \"Invalid rtol: must be less than {}.\".format(max_tol))\n self.__class__.RTOL = rtol"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn tuple of input dimension for specified subsystems.", "response": "def input_dims(self, qargs=None):\n \"\"\"Return tuple of input dimension for specified subsystems.\"\"\"\n if qargs is None:\n return self._input_dims\n return tuple(self._input_dims[i] for i in qargs)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef output_dims(self, qargs=None):\n if qargs is None:\n return self._output_dims\n return tuple(self._output_dims[i] for i in qargs)", "response": "Return tuple of output dimension for specified subsystems."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nmaking a copy of current operator.", "response": "def copy(self):\n \"\"\"Make a copy of current operator.\"\"\"\n # pylint: disable=no-value-for-parameter\n # The constructor of subclasses from raw data should be a copy\n return self.__class__(self.data, self.input_dims(), self.output_dims())"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef power(self, n):\n # NOTE: if a subclass can have negative or non-integer powers\n # this method should be overriden in that class.\n if not isinstance(n, (int, np.integer)) or n < 1:\n raise QiskitError(\"Can only power with positive integer powers.\")\n if self._input_dim != self._output_dim:\n raise QiskitError(\"Can only power with input_dim = output_dim.\")\n ret = self.copy()\n for _ in range(1, n):\n ret = ret.compose(self)\n return ret", "response": "Return a copy of the current operator with itself n times."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _automatic_dims(cls, dims, size):\n if dims is None:\n dims = size\n elif np.product(dims) != size:\n raise QiskitError(\"dimensions do not match size.\")\n if isinstance(dims, (int, np.integer)):\n num_qubits = int(np.log2(dims))\n if 2 ** num_qubits == size:\n return num_qubits * (2,)\n return (dims,)\n return tuple(dims)", "response": "Check if input dimension corresponds to qubit subsystems."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nperforms a contraction using Numpy. einsum.", "response": "def _einsum_matmul(cls, tensor, mat, indices, shift=0, right_mul=False):\n \"\"\"Perform a contraction using Numpy.einsum\n\n Args:\n tensor (np.array): a vector or matrix reshaped to a rank-N tensor.\n mat (np.array): a matrix reshaped to a rank-2M tensor.\n indices (list): tensor indices to contract with mat.\n shift (int): shift for indicies of tensor to contract [Default: 0].\n right_mul (bool): if True right multiply tensor by mat\n (else left multiply) [Default: False].\n\n Returns:\n Numpy.ndarray: the matrix multiplied rank-N tensor.\n\n Raises:\n QiskitError: if mat is not an even rank tensor.\n \"\"\"\n rank = tensor.ndim\n rank_mat = mat.ndim\n if rank_mat % 2 != 0:\n raise QiskitError(\n \"Contracted matrix must have an even number of indices.\")\n # Get einsum indices for tensor\n indices_tensor = list(range(rank))\n for j, index in enumerate(indices):\n indices_tensor[index + shift] = rank + j\n # Get einsum indces for mat\n mat_contract = list(reversed(range(rank, rank + len(indices))))\n mat_free = [index + shift for index in reversed(indices)]\n if right_mul:\n indices_mat = mat_contract + mat_free\n else:\n indices_mat = mat_free + mat_contract\n return np.einsum(tensor, indices_tensor, mat, indices_mat)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\noverrides _deserialize for customizing the exception raised.", "response": "def _deserialize(self, value, attr, data):\n \"\"\"Override ``_deserialize`` for customizing the exception raised.\"\"\"\n try:\n return super()._deserialize(value, attr, data)\n except ValidationError as ex:\n if 'deserialization_schema_selector' in ex.messages[0]:\n ex.messages[0] = 'Cannot find a valid schema among the choices'\n raise"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _serialize(self, value, key, obj):\n try:\n return super()._serialize(value, key, obj)\n except TypeError as ex:\n if 'serialization_schema_selector' in str(ex):\n raise ValidationError('Data from an invalid schema')\n raise", "response": "Override _serialize for customizing the exception raised."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nchecks if at least one of the possible choices validates the value.", "response": "def check_type(self, value, attr, data):\n \"\"\"Check if at least one of the possible choices validates the value.\n\n Possible choices are assumed to be ``ModelTypeValidator`` fields.\n \"\"\"\n for field in self.choices:\n if isinstance(field, ModelTypeValidator):\n try:\n return field.check_type(value, attr, data)\n except ValidationError:\n pass\n\n raise self._not_expected_type(\n value, [field.__class__ for field in self.choices],\n fields=[self], field_names=attr, data=data)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the fidelity between two quantum states.", "response": "def state_fidelity(state1, state2):\n \"\"\"Return the state fidelity between two quantum states.\n\n Either input may be a state vector, or a density matrix. The state\n fidelity (F) for two density matrices is defined as::\n\n F(rho1, rho2) = Tr[sqrt(sqrt(rho1).rho2.sqrt(rho1))] ^ 2\n\n For a pure state and mixed state the fidelity is given by::\n\n F(|psi1>, rho2) = \n\n For two pure states the fidelity is given by::\n\n F(|psi1>, |psi2>) = ||^2\n\n Args:\n state1 (array_like): a quantum state vector or density matrix.\n state2 (array_like): a quantum state vector or density matrix.\n\n Returns:\n array_like: The state fidelity F(state1, state2).\n \"\"\"\n # convert input to numpy arrays\n s1 = np.array(state1)\n s2 = np.array(state2)\n\n # fidelity of two state vectors\n if s1.ndim == 1 and s2.ndim == 1:\n return np.abs(s2.conj().dot(s1)) ** 2\n # fidelity of vector and density matrix\n elif s1.ndim == 1:\n # psi = s1, rho = s2\n return np.abs(s1.conj().dot(s2).dot(s1))\n elif s2.ndim == 1:\n # psi = s2, rho = s1\n return np.abs(s2.conj().dot(s1).dot(s2))\n # fidelity of two density matrices\n s1sq = _funm_svd(s1, np.sqrt)\n s2sq = _funm_svd(s2, np.sqrt)\n return np.linalg.norm(s1sq.dot(s2sq), ord='nuc') ** 2"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\napply real scalar function to singular values of a matrix.", "response": "def _funm_svd(a, func):\n \"\"\"Apply real scalar function to singular values of a matrix.\n\n Args:\n a (array_like): (N, N) Matrix at which to evaluate the function.\n func (callable): Callable object that evaluates a scalar function f.\n\n Returns:\n ndarray: funm (N, N) Value of the matrix function specified by func\n evaluated at `A`.\n \"\"\"\n U, s, Vh = la.svd(a, lapack_driver='gesvd')\n S = np.diag(func(s))\n return U.dot(S).dot(Vh)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ntakes a statevector snapshot of the internal simulator representation.", "response": "def snapshot(self,\n label,\n snapshot_type='statevector',\n qubits=None,\n params=None):\n \"\"\"Take a statevector snapshot of the internal simulator representation.\n Works on all qubits, and prevents reordering (like barrier).\n\n For other types of snapshots use the Snapshot extension directly.\n\n Args:\n label (str): a snapshot label to report the result\n snapshot_type (str): the type of the snapshot.\n qubits (list or None): the qubits to apply snapshot to [Default: None].\n params (list or None): the parameters for snapshot_type [Default: None].\n\n Returns:\n QuantumCircuit: with attached command\n\n Raises:\n ExtensionError: malformed command\n \"\"\"\n # Convert label to string for backwards compatibility\n if not isinstance(label, str):\n warnings.warn(\n \"Snapshot label should be a string, \"\n \"implicit conversion is depreciated.\", DeprecationWarning)\n label = str(label)\n # If no qubits are specified we add all qubits so it acts as a barrier\n # This is needed for full register snapshots like statevector\n if isinstance(qubits, QuantumRegister):\n qubits = qubits[:]\n if not qubits:\n tuples = []\n if isinstance(self, QuantumCircuit):\n for register in self.qregs:\n tuples.append(register)\n if not tuples:\n raise ExtensionError('no qubits for snapshot')\n qubits = []\n for tuple_element in tuples:\n if isinstance(tuple_element, QuantumRegister):\n for j in range(tuple_element.size):\n qubits.append((tuple_element, j))\n else:\n qubits.append(tuple_element)\n return self.append(\n Snapshot(\n label,\n snapshot_type=snapshot_type,\n num_qubits=len(qubits),\n params=params), qubits)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsets the label of the current object", "response": "def label(self, name):\n \"\"\"Set snapshot label to name\n\n Args:\n name (str or None): label to assign unitary\n\n Raises:\n TypeError: name is not string or None.\n \"\"\"\n if isinstance(name, str):\n self._label = name\n else:\n raise TypeError('label expects a string')"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn True if completely - positive trace - preserving ( CPTP ).", "response": "def is_cptp(self, atol=None, rtol=None):\n \"\"\"Return True if completely-positive trace-preserving (CPTP).\"\"\"\n choi = _to_choi(self.rep, self._data, *self.dim)\n return self._is_cp_helper(choi, atol, rtol) and self._is_tp_helper(\n choi, atol, rtol)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ntests if a channel is completely - positive ( CP )", "response": "def is_tp(self, atol=None, rtol=None):\n \"\"\"Test if a channel is completely-positive (CP)\"\"\"\n choi = _to_choi(self.rep, self._data, *self.dim)\n return self._is_tp_helper(choi, atol, rtol)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef is_cp(self, atol=None, rtol=None):\n choi = _to_choi(self.rep, self._data, *self.dim)\n return self._is_cp_helper(choi, atol, rtol)", "response": "Test if Choi - matrix is completely - positive ( CP )."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning True if QuantumChannel is a unitary channel.", "response": "def is_unitary(self, atol=None, rtol=None):\n \"\"\"Return True if QuantumChannel is a unitary channel.\"\"\"\n try:\n op = self.to_operator()\n return op.is_unitary(atol=atol, rtol=rtol)\n except QiskitError:\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ntries to convert channel to a unitary representation Operator.", "response": "def to_operator(self):\n \"\"\"Try to convert channel to a unitary representation Operator.\"\"\"\n mat = _to_operator(self.rep, self._data, *self.dim)\n return Operator(mat, self.input_dims(), self.output_dims())"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef to_instruction(self):\n from qiskit.circuit.instruction import Instruction\n # Check if input is an N-qubit CPTP channel.\n n_qubits = int(np.log2(self._input_dim))\n if self._input_dim != self._output_dim or 2**n_qubits != self._input_dim:\n raise QiskitError(\n 'Cannot convert QuantumChannel to Instruction: channel is not an N-qubit channel.'\n )\n if not self.is_cptp():\n raise QiskitError(\n 'Cannot convert QuantumChannel to Instruction: channel is not CPTP.'\n )\n # Next we convert to the Kraus representation. Since channel is CPTP we know\n # that there is only a single set of Kraus operators\n kraus, _ = _to_kraus(self.rep, self._data, *self.dim)\n # If we only have a single Kraus operator then the channel is\n # a unitary channel so can be converted to a UnitaryGate. We do this by\n # converting to an Operator and using its to_instruction method\n if len(kraus) == 1:\n return Operator(kraus[0]).to_instruction()\n return Instruction('kraus', n_qubits, 0, kraus)", "response": "Convert the current state of the object to a Kraus or UnitaryGate circuit instruction."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ntests if a channel is completely - positive ( CP )", "response": "def _is_cp_helper(self, choi, atol, rtol):\n \"\"\"Test if a channel is completely-positive (CP)\"\"\"\n if atol is None:\n atol = self._atol\n if rtol is None:\n rtol = self._rtol\n return is_positive_semidefinite_matrix(choi, rtol=rtol, atol=atol)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _is_tp_helper(self, choi, atol, rtol):\n if atol is None:\n atol = self._atol\n if rtol is None:\n rtol = self._rtol\n # Check if the partial trace is the identity matrix\n d_in, d_out = self.dim\n mat = np.trace(\n np.reshape(choi, (d_in, d_out, d_in, d_out)), axis1=1, axis2=3)\n return is_identity_matrix(mat, rtol=rtol, atol=atol)", "response": "Test if Choi - matrix is trace - preserving ( TP )."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nformatting input state so it is statevector or density matrix.", "response": "def _format_state(self, state, density_matrix=False):\n \"\"\"Format input state so it is statevector or density matrix\"\"\"\n state = np.array(state)\n shape = state.shape\n ndim = state.ndim\n if ndim > 2:\n raise QiskitError('Input state is not a vector or matrix.')\n # Flatten column-vector to vector\n if ndim == 2:\n if shape[1] != 1 and shape[1] != shape[0]:\n raise QiskitError('Input state is not a vector or matrix.')\n if shape[1] == 1:\n # flatten colum-vector to vector\n state = np.reshape(state, shape[0])\n # Convert statevector to density matrix if required\n if density_matrix and ndim == 1:\n state = np.outer(state, np.transpose(np.conj(state)))\n return state"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _init_transformer(cls, data):\n # This handles common conversion for all QuantumChannel subclasses.\n # If the input is already a QuantumChannel subclass it will return\n # the original object\n if isinstance(data, QuantumChannel):\n return data\n if hasattr(data, 'to_quantumchannel'):\n # If the data object is not a QuantumChannel it will give\n # preference to a 'to_quantumchannel' attribute that allows\n # an arbitrary object to define its own conversion to any\n # quantum channel subclass.\n return data.to_channel()\n if hasattr(data, 'to_channel'):\n # TODO: this 'to_channel' method is the same case as the above\n # but is used by current version of Aer. It should be removed\n # once Aer is nupdated to use `to_quantumchannel`\n # instead of `to_channel`,\n return data.to_channel()\n # Finally if the input is not a QuantumChannel and doesn't have a\n # 'to_quantumchannel' conversion method we try and initialize it as a\n # regular matrix Operator which can be converted into a QuantumChannel.\n return Operator(data)", "response": "Initialize the transformer object."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef run(self, dag):\n if self.layout is None:\n if self.property_set[\"layout\"]:\n self.layout = self.property_set[\"layout\"]\n else:\n self.layout = Layout.generate_trivial_layout(*dag.qregs.values())\n\n self.property_set['is_direction_mapped'] = True\n edges = self.coupling_map.get_edges()\n\n for gate in dag.twoQ_gates():\n physical_q0 = self.layout[gate.qargs[0]]\n physical_q1 = self.layout[gate.qargs[1]]\n\n if isinstance(gate.op, (CXBase, CnotGate)) and (\n physical_q0, physical_q1) not in edges:\n self.property_set['is_direction_mapped'] = False\n return", "response": "Runs the actual logic for the base class."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate a Graphene Enum for sorting a SQLAlchemy model class", "response": "def sort_enum_for_model(cls, name=None, symbol_name=_symbol_name):\n \"\"\"Create Graphene Enum for sorting a SQLAlchemy class query\n\n Parameters\n - cls : Sqlalchemy model class\n Model used to create the sort enumerator\n - name : str, optional, default None\n Name to use for the enumerator. If not provided it will be set to `cls.__name__ + 'SortEnum'`\n - symbol_name : function, optional, default `_symbol_name`\n Function which takes the column name and a boolean indicating if the sort direction is ascending,\n and returns the symbol name for the current column and sort direction.\n The default function will create, for a column named 'foo', the symbols 'foo_asc' and 'foo_desc'\n\n Returns\n - Enum\n The Graphene enumerator\n \"\"\"\n enum, _ = _sort_enum_for_model(cls, name, symbol_name)\n return enum"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns an Argument that can be used to sort the result of a model.", "response": "def sort_argument_for_model(cls, has_default=True):\n \"\"\"Returns a Graphene argument for the sort field that accepts a list of sorting directions for a model.\n If `has_default` is True (the default) it will sort the result by the primary key(s)\n \"\"\"\n enum, default = _sort_enum_for_model(cls)\n if not has_default:\n default = None\n return Argument(List(enum), default_value=default)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nsearch for all dates in the given string in a natural language.", "response": "def search_dates(text, languages=None, settings=None, add_detected_language=False):\n \"\"\"Find all substrings of the given string which represent date and/or time and parse them.\n\n :param text:\n A string in a natural language which may contain date and/or time expressions.\n :type text: str|unicode\n\n :param languages:\n A list of two letters language codes.e.g. ['en', 'es']. If languages are given, it will\n not attempt to detect the language.\n :type languages: list\n\n :param settings:\n Configure customized behavior using settings defined in :mod:`dateparser.conf.Settings`.\n :type settings: dict\n\n :param add_detected_language:\n Indicates if we want the detected language returned in the tuple.\n :type add_detected_language: bool\n\n :return: Returns list of tuples containing:\n substrings representing date and/or time, corresponding :mod:`datetime.datetime`\n object and detected language if *add_detected_language* is True.\n Returns None if no dates that can be parsed are found.\n :rtype: list\n :raises: ValueError - Unknown Language\n\n >>> from dateparser.search import search_dates\n >>> search_dates('The first artificial Earth satellite was launched on 4 October 1957.')\n [('on 4 October 1957', datetime.datetime(1957, 10, 4, 0, 0))]\n\n >>> search_dates('The first artificial Earth satellite was launched on 4 October 1957.', add_detected_language=True)\n [('on 4 October 1957', datetime.datetime(1957, 10, 4, 0, 0), 'en')]\n\n >>> search_dates(\"The client arrived to the office for the first time in March 3rd, 2004 and got serviced, after a couple of months, on May 6th 2004, the customer returned indicating a defect on the part\")\n [('in March 3rd, 2004 and', datetime.datetime(2004, 3, 3, 0, 0)),\n ('on May 6th 2004', datetime.datetime(2004, 5, 6, 0, 0))]\n\n\n \"\"\"\n result = _search_with_detection.search_dates(\n text=text, languages=languages, settings=settings\n )\n language, dates = result.get('Language'), result.get('Dates')\n if dates:\n if add_detected_language:\n dates = [date + (language, ) for date in dates]\n return dates"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nmonkey patching _strptime to avoid problems related with non - English locale changes on the system.", "response": "def patch_strptime():\n \"\"\"Monkey patching _strptime to avoid problems related with non-english\n locale changes on the system.\n\n For example, if system's locale is set to fr_FR. Parser won't recognize\n any date since all languages are translated to english dates.\n \"\"\"\n\n _strptime = imp.load_module(\n 'strptime_patched', *imp.find_module('_strptime')\n )\n\n _calendar = imp.load_module(\n 'calendar_patched', *imp.find_module('_strptime')\n )\n\n _strptime._getlang = lambda: ('en_US', 'UTF-8')\n _strptime.calendar = _calendar\n _strptime.calendar.day_abbr = [\n 'mon', 'tue', 'wed', 'thu', 'fri', 'sat', 'sun'\n ]\n _strptime.calendar.day_name = [\n 'monday', 'tuesday', 'wednesday', 'thursday',\n 'friday', 'saturday', 'sunday'\n ]\n _strptime.calendar.month_abbr = [\n '', 'jan', 'feb', 'mar', 'apr', 'may', 'jun',\n 'jul', 'aug', 'sep', 'oct', 'nov', 'dec'\n ]\n _strptime.calendar.month_name = [\n '', 'january', 'february', 'march', 'april',\n 'may', 'june', 'july', 'august', 'september',\n 'october', 'november', 'december'\n ]\n\n return _strptime._strptime_time"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_locale_map(self, languages=None, locales=None, region=None,\n use_given_order=False, allow_conflicting_locales=False):\n \"\"\"\n Get an ordered mapping with locale codes as keys\n and corresponding locale instances as values.\n\n :param languages:\n A list of language codes, e.g. ['en', 'es', 'zh-Hant'].\n If locales are not given, languages and region are\n used to construct locales to load.\n :type languages: list\n\n :param locales:\n A list of codes of locales which are to be loaded,\n e.g. ['fr-PF', 'qu-EC', 'af-NA']\n :type locales: list\n\n :param region:\n A region code, e.g. 'IN', '001', 'NE'.\n If locales are not given, languages and region are\n used to construct locales to load.\n :type region: str|unicode\n\n :param use_given_order:\n If True, the returned mapping is ordered in the order locales are given.\n :type allow_redetect_language: bool\n\n :param allow_conflicting_locales:\n if True, locales with same language and different region can be loaded.\n :type allow_conflicting_locales: bool\n\n :return: ordered locale code to locale instance mapping\n \"\"\"\n return OrderedDict(self._load_data(\n languages=languages, locales=locales, region=region, use_given_order=use_given_order,\n allow_conflicting_locales=allow_conflicting_locales))", "response": "Returns an ordered mapping of locale codes to locale instances."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_locales(self, languages=None, locales=None, region=None,\n use_given_order=False, allow_conflicting_locales=False):\n \"\"\"\n Yield locale instances.\n\n :param languages:\n A list of language codes, e.g. ['en', 'es', 'zh-Hant'].\n If locales are not given, languages and region are\n used to construct locales to load.\n :type languages: list\n\n :param locales:\n A list of codes of locales which are to be loaded,\n e.g. ['fr-PF', 'qu-EC', 'af-NA']\n :type locales: list\n\n :param region:\n A region code, e.g. 'IN', '001', 'NE'.\n If locales are not given, languages and region are\n used to construct locales to load.\n :type region: str|unicode\n\n :param use_given_order:\n If True, the returned mapping is ordered in the order locales are given.\n :type allow_redetect_language: bool\n\n :param allow_conflicting_locales:\n if True, locales with same language and different region can be loaded.\n :type allow_conflicting_locales: bool\n\n :yield: locale instances\n \"\"\"\n for _, locale in self._load_data(\n languages=languages, locales=locales, region=region,\n use_given_order=use_given_order,\n allow_conflicting_locales=allow_conflicting_locales):\n yield locale", "response": "Yields a list of all available locale instances."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef are_tokens_valid(self, tokens):\n match_relative_regex = self._get_match_relative_regex_cache()\n for token in tokens:\n if any([match_relative_regex.match(token),\n token in self, token.isdigit()]):\n continue\n else:\n return False\n else:\n return True", "response": "Check if tokens are valid for the locale."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef split(self, string, keep_formatting=False):\n if not string:\n return string\n\n split_relative_regex = self._get_split_relative_regex_cache()\n match_relative_regex = self._get_match_relative_regex_cache()\n\n tokens = split_relative_regex.split(string)\n\n for i, token in enumerate(tokens):\n if match_relative_regex.match(token):\n tokens[i] = [token]\n continue\n tokens[i] = self._split_by_known_words(token, keep_formatting)\n\n return list(filter(bool, chain(*tokens)))", "response": "Splits the date string using translations in locale info."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nsearch for dates in a given string in a natural language.", "response": "def search_dates(self, text, languages=None, settings=None):\n \"\"\"\n Find all substrings of the given string which represent date and/or time and parse them.\n\n :param text:\n A string in a natural language which may contain date and/or time expressions.\n :type text: str|unicode\n :param languages:\n A list of two letters language codes.e.g. ['en', 'es']. If languages are given, it will not attempt\n to detect the language.\n :type languages: list\n :param settings:\n Configure customized behavior using settings defined in :mod:`dateparser.conf.Settings`.\n :type settings: dict\n\n :return: a dict mapping keys to two letter language code and a list of tuples of pairs:\n substring representing date expressions and corresponding :mod:`datetime.datetime` object.\n For example:\n {'Language': 'en', 'Dates': [('on 4 October 1957', datetime.datetime(1957, 10, 4, 0, 0))]}\n If language of the string isn't recognised returns:\n {'Language': None, 'Dates': None}\n :raises: ValueError - Unknown Language\n \"\"\"\n\n language_shortname = self.detect_language(text=text, languages=languages)\n if not language_shortname:\n return {'Language': None, 'Dates': None}\n return {'Language': language_shortname, 'Dates': self.search.search_parse(language_shortname, text,\n settings=settings)}"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef parse(date_string, date_formats=None, languages=None, locales=None, region=None, settings=None):\n parser = _default_parser\n\n if any([languages, locales, region, not settings._default]):\n parser = DateDataParser(languages=languages, locales=locales,\n region=region, settings=settings)\n\n data = parser.get_date_data(date_string, date_formats)\n\n if data:\n return data['date_obj']", "response": "Parse a date and time from a given string."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nparse the time part of a string like 1 day ago 2 PM", "response": "def _parse_time(self, date_string, settings):\n \"\"\"Attemps to parse time part of date strings like '1 day ago, 2 PM' \"\"\"\n date_string = PATTERN.sub('', date_string)\n date_string = re.sub(r'\\b(?:ago|in)\\b', '', date_string)\n try:\n return time_parser(date_string)\n except:\n pass"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks if the locale is applicable to translate a date string.", "response": "def is_applicable(self, date_string, strip_timezone=False, settings=None):\n \"\"\"\n Check if the locale is applicable to translate date string.\n\n :param date_string:\n A string representing date and/or time in a recognizably valid format.\n :type date_string: str|unicode\n\n :param strip_timezone:\n If True, timezone is stripped from date string.\n :type strip_timezone: bool\n\n :return: boolean value representing if the locale is applicable for the date string or not.\n \"\"\"\n if strip_timezone:\n date_string, _ = pop_tz_offset_from_string(date_string, as_offset=False)\n\n date_string = self._translate_numerals(date_string)\n if settings.NORMALIZE:\n date_string = normalize_unicode(date_string)\n date_string = self._simplify(date_string, settings=settings)\n dictionary = self._get_dictionary(settings)\n date_tokens = dictionary.split(date_string)\n\n return dictionary.are_tokens_valid(date_tokens)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ntranslating the date string to its English equivalent.", "response": "def translate(self, date_string, keep_formatting=False, settings=None):\n \"\"\"\n Translate the date string to its English equivalent.\n\n :param date_string:\n A string representing date and/or time in a recognizably valid format.\n :type date_string: str|unicode\n\n :param keep_formatting:\n If True, retain formatting of the date string after translation.\n :type keep_formatting: bool\n\n :return: translated date string.\n \"\"\"\n date_string = self._translate_numerals(date_string)\n if settings.NORMALIZE:\n date_string = normalize_unicode(date_string)\n date_string = self._simplify(date_string, settings=settings)\n dictionary = self._get_dictionary(settings)\n date_string_tokens = dictionary.split(date_string, keep_formatting)\n\n relative_translations = self._get_relative_translations(settings=settings)\n\n for i, word in enumerate(date_string_tokens):\n word = word.lower()\n for pattern, replacement in relative_translations.items():\n if pattern.match(word):\n date_string_tokens[i] = pattern.sub(replacement, word)\n else:\n if word in dictionary:\n date_string_tokens[i] = dictionary[word] or ''\n if \"in\" in date_string_tokens:\n date_string_tokens = self._clear_future_words(date_string_tokens)\n\n return self._join(list(filter(bool, date_string_tokens)),\n separator=\"\" if keep_formatting else \" \", settings=settings)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nparses with formats and return a dictionary with period and obj_date.", "response": "def parse_with_formats(date_string, date_formats, settings):\n \"\"\" Parse with formats and return a dictionary with 'period' and 'obj_date'.\n\n :returns: :class:`datetime.datetime`, dict or None\n\n \"\"\"\n period = 'day'\n for date_format in date_formats:\n try:\n date_obj = datetime.strptime(date_string, date_format)\n except ValueError:\n continue\n else:\n # If format does not include the day, use last day of the month\n # instead of first, because the first is usually out of range.\n if '%d' not in date_format:\n period = 'month'\n date_obj = date_obj.replace(\n day=get_last_day_of_month(date_obj.year, date_obj.month))\n\n if not ('%y' in date_format or '%Y' in date_format):\n today = datetime.today()\n date_obj = date_obj.replace(year=today.year)\n\n date_obj = apply_timezone_from_settings(date_obj, settings)\n\n return {'date_obj': date_obj, 'period': period}\n else:\n return {'date_obj': None, 'period': period}"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nparsing a string representing a date and time and return a dictionary mapping keys to datetime objects.", "response": "def get_date_data(self, date_string, date_formats=None):\n \"\"\"\n Parse string representing date and/or time in recognizable localized formats.\n Supports parsing multiple languages and timezones.\n\n :param date_string:\n A string representing date and/or time in a recognizably valid format.\n :type date_string: str|unicode\n :param date_formats:\n A list of format strings using directives as given\n `here `_.\n The parser applies formats one by one, taking into account the detected languages.\n :type date_formats: list\n\n :return: a dict mapping keys to :mod:`datetime.datetime` object and *period*. For example:\n {'date_obj': datetime.datetime(2015, 6, 1, 0, 0), 'period': u'day'}\n\n :raises: ValueError - Unknown Language\n\n .. note:: *Period* values can be a 'day' (default), 'week', 'month', 'year'.\n\n *Period* represents the granularity of date parsed from the given string.\n\n In the example below, since no day information is present, the day is assumed to be current\n day ``16`` from *current date* (which is June 16, 2015, at the moment of writing this).\n Hence, the level of precision is ``month``:\n\n >>> DateDataParser().get_date_data(u'March 2015')\n {'date_obj': datetime.datetime(2015, 3, 16, 0, 0), 'period': u'month'}\n\n Similarly, for date strings with no day and month information present, level of precision\n is ``year`` and day ``16`` and month ``6`` are from *current_date*.\n\n >>> DateDataParser().get_date_data(u'2014')\n {'date_obj': datetime.datetime(2014, 6, 16, 0, 0), 'period': u'year'}\n\n Dates with time zone indications or UTC offsets are returned in UTC time unless\n specified using `Settings`_.\n\n >>> DateDataParser().get_date_data(u'23 March 2000, 1:21 PM CET')\n {'date_obj': datetime.datetime(2000, 3, 23, 14, 21), 'period': 'day'}\n\n \"\"\"\n if not(isinstance(date_string, six.text_type) or isinstance(date_string, six.string_types)):\n raise TypeError('Input type must be str or unicode')\n\n if isinstance(date_string, bytes):\n date_string = date_string.decode('utf-8')\n\n res = parse_with_formats(date_string, date_formats or [], self._settings)\n if res['date_obj']:\n return res\n\n date_string = sanitize_date(date_string)\n\n for locale in self._get_applicable_locales(date_string):\n parsed_date = _DateLocaleParser.parse(\n locale, date_string, date_formats, settings=self._settings)\n if parsed_date:\n parsed_date['locale'] = locale.shortname\n if self.try_previous_locales:\n self.previous_locales.insert(0, locale)\n return parsed_date\n else:\n return {'date_obj': None, 'period': 'day', 'locale': None}"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns load plan for the current instance of the class.", "response": "def get_load_plan(self):\n \"\"\"\n return load plan (timestamps generator)\n \"\"\"\n if self.rps_schedule and self.instances_schedule:\n raise StepperConfigurationError(\n 'Both rps and instances schedules specified. You must specify only one of them'\n )\n elif self.rps_schedule:\n info.status.publish('loadscheme', self.rps_schedule)\n return lp.create(self.rps_schedule)\n elif self.instances_schedule:\n info.status.publish('loadscheme', self.instances_schedule)\n return ip.create(self.instances_schedule)\n else:\n self.instances_schedule = []\n info.status.publish('loadscheme', self.instances_schedule)\n return ip.create(self.instances_schedule)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_ammo_generator(self):\n af_readers = {\n 'phantom': missile.AmmoFileReader,\n 'slowlog': missile.SlowLogReader,\n 'line': missile.LineReader,\n 'uri': missile.UriReader,\n 'uripost': missile.UriPostReader,\n 'access': missile.AccessLogReader,\n 'caseline': missile.CaseLineReader,\n }\n if self.uris and self.ammo_file:\n raise StepperConfigurationError(\n 'Both uris and ammo file specified. You must specify only one of them'\n )\n elif self.uris:\n ammo_gen = missile.UriStyleGenerator(\n self.uris, self.headers, http_ver=self.http_ver)\n elif self.ammo_file:\n if self.ammo_type in af_readers:\n if self.ammo_type == 'phantom':\n opener = resource.get_opener(self.ammo_file)\n with opener(self.use_cache) as ammo:\n try:\n if not ammo.next()[0].isdigit():\n self.ammo_type = 'uri'\n self.log.info(\n \"Setting ammo_type 'uri' because ammo is not started with digit and you did not specify ammo format\"\n )\n else:\n self.log.info(\n \"Default ammo type ('phantom') used, use 'phantom.ammo_type' option to override it\"\n )\n except StopIteration:\n self.log.exception(\n \"Couldn't read first line of ammo file\")\n raise AmmoFileError(\n \"Couldn't read first line of ammo file\")\n else:\n raise NotImplementedError(\n 'No such ammo type implemented: \"%s\"' % self.ammo_type)\n ammo_gen = af_readers[self.ammo_type](\n self.ammo_file, headers=self.headers, http_ver=self.http_ver, use_cache=self.use_cache)\n else:\n raise StepperConfigurationError(\n 'Ammo not found. Specify uris or ammo file')\n self.log.info(\"Using %s ammo reader\" % type(ammo_gen).__name__)\n return ammo_gen", "response": "Returns an ammo generator for the current state of the current object."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _exc_to_net(param1, success):\n if len(param1) <= 3:\n # FIXME: we're unable to use better logic here, because we should support non-http codes\n # but, we should look for core.util.HTTP or some other common logic\n # here\n if success:\n return 0\n else:\n return 314\n\n exc = param1.split(' ')[-1]\n if exc in KNOWN_EXC.keys():\n return KNOWN_EXC[exc]\n else:\n logger.warning(\n \"Unknown Java exception, consider adding it to dictionary: %s\",\n param1)\n return 41", "response": "Translate Java exception to net code."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _exc_to_http(param1):\n if len(param1) <= 3:\n try:\n int(param1)\n except BaseException:\n logger.error(\n \"JMeter wrote some strange data into codes column: %s\", param1)\n else:\n return int(param1)\n\n exc = param1.split(' ')[-1]\n if exc in KNOWN_EXC.keys():\n return 0\n else:\n logger.warning(\"Unknown Java exception. %s\", param1)\n return 0", "response": "translate exception str to http code"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreading phantom tool specific options.", "response": "def read_config(self):\n \"\"\" Read phantom tool specific options \"\"\"\n self.threads = self.cfg[\"threads\"] or str(int(multiprocessing.cpu_count() / 2) + 1)\n self.phantom_modules_path = self.cfg[\"phantom_modules_path\"]\n self.additional_libs = ' '.join(self.cfg[\"additional_libs\"])\n self.answ_log_level = self.cfg[\"writelog\"]\n if self.answ_log_level.lower() in ['0', 'false']:\n self.answ_log_level = 'none'\n elif self.answ_log_level.lower() in ['1', 'true']:\n self.answ_log_level = 'all'\n self.timeout = parse_duration(self.cfg[\"timeout\"])\n if self.timeout > 120000:\n logger.warning(\n \"You've set timeout over 2 minutes.\"\n \" Are you a functional tester?\")\n self.answ_log = self.core.mkstemp(\".log\", \"answ_\")\n self.core.add_artifact_file(self.answ_log)\n self.core.add_artifact_file(self.phout_file)\n self.core.add_artifact_file(self.stat_log)\n self.phantom_log = self.core.mkstemp(\".log\", \"phantom_\")\n self.core.add_artifact_file(self.phantom_log)\n\n main_stream = StreamConfig(\n self.core,\n len(self.streams), self.phout_file, self.answ_log,\n self.answ_log_level, self.timeout, self.cfg, True)\n self.streams.append(main_stream)\n\n for section in self.multi():\n self.streams.append(\n StreamConfig(\n self.core,\n len(self.streams), self.phout_file, self.answ_log,\n self.answ_log_level, self.timeout, section))\n\n for stream in self.streams:\n stream.read_config()\n\n if any(stream.ssl for stream in self.streams):\n self.additional_libs += ' ssl io_benchmark_method_stream_transport_ssl'"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ngenerate phantom config for all streams and create a new file with the same name.", "response": "def compose_config(self):\n \"\"\" Generate phantom tool run config \"\"\"\n streams_config = ''\n stat_benchmarks = ''\n for stream in self.streams:\n streams_config += stream.compose_config()\n if not stream.is_main:\n stat_benchmarks += \" \" + \"benchmark_io%s\" % stream.sequence_no\n\n kwargs = {}\n kwargs['threads'] = self.threads\n kwargs['phantom_log'] = self.phantom_log\n kwargs['stat_log'] = self.stat_log\n kwargs['benchmarks_block'] = streams_config\n kwargs['stat_benchmarks'] = stat_benchmarks\n kwargs['additional_libs'] = self.additional_libs\n kwargs['phantom_modules_path'] = self.phantom_modules_path\n filename = self.core.mkstemp(\".conf\", \"phantom_\")\n self.core.add_artifact_file(filename)\n logger.debug(\"Generating phantom config: %s\", filename)\n template_str = resource_string(__name__, \"config/phantom.conf.tpl\")\n tpl = string.Template(template_str)\n config = tpl.substitute(kwargs)\n\n with open(filename, 'w') as conffile:\n conffile.write(config)\n return filename"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets merged info about phantom conf", "response": "def get_info(self):\n \"\"\" get merged info about phantom conf \"\"\"\n result = copy.copy(self.streams[0])\n result.stat_log = self.stat_log\n result.steps = []\n result.ammo_file = ''\n result.rps_schedule = None\n result.ammo_count = 0\n result.duration = 0\n\n result.instances = 0\n result.loadscheme = []\n result.loop_count = 0\n\n for stream in self.streams:\n sec_no = 0\n logger.debug(\"Steps: %s\", stream.stepper_wrapper.steps)\n for item in stream.stepper_wrapper.steps:\n for x in range(0, item[1]):\n if len(result.steps) > sec_no:\n result.steps[sec_no][0] += item[0]\n else:\n result.steps.append([item[0], 1])\n sec_no += 1\n\n if result.rps_schedule:\n result.rps_schedule = []\n else:\n result.rps_schedule = stream.stepper_wrapper.loadscheme\n if result.loadscheme:\n result.loadscheme = ''\n else:\n # FIXME: add formatted load scheme for server:\n # \n # as a string\n result.loadscheme = ''\n\n if result.loop_count:\n result.loop_count = u'0'\n else:\n result.loop_count = stream.stepper_wrapper.loop_count\n\n result.ammo_file += '{} '.format(stream.stepper_wrapper.ammo_file)\n result.ammo_count += stream.stepper_wrapper.ammo_count\n result.duration = max(\n result.duration, stream.stepper_wrapper.duration)\n result.instances += stream.instances\n\n if not result.ammo_count:\n raise ValueError(\"Total ammo count cannot be zero\")\n return result"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef read_config(self):\n # multi-options\n self.ssl = self.get_option(\"ssl\")\n self.tank_type = self.get_option(\"tank_type\")\n # TODO: refactor. Maybe we should decide how to interact with\n # StepperWrapper here.\n # self.instances = self.get_option('instances')\n self.gatling = ' '.join(self.get_option('gatling_ip').split(\"\\n\"))\n self.method_prefix = self.get_option(\"method_prefix\")\n self.method_options = self.get_option(\"method_options\")\n self.source_log_prefix = self.get_option(\"source_log_prefix\")\n\n self.phantom_http_line = self.get_option(\"phantom_http_line\")\n self.phantom_http_field_num = self.get_option(\"phantom_http_field_num\")\n self.phantom_http_field = self.get_option(\"phantom_http_field\")\n self.phantom_http_entity = self.get_option(\"phantom_http_entity\")\n\n self.address = self.get_option('address')\n do_test_connect = self.get_option(\"connection_test\")\n explicit_port = self.get_option('port', '')\n self.ipv6, self.resolved_ip, self.port, self.address = self.address_wizard.resolve(\n self.address, do_test_connect, explicit_port)\n\n logger.info(\n \"Resolved %s into %s:%s\", self.address, self.resolved_ip, self.port)\n\n self.client_cipher_suites = self.get_option(\"client_cipher_suites\", \"\")\n self.client_certificate = self.get_option(\"client_certificate\", \"\")\n self.client_key = self.get_option(\"client_key\", \"\")\n self.stepper_wrapper.read_config()", "response": "reads config from options and returns a new instance of the class"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef patch_config(self, config):\n # get expvar parameters\n if config.get(\"monitoring\"):\n if config[\"monitoring\"].get(\"expvar\"):\n self.expvar = config[\"monitoring\"][\"expvar\"].get(\"enabled\")\n if config[\"monitoring\"][\"expvar\"].get(\"port\"):\n self.expvar_port = config[\"monitoring\"][\"expvar\"].get(\"port\")\n else:\n self.expvar_port = self.DEFAULT_EXPVAR_PORT\n # or set if expvar not exists\n else:\n config[\"monitoring\"] = {\n \"expvar\": {\n \"enabled\": True,\n }\n }\n self.expvar = True\n self.expvar_port = self.DEFAULT_EXPVAR_PORT\n\n # FIXME this is broken for custom ammo providers due to interface incompatibility\n # FIXME refactor pandora plx\n for pool in config['pools']:\n if pool.get('ammo', {}).get('file', ''):\n self.ammofile = pool['ammo']['file']\n pool['ammo']['file'] = resource_manager.resource_filename(\n self.ammofile\n )\n if not pool.get('result') or 'phout' not in pool.get('result', {}).get('type', ''):\n logger.warning('Seems like pandora result file not specified... adding defaults')\n pool['result'] = dict(\n destination=self.DEFAULT_REPORT_FILE,\n type='phout',\n )\n return config", "response": "patch config with new values"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nvalidate the duration of a load.", "response": "def validate_duration(self, field, duration):\n '''\n 2h\n 2h5m\n 5m\n 180\n 1h4m3\n :param duration:\n :return:\n '''\n DURATION_RE = r'^(\\d+d)?(\\d+h)?(\\d+m)?(\\d+s?)?$'\n if not re.match(DURATION_RE, duration):\n self._error(field, 'Load duration examples: 2h30m; 5m15; 180')"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef __parse_enabled_plugins(self):\n return [\n (\n plugin_name,\n plugin['package'],\n plugin) for plugin_name,\n plugin in self.raw_config_dict.items() if (\n plugin_name not in self.BASE_SCHEMA.keys()) and isinstance(\n plugin,\n dict) and plugin.get('enabled')]", "response": ":returns: [(plugin_name, plugin_package, plugin_config), ...]\n :rtype: list of tuple"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef plugins(self):\n if not self._plugins:\n self._plugins = [\n (plugin_name,\n plugin_cfg['package'],\n plugin_cfg) for plugin_name, plugin_cfg in self.validated.items() if (\n plugin_name not in self.base_schema.keys()) and plugin_cfg['enabled']]\n return self._plugins", "response": "returns a list of tuples containing the name package and config of all the plugins that are not in the base schema"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef log_stdout_stderr(log, stdout, stderr, comment=\"\"):\n readable = select.select([stdout], [], [], 0)[0]\n if stderr:\n exceptional = select.select([stderr], [], [], 0)[0]\n else:\n exceptional = []\n\n log.debug(\"Selected: %s, %s\", readable, exceptional)\n\n for handle in readable:\n line = handle.read()\n readable.remove(handle)\n if line:\n log.debug(\"%s stdout: %s\", comment, line.strip())\n\n for handle in exceptional:\n line = handle.read()\n exceptional.remove(handle)\n if line:\n log.warn(\"%s stderr: %s\", comment, line.strip())", "response": "This function polls stdout and stderr streams and writes their contents to log\n "} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef pid_exists(pid):\n if pid < 0:\n return False\n try:\n os.kill(pid, 0)\n except OSError as exc:\n logging.debug(\"No process[%s]: %s\", exc.errno, exc)\n return exc.errno == errno.EPERM\n else:\n p = psutil.Process(pid)\n return p.status != psutil.STATUS_ZOMBIE", "response": "Check whether a process with the given pid exists in the current process table."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef splitstring(string):\n patt = re.compile(r'\"[\\w ]+\"')\n if patt.search(string):\n quoted_item = patt.search(string).group()\n newstring = patt.sub('', string)\n return newstring.split() + [quoted_item]\n else:\n return string.split()", "response": ">>> splitstring - Split string into list of items"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nread from the file and returns the line and the stop position.", "response": "def read_with_lock(self, pos, _len=None):\n \"\"\"\n Reads {_len} characters if _len is not None else reads line\n :param pos: start reading position\n :param _len: number of characters to read\n :rtype: (string, int)\n \"\"\"\n self.wait_lock()\n try:\n self._opened_file.seek(pos)\n result = self._opened_file.read(_len) if _len is not None else self._opened_file.readline()\n stop_pos = self._opened_file.tell()\n finally:\n self.unlock()\n if not result and self.stop.is_set():\n result = None\n return result, stop_pos"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nread the configuration of the current stepper class", "response": "def read_config(self):\n ''' stepper part of reading options '''\n self.log.info(\"Configuring StepperWrapper...\")\n self.ammo_file = self.get_option(self.OPTION_AMMOFILE)\n self.ammo_type = self.get_option('ammo_type')\n if self.ammo_file:\n self.ammo_file = os.path.expanduser(self.ammo_file)\n self.loop_limit = self.get_option(self.OPTION_LOOP)\n self.ammo_limit = self.get_option(\"ammo_limit\")\n\n self.load_profile = LoadProfile(**self.get_option('load_profile'))\n\n self.instances = int(\n self.get_option(self.OPTION_INSTANCES_LIMIT, '1000'))\n self.uris = self.get_option(\"uris\", [])\n while '' in self.uris:\n self.uris.remove('')\n self.headers = self.get_option(\"headers\")\n self.http_ver = self.get_option(\"header_http\")\n self.autocases = self.get_option(\"autocases\")\n self.enum_ammo = self.get_option(\"enum_ammo\")\n self.use_caching = self.get_option(\"use_caching\")\n\n self.file_cache = self.get_option('file_cache')\n cache_dir = self.get_option(\"cache_dir\") or self.core.artifacts_base_dir\n self.cache_dir = os.path.expanduser(cache_dir)\n self.force_stepping = self.get_option(\"force_stepping\")\n if self.get_option(self.OPTION_LOAD)[self.OPTION_LOAD_TYPE] == 'stpd_file':\n self.stpd = self.get_option(self.OPTION_LOAD)[self.OPTION_SCHEDULE]\n\n self.chosen_cases = self.get_option(\"chosen_cases\").split()\n if self.chosen_cases:\n self.log.info(\"chosen_cases LIMITS: %s\", self.chosen_cases)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef prepare_stepper(self):\n ''' Generate test data if necessary '''\n\n def publish_info(stepper_info):\n info.status.publish('loadscheme', stepper_info.loadscheme)\n info.status.publish('loop_count', stepper_info.loop_count)\n info.status.publish('steps', stepper_info.steps)\n info.status.publish('duration', stepper_info.duration)\n info.status.ammo_count = stepper_info.ammo_count\n info.status.publish('instances', stepper_info.instances)\n self.core.publish('stepper', 'loadscheme', stepper_info.loadscheme)\n self.core.publish('stepper', 'loop_count', stepper_info.loop_count)\n self.core.publish('stepper', 'steps', stepper_info.steps)\n self.core.publish('stepper', 'duration', stepper_info.duration)\n self.core.publish('stepper', 'ammo_count', stepper_info.ammo_count)\n self.core.publish('stepper', 'instances', stepper_info.instances)\n return stepper_info\n\n if not self.stpd:\n self.stpd = self.__get_stpd_filename()\n if self.use_caching and not self.force_stepping and os.path.exists(\n self.stpd) and os.path.exists(self.__si_filename()):\n self.log.info(\"Using cached stpd-file: %s\", self.stpd)\n stepper_info = self.__read_cached_options()\n if self.instances and self.load_profile.is_rps():\n self.log.info(\n \"rps_schedule is set. Overriding cached instances param from config: %s\",\n self.instances)\n stepper_info = stepper_info._replace(\n instances=self.instances)\n publish_info(stepper_info)\n else:\n if (\n self.force_stepping and os.path.exists(self.__si_filename())):\n os.remove(self.__si_filename())\n self.__make_stpd_file()\n stepper_info = info.status.get_info()\n self.__write_cached_options(stepper_info)\n else:\n self.log.info(\"Using specified stpd-file: %s\", self.stpd)\n stepper_info = publish_info(self.__read_cached_options())\n self.ammo_count = stepper_info.ammo_count\n self.duration = stepper_info.duration\n self.loop_count = stepper_info.loop_count\n self.loadscheme = stepper_info.loadscheme\n self.steps = stepper_info.steps\n if stepper_info.instances:\n self.instances = stepper_info.instances", "response": "Prepare the stepper file for the current instance."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchooses the name for stepped data file", "response": "def __get_stpd_filename(self):\n ''' Choose the name for stepped data file '''\n if self.use_caching:\n sep = \"|\"\n hasher = hashlib.md5()\n hashed_str = \"cache version 6\" + sep + \\\n ';'.join(self.load_profile.schedule) + sep + str(self.loop_limit)\n hashed_str += sep + str(self.ammo_limit) + sep + ';'.join(\n self.load_profile.schedule) + sep + str(self.autocases)\n hashed_str += sep + \";\".join(self.uris) + sep + \";\".join(\n self.headers) + sep + self.http_ver + sep + \";\".join(\n self.chosen_cases)\n hashed_str += sep + str(self.enum_ammo) + sep + str(self.ammo_type)\n if self.load_profile.is_instances():\n hashed_str += sep + str(self.instances)\n if self.ammo_file:\n opener = resource.get_opener(self.ammo_file)\n hashed_str += sep + opener.hash\n else:\n if not self.uris:\n raise RuntimeError(\"Neither ammofile nor uris specified\")\n hashed_str += sep + \\\n ';'.join(self.uris) + sep + ';'.join(self.headers)\n self.log.debug(\"stpd-hash source: %s\", hashed_str)\n hasher.update(hashed_str.encode('utf8'))\n if not os.path.exists(self.cache_dir):\n os.makedirs(self.cache_dir)\n stpd = self.cache_dir + '/' + \\\n os.path.basename(self.ammo_file) + \\\n \"_\" + hasher.hexdigest() + \".stpd\"\n else:\n stpd = os.path.realpath(\"ammo.stpd\")\n self.log.debug(\"Generated cache file name: %s\", stpd)\n return stpd"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreads the stepper info from json and return it as a StepperInfo object", "response": "def __read_cached_options(self):\n '''\n Read stepper info from json\n '''\n self.log.debug(\"Reading cached stepper info: %s\", self.__si_filename())\n with open(self.__si_filename(), 'r') as si_file:\n si = info.StepperInfo(**json.load(si_file))\n return si"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef __write_cached_options(self, si):\n '''\n Write stepper info to json\n '''\n self.log.debug(\"Saving stepper info: %s\", self.__si_filename())\n with open(self.__si_filename(), 'w') as si_file:\n json.dump(si._asdict(), si_file, indent=4)", "response": "Write the stepper info to json"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef __make_stpd_file(self):\n ''' stpd generation using Stepper class '''\n self.log.info(\"Making stpd-file: %s\", self.stpd)\n stepper = Stepper(\n self.core,\n rps_schedule=self.load_profile.schedule if self.load_profile.is_rps() else None,\n http_ver=self.http_ver,\n ammo_file=self.ammo_file,\n instances_schedule=self.load_profile.schedule if self.load_profile.is_instances() else None,\n instances=self.instances,\n loop_limit=self.loop_limit,\n ammo_limit=self.ammo_limit,\n uris=self.uris,\n headers=[header.strip('[]') for header in self.headers],\n autocases=self.autocases,\n enum_ammo=self.enum_ammo,\n ammo_type=self.ammo_type,\n chosen_cases=self.chosen_cases,\n use_cache=self.use_caching)\n with open(self.stpd, 'w', self.file_cache) as os:\n stepper.write(os)", "response": "Make the TPTD file using the current state of the current instance."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef create(rps_schedule):\n if len(rps_schedule) > 1:\n lp = Composite(\n [StepFactory.produce(step_config) for step_config in rps_schedule])\n else:\n lp = StepFactory.produce(rps_schedule[0])\n info.status.publish('duration', lp.get_duration() / 1000)\n info.status.publish('steps', lp.get_rps_list())\n info.status.lp_len = len(lp)\n return lp", "response": "Create Load Plan as defined in schedule."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns the time at which the n - th entry is shooted", "response": "def ts(self, n):\n \"\"\"\n :param n: number of charge\n :return: when to shoot nth charge, milliseconds\n \"\"\"\n try:\n root1, root2 = solve_quadratic(self.slope / 2.0, self.minrps, -n)\n except ZeroDivisionError:\n root2 = float(n) / self.minrps\n return int(root2 * 1000)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef rps_at(self, t):\n '''Return rps for second t'''\n if 0 <= t <= self.duration:\n return self.minrps + \\\n float(self.maxrps - self.minrps) * t / self.duration\n else:\n return 0", "response": "Return rps for second t"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_float_rps_list(self):\n '''\n get list of constant load parts (we have no constant load at all, but tank will think so),\n with parts durations (float)\n '''\n int_rps = range(int(self.minrps), int(self.maxrps) + 1)\n step_duration = float(self.duration) / len(int_rps)\n rps_list = [(rps, int(step_duration)) for rps in int_rps]\n return rps_list", "response": "get list of constant load parts with parts durations"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting list of each second s rps", "response": "def get_rps_list(self):\n \"\"\"\n get list of each second's rps\n :returns: list of tuples (rps, duration of corresponding rps in seconds)\n :rtype: list\n \"\"\"\n seconds = range(0, int(self.duration) + 1)\n rps_groups = groupby([proper_round(self.rps_at(t)) for t in seconds],\n lambda x: x)\n rps_list = [(rps, len(list(rpl))) for rps, rpl in rps_groups]\n return rps_list"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nexecuting and check exit code", "response": "def execute(self, cmd):\n \"\"\"\n Execute and check exit code\n \"\"\"\n self.log.info(\"Executing: %s\", cmd)\n retcode = execute(\n cmd, shell=True, poll_period=0.1, catch_out=self.catch_out)[0]\n if retcode:\n raise RuntimeError(\"Subprocess returned %s\" % retcode)\n return retcode"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef __make_points_for_label(self, ts, data, label, prefix, gun_stats):\n label_points = list()\n\n label_points.extend(\n (\n # overall quantiles for label\n self.__make_points(\n prefix + \"overall_quantiles\",\n {\"label\": label},\n ts,\n self.__make_quantile_fields(data)\n ),\n # overall meta (gun status) for label\n self.__make_points(\n prefix + \"overall_meta\",\n {\"label\": label},\n ts,\n self.__make_overall_meta_fields(data, gun_stats)\n ),\n # net codes for label\n self.__make_points(\n prefix + \"net_codes\",\n {\"label\": label},\n ts,\n self.__make_netcodes_fields(data)\n ),\n # proto codes for label\n self.__make_points(\n prefix + \"proto_codes\",\n {\"label\": label},\n ts,\n self.__make_protocodes_fields(data)\n )\n )\n )\n # histograms, one row for each bin\n if self.histograms:\n for bin_, count in zip(data[\"interval_real\"][\"hist\"][\"bins\"],\n data[\"interval_real\"][\"hist\"][\"data\"]):\n label_points.append(\n self.__make_points(\n prefix + \"histograms\",\n {\"label\": label},\n ts,\n {\"bin\": bin_, \"count\": count}\n )\n )\n return label_points", "response": "x\n Make a set of points for a given label"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncreating dict of points for InfluxDB client", "response": "def __make_points(self, measurement, additional_tags, ts, fields):\n \"\"\"\n Parameters\n ----------\n measurement : string\n measurement type (e.g. monitoring, overall_meta, net_codes, proto_codes, overall_quantiles)\n additional_tags : dict\n custom additional tags for this points\n ts : integer\n timestamp\n fields : dict\n influxdb columns\n\n Returns\n -------\n dict\n points for InfluxDB client\n \"\"\"\n tags = self.tags.copy()\n tags.update(additional_tags)\n return {\n \"measurement\": measurement,\n \"tags\": tags,\n \"time\": int(ts),\n \"fields\": fields,\n }"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\npublish value to status", "response": "def publish(self, key, value):\n \"\"\"publish value to status\"\"\"\n self.log.debug(\n \"Publishing status: %s/%s: %s\", self.__class__.__name__, key, value)\n self.core.publish(self.__class__.__name__, key, value)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nstopping the worker threads.", "response": "def stop(self):\n \"\"\"\n Say the workers to finish their jobs and quit.\n \"\"\"\n self.quit.set()\n # yapf:disable\n while sorted([\n self.pool[i].is_alive()\n for i in xrange(len(self.pool))])[-1]:\n time.sleep(1)\n # yapf:enable\n try:\n while not self.task_queue.empty():\n self.task_queue.get(timeout=0.1)\n self.task_queue.close()\n self.feeder.join()\n except Exception as ex:\n logger.info(ex)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfeeds the current thread into the queue.", "response": "def _feed(self):\n \"\"\"\n A feeder that runs in distinct thread in main process.\n \"\"\"\n self.plan = StpdReader(self.stpd_filename)\n if self.cached_stpd:\n self.plan = list(self.plan)\n for task in self.plan:\n if self.quit.is_set():\n logger.info(\"Stop feeding: gonna quit\")\n return\n # try putting a task to a queue unless there is a quit flag\n # or all workers have exited\n while True:\n try:\n self.task_queue.put(task, timeout=1)\n break\n except Full:\n if self.quit.is_set() or self.workers_finished:\n return\n else:\n continue\n workers_count = self.instances\n logger.info(\n \"Feeded all data. Publishing %d killer tasks\" % (workers_count))\n retry_delay = 1\n for _ in range(5):\n try:\n [\n self.task_queue.put(None, timeout=1)\n for _ in xrange(0, workers_count)\n ]\n break\n except Full:\n logger.debug(\n \"Couldn't post killer tasks\"\n \" because queue is full. Retrying in %ss\", retry_delay)\n time.sleep(retry_delay)\n retry_delay *= 2\n\n try:\n logger.info(\"Waiting for workers\")\n map(lambda x: x.join(), self.pool)\n logger.info(\"All workers exited.\")\n self.workers_finished = True\n except (KeyboardInterrupt, SystemExit):\n self.task_queue.close()\n self.results.close()\n self.quit.set()\n logger.info(\"Going to quit. Waiting for workers\")\n map(lambda x: x.join(), self.pool)\n self.workers_finished = True"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef init_logging(self, log_filename=\"tank.log\"):\n logger = logging.getLogger('')\n self.log_filename = log_filename\n self.core.add_artifact_file(self.log_filename)\n\n file_handler = logging.FileHandler(self.log_filename)\n file_handler.setLevel(logging.DEBUG)\n file_handler.setFormatter(\n logging.Formatter(\n \"%(asctime)s [%(levelname)s] %(name)s %(message)s\"))\n logger.addHandler(file_handler)\n console_handler = logging.StreamHandler(sys.stdout)\n stderr_hdl = logging.StreamHandler(sys.stderr)\n\n # fmt_verbose = logging.Formatter(\n # \"%(asctime)s [%(levelname)s] %(name)s %(message)s\")\n fmt_regular = logging.Formatter(\n \"%(asctime)s %(levelname)s: %(message)s\", \"%H:%M:%S\")\n\n console_handler.setLevel(logging.INFO)\n console_handler.setFormatter(fmt_regular)\n stderr_hdl.setFormatter(fmt_regular)\n\n f_err = SingleLevelFilter(logging.ERROR, True)\n f_warn = SingleLevelFilter(logging.WARNING, True)\n f_crit = SingleLevelFilter(logging.CRITICAL, True)\n console_handler.addFilter(f_err)\n console_handler.addFilter(f_warn)\n console_handler.addFilter(f_crit)\n logger.addHandler(console_handler)\n\n f_info = SingleLevelFilter(logging.INFO, True)\n f_debug = SingleLevelFilter(logging.DEBUG, True)\n stderr_hdl.addFilter(f_info)\n stderr_hdl.addFilter(f_debug)\n logger.addHandler(stderr_hdl)", "response": "Initialize the logging for this instance."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef __add_user_options(self):\n if self.options.get('user_options', None):\n self.core.apply_shorthand_options(self.options['user_options'])", "response": "override config options with user specified options"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_default_configs(self):\n # initialize basic defaults\n configs = [resource_filename(__name__, 'config/00-base.ini')]\n try:\n conf_files = sorted(os.listdir(self.baseconfigs_location))\n for filename in conf_files:\n if fnmatch.fnmatch(filename, '*.ini'):\n configs += [\n os.path.realpath(\n self.baseconfigs_location + os.sep + filename)\n ]\n except OSError:\n self.log.warn(\n self.baseconfigs_location + ' is not accessible to get configs list')\n\n configs += [os.path.expanduser('~/.yandex-tank')]\n return configs", "response": "returns default configs list from the base configs location and package_data"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncollect data from the queue and cache it and send to listeners", "response": "def _collect_data(self, end=False):\n \"\"\"\n Collect data, cache it and send to listeners\n \"\"\"\n data = get_nowait_from_queue(self.results)\n stats = get_nowait_from_queue(self.stats_results)\n logger.debug(\"Data timestamps: %s\" % [d.get('ts') for d in data])\n logger.debug(\"Stats timestamps: %s\" % [d.get('ts') for d in stats])\n for item in data:\n ts = item['ts']\n if ts in self.stat_cache:\n # send items\n data_item = item\n stat_item = self.stat_cache.pop(ts)\n self.__notify_listeners(data_item, stat_item)\n else:\n self.data_cache[ts] = item\n for item in stats:\n ts = item['ts']\n if ts in self.data_cache:\n # send items\n data_item = self.data_cache.pop(ts)\n stat_item = item\n self.__notify_listeners(data_item, stat_item)\n else:\n self.stat_cache[ts] = item\n if end and len(self.data_cache) > 0:\n logger.info('Timestamps without stats:')\n for ts, data_item in sorted(self.data_cache.items(), key=lambda i: i[0]):\n logger.info(ts)\n self.__notify_listeners(data_item, StatsReader.stats_item(ts, 0, 0))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef __notify_listeners(self, data, stats):\n for listener in self.listeners:\n listener.on_aggregated_data(data, stats)", "response": "notify all listeners about aggregate data and stats"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning a marker function of the requested marker_type", "response": "def get_marker(marker_type, enum_ammo=False):\n '''\n Returns a marker function of the requested marker_type\n\n >>> marker = get_marker('uniq')(__test_missile)\n >>> type(marker)\n \n >>> len(marker)\n 32\n\n >>> get_marker('uri')(__test_missile)\n '_example_search_hello_help_us'\n\n >>> marker = get_marker('non-existent')(__test_missile)\n Traceback (most recent call last):\n ...\n NotImplementedError: No such marker: \"non-existent\"\n\n >>> get_marker('3')(__test_missile)\n '_example_search_hello'\n\n >>> marker = get_marker('3', True)\n >>> marker(__test_missile)\n '_example_search_hello#0'\n >>> marker(__test_missile)\n '_example_search_hello#1'\n '''\n try:\n limit = int(marker_type)\n if limit:\n marker = __UriMarker(limit)\n else:\n\n def marker(m):\n return ''\n except ValueError:\n if marker_type in __markers:\n marker = __markers[marker_type]\n else:\n raise NotImplementedError('No such marker: \"%s\"' % marker_type)\n\n # todo: fix u'False'\n if enum_ammo:\n marker = __Enumerator(marker)\n return marker"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn a function that uploads the data to the server.", "response": "def get_uploader(data_session, column_mapping, overall_only=False):\n \"\"\"\n :type column_mapping: dict\n :type data_session: DataSession\n \"\"\"\n overall = {col_name: data_session.new_aggregated_metric(name + ' overall')\n for col_name, name in column_mapping.items()}\n\n def upload_df(df):\n for col_name, metric in overall.items():\n df['value'] = df[col_name]\n metric.put(df)\n return upload_df"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nloading all YAML files in a folder.", "response": "def cfg_folder_loader(path):\n \"\"\"\n :type path: str\n \"\"\"\n CFG_WILDCARD = '*.yaml'\n return [load_cfg(filename) for filename in sorted(glob.glob(os.path.join(path, CFG_WILDCARD)))]"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef parse_options(options):\n if options is None:\n return []\n else:\n return [\n convert_single_option(key.strip(), value.strip())\n for key, value\n in [option.split('=', 1) for option in options]\n ]", "response": "Parse options into a list of dicts."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns default configs list from config files from the base directory", "response": "def get_default_configs():\n \"\"\" returns default configs list, from /etc and home dir \"\"\"\n # initialize basic defaults\n configs = [resource_filename(__name__, 'config/00-base.ini')]\n baseconfigs_location = '/etc/yandex-tank'\n try:\n conf_files = sorted(os.listdir(baseconfigs_location))\n for filename in conf_files:\n if fnmatch.fnmatch(filename, '*.ini'):\n configs += [\n os.path.realpath(\n baseconfigs_location + os.sep + filename)\n ]\n except OSError:\n logger.info(\n baseconfigs_location + ' is not accessible to get configs list')\n\n configs += [os.path.expanduser('~/.yandex-tank')]\n return configs"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef clean_markup(self, orig_str):\n ''' clean markup from string '''\n for val in [\n self.YELLOW, self.RED, self.RESET, self.CYAN, self.BG_MAGENTA,\n self.WHITE, self.BG_GREEN, self.GREEN, self.BG_BROWN,\n self.RED_DARK, self.MAGENTA, self.BG_CYAN\n ]:\n orig_str = orig_str.replace(val, '')\n return orig_str", "response": "clean markup from string"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nparsing duration string such as 3h2m3s 5s 0. 3s 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s0. 0 0. 0 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 5s 0. 0 0. 0 0. 0 0. 0 0", "response": "def parse_duration(duration):\n '''\n Parse duration string, such as '3h2m3s' into milliseconds\n\n >>> parse_duration('3h2m3s')\n 10923000\n\n >>> parse_duration('0.3s')\n 300\n\n >>> parse_duration('5')\n 5000\n '''\n _re_token = re.compile(\"([0-9.]+)([dhms]?)\")\n\n def parse_token(time, multiplier):\n multipliers = {\n 'd': 86400,\n 'h': 3600,\n 'm': 60,\n 's': 1,\n }\n if multiplier:\n if multiplier in multipliers:\n return int(float(time) * multipliers[multiplier] * 1000)\n else:\n raise StepperConfigurationError(\n 'Failed to parse duration: %s' % duration)\n else:\n return int(float(time) * 1000)\n\n return sum(parse_token(*token) for token in _re_token.findall(duration))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef solve_quadratic(a, b, c):\n '''\n >>> solve_quadratic(1.0, 2.0, 1.0)\n (-1.0, -1.0)\n '''\n discRoot = math.sqrt((b * b) - 4 * a * c)\n root1 = (-b - discRoot) / (2 * a)\n root2 = (-b + discRoot) / (2 * a)\n return (root1, root2)", "response": "Solve a quadratic problem."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns the number of rounds to the closest n", "response": "def proper_round(n):\n \"\"\"\n rounds float to closest int\n :rtype: int\n :param n: float\n \"\"\"\n return int(n) + (n / abs(n)) * int(abs(n - int(n)) >= 0.5) if n != 0 else 0"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef uninstall(self):\n if self.session:\n logger.info('Waiting monitoring data...')\n self.session.terminate()\n self.session.wait()\n self.session = None\n log_filename = \"agent_{host}.log\".format(host=\"localhost\")\n data_filename = \"agent_{host}.rawdata\".format(host=\"localhost\")\n try:\n logger.info('Saving monitoring artefacts from localhost')\n copyfile(self.workdir + \"/_agent.log\", log_filename)\n copyfile(self.workdir + \"/monitoring.rawdata\", data_filename)\n logger.info('Deleting temp directory: %s', self.workdir)\n rmtree(self.workdir)\n except Exception:\n logger.error(\"Exception while uninstalling agent\", exc_info=True)\n\n logger.info(\"Removing agent from: localhost\")\n return log_filename, data_filename", "response": "Uninstalls the agent from the local host."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef install(self):\n logger.info(\n \"Installing monitoring agent at %s@%s...\",\n self.username,\n self.host)\n\n # create remote temp dir\n cmd = self.python + ' -c \"import tempfile; print tempfile.mkdtemp();\"'\n logger.info(\"Creating temp dir on %s\", self.host)\n try:\n out, errors, err_code = self.ssh.execute(cmd)\n except Exception:\n logger.error(\n \"Failed to install monitoring agent to %s\",\n self.host,\n exc_info=True)\n return None, None, None\n if errors:\n logger.error(\"[%s] error: '%s'\", self.host, errors)\n logger.error(\"Cancelling agent installation on %s\", self.host)\n return None, None, None\n\n if err_code:\n logger.error(\n \"Failed to create remote dir via SSH at %s@%s, code %s: %s\" %\n (self.username, self.host, err_code, out.strip()))\n return None, None, None\n\n remote_dir = out.strip()\n if remote_dir:\n self.path['AGENT_REMOTE_FOLDER'] = remote_dir\n self.agent_remote_folder = remote_dir\n logger.debug(\n \"Remote dir at %s:%s\", self.host, self.path['AGENT_REMOTE_FOLDER'])\n\n # create collector config\n agent_config = self.config.create_collector_config(\n self.path['AGENT_REMOTE_FOLDER'])\n startup_config = self.config.create_startup_config()\n customs_script = self.config.create_custom_exec_script()\n\n # trying to detect os version/architecture and get information about telegraf client\n # DO NOT DELETE indices in string format below. Python 2.6 does not\n # support string formatting without indices\n remote_cmd = 'import os; print os.path.isfile(\"' + self.path[\n 'TELEGRAF_REMOTE_PATH'] + '\")'\n cmd = self.python + ' -c \\'{cmd}\\''.format(cmd=remote_cmd)\n remote_telegraf_exists = \"False\"\n try:\n out, err, err_code = self.ssh.execute(cmd)\n except Exception:\n logger.error(\n \"SSH execute error trying to check telegraf availability on host %s\",\n self.host,\n exc_info=True)\n else:\n if err:\n logger.error(\"[%s] error: '%s'\", self.host, errors)\n if out.strip():\n remote_telegraf_exists = out.strip()\n\n try:\n if remote_telegraf_exists in \"True\":\n logger.debug('Found telegraf client on %s..', self.host)\n else:\n logger.debug(\n 'Not found telegraf client on %s, trying to install from tank. Copying..',\n self.host)\n if os.path.isfile(self.path['TELEGRAF_LOCAL_PATH']):\n self.ssh.send_file(\n self.path['TELEGRAF_LOCAL_PATH'],\n self.path['TELEGRAF_REMOTE_PATH'])\n elif os.path.isfile(\"/usr/bin/telegraf\"):\n self.ssh.send_file(\n '/usr/bin/telegraf', self.path['TELEGRAF_REMOTE_PATH'])\n else:\n logger.error(\n 'Telegraf binary not found neither on %s nor on localhost at specified path: %s\\n'\n 'You can download telegraf binaries here: https://github.com/influxdata/telegraf\\n'\n 'or install debian package: `telegraf`', self.host, self.path['TELEGRAF_LOCAL_PATH'])\n return None, None, None\n\n self.ssh.send_file(\n os.path.join(\n self.path['AGENT_LOCAL_FOLDER'],\n self.AGENT_FILENAME),\n os.path.join(\n self.path['AGENT_REMOTE_FOLDER'],\n self.AGENT_FILENAME))\n self.ssh.send_file(\n agent_config,\n os.path.join(\n self.path['AGENT_REMOTE_FOLDER'],\n 'agent.cfg'))\n self.ssh.send_file(\n startup_config,\n os.path.join(\n self.path['AGENT_REMOTE_FOLDER'],\n 'agent_startup.cfg'))\n self.ssh.send_file(\n customs_script,\n os.path.join(\n self.path['AGENT_REMOTE_FOLDER'],\n 'agent_customs.sh'))\n\n except Exception:\n logger.error(\n \"Failed to install agent on %s\", self.host, exc_info=True)\n return None, None, None\n\n return agent_config, startup_config, customs_script", "response": "Installs monitoring agent and returns a tuple of the name of the new agent and the path to the new agent and the path to the new agent."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nremove agent's files from remote host", "response": "def uninstall(self):\n \"\"\"\n Remove agent's files from remote host\n \"\"\"\n log_filename = \"agent_{host}.log\".format(host=self.host)\n data_filename = \"agent_{host}.rawdata\".format(host=self.host)\n\n try:\n if self.session:\n self.session.send(\"stop\\n\")\n self.session.close()\n self.session = None\n except BaseException:\n logger.warning(\n 'Unable to correctly stop monitoring agent - session is broken. Pay attention to agent log (%s).',\n log_filename,\n exc_info=True)\n else:\n try:\n self.ssh.get_file(\n os.path.join(\n self.path['AGENT_REMOTE_FOLDER'],\n \"_agent.log\"),\n log_filename)\n self.ssh.get_file(\n os.path.join(\n self.path['AGENT_REMOTE_FOLDER'],\n \"monitoring.rawdata\"),\n data_filename)\n self.ssh.rm_r(self.path['AGENT_REMOTE_FOLDER'])\n except Exception:\n logger.error(\"Unable to get agent artefacts\", exc_info=True)\n self._kill_agent()\n return log_filename, data_filename"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef parse_sections(cfg_ini):\n return [Section(section.lower(),\n guess_plugin(section.lower()),\n without_defaults(cfg_ini, section))\n for section in cfg_ini.sections()\n if not re.match(CORE_SECTION_PATTERN, section.lower()) and section.lower() not in DEPRECATED_SECTIONS]", "response": "Parses the sections from a ConfigParser object."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef combine_sections(sections):\n PLUGINS_TO_COMBINE = {\n 'Phantom': ('phantom', 'multi', True),\n 'Bfg': ('bfg', 'gun_config', False)\n }\n plugins = {}\n ready_sections = []\n for section in sections:\n if section.plugin in PLUGINS_TO_COMBINE.keys():\n try:\n plugins[section.plugin].append(section)\n except KeyError:\n plugins[section.plugin] = [section]\n else:\n ready_sections.append(section)\n\n for plugin_name, _sections in plugins.items():\n if isinstance(_sections, list):\n parent_name, child_name, is_list = PLUGINS_TO_COMBINE[plugin_name]\n ready_sections.append(Section.from_multiple(_sections, parent_name, child_name, is_list))\n return ready_sections", "response": "Combine a list of Sections into a single list of Sections."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn the converted object", "response": "def converted(self):\n \"\"\"\n :rtype: {str: object}\n \"\"\"\n if self._converted is None:\n self._converted = self.converter(self.name, self.value)\n return self._converted"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef as_tuple(self):\n if self._as_tuple is None:\n self._as_tuple = self.converted.items()[0]\n return self._as_tuple", "response": "Returns a tuple of the names and values of the object."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the converter for this object.", "response": "def converter(self):\n \"\"\"\n :rtype: callable\n \"\"\"\n if self._converter is None:\n try:\n self._converter = self.SPECIAL_CONVERTERS[self.plugin][self.name]\n except KeyError:\n try:\n self._converter = self._get_scheme_converter()\n except UnknownOption:\n self._converter = self.CONVERTERS_FOR_UNKNOWN.get(self.plugin, self.dummy_converter)\n return self._converter"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a new config dictionary from multiple sections.", "response": "def from_multiple(cls, sections, parent_name=None, child_name=None, is_list=True):\n \"\"\"\n :type parent_name: str\n :type sections: list of Section\n \"\"\"\n if len(sections) == 1:\n return sections[0]\n if parent_name:\n master_section = filter(lambda section: section.name == parent_name, sections)[0]\n rest = filter(lambda section: section.name != parent_name, sections)\n else:\n master_section = sections[0]\n parent_name = master_section.name\n rest = sections[1:]\n child = {'multi': [section.get_cfg_dict(with_meta=False) for section in rest]} if is_list \\\n else {child_name: cls._select_one(master_section, rest).get_cfg_dict(with_meta=False)}\n master_section.merged_options.update(child)\n return master_section"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn unicode title of content.", "response": "def title(content, new_line_replacement=' ', tab_replacement=' '):\n \"\"\"\n Underlines content with '='. New lines and tabs will be replaced\n :param str content:\n :param str new_line_replacement:\n :param str tab_replacement:\n :return: unicode\n \"\"\"\n prepared_content = content.strip().replace('\\n', new_line_replacement).replace('\\t', tab_replacement)\n return u'{}\\n{}'.format(prepared_content, '=' * len(prepared_content))"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsearches for line in jmeter. log such as Waiting for possible shutdown message on port 4445", "response": "def __discover_jmeter_udp_port(self):\n \"\"\"Searching for line in jmeter.log such as\n Waiting for possible shutdown message on port 4445\n \"\"\"\n r = re.compile(self.DISCOVER_PORT_PATTERN)\n with open(self.process_stderr.name, 'r') as f:\n cnt = 0\n while self.process.pid and cnt < 10:\n line = f.readline()\n m = r.match(line)\n if m is None:\n cnt += 1\n time.sleep(1)\n else:\n port = int(m.group('port'))\n return port\n else:\n logger.warning('JMeter UDP port wasn\\'t discovered')\n return None"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef __add_jmeter_components(self, jmx, jtl, variables):\n logger.debug(\"Original JMX: %s\", os.path.realpath(jmx))\n with open(jmx, 'r') as src_jmx:\n source_lines = src_jmx.readlines()\n\n try:\n # In new Jmeter version (3.2 as example) WorkBench's plugin checkbox enabled by default\n # It totally crashes Yandex tank injection and raises XML Parse Exception\n closing = source_lines.pop(-1)\n if \"WorkBenchGui\" in source_lines[-5]:\n logger.info(\"WorkBench checkbox enabled...bypassing\")\n last_string_count = 6\n else:\n last_string_count = 2\n while last_string_count > 0:\n closing = source_lines.pop(-1) + closing\n last_string_count -= 1\n logger.debug(\"Closing statement: %s\", closing)\n except Exception as exc:\n raise RuntimeError(\"Failed to find the end of JMX XML: %s\" % exc)\n\n udv_tpl = resource_string(__name__, 'config/jmeter_var_template.xml')\n udv_set = []\n for var_name, var_value in variables.iteritems():\n udv_set.append(udv_tpl % (var_name, var_name, var_value))\n udv = \"\\n\".join(udv_set)\n\n if self.jmeter_ver >= 2.13:\n save_connect = 'true'\n else:\n save_connect = ''\n\n if self.ext_log in ['errors', 'all']:\n level_map = {'errors': 'true', 'all': 'false'}\n tpl_resource = 'jmeter_writer_ext.xml'\n tpl_args = {\n 'jtl': self.jtl_file,\n 'udv': udv,\n 'ext_log': self.ext_log_file,\n 'ext_level': level_map[self.ext_log],\n 'save_connect': save_connect\n }\n else:\n tpl_resource = 'jmeter_writer.xml'\n tpl_args = {\n 'jtl': self.jtl_file,\n 'udv': udv,\n 'save_connect': save_connect\n }\n\n tpl = resource_string(__name__, 'config/' + tpl_resource)\n\n try:\n new_jmx = self.core.mkstemp(\n '.jmx', 'modified_', os.path.dirname(os.path.realpath(jmx)))\n except OSError as exc:\n logger.debug(\"Can't create modified jmx near original: %s\", exc)\n new_jmx = self.core.mkstemp('.jmx', 'modified_')\n logger.debug(\"Modified JMX: %s\", new_jmx)\n with open(new_jmx, \"wb\") as fh:\n fh.write(''.join(source_lines))\n fh.write(tpl % tpl_args)\n fh.write(closing)\n return new_jmx", "response": "Add JMX components to the UDV file."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _read_data(self, lines):\n\n results = []\n for line in lines:\n timestamp, rps, instances = line.split(\"\\t\")\n curr_ts = int(float(timestamp)) # We allow floats here, but tank expects only seconds\n if self.__last_ts < curr_ts:\n self.__last_ts = curr_ts\n results.append(self.stats_item(self.__last_ts, float(rps), float(instances)))\n return results", "response": "Parse lines and return stats\n "} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef __create_criterion(self, criterion_str):\n parsed = criterion_str.split(\"(\")\n type_str = parsed[0].strip().lower()\n parsed[1] = parsed[1].split(\")\")[0].strip()\n\n for criterion_class in self.custom_criterions:\n if criterion_class.get_type_string() == type_str:\n return criterion_class(self, parsed[1])\n raise ValueError(\n \"Unsupported autostop criterion type: %s\" % criterion_str)", "response": "instantiate criterion from config string"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef getconfig(self, filename, target_hint):\n try:\n tree = self.parse_xml(filename)\n except IOError as exc:\n logger.error(\"Error loading config: %s\", exc)\n raise RuntimeError(\"Can't read monitoring config %s\" % filename)\n hosts = tree.findall('Host')\n config = []\n for host in hosts:\n host_config = self.get_host_config(host, target_hint)\n config.append(host_config)\n return config", "response": "Get monitoring config data."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncreate startup config file for the agent.", "response": "def create_startup_config(self):\n \"\"\" Startup and shutdown commands config\n Used by agent.py on the target\n\n \"\"\"\n cfg_path = \"agent_startup_{}.cfg\".format(self.host)\n if os.path.isfile(cfg_path):\n logger.info(\n 'Found agent startup config file in working directory with the same name as created for host %s.\\n'\n 'Creating new one via tempfile. This will affect predictable filenames for agent artefacts',\n self.host)\n handle, cfg_path = tempfile.mkstemp('.cfg', 'agent_')\n os.close(handle)\n try:\n config = ConfigParser.RawConfigParser()\n # FIXME incinerate such a string formatting inside a method call\n # T_T\n config.add_section('startup')\n [\n config.set('startup', \"cmd%s\" % idx, cmd)\n for idx, cmd in enumerate(self.startups)\n ]\n config.add_section('shutdown')\n [\n config.set('shutdown', \"cmd%s\" % idx, cmd)\n for idx, cmd in enumerate(self.shutdowns)\n ]\n config.add_section('source')\n [\n config.set('source', \"file%s\" % idx, path)\n for idx, path in enumerate(self.sources)\n ]\n with open(cfg_path, 'w') as fds:\n config.write(fds)\n\n except Exception as exc:\n logger.error(\n 'Error trying to create monitoring startups config. Malformed? %s',\n exc,\n exc_info=True)\n return cfg_path"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncreates a bash script that will execute the custom commands inside the local config file.", "response": "def create_custom_exec_script(self):\n \"\"\" bash script w/ custom commands inside\n inspired by half a night trying to avoid escaping bash special characters\n\n \"\"\"\n cfg_path = \"agent_customs_{}.cfg\".format(self.host)\n if os.path.isfile(cfg_path):\n logger.info(\n 'Found agent custom execs config file in working directory with the same name as created for host %s.\\n'\n 'Creating new one via tempfile. This will affect predictable filenames for agent artefacts',\n self.host)\n handle, cfg_path = tempfile.mkstemp('.sh', 'agent_customs_')\n os.close(handle)\n\n cmds = \"\"\n for idx, cmd in enumerate(self.custom):\n cmds += \"-{idx}) {cmd};;\\n\".format(idx=idx, cmd=cmd['cmd'])\n customs_script = \"\"\"\n #!/bin/sh\n while :\n do\n case \"$1\" in\n {cmds}\n *) break;;\n esac\n shift\n done\n \"\"\".format(cmds=cmds)\n\n with open(cfg_path, 'w') as fds:\n fds.write(customs_script)\n return cfg_path"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef create_collector_config(self, workdir):\n cfg_path = \"agent_collector_{}.cfg\".format(self.host)\n if os.path.isfile(cfg_path):\n logger.info(\n 'Found agent config file in working directory with the same name as created for host %s.\\n'\n 'Creating new one via tempfile. This will affect predictable filenames for agent artefacts',\n self.host)\n handle, cfg_path = tempfile.mkstemp('.cfg', 'agent_collector_')\n os.close(handle)\n\n self.monitoring_data_output = \"{remote_folder}/monitoring.rawdata\".format(\n remote_folder=workdir)\n\n defaults_old_enabled = ['CPU', 'Memory', 'Disk', 'Net', 'System']\n\n try:\n config = ConfigParser.RawConfigParser()\n\n config.add_section(\"global_tags\")\n config.add_section(\"agent\")\n config.set(\n \"agent\",\n \"interval\",\n \"'{interval}s'\".format(interval=self.interval))\n config.set(\"agent\", \"round_interval\", \"true\")\n config.set(\"agent\", \"flush_interval\", \"'1s'\")\n config.set(\"agent\", \"collection_jitter\", \"'0s'\")\n config.set(\"agent\", \"flush_jitter\", \"'1s'\")\n\n for section in self.host_config.keys():\n # telegraf-style config\n if not self.old_style_configs:\n config.add_section(\n \"{section_name}\".format(\n section_name=self.host_config[section]['name']))\n for key, value in iteritems(self.host_config[section]):\n if key != 'name':\n config.set(\n \"{section_name}\".format(\n section_name=self.host_config[section][\n 'name']),\n \"{key}\".format(key=key),\n \"{value}\".format(value=value))\n # monitoring-style config\n else:\n if section in defaults_old_enabled:\n config.add_section(\n \"{section_name}\".format(\n section_name=self.host_config[section]['name']))\n for key, value in iteritems(self.host_config[section]):\n if key in [\n 'fielddrop', 'fieldpass', 'percpu',\n 'devices', 'interfaces'\n ]:\n config.set(\n \"{section_name}\".format(\n section_name=self.host_config[section][\n 'name']),\n \"{key}\".format(key=key),\n \"{value}\".format(value=value))\n\n # outputs\n config.add_section(\"[outputs.file]\")\n config.set(\n \"[outputs.file]\",\n \"files\",\n \"['{config}']\".format(config=self.monitoring_data_output))\n config.set(\"[outputs.file]\", \"data_format\", \"'json'\")\n\n with open(cfg_path, 'w') as fds:\n config.write(fds)\n\n # dirty hack, this allow to avoid bash escape quoting, we're pushing shell script w/ arguments\n # index of argument is index of custom metric in our config\n inputs = \"\"\n for idx, cmd in enumerate(self.custom):\n inputs += \"[[inputs.exec]]\\n\"\n inputs += \"commands = ['/bin/sh {workdir}/agent_customs.sh -{idx}']\\n\".format(\n workdir=workdir, idx=idx)\n inputs += \"data_format = 'value'\\n\"\n inputs += \"data_type = 'float'\\n\"\n inputs += \"name_prefix = '{}_'\\n\\n\".format(cmd.get('label'))\n if cmd['diff']:\n decoder.diff_metrics['custom'].append(\n decoder.find_common_names(cmd.get('label')))\n\n with open(cfg_path, 'a') as fds:\n fds.write(inputs)\n\n # telegraf raw configuration into xml\n telegraf_raw = \"\"\n for element in self.telegrafraw:\n telegraf_raw += element\n\n with open(cfg_path, 'a') as fds:\n fds.write(telegraf_raw)\n\n except Exception as exc:\n logger.error(\n 'Error trying to create monitoring config. Malformed? %s',\n exc,\n exc_info=True)\n return cfg_path", "response": "Create the Telegraf collector config file and store it in the self. monitor_data_output."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking if disk space is exceeded", "response": "def __check_disk(self):\n ''' raise exception on disk space exceeded '''\n cmd = \"sh -c \\\"df --no-sync -m -P -l -x fuse -x tmpfs -x devtmpfs -x davfs -x nfs \"\n cmd += self.core.artifacts_base_dir\n cmd += \" | tail -n 1 | awk '{print \\$4}' \\\"\"\n res = execute(cmd, True, 0.1, True)\n logging.debug(\"Result: %s\", res)\n if not len(res[1]):\n self.log.debug(\"No disk usage info: %s\", res[2])\n return\n disk_free = res[1]\n self.log.debug(\n \"Disk free space: %s/%s\", disk_free.strip(), self.disk_limit)\n if int(disk_free.strip()) < self.disk_limit:\n raise RuntimeError(\n \"Not enough local resources: disk space less than %sMB in %s: %sMB\"\n % (\n self.disk_limit, self.core.artifacts_base_dir,\n int(disk_free.strip())))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef __check_mem(self):\n ''' raise exception on RAM exceeded '''\n mem_free = psutil.virtual_memory().available / 2**20\n self.log.debug(\"Memory free: %s/%s\", mem_free, self.mem_limit)\n if mem_free < self.mem_limit:\n raise RuntimeError(\n \"Not enough resources: free memory less \"\n \"than %sMB: %sMB\" % (self.mem_limit, mem_free))", "response": "check if memory is not exceeded"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_terminal_size():\n '''\n Gets width and height of terminal viewport\n '''\n default_size = (30, 120)\n env = os.environ\n\n def ioctl_gwinsz(file_d):\n '''\n Helper to get console size\n '''\n try:\n sizes = struct.unpack(\n 'hh', fcntl.ioctl(file_d, termios.TIOCGWINSZ, '1234'))\n except Exception:\n sizes = default_size\n return sizes\n\n sizes = ioctl_gwinsz(0) or ioctl_gwinsz(1) or ioctl_gwinsz(2)\n if not sizes:\n try:\n file_d = os.open(os.ctermid(), os.O_RDONLY)\n sizes = ioctl_gwinsz(file_d)\n os.close(file_d.fileno())\n except Exception:\n pass\n if not sizes:\n try:\n sizes = (env['LINES'], env['COLUMNS'])\n except Exception:\n sizes = default_size\n return int(sizes[1]), int(sizes[0])", "response": "Gets width and height of terminal viewport\n "} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef __get_right_line(self, widget_output):\n ''' Gets next line for right panel '''\n right_line = ''\n if widget_output:\n right_line = widget_output.pop(0)\n if len(right_line) > self.right_panel_width:\n right_line_plain = self.markup.clean_markup(right_line)\n if len(right_line_plain) > self.right_panel_width:\n right_line = right_line[:self.right_panel_width] + self.markup.RESET\n return right_line", "response": "Gets next line for right panel "} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ntruncates the line_arr according to it s wisible lenght", "response": "def __truncate(self, line_arr, max_width):\n ''' Cut tuple of line chunks according to it's wisible lenght '''\n def is_space(chunk):\n return all([True if i == ' ' else False for i in chunk])\n\n def is_empty(chunks, markups):\n result = []\n for chunk in chunks:\n if chunk in markups:\n result.append(True)\n elif is_space(chunk):\n result.append(True)\n else:\n result.append(False)\n return all(result)\n left = max_width\n result = ''\n markups = self.markup.get_markup_vars()\n for num, chunk in enumerate(line_arr):\n if chunk in markups:\n result += chunk\n else:\n if left > 0:\n if len(chunk) <= left:\n result += chunk\n left -= len(chunk)\n else:\n leftover = (chunk[left:],) + line_arr[num + 1:]\n was_cut = not is_empty(leftover, markups)\n if was_cut:\n result += chunk[:left - 1] + self.markup.RESET + u'\\u2026'\n else:\n result += chunk[:left]\n left = 0\n return result"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef __render_left_panel(self):\n ''' Render left blocks '''\n self.log.debug(\"Rendering left blocks\")\n left_block = self.left_panel\n left_block.render()\n blank_space = self.left_panel_width - left_block.width\n\n lines = []\n pre_space = ' ' * int(blank_space / 2)\n if not left_block.lines:\n lines = [(''), (self.markup.RED + 'BROKEN LEFT PANEL' + self.markup.RESET)]\n else:\n while self.left_panel.lines:\n src_line = self.left_panel.lines.pop(0)\n line = pre_space + self.__truncate(src_line, self.left_panel_width)\n post_space = ' ' * (self.left_panel_width - len(self.markup.clean_markup(line)))\n line += post_space + self.markup.RESET\n lines.append(line)\n return lines", "response": "Render the left panel."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef render_screen(self):\n ''' Main method to render screen view '''\n self.term_width, self.term_height = get_terminal_size()\n self.log.debug(\n \"Terminal size: %sx%s\", self.term_width, self.term_height)\n self.right_panel_width = int(\n (self.term_width - len(self.RIGHT_PANEL_SEPARATOR))\n * (float(self.info_panel_percent) / 100)) - 1\n if self.right_panel_width > 0:\n self.left_panel_width = self.term_width - \\\n self.right_panel_width - len(self.RIGHT_PANEL_SEPARATOR) - 2\n else:\n self.right_panel_width = 0\n self.left_panel_width = self.term_width - 1\n self.log.debug(\n \"Left/right panels width: %s/%s\", self.left_panel_width,\n self.right_panel_width)\n\n widget_output = []\n if self.right_panel_width:\n widget_output = []\n self.log.debug(\"There are %d info widgets\" % len(self.info_widgets))\n for index, widget in sorted(\n self.info_widgets.iteritems(),\n key=lambda item: (item[1].get_index(), item[0])):\n self.log.debug(\"Rendering info widget #%s: %s\", index, widget)\n widget_out = widget.render(self).strip()\n if widget_out:\n widget_output += widget_out.split(\"\\n\")\n widget_output += [\"\"]\n\n left_lines = self.__render_left_panel()\n\n self.log.debug(\"Composing final screen output\")\n output = []\n for line_no in range(1, self.term_height):\n line = \" \"\n\n if line_no > 1 and left_lines:\n left_line = left_lines.pop(0)\n left_line_plain = self.markup.clean_markup(left_line)\n\n left_line += (\n ' ' * (self.left_panel_width - len(left_line_plain)))\n line += left_line\n else:\n line += ' ' * self.left_panel_width\n if self.right_panel_width:\n line += self.markup.RESET\n line += self.markup.WHITE\n line += self.RIGHT_PANEL_SEPARATOR\n line += self.markup.RESET\n right_line = self.__get_right_line(widget_output)\n line += right_line\n\n output.append(line)\n return self.markup.new_line.join(output) + self.markup.new_line", "response": "This method renders the screen."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nadds widget string to right panel of the screen", "response": "def add_info_widget(self, widget):\n '''\n Add widget string to right panel of the screen\n '''\n index = widget.get_index()\n while index in self.info_widgets.keys():\n index += 1\n self.info_widgets[widget.get_index()] = widget"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fill_rectangle(self, prepared):\n ''' Right-pad lines of block to equal width '''\n result = []\n width = max([self.clean_len(line) for line in prepared])\n for line in prepared:\n spacer = ' ' * (width - self.clean_len(line))\n result.append(line + (self.screen.markup.RESET, spacer))\n return (width, result)", "response": "Right - pad lines of block to equal width"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef clean_len(self, line):\n ''' Calculate wisible length of string '''\n if isinstance(line, basestring):\n return len(self.screen.markup.clean_markup(line))\n elif isinstance(line, tuple) or isinstance(line, list):\n markups = self.screen.markup.get_markup_vars()\n length = 0\n for i in line:\n if i not in markups:\n length += len(i)\n return length", "response": "Calculate wisible length of string"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating load plan timestamps generator", "response": "def create(instances_schedule):\n '''\n Creates load plan timestamps generator\n\n >>> from util import take\n\n >>> take(7, LoadPlanBuilder().ramp(5, 4000).create())\n [0, 1000, 2000, 3000, 4000, 0, 0]\n\n >>> take(7, create(['ramp(5, 4s)']))\n [0, 1000, 2000, 3000, 4000, 0, 0]\n\n >>> take(12, create(['ramp(5, 4s)', 'wait(5s)', 'ramp(5,4s)']))\n [0, 1000, 2000, 3000, 4000, 9000, 10000, 11000, 12000, 13000, 0, 0]\n\n >>> take(7, create(['wait(5s)', 'ramp(5, 0)']))\n [5000, 5000, 5000, 5000, 5000, 0, 0]\n\n >>> take(7, create([]))\n [0, 0, 0, 0, 0, 0, 0]\n\n >>> take(12, create(['line(1, 9, 4s)']))\n [0, 500, 1000, 1500, 2000, 2500, 3000, 3500, 4000, 0, 0, 0]\n\n >>> take(12, create(['const(3, 5s)', 'line(7, 11, 2s)']))\n [0, 0, 0, 5000, 5000, 5000, 5000, 5500, 6000, 6500, 7000, 0]\n\n >>> take(12, create(['step(2, 10, 2, 3s)']))\n [0, 0, 3000, 3000, 6000, 6000, 9000, 9000, 12000, 12000, 0, 0]\n\n >>> take(12, LoadPlanBuilder().const(3, 1000).line(5, 10, 5000).steps)\n [(3, 1), (5, 1), (6, 1), (7, 1), (8, 1), (9, 1), (10, 1)]\n\n >>> take(12, LoadPlanBuilder().stairway(100, 950, 100, 30000).steps)\n [(100, 30), (200, 30), (300, 30), (400, 30), (500, 30), (600, 30), (700, 30), (800, 30), (900, 30), (950, 30)]\n\n >>> LoadPlanBuilder().stairway(100, 950, 100, 30000).instances\n 950\n\n >>> LoadPlanBuilder().const(3, 1000).line(5, 10, 5000).instances\n 10\n\n >>> LoadPlanBuilder().line(1, 100, 60000).instances\n 100\n '''\n lpb = LoadPlanBuilder().add_all_steps(instances_schedule)\n lp = lpb.create()\n info.status.publish('duration', 0)\n # info.status.publish('steps', lpb.steps)\n info.status.publish('steps', [])\n info.status.publish('instances', lpb.instances)\n return lp"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncalculating the error of a set of tangents for a set of tangents.", "response": "def calc_measurement_error(self, tangents):\n '''\n formula for measurement error\n sqrt ( (sum(1, n, (k_i - )**2) / (n*(n-1)))\n '''\n\n if len(tangents) < 2:\n return 0.0\n\n avg_tan = float(sum(tangents) / len(tangents))\n numerator = float()\n for i in tangents:\n numerator += (i - avg_tan) * (i - avg_tan)\n\n return math.sqrt(numerator / len(tangents) / (len(tangents) - 1))"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nadds right panel widget", "response": "def add_info_widget(self, widget):\n ''' add right panel widget '''\n if not self.screen:\n self.log.debug(\"No screen instance to add widget\")\n else:\n self.screen.add_info_widget(widget)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef clean_markup(self, orig_str):\n ''' clean markup from string '''\n for val in self.get_markup_vars():\n orig_str = orig_str.replace(val, '')\n return orig_str", "response": "clean markup from string"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef __make_writer_request(\n self,\n params=None,\n json=None,\n http_method=\"POST\",\n trace=False):\n '''\n Send request to writer service.\n '''\n request = requests.Request(\n http_method,\n self.writer_url,\n params=params,\n json=json,\n headers={\n 'User-Agent': self.user_agent})\n ids = id_gen(str(uuid.uuid4()))\n network_timeouts = self.network_timeouts()\n maintenance_timeouts = self.maintenance_timeouts()\n while True:\n try:\n response = self.__send_single_request(request, ids.next(), trace=trace)\n return response\n except (Timeout, ConnectionError, ProtocolError):\n logger.warn(traceback.format_exc())\n try:\n timeout = next(network_timeouts)\n logger.warn(\n \"Network error, will retry in %ss...\" %\n timeout)\n time.sleep(timeout)\n continue\n except StopIteration:\n raise self.NetworkError()\n except self.UnderMaintenance as e:\n try:\n timeout = next(maintenance_timeouts)\n logger.warn(\n \"Writer is under maintenance, will retry in %ss...\" %\n timeout)\n time.sleep(timeout)\n continue\n except StopIteration:\n raise e", "response": "Send request to writer service."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef new_job(\n self,\n task,\n person,\n tank,\n target_host,\n target_port,\n loadscheme=None,\n detailed_time=None,\n notify_list=None,\n trace=False):\n \"\"\"\n :return: job_nr, upload_token\n :rtype: tuple\n \"\"\"\n if not notify_list:\n notify_list = []\n data = {\n 'task': task,\n 'person': person,\n 'tank': tank,\n 'host': target_host,\n 'port': target_port,\n 'loadscheme': loadscheme,\n 'detailed_time': detailed_time,\n 'notify': notify_list\n }\n\n logger.debug(\"Job create request: %s\", data)\n api_timeouts = self.api_timeouts()\n while True:\n try:\n response = self.__post(\n \"api/job/create.json\", data, trace=trace)[0]\n # [{\"upload_token\": \"1864a3b2547d40f19b5012eb038be6f6\", \"job\": 904317}]\n return response['job'], response['upload_token']\n except (self.NotAvailable, self.StoppedFromOnline) as e:\n try:\n timeout = next(api_timeouts)\n logger.warn(\"API error, will retry in %ss...\" % timeout)\n time.sleep(timeout)\n continue\n except StopIteration:\n logger.warn('Failed to create job on lunapark')\n raise self.JobNotCreated(e.message)\n except requests.HTTPError as e:\n raise self.JobNotCreated('Failed to create job on lunapark\\n{}'.format(e.response.content))\n except Exception as e:\n logger.warn('Failed to create job on lunapark')\n logger.warn(repr(e), )\n raise self.JobNotCreated()", "response": "This method creates a new job on the lunapark."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef plugins(self):\n if self._plugins is None:\n self.load_plugins()\n if self._plugins is None:\n self._plugins = {}\n return self._plugins", "response": "returns a dict of all available plugins"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef load_plugins(self):\n logger.info(\"Loading plugins...\")\n for (plugin_name, plugin_path, plugin_cfg) in self.config.plugins:\n logger.debug(\"Loading plugin %s from %s\", plugin_name, plugin_path)\n if plugin_path == \"yandextank.plugins.Overload\":\n logger.warning(\n \"Deprecated plugin name: 'yandextank.plugins.Overload'\\n\"\n \"There is a new generic plugin now.\\n\"\n \"Correcting to 'yandextank.plugins.DataUploader overload'\")\n plugin_path = \"yandextank.plugins.DataUploader overload\"\n try:\n plugin = il.import_module(plugin_path)\n except ImportError:\n logger.warning('Plugin name %s path %s import error', plugin_name, plugin_path)\n logger.debug('Plugin name %s path %s import error', plugin_name, plugin_path, exc_info=True)\n raise\n try:\n instance = getattr(plugin, 'Plugin')(self, cfg=plugin_cfg, name=plugin_name)\n except AttributeError:\n logger.warning('Plugin %s classname should be `Plugin`', plugin_name)\n raise\n else:\n self.register_plugin(self.PLUGIN_PREFIX + plugin_name, instance)\n logger.debug(\"Plugin instances: %s\", self._plugins)", "response": "Load all plugins and instantiate their class instances."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef plugins_configure(self):\n self.publish(\"core\", \"stage\", \"configure\")\n\n logger.info(\"Configuring plugins...\")\n self.taskset_affinity = self.get_option(self.SECTION, 'affinity')\n if self.taskset_affinity:\n self.__setup_taskset(self.taskset_affinity, pid=os.getpid())\n\n for plugin in self.plugins.values():\n if not self.interrupted.is_set():\n logger.debug(\"Configuring %s\", plugin)\n plugin.configure()", "response": "Configure all the plugins."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nwaiting for all plugins to finish and return the exit status.", "response": "def wait_for_finish(self):\n \"\"\"\n Call is_test_finished() on all plugins 'till one of them initiates exit\n \"\"\"\n if not self.interrupted.is_set():\n logger.info(\"Waiting for test to finish...\")\n logger.info('Artifacts dir: {dir}'.format(dir=self.artifacts_dir))\n self.publish(\"core\", \"stage\", \"shoot\")\n if not self.plugins:\n raise RuntimeError(\"It's strange: we have no plugins loaded...\")\n\n while not self.interrupted.is_set():\n begin_time = time.time()\n aggr_retcode = self.job.aggregator.is_test_finished()\n if aggr_retcode >= 0:\n return aggr_retcode\n for plugin in self.plugins.values():\n logger.debug(\"Polling %s\", plugin)\n retcode = plugin.is_test_finished()\n if retcode >= 0:\n return retcode\n end_time = time.time()\n diff = end_time - begin_time\n logger.debug(\"Polling took %s\", diff)\n logger.debug(\"Tank status: %s\", json.dumps(self.status))\n # screen refresh every 0.5 s\n if diff < 0.5:\n time.sleep(0.5 - diff)\n return 1"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef plugins_post_process(self, retcode):\n logger.info(\"Post-processing test...\")\n self.publish(\"core\", \"stage\", \"post_process\")\n for plugin in self.plugins.values():\n logger.debug(\"Post-process %s\", plugin)\n try:\n logger.debug(\"RC before: %s\", retcode)\n retcode = plugin.post_process(retcode)\n logger.debug(\"RC after: %s\", retcode)\n except Exception: # FIXME too broad exception clause\n logger.error(\"Failed post-processing plugin %s\", plugin, exc_info=True)\n if not retcode:\n retcode = 1\n return retcode", "response": "Call post_process on all plugins and return the retcode."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_plugin_of_type(self, plugin_class):\n logger.debug(\"Searching for plugin: %s\", plugin_class)\n matches = [plugin for plugin in self.plugins.values() if isinstance(plugin, plugin_class)]\n if matches:\n if len(matches) > 1:\n logger.debug(\n \"More then one plugin of type %s found. Using first one.\",\n plugin_class)\n return matches[-1]\n else:\n raise KeyError(\"Requested plugin type not found: %s\" % plugin_class)", "response": "Retrieve a specific plugin of the specified class. Raise KeyError raised otherwise."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nretrieving a list of plugins of the specified class. KeyError raised otherwise.", "response": "def get_plugins_of_type(self, plugin_class):\n \"\"\"\n Retrieve a list of plugins of desired class, KeyError raised otherwise\n \"\"\"\n logger.debug(\"Searching for plugins: %s\", plugin_class)\n matches = [plugin for plugin in self.plugins.values() if isinstance(plugin, plugin_class)]\n if matches:\n return matches\n else:\n raise KeyError(\"Requested plugin type not found: %s\" % plugin_class)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef __collect_file(self, filename, keep_original=False):\n dest = self.artifacts_dir + '/' + os.path.basename(filename)\n logger.debug(\"Collecting file: %s to %s\", filename, dest)\n if not filename or not os.path.exists(filename):\n logger.warning(\"File not found to collect: %s\", filename)\n return\n\n if os.path.exists(dest):\n # FIXME: 3 find a way to store artifacts anyway\n logger.warning(\"File already exists: %s\", dest)\n return\n\n if keep_original:\n shutil.copy(filename, self.artifacts_dir)\n else:\n shutil.move(filename, self.artifacts_dir)\n\n os.chmod(dest, 0o644)", "response": "Move or copy a single file to artifacts dir"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_artifact_file(self, filename, keep_original=False):\n if filename:\n logger.debug(\n \"Adding artifact file to collect (keep=%s): %s\", keep_original,\n filename)\n self.artifact_files[filename] = keep_original", "response": "Add file to be stored as result artifact on post - process phase."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef mkstemp(self, suffix, prefix, directory=None):\n if not directory:\n directory = self.artifacts_dir\n fd, fname = tempfile.mkstemp(suffix, prefix, directory)\n os.close(fd)\n os.chmod(fname, 0o644) # FIXME: chmod to parent dir's mode?\n return fname", "response": "Generate a temporary file name in artifacts base dir and return its name."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef close(self):\n logger.info(\"Close allocated resources...\")\n for plugin in self.plugins.values():\n logger.debug(\"Close %s\", plugin)\n try:\n plugin.close()\n except Exception as ex:\n logger.error(\"Failed closing plugin %s: %s\", plugin, ex)\n logger.debug(\n \"Failed closing plugin: %s\", traceback.format_exc(ex))", "response": "Close all allocated resources for all plugins and free resources."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef load_files(self, configs):\n logger.debug(\"Reading configs: %s\", configs)\n config_filenames = [resource.resource_filename(config) for config in configs]\n try:\n self.config.read(config_filenames)\n except Exception as ex:\n logger.error(\"Can't load configs: %s\", ex)\n raise ex", "response": "Read configs set into storage "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nflushing current stat to file", "response": "def flush(self, filename=None):\n \"\"\" Flush current stat to file \"\"\"\n if not filename:\n filename = self.file\n\n if filename:\n with open(filename, 'w') as handle:\n self.config.write(handle)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_options(self, section, prefix=''):\n res = []\n try:\n for option in self.config.options(section):\n if not prefix or option.find(prefix) == 0:\n res += [(\n option[len(prefix):], self.config.get(section, option))]\n except ConfigParser.NoSectionError as ex:\n logger.warning(\"No section: %s\", ex)\n\n logger.debug(\n \"Section: [%s] prefix: '%s' options:\\n%s\", section, prefix, res)\n return res", "response": "Get options list with requested prefix"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef find_sections(self, prefix):\n res = []\n for section in self.config.sections():\n if section.startswith(prefix):\n res.append(section)\n return res", "response": "return a list of all sections with specified prefix"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn all items found in this chunk", "response": "def _decode_stat_data(self, chunk):\n \"\"\"\n Return all items found in this chunk\n \"\"\"\n for date_str, statistics in chunk.iteritems():\n date_obj = datetime.datetime.strptime(\n date_str.split(\".\")[0], '%Y-%m-%d %H:%M:%S')\n chunk_date = int(time.mktime(date_obj.timetuple()))\n instances = 0\n for benchmark_name, benchmark in statistics.iteritems():\n if not benchmark_name.startswith(\"benchmark_io\"):\n continue\n for method, meth_obj in benchmark.iteritems():\n if \"mmtasks\" in meth_obj:\n instances += meth_obj[\"mmtasks\"][2]\n\n offset = chunk_date - 1 - self.start_time\n reqps = 0\n if 0 <= offset < len(self.phantom_info.steps):\n reqps = self.phantom_info.steps[offset][0]\n yield self.stats_item(chunk_date - 1, instances, reqps)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\npreparing for monitoring - install agents etc", "response": "def prepare(self):\n \"\"\"Prepare for monitoring - install agents etc\"\"\"\n\n # Parse config\n agent_configs = []\n if self.config:\n agent_configs = self.config_manager.getconfig(\n self.config, self.default_target)\n\n # Creating agent for hosts\n for config in agent_configs:\n if config['host'] in ['localhost', '127.0.0.1', '::1']:\n client = self.clients['localhost'](\n config, self.old_style_configs, kill_old=self.kill_old)\n else:\n client = self.clients['ssh'](\n config, self.old_style_configs, timeout=5, kill_old=self.kill_old)\n logger.debug('Installing monitoring agent. Host: %s', client.host)\n agent_config, startup_config, customs_script = client.install()\n if agent_config:\n self.agents.append(client)\n self.artifact_files.append(agent_config)\n if startup_config:\n self.artifact_files.append(startup_config)\n if customs_script:\n self.artifact_files.append(customs_script)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef start(self):\n [agent.start() for agent in self.agents]\n [agent.reader_thread.start() for agent in self.agents]", "response": "Start agents with the same names execute popen of agent. py on target and start output reader thread."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\npoll agents for data and send them to the server.", "response": "def poll(self):\n \"\"\" Poll agents for data\n \"\"\"\n start_time = time.time()\n for agent in self.agents:\n for collect in agent.reader:\n # don't crush if trash or traceback came from agent to stdout\n if not collect:\n return 0\n for chunk in collect:\n ts, prepared_results = chunk\n if self.load_start_time and int(\n ts) >= self.load_start_time:\n ready_to_send = {\n \"timestamp\": int(ts),\n \"data\": {\n self.hash_hostname(agent.host): {\n \"comment\": agent.config.comment,\n \"metrics\": prepared_results\n }\n }\n }\n self.__collected_data.append(ready_to_send)\n\n logger.debug(\n 'Polling/decoding agents data took: %.2fms',\n (time.time() - start_time) * 1000)\n\n collected_data_length = len(self.__collected_data)\n\n if not self.first_data_received and self.__collected_data:\n self.first_data_received = True\n logger.info(\"Monitoring received first data.\")\n else:\n self.send_collected_data()\n return collected_data_length"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef stop(self):\n logger.debug(\"Uninstalling monitoring agents\")\n for agent in self.agents:\n log_filename, data_filename = agent.uninstall()\n self.artifact_files.append(log_filename)\n self.artifact_files.append(data_filename)\n for agent in self.agents:\n try:\n logger.debug(\n 'Waiting for agent %s reader thread to finish.', agent)\n agent.reader_thread.join(10)\n except BaseException:\n logger.error('Monitoring reader thread stuck!', exc_info=True)", "response": "Shutdown agents and wait for reader thread to finish."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsend pending data set to all listeners", "response": "def send_collected_data(self):\n \"\"\"sends pending data set to listeners\"\"\"\n data = self.__collected_data\n self.__collected_data = []\n for listener in self.listeners:\n # deep copy to ensure each listener gets it's own copy\n listener.monitoring_data(copy.deepcopy(data))"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ndetecting the configuration of the current plugin and returns the section name or None if no configuration is found.", "response": "def __detect_configuration(self):\n \"\"\"\n we need to be flexible in order to determine which plugin's configuration\n specified and make appropriate configs to metrics collector\n\n :return: SECTION name or None for defaults\n \"\"\"\n try:\n is_telegraf = self.core.get_option('telegraf', \"config\")\n except KeyError:\n is_telegraf = None\n try:\n is_monitoring = self.core.get_option('monitoring', \"config\")\n except KeyError:\n is_monitoring = None\n\n if is_telegraf and is_monitoring:\n raise ValueError(\n 'Both telegraf and monitoring configs specified. '\n 'Clean up your config and delete one of them')\n if is_telegraf and not is_monitoring:\n return 'telegraf'\n if not is_telegraf and is_monitoring:\n return 'monitoring'\n if not is_telegraf and not is_monitoring:\n # defaults target logic\n try:\n is_telegraf_dt = self.core.get_option('telegraf')\n except NoOptionError:\n is_telegraf_dt = None\n try:\n is_monitoring_dt = self.core.get_option('monitoring')\n except BaseException:\n is_monitoring_dt = None\n if is_telegraf_dt and is_monitoring_dt:\n raise ValueError(\n 'Both telegraf and monitoring default targets specified. '\n 'Clean up your config and delete one of them')\n if is_telegraf_dt and not is_monitoring_dt:\n return\n if not is_telegraf_dt and is_monitoring_dt:\n self.core.set_option(\n \"telegraf\", \"default_target\", is_monitoring_dt)\n if not is_telegraf_dt and not is_monitoring_dt:\n return"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nhandles the data items in the data tree and store the values in the data tree and calc offset", "response": "def __handle_data_items(self, host, data):\n \"\"\" store metric in data tree and calc offset signs\n\n sign < 0 is CYAN, means metric value is lower then previous,\n sign > 1 is YELLOW, means metric value is higher then prevoius,\n sign == 0 is WHITE, means initial or equal metric value\n \"\"\"\n for metric, value in data.iteritems():\n if value == '':\n self.sign[host][metric] = -1\n self.data[host][metric] = value\n else:\n if not self.data[host].get(metric, None):\n self.sign[host][metric] = 1\n elif float(value) > float(self.data[host][metric]):\n self.sign[host][metric] = 1\n elif float(value) < float(self.data[host][metric]):\n self.sign[host][metric] = -1\n else:\n self.sign[host][metric] = 0\n self.data[host][metric] = \"%.2f\" % float(value)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _decode_agents_data(self, block):\n collect = []\n if block:\n for chunk in block.split('\\n'):\n try:\n if chunk:\n prepared_results = {}\n jsn = json.loads(chunk)\n for ts, values in jsn.iteritems():\n for key, value in values.iteritems():\n # key sample: diskio-sda1_io_time\n # key_group sample: diskio\n # key_name sample: io_time\n try:\n key_group, key_name = key.split('_')[0].split('-')[0], '_'.join(key.split('_')[1:])\n except: # noqa: E722\n key_group, key_name = key.split('_')[0], '_'.join(key.split('_')[1:])\n if key_group in decoder.diff_metrics.keys():\n if key_name in decoder.diff_metrics[key_group]:\n decoded_key = decoder.find_common_names(\n key)\n if self.prev_check:\n try:\n value = jsn[ts][key] - \\\n self.prev_check[key]\n except KeyError:\n logger.debug(\n 'There is no diff value for metric %s.\\n'\n 'Timestamp: %s. Is it initial data?', key, ts, exc_info=True)\n value = 0\n prepared_results[decoded_key] = value\n else:\n decoded_key = decoder.find_common_names(\n key)\n prepared_results[decoded_key] = value\n else:\n decoded_key = decoder.find_common_names(\n key)\n prepared_results[decoded_key] = value\n self.prev_check = jsn[ts]\n collect.append((ts, prepared_results))\n except ValueError:\n logger.error(\n 'Telegraf agent send trash to output: %s', chunk)\n logger.debug(\n 'Telegraf agent data block w/ trash: %s',\n exc_info=True)\n return []\n except BaseException:\n logger.error(\n 'Exception trying to parse agent data: %s',\n chunk,\n exc_info=True)\n return []\n if collect:\n return collect", "response": "Decode agents jsons count diffs\n "} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nstarts subscribing channels. If the necessary connection isn't open yet, it opens now.", "response": "async def subscribe(self, channels):\n '''Start subscribing channels.\n If the necessary connection isn't open yet, it opens now.\n '''\n ws_channels = []\n nats_channels = []\n for c in channels:\n if c.startswith(('Q.', 'T.', 'A.', 'AM.',)):\n nats_channels.append(c)\n else:\n ws_channels.append(c)\n\n if len(ws_channels) > 0:\n await self._ensure_ws()\n await self._ws.send(json.dumps({\n 'action': 'listen',\n 'data': {\n 'streams': ws_channels,\n }\n }))\n\n if len(nats_channels) > 0:\n await self._ensure_nats()\n await self.polygon.subscribe(nats_channels)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nruns forever and block until exception is rasised.", "response": "def run(self, initial_channels=[]):\n '''Run forever and block until exception is rasised.\n initial_channels is the channels to start with.\n '''\n loop = asyncio.get_event_loop()\n try:\n loop.run_until_complete(self.subscribe(initial_channels))\n loop.run_forever()\n finally:\n loop.run_until_complete(self.close())"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\nasync def close(self):\n '''Close any of open connections'''\n if self._ws is not None:\n await self._ws.close()\n if self.polygon is not None:\n await self.polygon.close()", "response": "Close any of open connections"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _one_request(self, method, url, opts, retry):\n '''\n Perform one request, possibly raising RetryException in the case\n the response is 429. Otherwise, if error text contain \"code\" string,\n then it decodes to json object and returns APIError.\n Returns the body json in the 200 status.\n '''\n retry_codes = self._retry_codes\n resp = self._session.request(method, url, **opts)\n try:\n resp.raise_for_status()\n except HTTPError as http_error:\n # retry if we hit Rate Limit\n if resp.status_code in retry_codes and retry > 0:\n raise RetryException()\n if 'code' in resp.text:\n error = resp.json()\n if 'code' in error:\n raise APIError(error, http_error)\n else:\n raise\n if resp.text != '':\n return resp.json()\n return None", "response": "Perform one request to the specified url and return the response."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget a list of orders for a specific node.", "response": "def list_orders(self, status=None, limit=None, after=None, until=None,\n direction=None, params=None):\n '''\n Get a list of orders\n https://docs.alpaca.markets/web-api/orders/#get-a-list-of-orders\n '''\n if params is None:\n params = dict()\n if limit is not None:\n params['limit'] = limit\n if after is not None:\n params['after'] = after\n if until is not None:\n params['until'] = until\n if direction is not None:\n params['direction'] = direction\n if status is not None:\n params['status'] = status\n resp = self.get('/orders', params)\n return [Order(o) for o in resp]"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsubmits a new order to the server.", "response": "def submit_order(self, symbol, qty, side, type, time_in_force,\n limit_price=None, stop_price=None, client_order_id=None):\n '''Request a new order'''\n params = {\n 'symbol': symbol,\n 'qty': qty,\n 'side': side,\n 'type': type,\n 'time_in_force': time_in_force,\n }\n if limit_price is not None:\n params['limit_price'] = limit_price\n if stop_price is not None:\n params['stop_price'] = stop_price\n if client_order_id is not None:\n params['client_order_id'] = client_order_id\n resp = self.post('/orders', params)\n return Order(resp)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets an open position", "response": "def get_position(self, symbol):\n '''Get an open position'''\n resp = self.get('/positions/{}'.format(symbol))\n return Position(resp)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets a list of assets", "response": "def list_assets(self, status=None, asset_class=None):\n '''Get a list of assets'''\n params = {\n 'status': status,\n 'assert_class': asset_class,\n }\n resp = self.get('/assets', params)\n return [Asset(o) for o in resp]"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_barset(self,\n symbols,\n timeframe,\n limit=None,\n start=None,\n end=None,\n after=None,\n until=None):\n '''Get BarSet(dict[str]->list[Bar])\n The parameter symbols can be either a comma-split string\n or a list of string. Each symbol becomes the key of\n the returned value.\n '''\n if not isinstance(symbols, str):\n symbols = ','.join(symbols)\n params = {\n 'symbols': symbols,\n }\n if limit is not None:\n params['limit'] = limit\n if start is not None:\n params['start'] = start\n if end is not None:\n params['end'] = end\n if after is not None:\n params['after'] = after\n if until is not None:\n params['until'] = until\n resp = self.data_get('/bars/{}'.format(timeframe), params)\n return BarSet(resp)", "response": "Get a BarSet object from the specified timeframe."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef lambda_solid(name=None, inputs=None, output=None, description=None):\n '''(decorator) Create a simple solid.\n\n This shortcut allows the creation of simple solids that do not require\n configuration and whose implementations do not require a context.\n\n Lambda solids take inputs and produce a single output. The body of the function\n should return a single value.\n\n Args:\n name (str): Name of solid.\n inputs (list[InputDefinition]): List of inputs.\n output (OutputDefinition): The output of the solid. Defaults to ``OutputDefinition()``.\n description (str): Solid description.\n\n Examples:\n\n .. code-block:: python\n\n @lambda_solid\n def hello_world():\n return 'hello'\n\n @lambda_solid(inputs=[InputDefinition(name='foo')])\n def hello_world(foo):\n return foo\n\n '''\n output = output or OutputDefinition()\n\n if callable(name):\n check.invariant(inputs is None)\n check.invariant(description is None)\n return _LambdaSolid(output=output)(name)\n\n return _LambdaSolid(name=name, inputs=inputs, output=output, description=description)", "response": "Decorator for creating a simple solid."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef from_dict(result_dict):\n '''Create a new ``MultipleResults`` object from a dictionary.\n \n Keys of the dictionary are unpacked into result names.\n \n Args:\n result_dict (dict) - The dictionary to unpack.\n \n Returns:\n (:py:class:`MultipleResults `) A new ``MultipleResults`` object\n\n '''\n check.dict_param(result_dict, 'result_dict', key_type=str)\n results = []\n for name, value in result_dict.items():\n results.append(Result(value, name))\n return MultipleResults(*results)", "response": "Create a new MultipleResults object from a dictionary."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_joining_subplan(\n pipeline_def, solid, join_step_key, parallel_steps, parallel_step_output\n):\n '''\n This captures a common pattern of fanning out a single value to N steps,\n where each step has similar structure. The strict requirement here is that each step\n must provide an output named the parameters parallel_step_output.\n\n This takes those steps and then uses a join node to coalesce them so that downstream\n steps can depend on a single output.\n\n Currently the join step just does a passthrough with no computation. It remains\n to be seen if there should be any work or verification done in this step, especially\n in multi-process environments that require persistence between steps.\n '''\n check.inst_param(pipeline_def, 'pipeline_def', PipelineDefinition)\n check.inst_param(solid, 'solid', Solid)\n check.str_param(join_step_key, 'join_step_key')\n check.list_param(parallel_steps, 'parallel_steps', of_type=ExecutionStep)\n check.str_param(parallel_step_output, 'parallel_step_output')\n\n for parallel_step in parallel_steps:\n check.invariant(parallel_step.has_step_output(parallel_step_output))\n\n join_step = create_join_step(\n pipeline_def, solid, join_step_key, parallel_steps, parallel_step_output\n )\n\n output_name = join_step.step_outputs[0].name\n return ExecutionValueSubplan(\n parallel_steps + [join_step], StepOutputHandle.from_step(join_step, output_name)\n )", "response": "Creates a subplan that can be used to join a single value to N steps."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef gunzipper(gzip_file):\n '''gunzips /path/to/foo.gz to /path/to/raw/2019/01/01/data.json\n '''\n # TODO: take date as an input\n\n path_prefix = os.path.dirname(gzip_file)\n output_folder = os.path.join(path_prefix, 'raw/2019/01/01')\n outfile = os.path.join(output_folder, 'data.json')\n\n if not safe_isfile(outfile):\n mkdir_p(output_folder)\n\n with gzip.open(gzip_file, 'rb') as f_in, open(outfile, 'wb') as f_out:\n shutil.copyfileobj(f_in, f_out)\n\n return [path_prefix]", "response": "gunzips a gzipped file to a base folder"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _check_key_value_types(obj, key_type, value_type, key_check=isinstance, value_check=isinstance):\n '''Ensures argument obj is a dictionary, and enforces that the keys/values conform to the types\n specified by key_type, value_type.\n '''\n if not isinstance(obj, dict):\n raise_with_traceback(_type_mismatch_error(obj, dict))\n\n if key_type is str:\n key_type = string_types\n\n if value_type is str:\n value_type = string_types\n\n for key, value in obj.items():\n if key_type and not key_check(key, key_type):\n raise_with_traceback(\n CheckError(\n 'Key in dictionary mismatches type. Expected {key_type}. Got {obj_repr}'.format(\n key_type=repr(key_type), obj_repr=repr(key)\n )\n )\n )\n if value_type and not value_check(value, value_type):\n raise_with_traceback(\n CheckError(\n 'Value in dictionary mismatches expected type for key {key}. Expected value '\n 'of type {vtype}. Got value {value} of type {obj_type}.'.format(\n vtype=repr(value_type), obj_type=type(value), key=key, value=value\n )\n )\n )\n return obj", "response": "Ensures that the keys and values conform to the types\n specified by key_type value_type."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nensures argument obj is a native Python dictionary raises an exception if not otherwise returns obj.", "response": "def dict_param(obj, param_name, key_type=None, value_type=None):\n '''Ensures argument obj is a native Python dictionary, raises an exception if not, and otherwise\n returns obj.\n '''\n if not isinstance(obj, dict):\n raise_with_traceback(_param_type_mismatch_exception(obj, dict, param_name))\n\n if not (key_type or value_type):\n return obj\n\n return _check_key_value_types(obj, key_type, value_type)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef opt_dict_param(obj, param_name, key_type=None, value_type=None, value_class=None):\n '''Ensures argument obj is either a dictionary or None; if the latter, instantiates an empty\n dictionary.\n '''\n if obj is not None and not isinstance(obj, dict):\n raise_with_traceback(_param_type_mismatch_exception(obj, dict, param_name))\n\n if not obj:\n return {}\n\n if value_class:\n return _check_key_value_types(obj, key_type, value_type=value_class, value_check=issubclass)\n return _check_key_value_types(obj, key_type, value_type)", "response": "Ensures argument obj is either a dictionary or None ; returns an empty dict."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nconstructing a new event - logger.", "response": "def construct_event_logger(event_record_callback):\n '''\n Callback receives a stream of event_records\n '''\n check.callable_param(event_record_callback, 'event_record_callback')\n\n return construct_single_handler_logger(\n 'event-logger',\n DEBUG,\n StructuredLoggerHandler(\n lambda logger_message: event_record_callback(construct_event_record(logger_message))\n ),\n )"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef construct_json_event_logger(json_path):\n '''Record a stream of event records to json'''\n check.str_param(json_path, 'json_path')\n return construct_single_handler_logger(\n \"json-event-record-logger\",\n DEBUG,\n JsonEventLoggerHandler(\n json_path,\n lambda record: construct_event_record(\n StructuredLoggerMessage(\n name=record.name,\n message=record.msg,\n level=record.levelno,\n meta=record.dagster_meta,\n record=record,\n )\n ),\n ),\n )", "response": "Constructs a logger for a stream of event records to json"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef from_file(cls, path=None):\n path = path or cls.CONFIG_PATH\n if not os.path.exists(path):\n error = 'Config file not found: {0!r}'.format(path)\n raise ConfigFileError(error)\n config = read_config(path)\n return cls(config)", "response": "Read a config file and instantiate the RCParser."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets the config dictionary for the given repository.", "response": "def get_repository_config(self, repository):\n \"\"\"Get config dictionary for the given repository.\n\n If the repository section is not found in the config file,\n return ``None``. If the file is invalid, raise\n :exc:`configparser.Error`.\n\n Otherwise return a dictionary with:\n\n * ``'repository'`` -- the repository URL\n * ``'username'`` -- username for authentication\n * ``'password'`` -- password for authentication\n\n :param repository:\n Name or URL of the repository to find in the ``.pypirc`` file.\n The repository section must be defined in the config file.\n\n \"\"\"\n servers = self._read_index_servers()\n repo_config = self._find_repo_config(servers, repository)\n return repo_config"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreplace the parameters cell with the new version of the parameters cell.", "response": "def replace_parameters(context, nb, parameters):\n # Uma: This is a copy-paste from papermill papermill/execute.py:104 (execute_parameters).\n # Typically, papermill injects the injected-parameters cell *below* the parameters cell\n # but we want to *replace* the parameters cell, which is what this function does.\n\n '''Assigned parameters into the appropiate place in the input notebook\n Args:\n nb (NotebookNode): Executable notebook object\n parameters (dict): Arbitrary keyword arguments to pass to the notebook parameters.\n '''\n\n # Copy the nb object to avoid polluting the input\n nb = copy.deepcopy(nb)\n\n # Generate parameter content based on the kernel_name\n param_content = DagsterTranslator.codify(parameters)\n # papermill method choosed translator based on kernel_name and language,\n # but we just call the DagsterTranslator\n # translate_parameters(kernel_name, language, parameters)\n newcell = nbformat.v4.new_code_cell(source=param_content)\n newcell.metadata['tags'] = ['injected-parameters']\n\n param_cell_index = _find_first_tagged_cell_index(nb, 'parameters')\n injected_cell_index = _find_first_tagged_cell_index(nb, 'injected-parameters')\n if injected_cell_index >= 0:\n # Replace the injected cell with a new version\n before = nb.cells[:injected_cell_index]\n after = nb.cells[injected_cell_index + 1 :]\n check.int_value_param(param_cell_index, -1, 'param_cell_index')\n # We should have blown away the parameters cell if there is an injected-parameters cell\n elif param_cell_index >= 0:\n # Replace the parameter cell with the injected-parameters cell\n before = nb.cells[:param_cell_index]\n after = nb.cells[param_cell_index + 1 :]\n else:\n # Inject to the top of the notebook, presumably first cell includes dagstermill import\n context.log.debug(\n (\n 'Warning notebook has no parameters cell, '\n 'so first cell must import dagstermill and call dm.register_repo()'\n )\n )\n before = nb.cells[:1]\n after = nb.cells[1:]\n\n nb.cells = before + [newcell] + after\n nb.metadata.papermill['parameters'] = parameters\n\n return nb"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef nonce_solid(name, n_inputs, n_outputs):\n\n @solid(\n name=name,\n inputs=[\n InputDefinition(name='input_{}'.format(i)) for i in range(n_inputs)\n ],\n outputs=[\n OutputDefinition(name='output_{}'.format(i))\n for i in range(n_outputs)\n ],\n )\n def solid_fn(context, **_kwargs):\n for i in range(200):\n time.sleep(0.02)\n if i % 1000 == 420:\n context.log.error(\n 'Error message seq={i} from solid {name}'.format(\n i=i, name=name\n )\n )\n elif i % 100 == 0:\n context.log.warning(\n 'Warning message seq={i} from solid {name}'.format(\n i=i, name=name\n )\n )\n elif i % 10 == 0:\n context.log.info(\n 'Info message seq={i} from solid {name}'.format(\n i=i, name=name\n )\n )\n else:\n context.log.debug(\n 'Debug message seq={i} from solid {name}'.format(\n i=i, name=name\n )\n )\n return MultipleResults.from_dict(\n {'output_{}'.format(i): 'foo' for i in range(n_outputs)}\n )\n\n return solid_fn", "response": "Creates a solid with the given number of inputs and outputs."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef format_config_for_graphql(config):\n '''This recursive descent thing formats a config dict for GraphQL.'''\n\n def _format_config_subdict(config, current_indent=0):\n check.dict_param(config, 'config', key_type=str)\n\n printer = IndentingStringIoPrinter(indent_level=2, current_indent=current_indent)\n printer.line('{')\n\n n_elements = len(config)\n for i, key in enumerate(sorted(config, key=lambda x: x[0])):\n value = config[key]\n with printer.with_indent():\n formatted_value = (\n _format_config_item(value, current_indent=printer.current_indent)\n .lstrip(' ')\n .rstrip('\\n')\n )\n printer.line(\n '{key}: {formatted_value}{comma}'.format(\n key=key,\n formatted_value=formatted_value,\n comma=',' if i != n_elements - 1 else '',\n )\n )\n printer.line('}')\n\n return printer.read()\n\n def _format_config_sublist(config, current_indent=0):\n printer = IndentingStringIoPrinter(indent_level=2, current_indent=current_indent)\n printer.line('[')\n\n n_elements = len(config)\n for i, value in enumerate(config):\n with printer.with_indent():\n formatted_value = (\n _format_config_item(value, current_indent=printer.current_indent)\n .lstrip(' ')\n .rstrip('\\n')\n )\n printer.line(\n '{formatted_value}{comma}'.format(\n formatted_value=formatted_value, comma=',' if i != n_elements - 1 else ''\n )\n )\n printer.line(']')\n\n return printer.read()\n\n def _format_config_item(config, current_indent=0):\n printer = IndentingStringIoPrinter(indent_level=2, current_indent=current_indent)\n\n if isinstance(config, dict):\n return _format_config_subdict(config, printer.current_indent)\n elif isinstance(config, list):\n return _format_config_sublist(config, printer.current_indent)\n elif isinstance(config, bool):\n return repr(config).lower()\n else:\n return repr(config).replace('\\'', '\"')\n\n check.dict_param(config, 'config', key_type=str)\n if not isinstance(config, dict):\n check.failed('Expected a dict to format as config, got: {item}'.format(item=repr(config)))\n\n return _format_config_subdict(config)", "response": "This recursive descent thing formats a config dict for GraphQL."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget a pipeline by name. Only constructs that pipeline and caches it.", "response": "def get_pipeline(self, name):\n '''Get a pipeline by name. Only constructs that pipeline and caches it.\n\n Args:\n name (str): Name of the pipeline to retriever\n\n Returns:\n PipelineDefinition: Instance of PipelineDefinition with that name.\n '''\n check.str_param(name, 'name')\n\n if name in self._pipeline_cache:\n return self._pipeline_cache[name]\n\n try:\n pipeline = self.pipeline_dict[name]()\n except KeyError:\n raise DagsterInvariantViolationError(\n 'Could not find pipeline \"{name}\". Found: {pipeline_names}.'.format(\n name=name,\n pipeline_names=', '.join(\n [\n '\"{pipeline_name}\"'.format(pipeline_name=pipeline_name)\n for pipeline_name in self.pipeline_dict.keys()\n ]\n ),\n )\n )\n check.invariant(\n pipeline.name == name,\n 'Name does not match. Name in dict {name}. Name in pipeline {pipeline.name}'.format(\n name=name, pipeline=pipeline\n ),\n )\n\n self._pipeline_cache[name] = check.inst(\n pipeline,\n PipelineDefinition,\n (\n 'Function passed into pipeline_dict with key {key} must return a '\n 'PipelineDefinition'\n ).format(key=name),\n )\n\n return pipeline"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_all_pipelines(self):\n '''Return all pipelines as a list\n\n Returns:\n List[PipelineDefinition]:\n\n '''\n pipelines = list(map(self.get_pipeline, self.pipeline_dict.keys()))\n # This does uniqueness check\n self._construct_solid_defs(pipelines)\n return pipelines", "response": "Return all pipelines as a list of PipelineDefinition objects"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsparking configuration. See the Spark documentation for reference: https://spark.apache.org/docs/latest/submitting-applications.html", "response": "def define_spark_config():\n '''Spark configuration.\n\n See the Spark documentation for reference:\n https://spark.apache.org/docs/latest/submitting-applications.html\n '''\n\n master_url = Field(\n String,\n description='The master URL for the cluster (e.g. spark://23.195.26.187:7077)',\n is_optional=False,\n )\n\n deploy_mode = Field(\n SparkDeployMode,\n description='''Whether to deploy your driver on the worker nodes (cluster) or locally as an\n external client (client) (default: client). A common deployment strategy is to submit your\n application from a gateway machine that is physically co-located with your worker machines\n (e.g. Master node in a standalone EC2 cluster). In this setup, client mode is appropriate. \n In client mode, the driver is launched directly within the spark-submit process which acts \n as a client to the cluster. The input and output of the application is attached to the \n console. Thus, this mode is especially suitable for applications that involve the REPL (e.g.\n Spark shell).''',\n is_optional=True,\n )\n\n application_jar = Field(\n Path,\n description='''Path to a bundled jar including your application and all\n dependencies. The URL must be globally visible inside of your cluster, for\n instance, an hdfs:// path or a file:// path that is present on all nodes.\n ''',\n is_optional=False,\n )\n\n application_arguments = Field(\n String,\n description='Arguments passed to the main method of your main class, if any',\n is_optional=True,\n )\n\n spark_home = Field(\n String,\n description='The path to your spark installation. Defaults to $SPARK_HOME at runtime if not provided.',\n is_optional=True,\n )\n\n spark_outputs = Field(List(String), description='The outputs that this Spark job will produce')\n\n return Field(\n Dict(\n fields={\n 'master_url': master_url,\n 'deploy_mode': deploy_mode,\n 'application_jar': application_jar,\n 'spark_conf': spark_config(),\n 'spark_home': spark_home,\n 'application_arguments': application_arguments,\n 'spark_outputs': spark_outputs,\n }\n )\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_next_event(process, queue):\n '''\n This function polls the process until it returns a valid\n item or returns PROCESS_DEAD_AND_QUEUE_EMPTY if it is in\n a state where the process has terminated and the queue is empty\n\n Warning: if the child process is in an infinite loop. This will\n also infinitely loop.\n '''\n while True:\n try:\n return queue.get(block=True, timeout=TICK)\n except multiprocessing.queues.Empty:\n if not process.is_alive():\n # There is a possibility that after the last queue.get the\n # process created another event and then died. In that case\n # we want to continue draining the queue.\n try:\n return queue.get(block=False)\n except multiprocessing.queues.Empty:\n # If the queue empty we know that there are no more events\n # and that the process has died.\n return PROCESS_DEAD_AND_QUEUE_EMPTY\n\n check.failed('unreachable')", "response": "This function polls the process until it returns a valid\n item or returns PROCESS_DEAD_AND_QUEUE_EMPTY"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nexecutes a pipeline through a message queue.", "response": "def execute_pipeline_through_queue(\n repository_info,\n pipeline_name,\n solid_subset,\n environment_dict,\n run_id,\n message_queue,\n reexecution_config,\n step_keys_to_execute,\n):\n \"\"\"\n Execute pipeline using message queue as a transport\n \"\"\"\n\n message_queue.put(ProcessStartedSentinel(os.getpid()))\n\n run_config = RunConfig(\n run_id,\n event_callback=message_queue.put,\n executor_config=InProcessExecutorConfig(raise_on_error=False),\n reexecution_config=reexecution_config,\n step_keys_to_execute=step_keys_to_execute,\n )\n\n repository_container = RepositoryContainer(repository_info)\n if repository_container.repo_error:\n message_queue.put(\n MultiprocessingError(\n serializable_error_info_from_exc_info(repository_container.repo_error)\n )\n )\n return\n\n try:\n result = execute_pipeline(\n repository_container.repository.get_pipeline(pipeline_name).build_sub_pipeline(\n solid_subset\n ),\n environment_dict,\n run_config=run_config,\n )\n return result\n except: # pylint: disable=W0702\n error_info = serializable_error_info_from_exc_info(sys.exc_info())\n message_queue.put(MultiprocessingError(error_info))\n finally:\n message_queue.put(MultiprocessingDone())\n message_queue.close()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nwaiting until all processes are enqueued.", "response": "def join(self):\n '''Waits until all there are no processes enqueued.'''\n while True:\n with self._processes_lock:\n if not self._processes and self._processing_semaphore.locked():\n return True\n gevent.sleep(0.1)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn a new instance of a Field class that can be used to create a new instance of the type optionality defaults and description.", "response": "def Field(\n dagster_type,\n default_value=FIELD_NO_DEFAULT_PROVIDED,\n is_optional=INFER_OPTIONAL_COMPOSITE_FIELD,\n is_secret=False,\n description=None,\n):\n '''\n The schema for configuration data that describes the type, optionality, defaults, and description.\n\n Args:\n dagster_type (DagsterType):\n A ``DagsterType`` describing the schema of this field, ie `Dict({'example': Field(String)})`\n default_value (Any):\n A default value to use that respects the schema provided via dagster_type\n is_optional (bool): Whether the presence of this field is optional\n despcription (str):\n '''\n config_type = resolve_to_config_type(dagster_type)\n if not config_type:\n raise DagsterInvalidDefinitionError(\n (\n 'Attempted to pass {value_repr} to a Field that expects a valid '\n 'dagster type usable in config (e.g. Dict, NamedDict, Int, String et al).'\n ).format(value_repr=repr(dagster_type))\n )\n return FieldImpl(\n config_type=resolve_to_config_type(dagster_type),\n default_value=default_value,\n is_optional=is_optional,\n is_secret=is_secret,\n description=description,\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndefines the Snowflake configuration.", "response": "def define_snowflake_config():\n '''Snowflake configuration.\n\n See the Snowflake documentation for reference:\n https://docs.snowflake.net/manuals/user-guide/python-connector-api.html\n '''\n\n account = Field(\n String,\n description='Your Snowflake account name. For more details, see https://bit.ly/2FBL320.',\n is_optional=True,\n )\n\n user = Field(String, description='User login name.', is_optional=False)\n\n password = Field(String, description='User password.', is_optional=False)\n\n database = Field(\n String,\n description='''Name of the default database to use. After login, you can use USE DATABASE\n to change the database.''',\n is_optional=True,\n )\n\n schema = Field(\n String,\n description='''Name of the default schema to use. After login, you can use USE SCHEMA to \n change the schema.''',\n is_optional=True,\n )\n\n role = Field(\n String,\n description='''Name of the default role to use. After login, you can use USE ROLE to change\n the role.''',\n is_optional=True,\n )\n\n warehouse = Field(\n String,\n description='''Name of the default warehouse to use. After login, you can use USE WAREHOUSE\n to change the role.''',\n is_optional=True,\n )\n\n autocommit = Field(\n Bool,\n description='''None by default, which honors the Snowflake parameter AUTOCOMMIT. Set to True\n or False to enable or disable autocommit mode in the session, respectively.''',\n is_optional=True,\n )\n\n client_prefetch_threads = Field(\n Int,\n description='''Number of threads used to download the results sets (4 by default).\n Increasing the value improves fetch performance but requires more memory.''',\n is_optional=True,\n )\n\n client_session_keep_alive = Field(\n String,\n description='''False by default. Set this to True to keep the session active indefinitely,\n even if there is no activity from the user. Make certain to call the close method to\n terminate the thread properly or the process may hang.''',\n is_optional=True,\n )\n\n login_timeout = Field(\n Int,\n description='''Timeout in seconds for login. By default, 60 seconds. The login request gives\n up after the timeout length if the HTTP response is \"success\".''',\n is_optional=True,\n )\n\n network_timeout = Field(\n Int,\n description='''Timeout in seconds for all other operations. By default, none/infinite. A\n general request gives up after the timeout length if the HTTP response is not \"success\"''',\n is_optional=True,\n )\n\n ocsp_response_cache_filename = Field(\n Path,\n description='''URI for the OCSP response cache file.\n By default, the OCSP response cache file is created in the cache directory.''',\n is_optional=True,\n )\n\n validate_default_parameters = Field(\n Bool,\n description='''False by default. Raise an exception if either one of specified database,\n schema or warehouse doesn't exists if True.''',\n is_optional=True,\n )\n\n paramstyle = Field(\n # TODO should validate only against permissible values for this\n String,\n description='''pyformat by default for client side binding. Specify qmark or numeric to\n change bind variable formats for server side binding.''',\n is_optional=True,\n )\n\n timezone = Field(\n String,\n description='''None by default, which honors the Snowflake parameter TIMEZONE. Set to a\n valid time zone (e.g. America/Los_Angeles) to set the session time zone.''',\n is_optional=True,\n )\n\n return Field(\n Dict(\n fields={\n 'account': account,\n 'user': user,\n 'password': password,\n 'database': database,\n 'schema': schema,\n 'role': role,\n 'warehouse': warehouse,\n 'autocommit': autocommit,\n 'client_prefetch_threads': client_prefetch_threads,\n 'client_session_keep_alive': client_session_keep_alive,\n 'login_timeout': login_timeout,\n 'network_timeout': network_timeout,\n 'ocsp_response_cache_filename': ocsp_response_cache_filename,\n 'validate_default_parameters': validate_default_parameters,\n 'paramstyle': paramstyle,\n 'timezone': timezone,\n }\n ),\n description='Snowflake configuration',\n )"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef build(self, pipeline_def, artifacts_persisted):\n '''Builds the execution plan.\n '''\n\n # Construct dependency dictionary\n deps = {step.key: set() for step in self.steps}\n\n for step in self.steps:\n for step_input in step.step_inputs:\n deps[step.key].add(step_input.prev_output_handle.step_key)\n\n step_dict = {step.key: step for step in self.steps}\n\n return ExecutionPlan(pipeline_def, step_dict, deps, artifacts_persisted)", "response": "Builds the execution plan."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nbuild a sub - pipeline which is a subset of another pipeline.", "response": "def _build_sub_pipeline(pipeline_def, solid_names):\n '''\n Build a pipeline which is a subset of another pipeline.\n Only includes the solids which are in solid_names.\n '''\n\n check.inst_param(pipeline_def, 'pipeline_def', PipelineDefinition)\n check.list_param(solid_names, 'solid_names', of_type=str)\n\n solid_name_set = set(solid_names)\n solids = list(map(pipeline_def.solid_named, solid_names))\n deps = {_dep_key_of(solid): {} for solid in solids}\n\n def _out_handle_of_inp(input_handle):\n if pipeline_def.dependency_structure.has_dep(input_handle):\n output_handle = pipeline_def.dependency_structure.get_dep(input_handle)\n if output_handle.solid.name in solid_name_set:\n return output_handle\n return None\n\n for solid in solids:\n for input_handle in solid.input_handles():\n output_handle = _out_handle_of_inp(input_handle)\n if output_handle:\n deps[_dep_key_of(solid)][input_handle.input_def.name] = DependencyDefinition(\n solid=output_handle.solid.name, output=output_handle.output_def.name\n )\n\n return PipelineDefinition(\n name=pipeline_def.name,\n solids=list({solid.definition for solid in solids}),\n context_definitions=pipeline_def.context_definitions,\n dependencies=deps,\n )"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the solid with correct name. Throws if it does not exist.", "response": "def solid_named(self, name):\n '''Return the solid named \"name\". Throws if it does not exist.\n\n Args:\n name (str): Name of solid\n\n Returns:\n SolidDefinition: SolidDefinition with correct name.\n '''\n check.str_param(name, 'name')\n if name not in self._solid_dict:\n raise DagsterInvariantViolationError(\n 'Pipeline {pipeline_name} has no solid named {name}.'.format(\n pipeline_name=self.name, name=name\n )\n )\n return self._solid_dict[name]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting the shell commands we ll use to actually build and publish a package to PyPI.", "response": "def construct_publish_comands(additional_steps=None, nightly=False):\n '''Get the shell commands we'll use to actually build and publish a package to PyPI.'''\n publish_commands = (\n ['rm -rf dist']\n + (additional_steps if additional_steps else [])\n + [\n 'python setup.py sdist bdist_wheel{nightly}'.format(\n nightly=' --nightly' if nightly else ''\n ),\n 'twine upload dist/*',\n ]\n )\n\n return publish_commands"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef publish(nightly):\n\n try:\n RCParser.from_file()\n except ConfigFileError:\n raise ConfigFileError(PYPIRC_EXCEPTION_MESSAGE)\n\n assert '\\nwheel' in subprocess.check_output(['pip', 'list']).decode('utf-8'), (\n 'You must have wheel installed in order to build packages for release -- run '\n '`pip install wheel`.'\n )\n\n assert which_('twine'), (\n 'You must have twin installed in order to upload packages to PyPI -- run '\n '`pip install twine`.'\n )\n\n assert which_('yarn'), (\n 'You must have yarn installed in order to build dagit for release -- see '\n 'https://yarnpkg.com/lang/en/docs/install/'\n )\n\n print('Checking that module versions are in lockstep')\n check_versions(nightly=nightly)\n if not nightly:\n print('... and match git tag on most recent commit...')\n check_git_status()\n\n print('Publishing packages to PyPI...')\n\n if nightly:\n new_version = increment_nightly_versions()\n commit_new_version('nightly: {nightly}'.format(nightly=new_version['__nightly__']))\n set_git_tag('{nightly}'.format(nightly=new_version['__nightly__']))\n git_push()\n git_push(tags=True)\n publish_all(nightly)", "response": "Publishes all submodules to PyPI."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef release(version):\n check_new_version(version)\n set_new_version(version)\n commit_new_version(version)\n set_git_tag(version)", "response": "Tags all submodules for a new release."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate a context definition from a pre - existing context. This can be useful in testing contexts where you want to use a one - off context.", "response": "def passthrough_context_definition(context_params):\n '''Create a context definition from a pre-existing context. This can be useful\n in testing contexts where you may want to create a context manually and then\n pass it into a one-off PipelineDefinition\n\n Args:\n context (ExecutionContext): The context that will provided to the pipeline.\n Returns:\n PipelineContextDefinition: The passthrough context definition.\n '''\n\n check.inst_param(context_params, 'context', ExecutionContext)\n context_definition = PipelineContextDefinition(context_fn=lambda *_args: context_params)\n return {DEFAULT_CONTEXT_NAME: context_definition}"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef block(self, text, prefix=''):\n '''Automagically wrap a block of text.'''\n wrapper = TextWrapper(\n width=self.line_length - len(self.current_indent_str),\n initial_indent=prefix,\n subsequent_indent=prefix,\n break_long_words=False,\n break_on_hyphens=False,\n )\n for line in wrapper.wrap(text):\n self.line(line)", "response": "Automagically wrap a block of text."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ndefining the fields that are shared between both QueryJobConfig and LoadJobConfig.", "response": "def _define_shared_fields():\n '''The following fields are shared between both QueryJobConfig and LoadJobConfig.\n '''\n\n clustering_fields = Field(\n List(String),\n description='''Fields defining clustering for the table\n\n (Defaults to None).\n\n Clustering fields are immutable after table creation.\n ''',\n is_optional=True,\n )\n\n create_disposition = Field(\n BQCreateDisposition,\n description='''Specifies behavior for creating tables.\n See https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.createDisposition\n ''',\n is_optional=True,\n )\n\n destination_encryption_configuration = Field(\n String,\n description='''Custom encryption configuration for the destination table.\n Custom encryption configuration (e.g., Cloud KMS keys) or None if using default encryption.\n See https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.destinationEncryptionConfiguration\n ''',\n is_optional=True,\n )\n\n schema_update_options = Field(\n List(BQSchemaUpdateOption),\n description='''Specifies updates to the destination table schema to allow as a side effect\n of the query job.''',\n is_optional=True,\n )\n\n time_partitioning = Field(\n Dict(\n fields={\n 'expiration_ms': Field(\n Int,\n description='''Number of milliseconds for which to keep the storage for a\n partition.''',\n is_optional=True,\n ),\n 'field': Field(\n String,\n description='''If set, the table is partitioned by this field. If not set, the\n table is partitioned by pseudo column _PARTITIONTIME. The field must be a\n top-level TIMESTAMP or DATE field. Its mode must be NULLABLE or REQUIRED.''',\n is_optional=True,\n ),\n 'require_partition_filter': Field(\n Bool,\n description='''If set to true, queries over the partitioned table require a\n partition filter that can be used for partition elimination to be specified.''',\n is_optional=True,\n ),\n }\n ),\n description='Specifies time-based partitioning for the destination table.',\n is_optional=True,\n )\n\n write_disposition = Field(\n BQWriteDisposition,\n description='''\n Action that occurs if the destination table already exists.\n See https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.writeDisposition\n ''',\n is_optional=True,\n )\n return {\n 'clustering_fields': clustering_fields,\n 'create_disposition': create_disposition,\n 'destination_encryption_configuration': destination_encryption_configuration,\n 'schema_update_options': schema_update_options,\n 'time_partitioning': time_partitioning,\n 'write_disposition': write_disposition,\n }"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndefines the bigquery query config.", "response": "def define_bigquery_query_config():\n '''See:\n https://googleapis.github.io/google-cloud-python/latest/bigquery/generated/google.cloud.bigquery.job.QueryJobConfig.html\n '''\n sf = _define_shared_fields()\n\n allow_large_results = Field(\n Bool,\n description='''Allow large query results tables (legacy SQL, only)\n See https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.allowLargeResults\n ''',\n is_optional=True,\n )\n\n default_dataset = Field(\n Dataset,\n description='''the default dataset to use for unqualified table names in the query or None\n if not set. The default_dataset setter accepts a str of the fully-qualified dataset ID in\n standard SQL format. The value must included a project ID and dataset ID separated by \".\".\n For example: your-project.your_dataset.\n See https://g.co/cloud/bigquery/docs/reference/v2/jobs#configuration.query.defaultDataset\n ''',\n is_optional=True,\n )\n\n destination = Field(\n Table,\n description='''table where results are written or None if not set. The destination setter\n accepts a str of the fully-qualified table ID in standard SQL format. The value must\n included a project ID, dataset ID, and table ID, each separated by \".\". For example:\n your-project.your_dataset.your_table.\n See https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.destinationTable\n ''',\n is_optional=True,\n )\n\n dry_run = Field(\n Bool,\n description='''True if this query should be a dry run to estimate costs.\n See https://g.co/cloud/bigquery/docs/reference/v2/jobs#configuration.dryRun\n ''',\n is_optional=True,\n )\n\n flatten_results = Field(\n Bool,\n description='''Flatten nested/repeated fields in results. (Legacy SQL only)\n See https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.flattenResults\n ''',\n is_optional=True,\n )\n\n maximum_billing_tier = Field(\n Int,\n description='''Deprecated. Changes the billing tier to allow high-compute queries.\n See https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.maximumBillingTier\n ''',\n is_optional=True,\n )\n\n maximum_bytes_billed = Field(\n Int,\n description='''Maximum bytes to be billed for this job or None if not set.\n\n See https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.maximumBytesBilled\n ''',\n is_optional=True,\n )\n\n priority = Field(\n BQPriority,\n description='''Priority of the query.\n See https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.priority\n ''',\n is_optional=True,\n )\n\n query_parameters = Field(\n List(String),\n description='''list of parameters for parameterized query (empty by default)\n See: https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.queryParameters\n ''',\n is_optional=True,\n )\n\n # TODO:\n # Type:\tDict[str, google.cloud.bigquery.external_config.ExternalConfig]\n # table_definitions = Field(\n # PermissiveDict(),\n # description='''Definitions for external tables or None if not set.\n # See https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.tableDefinitions\n # ''',\n # is_optional=True,\n # )\n\n # TODO: Need to add this\n # Type:\tList[google.cloud.bigquery.query.UDFResource]\n # udf_resources = Field(\n # String,\n # description='''user defined function resources (empty by default)\n # See: https://g.co/cloud/bigquery/docs/reference/rest/v2/jobs#configuration.query.userDefinedFunctionResources\n # ''',\n # is_optional=True\n # )\n\n use_legacy_sql = Field(\n Bool,\n description='''Use legacy SQL syntax.\n See https://g.co/cloud/bigquery/docs/reference/v2/jobs#configuration.query.useLegacySql\n ''',\n is_optional=True,\n )\n\n use_query_cache = Field(\n Bool,\n description='''Look for the query result in the cache.\n See https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.query.useQueryCache\n ''',\n is_optional=True,\n )\n\n return Field(\n Dict(\n fields={\n 'query_job_config': Field(\n Dict(\n fields={\n 'allow_large_results': allow_large_results,\n 'clustering_fields': sf['clustering_fields'],\n 'create_disposition': sf['create_disposition'],\n 'default_dataset': default_dataset,\n 'destination': destination,\n 'destination_encryption_configuration': sf[\n 'destination_encryption_configuration'\n ],\n 'dry_run': dry_run,\n 'flatten_results': flatten_results,\n # TODO: labels\n 'maximum_billing_tier': maximum_billing_tier,\n 'maximum_bytes_billed': maximum_bytes_billed,\n 'priority': priority,\n 'query_parameters': query_parameters,\n # TODO: table_definitions\n 'schema_update_options': sf['schema_update_options'],\n 'time_partitioning': sf['time_partitioning'],\n # TODO: udf_resources\n 'use_legacy_sql': use_legacy_sql,\n 'use_query_cache': use_query_cache,\n 'write_disposition': sf['write_disposition'],\n }\n )\n )\n }\n ),\n description='BigQuery query configuration',\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn a new SQL solid that executes and materializes a SQL select statement.", "response": "def sql_solid(name, select_statement, materialization_strategy, table_name=None, inputs=None):\n '''Return a new solid that executes and materializes a SQL select statement.\n\n Args:\n name (str): The name of the new solid.\n select_statement (str): The select statement to execute.\n materialization_strategy (str): Must be 'table', the only currently supported\n materialization strategy. If 'table', the kwarg `table_name` must also be passed.\n Kwargs:\n table_name (str): THe name of the new table to create, if the materialization strategy\n is 'table'. Default: None.\n inputs (list[InputDefinition]): Inputs, if any, for the new solid. Default: None.\n\n Returns:\n function:\n The new SQL solid.\n '''\n inputs = check.opt_list_param(inputs, 'inputs', InputDefinition)\n\n materialization_strategy_output_types = { # pylint:disable=C0103\n 'table': SqlTableName,\n # 'view': String,\n # 'query': SqlAlchemyQueryType,\n # 'subquery': SqlAlchemySubqueryType,\n # 'result_proxy': SqlAlchemyResultProxyType,\n # could also materialize as a Pandas table, as a Spark table, as an intermediate file, etc.\n }\n\n if materialization_strategy not in materialization_strategy_output_types:\n raise Exception(\n 'Invalid materialization strategy {materialization_strategy}, must '\n 'be one of {materialization_strategies}'.format(\n materialization_strategy=materialization_strategy,\n materialization_strategies=str(list(materialization_strategy_output_types.keys())),\n )\n )\n\n if materialization_strategy == 'table':\n if table_name is None:\n raise Exception('Missing table_name: required for materialization strategy \\'table\\'')\n\n output_description = (\n 'The string name of the new table created by the solid'\n if materialization_strategy == 'table'\n else 'The materialized SQL statement. If the materialization_strategy is '\n '\\'table\\', this is the string name of the new table created by the solid.'\n )\n\n description = '''This solid executes the following SQL statement:\n {select_statement}'''.format(\n select_statement=select_statement\n )\n\n # n.b., we will eventually want to make this resources key configurable\n sql_statement = (\n 'drop table if exists {table_name};\\n' 'create table {table_name} as {select_statement};'\n ).format(table_name=table_name, select_statement=select_statement)\n\n def transform_fn(context, _inputs):\n '''Inner function defining the new solid.\n\n Args:\n context (TransformExecutionContext): Must expose a `db` resource with an `execute` method,\n like a SQLAlchemy engine, that can execute raw SQL against a database.\n\n Returns:\n str:\n The table name of the newly materialized SQL select statement.\n '''\n\n context.log.info(\n 'Executing sql statement:\\n{sql_statement}'.format(sql_statement=sql_statement)\n )\n context.resources.db_info.engine.execute(text(sql_statement))\n yield Result(value=table_name, output_name='result')\n\n return SolidDefinition(\n name=name,\n inputs=inputs,\n outputs=[\n OutputDefinition(\n materialization_strategy_output_types[materialization_strategy],\n description=output_description,\n )\n ],\n transform_fn=transform_fn,\n description=description,\n metadata={'kind': 'sql', 'sql': sql_statement},\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndownloads an object from s3.", "response": "def download_from_s3(context):\n '''Download an object from s3.\n\n Args:\n info (ExpectationExecutionInfo): Must expose a boto3 S3 client as its `s3` resource.\n\n Returns:\n str:\n The path to the downloaded object.\n '''\n target_file = context.solid_config['target_file']\n return context.resources.download_manager.download_file_contents(context, target_file)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nupload a file to s3.", "response": "def upload_to_s3(context, file_obj):\n '''Upload a file to s3.\n\n Args:\n info (ExpectationExecutionInfo): Must expose a boto3 S3 client as its `s3` resource.\n\n Returns:\n (str, str):\n The bucket and key to which the file was uploaded.\n '''\n bucket = context.solid_config['bucket']\n key = context.solid_config['key']\n\n context.resources.s3.put_object(\n Bucket=bucket, Body=file_obj.read(), Key=key, **(context.solid_config.get('kwargs') or {})\n )\n yield Result(bucket, 'bucket')\n yield Result(key, 'key')"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nwrap the execution of user-space code in an error boundary. This places a uniform policy around an user code invoked by the framework. This ensures that all user errors are wrapped in the DagsterUserCodeExecutionError, and that the original stack trace of the user error is preserved, so that it can be reported without confusing framework code in the stack trace, if a tool author wishes to do so. This has been especially help in a notebooking context.", "response": "def user_code_error_boundary(error_cls, msg, **kwargs):\n '''\n Wraps the execution of user-space code in an error boundary. This places a uniform\n policy around an user code invoked by the framework. This ensures that all user\n errors are wrapped in the DagsterUserCodeExecutionError, and that the original stack\n trace of the user error is preserved, so that it can be reported without confusing\n framework code in the stack trace, if a tool author wishes to do so. This has\n been especially help in a notebooking context.\n '''\n check.str_param(msg, 'msg')\n check.subclass_param(error_cls, 'error_cls', DagsterUserCodeExecutionError)\n\n try:\n yield\n except Exception as e: # pylint: disable=W0703\n if isinstance(e, DagsterError):\n # The system has thrown an error that is part of the user-framework contract\n raise e\n else:\n # An exception has been thrown by user code and computation should cease\n # with the error reported further up the stack\n raise_from(\n error_cls(msg, user_exception=e, original_exc_info=sys.exc_info(), **kwargs), e\n )"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef mkdir_p(newdir, mode=0o777):\n try:\n os.makedirs(newdir, mode)\n except OSError as err:\n # Reraise the error unless it's about an already existing directory\n if err.errno != errno.EEXIST or not os.path.isdir(newdir):\n raise", "response": "The missing mkdir - p functionality in os. makedirs."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nwrapping the output of a user provided function that may yield or return a value and returns a generator that asserts it only yields a single value.", "response": "def user_code_context_manager(user_fn, error_cls, msg):\n '''Wraps the output of a user provided function that may yield or return a value and\n returns a generator that asserts it only yields a single value.\n '''\n check.callable_param(user_fn, 'user_fn')\n check.subclass_param(error_cls, 'error_cls', DagsterUserCodeExecutionError)\n\n with user_code_error_boundary(error_cls, msg):\n thing_or_gen = user_fn()\n gen = _ensure_gen(thing_or_gen)\n\n try:\n thing = next(gen)\n except StopIteration:\n check.failed('Must yield one item. You did not yield anything.')\n\n yield thing\n\n stopped = False\n\n try:\n next(gen)\n except StopIteration:\n stopped = True\n\n check.invariant(stopped, 'Must yield one item. Yielded more than one item')"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef construct_run_storage(run_config, environment_config):\n '''\n Construct the run storage for this pipeline. Our rules are the following:\n\n If the RunConfig has a storage_mode provided, we use that.\n\n Then we fallback to environment config.\n\n If there is no config, we default to in memory storage. This is mostly so\n that tests default to in-memory.\n '''\n check.inst_param(run_config, 'run_config', RunConfig)\n check.inst_param(environment_config, 'environment_config', EnvironmentConfig)\n\n if run_config.storage_mode:\n if run_config.storage_mode == RunStorageMode.FILESYSTEM:\n return FileSystemRunStorage()\n elif run_config.storage_mode == RunStorageMode.IN_MEMORY:\n return InMemoryRunStorage()\n elif run_config.storage_mode == RunStorageMode.S3:\n # TODO: Revisit whether we want to use S3 run storage\n return FileSystemRunStorage()\n else:\n check.failed('Unexpected enum {}'.format(run_config.storage_mode))\n elif environment_config.storage.storage_mode == 'filesystem':\n return FileSystemRunStorage()\n elif environment_config.storage.storage_mode == 'in_memory':\n return InMemoryRunStorage()\n elif environment_config.storage.storage_mode == 's3':\n # TODO: Revisit whether we want to use S3 run storage\n return FileSystemRunStorage()\n elif environment_config.storage.storage_mode is None:\n return InMemoryRunStorage()\n else:\n raise DagsterInvariantViolationError(\n 'Invalid storage specified {}'.format(environment_config.storage.storage_mode)\n )", "response": "Construct the run storage for this pipeline."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _create_context_free_log(run_config, pipeline_def):\n '''In the event of pipeline initialization failure, we want to be able to log the failure\n without a dependency on the ExecutionContext to initialize DagsterLog\n '''\n check.inst_param(run_config, 'run_config', RunConfig)\n check.inst_param(pipeline_def, 'pipeline_def', PipelineDefinition)\n\n # Use the default logger\n loggers = [define_colored_console_logger('dagster')]\n if run_config.event_callback:\n loggers += [construct_event_logger(run_config.event_callback)]\n elif run_config.loggers:\n loggers += run_config.loggers\n\n return DagsterLog(run_config.run_id, get_logging_tags(None, run_config, pipeline_def), loggers)", "response": "Create a DagsterLog object that can be used to log the failure of a pipeline."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef execute_pipeline_iterator(pipeline, environment_dict=None, run_config=None):\n '''Returns iterator that yields :py:class:`SolidExecutionResult` for each\n solid executed in the pipeline.\n\n This is intended to allow the caller to do things between each executed\n node. For the 'synchronous' API, see :py:func:`execute_pipeline`.\n\n Parameters:\n pipeline (PipelineDefinition): Pipeline to run\n environment_dict (dict): The enviroment configuration that parameterizes this run\n run_config (RunConfig): Configuration for how this pipeline will be executed\n\n Returns:\n Iterator[DagsterEvent]\n '''\n check.inst_param(pipeline, 'pipeline', PipelineDefinition)\n environment_dict = check.opt_dict_param(environment_dict, 'environment_dict')\n run_config = check_run_config_param(run_config)\n environment_config = create_environment_config(pipeline, environment_dict)\n intermediates_manager = construct_intermediates_manager(\n run_config, environment_config, pipeline\n )\n\n with _pipeline_execution_context_manager(\n pipeline, environment_config, run_config, intermediates_manager\n ) as pipeline_context:\n return _execute_pipeline_iterator(pipeline_context)", "response": "Returns an iterator that yields SolidExecutionResult for each\n in the pipeline."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef execute_pipeline(pipeline, environment_dict=None, run_config=None):\n '''\n \"Synchronous\" version of :py:func:`execute_pipeline_iterator`.\n\n Note: raise_on_error is very useful in testing contexts when not testing for error\n conditions\n\n Parameters:\n pipeline (PipelineDefinition): Pipeline to run\n environment_dict (dict): The enviroment configuration that parameterizes this run\n run_config (RunConfig): Configuration for how this pipeline will be executed\n\n Returns:\n :py:class:`PipelineExecutionResult`\n '''\n\n check.inst_param(pipeline, 'pipeline', PipelineDefinition)\n environment_dict = check.opt_dict_param(environment_dict, 'environment_dict')\n run_config = check_run_config_param(run_config)\n environment_config = create_environment_config(pipeline, environment_dict)\n intermediates_manager = construct_intermediates_manager(\n run_config, environment_config, pipeline\n )\n\n with _pipeline_execution_context_manager(\n pipeline, environment_config, run_config, intermediates_manager\n ) as pipeline_context:\n event_list = list(_execute_pipeline_iterator(pipeline_context))\n\n return PipelineExecutionResult(\n pipeline,\n run_config.run_id,\n event_list,\n lambda: _pipeline_execution_context_manager(\n pipeline, environment_config, run_config, intermediates_manager\n ),\n )", "response": "Synchronous version of execute_pipeline_iterator."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget a : py : class : SolidExecutionResult object for a given solid name.", "response": "def result_for_solid(self, name):\n '''Get a :py:class:`SolidExecutionResult` for a given solid name.\n '''\n check.str_param(name, 'name')\n\n if not self.pipeline.has_solid(name):\n raise DagsterInvariantViolationError(\n 'Try to get result for solid {name} in {pipeline}. No such solid.'.format(\n name=name, pipeline=self.pipeline.display_name\n )\n )\n\n if name not in self.solid_result_dict:\n raise DagsterInvariantViolationError(\n 'Did not find result for solid {name} in pipeline execution result'.format(\n name=name\n )\n )\n\n return self.solid_result_dict[name]"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns dictionary of transformed results with keys being output names.", "response": "def transformed_values(self):\n '''Return dictionary of transformed results, with keys being output names.\n Returns None if execution result isn't a success.\n\n Reconstructs the pipeline context to materialize values.\n '''\n if self.success and self.transforms:\n with self.reconstruct_context() as context:\n values = {\n result.step_output_data.output_name: self._get_value(\n context, result.step_output_data\n )\n for result in self.transforms\n if result.is_successful_output\n }\n return values\n else:\n return None"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef transformed_value(self, output_name=DEFAULT_OUTPUT):\n '''Returns transformed value either for DEFAULT_OUTPUT or for the output\n given as output_name. Returns None if execution result isn't a success.\n\n Reconstructs the pipeline context to materialize value.\n '''\n check.str_param(output_name, 'output_name')\n\n if not self.solid.definition.has_output(output_name):\n raise DagsterInvariantViolationError(\n '{output_name} not defined in solid {solid}'.format(\n output_name=output_name, solid=self.solid.name\n )\n )\n\n if self.success:\n for result in self.transforms:\n if (\n result.is_successful_output\n and result.step_output_data.output_name == output_name\n ):\n with self.reconstruct_context() as context:\n value = self._get_value(context, result.step_output_data)\n return value\n\n raise DagsterInvariantViolationError(\n (\n 'Did not find result {output_name} in solid {self.solid.name} '\n 'execution result'\n ).format(output_name=output_name, self=self)\n )\n else:\n return None", "response": "Returns the value of the output_name in the pipeline context or None if the execution result isn t a success."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the failure data that happened during this solid s execution if any", "response": "def failure_data(self):\n '''Returns the failing step's data that happened during this solid's execution, if any'''\n for result in itertools.chain(\n self.input_expectations, self.output_expectations, self.transforms\n ):\n if result.event_type == DagsterEventType.STEP_FAILURE:\n return result.step_failure_data"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef NamedDict(name, fields, description=None, type_attributes=DEFAULT_TYPE_ATTRIBUTES):\n '''\n A :py:class:`Dict` with a name allowing it to be referenced by that name.\n '''\n check_user_facing_fields_dict(fields, 'NamedDict named \"{}\"'.format(name))\n\n class _NamedDict(_ConfigComposite):\n def __init__(self):\n super(_NamedDict, self).__init__(\n key=name,\n name=name,\n fields=fields,\n description=description,\n type_attributes=type_attributes,\n )\n\n return _NamedDict", "response": "A class to create a named dict."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef PermissiveDict(fields=None):\n '''A permissive dict will permit the user to partially specify the permitted fields. Any fields\n that are specified and passed in will be type checked. Other fields will be allowed, but\n will be ignored by the type checker.\n '''\n\n if fields:\n check_user_facing_fields_dict(fields, 'PermissiveDict')\n\n class _PermissiveDict(_ConfigComposite):\n def __init__(self):\n key = 'PermissiveDict.' + str(DictCounter.get_next_count())\n super(_PermissiveDict, self).__init__(\n name=None,\n key=key,\n fields=fields or dict(),\n description='A configuration dictionary with typed fields',\n type_attributes=ConfigTypeAttributes(is_builtin=True),\n )\n\n @property\n def is_permissive_composite(self):\n return True\n\n return _PermissiveDict", "response": "A permissive dict will permit the user to partially specify the permitted fields."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef Selector(fields):\n '''Selectors are used when you want to be able present several different options to the user but\n force them to select one. For example, it would not make much sense to allow them\n to say that a single input should be sourced from a csv and a parquet file: They must choose.\n\n Note that in other type systems this might be called an \"input union.\"\n\n Args:\n fields (Dict[str, Field]):\n '''\n\n check_user_facing_fields_dict(fields, 'Selector')\n\n class _Selector(_ConfigSelector):\n def __init__(self):\n key = 'Selector.' + str(DictCounter.get_next_count())\n super(_Selector, self).__init__(\n key=key,\n name=None,\n fields=fields,\n # description='A configuration dictionary with typed fields',\n type_attributes=ConfigTypeAttributes(is_builtin=True),\n )\n\n return _Selector", "response": "A class that returns a ConfigSelector that can be used to select one or more items from a user but\n ."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _is_valid_dataset(config_value):\n '''Datasets must be of form \"project.dataset\" or \"dataset\"\n '''\n return re.match(\n # regex matches: project.table -- OR -- table\n r'^' + RE_PROJECT + r'\\.' + RE_DS_TABLE + r'$|^' + RE_DS_TABLE + r'$',\n config_value,\n )", "response": "Checks if the dataset is valid"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if the tables are valid for the current project or dataset. table.", "response": "def _is_valid_table(config_value):\n '''Tables must be of form \"project.dataset.table\" or \"dataset.table\"\n '''\n return re.match(\n r'^'\n + RE_PROJECT # project\n + r'\\.' # .\n + RE_DS_TABLE # dataset\n + r'\\.' # .\n + RE_DS_TABLE # table\n + r'$|^' # -- OR --\n + RE_DS_TABLE # dataset\n + r'\\.' # .\n + RE_DS_TABLE # table\n + r'$',\n config_value,\n )"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nexecuting the user - specified transform for the solid.", "response": "def _execute_core_transform(transform_context, inputs):\n '''\n Execute the user-specified transform for the solid. Wrap in an error boundary and do\n all relevant logging and metrics tracking\n '''\n check.inst_param(transform_context, 'transform_context', SystemTransformExecutionContext)\n check.dict_param(inputs, 'inputs', key_type=str)\n\n step = transform_context.step\n solid = step.solid\n\n transform_context.log.debug(\n 'Executing core transform for solid {solid}.'.format(solid=solid.name)\n )\n\n all_results = []\n for step_output in _yield_transform_results(transform_context, inputs):\n yield step_output\n if isinstance(step_output, StepOutputValue):\n all_results.append(step_output)\n\n if len(all_results) != len(solid.definition.output_defs):\n emitted_result_names = {r.output_name for r in all_results}\n solid_output_names = {output_def.name for output_def in solid.definition.output_defs}\n omitted_outputs = solid_output_names.difference(emitted_result_names)\n transform_context.log.info(\n 'Solid {solid} did not fire outputs {outputs}'.format(\n solid=solid.name, outputs=repr(omitted_outputs)\n )\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a Dagster type for the given python class.", "response": "def as_dagster_type(\n existing_type,\n name=None,\n description=None,\n input_schema=None,\n output_schema=None,\n serialization_strategy=None,\n storage_plugins=None,\n):\n '''\n Takes a python cls and creates a type for it in the Dagster domain.\n\n Args:\n existing_type (cls)\n The python type you want to project in to the Dagster type system.\n name (Optional[str]):\n description (Optiona[str]):\n input_schema (Optional[InputSchema]):\n An instance of a class that inherits from :py:class:`InputSchema` that\n can map config data to a value of this type.\n\n output_schema (Optiona[OutputSchema]):\n An instance of a class that inherits from :py:class:`OutputSchema` that\n can map config data to persisting values of this type.\n\n serialization_strategy (Optional[SerializationStrategy]):\n The default behavior for how to serialize this value for\n persisting between execution steps.\n\n storage_plugins (Optional[Dict[RunStorageMode, TypeStoragePlugin]]):\n Storage type specific overrides for the serialization strategy.\n This allows for storage specific optimzations such as effecient\n distributed storage on S3.\n '''\n check.type_param(existing_type, 'existing_type')\n check.opt_str_param(name, 'name')\n check.opt_str_param(description, 'description')\n check.opt_inst_param(input_schema, 'input_schema', InputSchema)\n check.opt_inst_param(output_schema, 'output_schema', OutputSchema)\n check.opt_inst_param(serialization_strategy, 'serialization_strategy', SerializationStrategy)\n storage_plugins = check.opt_dict_param(storage_plugins, 'storage_plugins')\n\n if serialization_strategy is None:\n serialization_strategy = PickleSerializationStrategy()\n\n name = existing_type.__name__ if name is None else name\n\n return _decorate_as_dagster_type(\n existing_type,\n key=name,\n name=name,\n description=description,\n input_schema=input_schema,\n output_schema=output_schema,\n serialization_strategy=serialization_strategy,\n storage_plugins=storage_plugins,\n )"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef resource(config_field=None, description=None):\n '''A decorator for creating a resource. The decorated function will be used as the \n resource_fn in a ResourceDefinition.\n '''\n\n # This case is for when decorator is used bare, without arguments.\n # E.g. @resource versus @resource()\n if callable(config_field):\n return ResourceDefinition(resource_fn=config_field)\n\n def _wrap(resource_fn):\n return ResourceDefinition(resource_fn, config_field, description)\n\n return _wrap", "response": "A decorator for creating a resource. The decorated function will be used as the \n resource_fn in a ResourceDefinition."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nruns Spark s subprocess and returns the return code.", "response": "def run_spark_subprocess(cmd, logger):\n \"\"\"See https://bit.ly/2OpksJC for source of the subprocess stdout/stderr capture pattern in this\n function.\n \"\"\"\n\n # Spark sometimes logs in log4j format. In those cases, we detect and parse.\n # Example log line from Spark that this is intended to match:\n # 2019-03-27 16:00:19 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler...\n log4j_regex = r'^(\\d{4}\\-\\d{2}\\-\\d{2} \\d{2}:\\d{2}:\\d{2}) ([A-Z]{3,5})(.*?)$'\n\n def reader(pipe, pipe_name, p, msg_queue):\n try:\n with pipe:\n while p.poll() is None:\n for line in pipe.readlines():\n match = re.match(log4j_regex, line)\n if match:\n line = match.groups()[2]\n msg_queue.put((pipe_name, line))\n finally:\n # Use None as sentinel for done state, detected by iter() below\n msg_queue.put(None)\n\n p = subprocess.Popen(\n ' '.join(cmd),\n stdout=subprocess.PIPE,\n stderr=subprocess.PIPE,\n bufsize=0,\n universal_newlines=True,\n shell=True,\n )\n q = queue.Queue()\n Thread(target=reader, args=[p.stdout, 'stdout', p, q]).start()\n Thread(target=reader, args=[p.stderr, 'stderr', p, q]).start()\n for _ in range(2): # There will be two None sentinels, one for each stream\n for pipe_name, line in iter(q.get, None):\n if pipe_name == 'stdout':\n logger.info(line)\n elif pipe_name == 'stderr':\n logger.error(line)\n\n p.wait()\n return p.returncode"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef parse_spark_config(spark_conf):\n '''For each key-value pair in spark conf, we need to pass to CLI in format:\n\n --conf \"key=value\"\n '''\n\n spark_conf_list = flatten_dict(spark_conf)\n return list(\n itertools.chain.from_iterable([('--conf', '{}={}'.format(*c)) for c in spark_conf_list])\n )", "response": "Parse spark conf into a list of Spark config files"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef SystemNamedDict(name, fields, description=None):\n '''A SystemNamedDict object is simply a NamedDict intended for internal (dagster) use.\n '''\n return NamedDict(name, fields, description, ConfigTypeAttributes(is_system_config=True))", "response": "A function to create a SystemNamedDict object."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef coalesce_execution_steps(execution_plan):\n '''Groups execution steps by solid, in topological order of the solids.'''\n\n solid_order = _coalesce_solid_order(execution_plan)\n\n steps = defaultdict(list)\n\n for solid_name, solid_steps in itertools.groupby(\n execution_plan.topological_steps(), lambda x: x.solid_name\n ):\n steps[solid_name] += list(solid_steps)\n\n return OrderedDict([(solid_name, steps[solid_name]) for solid_name in solid_order])", "response": "Groups execution steps by solid in topological order of the solids."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndefaulting method to acquire database connection parameters. Sets connection parameters to match settings.py, and sets default values to blank fields.", "response": "def get_connection_params(self):\n \"\"\"\n Default method to acquire database connection parameters.\n\n Sets connection parameters to match settings.py, and sets\n default values to blank fields.\n \"\"\"\n valid_settings = {\n 'NAME': 'name',\n 'HOST': 'host',\n 'PORT': 'port',\n 'USER': 'username',\n 'PASSWORD': 'password',\n 'AUTH_SOURCE': 'authSource',\n 'AUTH_MECHANISM': 'authMechanism',\n 'ENFORCE_SCHEMA': 'enforce_schema',\n 'REPLICASET': 'replicaset',\n 'SSL': 'ssl',\n 'SSL_CERTFILE': 'ssl_certfile',\n 'SSL_CA_CERTS': 'ssl_ca_certs',\n 'READ_PREFERENCE': 'read_preference'\n }\n connection_params = {\n 'name': 'djongo_test',\n 'enforce_schema': True\n }\n for setting_name, kwarg in valid_settings.items():\n try:\n setting = self.settings_dict[setting_name]\n except KeyError:\n continue\n\n if setting or setting is False:\n connection_params[kwarg] = setting\n\n return connection_params"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate a new connection to the database.", "response": "def get_new_connection(self, connection_params):\n \"\"\"\n Receives a dictionary connection_params to setup\n a connection to the database.\n\n Dictionary correct setup is made through the\n get_connection_params method.\n\n TODO: This needs to be made more generic to accept\n other MongoClient parameters.\n \"\"\"\n\n name = connection_params.pop('name')\n es = connection_params.pop('enforce_schema')\n\n connection_params['document_class'] = OrderedDict\n # connection_params['tz_aware'] = True\n # To prevent leaving unclosed connections behind,\n # client_conn must be closed before a new connection\n # is created.\n if self.client_connection is not None:\n self.client_connection.close()\n\n self.client_connection = Database.connect(**connection_params)\n database = self.client_connection[name]\n self.djongo_connection = DjongoClient(database, es)\n return self.client_connection[name]"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef create_cursor(self, name=None):\n return Cursor(self.client_connection, self.connection, self.djongo_connection)", "response": "Returns an active connection cursor to the database."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _close(self):\n if self.connection:\n with self.wrap_database_errors:\n self.connection.client.close()", "response": "Closes the client connection to the database."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nbuilding an instance of model from the model_dict.", "response": "def make_mdl(model, model_dict):\r\n \"\"\"\r\n Builds an instance of model from the model_dict.\r\n \"\"\"\r\n for field_name in model_dict:\r\n field = model._meta.get_field(field_name)\r\n model_dict[field_name] = field.to_python(model_dict[field_name])\r\n\r\n return model(**model_dict)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning a list of all the fields in the given value that are in the correct format.", "response": "def to_python(self, value):\r\n \"\"\"\r\n Overrides standard to_python method from django models to allow\r\n correct translation of Mongo array to a python list.\r\n \"\"\"\r\n if value is None:\r\n return value\r\n\r\n assert isinstance(value, list)\r\n ret = []\r\n for mdl_dict in value:\r\n if isinstance(mdl_dict, self.model_container):\r\n ret.append(mdl_dict)\r\n continue\r\n mdl = make_mdl(self.model_container, mdl_dict)\r\n ret.append(mdl)\r\n\r\n return ret"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the formfield for the array.", "response": "def formfield(self, **kwargs):\r\n \"\"\"\r\n Returns the formfield for the array.\r\n \"\"\"\r\n defaults = {\r\n 'form_class': ArrayFormField,\r\n 'model_container': self.model_container,\r\n 'model_form_class': self.model_form_class,\r\n 'name': self.attname,\r\n 'mdl_form_kw_l': self.model_form_kwargs_l\r\n\r\n }\r\n defaults.update(kwargs)\r\n return super().formfield(**defaults)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef to_python(self, value):\r\n if value is None or isinstance(value, self.model_container):\r\n return value\r\n assert isinstance(value, dict)\r\n\r\n instance = make_mdl(self.model_container, value)\r\n return instance", "response": "Converts a dictionary to a python dict."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\napplying core filters to the queryset.", "response": "def _apply_rel_filters(self, queryset):\r\n \"\"\"\r\n Filter the queryset for the instance this manager is bound to.\r\n \"\"\"\r\n queryset._add_hints(instance=self.instance)\r\n if self._db:\r\n queryset = queryset.using(self._db)\r\n queryset = queryset.filter(**self.core_filters)\r\n\r\n return queryset"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncompute the expected number of false positives caused by using l to approximate set sizes in the interval [ l u ) assuming uniform distribution of set sizes within the interval [ l u ).", "response": "def _compute_nfp_uniform(l, u, cum_counts, sizes):\n \"\"\"Computes the expected number of false positives caused by using\n u to approximate set sizes in the interval [l, u], assuming uniform\n distribution of set sizes within the interval.\n\n Args:\n l: the lower bound on set sizes.\n u: the upper bound on set sizes.\n cum_counts: the complete cummulative distribution of set sizes.\n sizes: the complete domain of set sizes.\n\n Return (float): the expected number of false positives.\n \"\"\"\n if l > u:\n raise ValueError(\"l must be less or equal to u\")\n if l == 0:\n n = cum_counts[u]\n else:\n n = cum_counts[u]-cum_counts[l-1]\n return n * float(sizes[u] - sizes[l]) / float(2*sizes[u])"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _compute_nfps_uniform(cum_counts, sizes):\n nfps = np.zeros((len(sizes), len(sizes)))\n # All u an l are inclusive bounds for intervals.\n # Compute p = 1, the NFPs\n for l in range(len(sizes)):\n for u in range(l, len(sizes)):\n nfps[l, u] = _compute_nfp_uniform(l, u, cum_counts, sizes)\n return nfps", "response": "Computes the matrix of expected false positives for all possible\n sub - intervals assuming uniform\n distribution of set_sizes within each sub - interval."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _compute_nfp_real(l, u, counts, sizes):\n if l > u:\n raise ValueError(\"l must be less or equal to u\")\n return np.sum((float(sizes[u])-sizes[l:u+1])/float(sizes[u])*counts[l:u+1])", "response": "Compute the expected number of false positives caused by using\n l to approximate set sizes in the interval [ l u )."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _compute_nfps_real(counts, sizes):\n nfps = np.zeros((len(sizes), len(sizes)))\n # All u an l are inclusive bounds for intervals.\n # Compute p = 1, the NFPs\n for l in range(len(sizes)):\n for u in range(l, len(sizes)):\n nfps[l, u] = _compute_nfp_real(l, u, counts, sizes)\n return nfps", "response": "Compute the matrix of expected false positives for all possible\n sub - intervals of the complete distribution of set sizes."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncomputes the optimal partitions given the size distributions and computed number of expected false positives for all sub - intervals.", "response": "def _compute_best_partitions(num_part, sizes, nfps):\n \"\"\"Computes the optimal partitions given the size distributions\n and computed number of expected false positives for all sub-intervals.\n\n Args:\n num_part (int): The number of partitions to create.\n sizes (numpy.array): The complete domain of set sizes in sorted order.\n nfps (numpy.array): The computed number of expected false positives\n for all sub-intervals; axis-0 is for the indexes of lower bounds and\n axis-1 is for the indexes of upper bounds.\n\n Returns:\n partitions (list): list of lower and upper bounds of set sizes for\n all partitions.\n total_nfps (float): total number of expected false positives from all\n partitions.\n cost (numpy.array): a N x p-1 matrix of the computed optimal NFPs for\n all sub-problems given upper bound set size and number of partitions.\n \"\"\"\n\n if num_part < 2:\n raise ValueError(\"num_part cannot be less than 2\")\n if num_part > len(sizes):\n raise ValueError(\"num_part cannot be greater than the domain size of \"\n \"all set sizes\")\n\n # If number of partitions is 2, then simply find the upper bound\n # of the first partition.\n if num_part == 2:\n total_nfps, u = min((nfps[0, u1]+nfps[u1+1, len(sizes)-1], u1)\n for u1 in range(0, len(sizes)-1))\n return [(sizes[0], sizes[u]), (sizes[u+1], sizes[-1]),], \\\n total_nfps, None\n\n # Initialize subproblem total NFPs.\n cost = np.zeros((len(sizes), num_part-2))\n\n # Note: p is the number of partitions in the subproblem.\n # p2i translates the number of partition into the index in the matrix.\n p2i = lambda p : p - 2\n\n # Compute p >= 2 until before p = num_part.\n for p in range(2, num_part):\n # Compute best partition for subproblems with increasing\n # max index u, starting from the smallest possible u given the p.\n # The smallest possible u can be considered as the max index that\n # generates p partitions each with only one size.\n for u in range(p-1, len(sizes)):\n if p == 2:\n cost[u, p2i(p)] = min(nfps[0, u1]+nfps[u1+1,u]\n for u1 in range(u))\n else:\n cost[u, p2i(p)] = min(cost[u1, p2i(p-1)] + nfps[u1+1, u]\n for u1 in range((p-1)-1, u))\n p = num_part\n # Find the optimal upper bound index of the 2nd right-most partition given\n # the number of partitions (p).\n total_nfps, u = min((cost[u1, p2i(p-1)]+nfps[u1+1, len(sizes)-1], u1)\n for u1 in range((p-1)-1, len(sizes)-1))\n partitions = [(sizes[u+1], sizes[-1]),]\n p -= 1\n # Back track to find the best partitions.\n while p > 1:\n # Find the optimal upper bound index of the 2nd right-most partition\n # givne the number of partitions (p) and upper bound index (u) in this\n # sub-problem.\n _, u1_best = min((cost[u1, p2i(p)]+nfps[u1+1, u], u1)\n for u1 in range((p-1)-1, u))\n partitions.insert(0, (sizes[u1_best+1], sizes[u]))\n u = u1_best\n p -= 1\n partitions.insert(0, (sizes[0], sizes[u]))\n return [partitions, total_nfps, cost]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncompute the optimal partitions given a distribution of set sizes.", "response": "def optimal_partitions(sizes, counts, num_part):\n \"\"\"Compute the optimal partitions given a distribution of set sizes.\n\n Args:\n sizes (numpy.array): The complete domain of set sizes in ascending\n order.\n counts (numpy.array): The frequencies of all set sizes in the same\n order as `sizes`.\n num_part (int): The number of partitions to create.\n\n Returns:\n list: A list of partitions in the form of `(lower, upper)` tuples,\n where `lower` and `upper` are lower and upper bound (inclusive)\n set sizes of each partition.\n \"\"\"\n if num_part < 2:\n return [(sizes[0], sizes[-1])]\n if num_part >= len(sizes):\n partitions = [(x, x) for x in sizes]\n return partitions\n nfps = _compute_nfps_real(counts, sizes)\n partitions, _, _ = _compute_best_partitions(num_part, sizes, nfps)\n return partitions"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef jaccard(self, other):\n '''\n Estimate the Jaccard similarity (resemblance) between this b-bit\n MinHash and the other.\n '''\n if self.b != other.b:\n raise ValueError(\"Cannot compare two b-bit MinHashes with different\\\n b values\")\n if self.seed != other.seed:\n raise ValueError(\"Cannot compare two b-bit MinHashes with different\\\n set of permutations\")\n intersection = np.count_nonzero(self.hashvalues==other.hashvalues)\n raw_est = float(intersection) / float(self.hashvalues.size)\n a1 = self._calc_a(self.r, self.b)\n a2 = self._calc_a(other.r, other.b)\n c1, c2 = self._calc_c(a1, a2, self.r, other.r)\n return (raw_est - c1) / (1 - c2)", "response": "Estimate the Jaccard similarity between this b - bit MinHash and the other."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _calc_a(self, r, b):\n '''\n Compute the function A(r, b)\n '''\n if r == 0.0:\n # Find the limit of A(r, b) as r -> 0.\n return 1.0 / (1 << b)\n return r * (1 - r) ** (2 ** b - 1) / (1 - (1 - r) ** (2 * b))", "response": "Compute the function A r b"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _calc_c(self, a1, a2, r1, r2):\n '''\n Compute the functions C1 and C2\n '''\n if r1 == 0.0 and r2 == 0.0:\n # Find the limits of C1 and C2 as r1 -> 0 and r2 -> 0\n # Since the b-value must be the same and r1 = r2,\n # we have A1(r1, b1) = A2(r2, b2) = A,\n # then the limits for both C1 and C2 are A.\n return a1, a2\n div = 1 / (r1 + r2)\n c1 = (a1 * r2 + a2 * r1) * div\n c2 = (a1 * r1 + a2 * r2) * div\n return c1, c2", "response": "Compute the functions C1 and C2 and return the c1 and c2."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _initialize_slots(self, seed, hashvalues):\n '''Initialize the slots of the LeanMinHash.\n\n Args:\n seed (int): The random seed controls the set of random\n permutation functions generated for this LeanMinHash.\n hashvalues: The hash values is the internal state of the LeanMinHash.\n '''\n self.seed = seed\n self.hashvalues = self._parse_hashvalues(hashvalues)", "response": "Initialize the slots of the LeanMinHash."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncomputes the byte size after serialization.", "response": "def bytesize(self, byteorder='@'):\n '''Compute the byte size after serialization.\n\n Args:\n byteorder (str, optional): This is byte order of the serialized data. Use one\n of the `byte order characters\n `_:\n ``@``, ``=``, ``<``, ``>``, and ``!``.\n Default is ``@`` -- the native order.\n\n Returns:\n int: Size in number of bytes after serialization.\n '''\n # Use 8 bytes to store the seed integer\n seed_size = struct.calcsize(byteorder+'q')\n # Use 4 bytes to store the number of hash values\n length_size = struct.calcsize(byteorder+'i')\n # Use 4 bytes to store each hash value as we are using the lower 32 bit\n hashvalue_size = struct.calcsize(byteorder+'I')\n return seed_size + length_size + len(self) * hashvalue_size"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef serialize(self, buf, byteorder='@'):\n '''\n Serialize this lean MinHash and store the result in an allocated buffer.\n\n Args:\n buf (buffer): `buf` must implement the `buffer`_ interface.\n One such example is the built-in `bytearray`_ class.\n byteorder (str, optional): This is byte order of the serialized data. Use one\n of the `byte order characters\n `_:\n ``@``, ``=``, ``<``, ``>``, and ``!``.\n Default is ``@`` -- the native order.\n\n This is preferred over using `pickle`_ if the serialized lean MinHash needs\n to be used by another program in a different programming language.\n\n The serialization schema:\n 1. The first 8 bytes is the seed integer\n 2. The next 4 bytes is the number of hash values\n 3. The rest is the serialized hash values, each uses 4 bytes\n\n Example:\n To serialize a single lean MinHash into a `bytearray`_ buffer.\n\n .. code-block:: python\n\n buf = bytearray(lean_minhash.bytesize())\n lean_minhash.serialize(buf)\n\n To serialize multiple lean MinHash into a `bytearray`_ buffer.\n\n .. code-block:: python\n\n # assuming lean_minhashs is a list of LeanMinHash with the same size\n size = lean_minhashs[0].bytesize()\n buf = bytearray(size*len(lean_minhashs))\n for i, lean_minhash in enumerate(lean_minhashs):\n lean_minhash.serialize(buf[i*size:])\n\n .. _`buffer`: https://docs.python.org/3/c-api/buffer.html\n .. _`bytearray`: https://docs.python.org/3.6/library/functions.html#bytearray\n .. _`byteorder`: https://docs.python.org/3/library/struct.html\n '''\n if len(buf) < self.bytesize():\n raise ValueError(\"The buffer does not have enough space\\\n for holding this MinHash.\")\n fmt = \"%sqi%dI\" % (byteorder, len(self))\n struct.pack_into(fmt, buf, 0,\n self.seed, len(self), *self.hashvalues)", "response": "Serialize this lean MinHash and store the result in a buffer."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef update(self, b):\n '''Update this MinHash with a new value.\n The value will be hashed using the hash function specified by\n the `hashfunc` argument in the constructor.\n\n Args:\n b: The value to be hashed using the hash function specified.\n\n Example:\n To update with a new string value (using the default SHA1 hash\n function, which requires bytes as input):\n\n .. code-block:: python\n\n minhash = Minhash()\n minhash.update(\"new value\".encode('utf-8'))\n\n We can also use a different hash function, for example, `pyfarmhash`:\n\n .. code-block:: python\n\n import farmhash\n def _hash_32(b):\n return farmhash.hash32(b)\n minhash = MinHash(hashfunc=_hash_32)\n minhash.update(\"new value\")\n '''\n hv = self.hashfunc(b)\n a, b = self.permutations\n phv = np.bitwise_and((a * hv + b) % _mersenne_prime, np.uint64(_max_hash))\n self.hashvalues = np.minimum(phv, self.hashvalues)", "response": "Update this MinHash with a new value."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef jaccard(self, other):\n '''Estimate the `Jaccard similarity`_ (resemblance) between the sets\n represented by this MinHash and the other.\n\n Args:\n other (datasketch.MinHash): The other MinHash.\n\n Returns:\n float: The Jaccard similarity, which is between 0.0 and 1.0.\n '''\n if other.seed != self.seed:\n raise ValueError(\"Cannot compute Jaccard given MinHash with\\\n different seeds\")\n if len(self) != len(other):\n raise ValueError(\"Cannot compute Jaccard given MinHash with\\\n different numbers of permutation functions\")\n return np.float(np.count_nonzero(self.hashvalues==other.hashvalues)) /\\\n np.float(len(self))", "response": "Estimate the Jaccard similarity between this MinHash and the other MinHash."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef count(self):\n '''Estimate the cardinality count based on the technique described in\n `this paper `_.\n\n Returns:\n int: The estimated cardinality of the set represented by this MinHash.\n '''\n k = len(self)\n return np.float(k) / np.sum(self.hashvalues / np.float(_max_hash)) - 1.0", "response": "Estimate the cardinality count based on the technique described in\n this paper _"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef merge(self, other):\n '''Merge the other MinHash with this one, making this one the union\n of both.\n\n Args:\n other (datasketch.MinHash): The other MinHash.\n '''\n if other.seed != self.seed:\n raise ValueError(\"Cannot merge MinHash with\\\n different seeds\")\n if len(self) != len(other):\n raise ValueError(\"Cannot merge MinHash with\\\n different numbers of permutation functions\")\n self.hashvalues = np.minimum(other.hashvalues, self.hashvalues)", "response": "Merge the other MinHash with this one making this one the union\n of both."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning a copy of this MinHash by exporting its state.", "response": "def copy(self):\n '''\n :returns: datasketch.MinHash -- A copy of this MinHash by exporting its state.\n '''\n return MinHash(seed=self.seed, hashfunc=self.hashfunc,\n hashvalues=self.digest(),\n permutations=self.permutations)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a MinHash which is the union of the passed MinHash objects.", "response": "def union(cls, *mhs):\n '''Create a MinHash which is the union of the MinHash objects passed as arguments.\n\n Args:\n *mhs: The MinHash objects to be united. The argument list length is variable,\n but must be at least 2.\n\n Returns:\n datasketch.MinHash: A new union MinHash.\n '''\n if len(mhs) < 2:\n raise ValueError(\"Cannot union less than 2 MinHash\")\n num_perm = len(mhs[0])\n seed = mhs[0].seed\n if any((seed != m.seed or num_perm != len(m)) for m in mhs):\n raise ValueError(\"The unioning MinHash must have the\\\n same seed and number of permutation functions\")\n hashvalues = np.minimum.reduce([m.hashvalues for m in mhs])\n permutations = mhs[0].permutations\n return cls(num_perm=num_perm, seed=seed, hashvalues=hashvalues,\n permutations=permutations)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncompute the false positive probability given the containment threshold b r and xq.", "response": "def _false_positive_probability(threshold, b, r, xq):\n '''\n Compute the false positive probability given the containment threshold.\n xq is the ratio of x/q.\n '''\n _probability = lambda t : 1 - (1 - (t/(1 + xq - t))**float(r))**float(b)\n if xq >= threshold:\n a, err = integrate(_probability, 0.0, threshold)\n return a\n a, err = integrate(_probability, 0.0, xq)\n return a"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _optimal_param(threshold, num_perm, max_r, xq, false_positive_weight,\n false_negative_weight):\n '''\n Compute the optimal parameters that minimizes the weighted sum\n of probabilities of false positive and false negative.\n xq is the ratio of x/q.\n '''\n min_error = float(\"inf\")\n opt = (0, 0)\n for b in range(1, num_perm+1):\n for r in range(1, max_r+1):\n if b*r > num_perm:\n continue\n fp = _false_positive_probability(threshold, b, r, xq)\n fn = _false_negative_probability(threshold, b, r, xq)\n error = fp*false_positive_weight + fn*false_negative_weight\n if error < min_error:\n min_error = error\n opt = (b, r)\n return opt", "response": "Compute the optimal parameter that minimizes the weighted sum of probabilities of false positive and false negative."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nindexing all sets given their keys MinHashes and sizes.", "response": "def index(self, entries):\n '''\n Index all sets given their keys, MinHashes, and sizes.\n It can be called only once after the index is created.\n\n Args:\n entries (`iterable` of `tuple`): An iterable of tuples, each must be\n in the form of `(key, minhash, size)`, where `key` is the unique\n identifier of a set, `minhash` is the MinHash of the set,\n and `size` is the size or number of unique items in the set.\n\n Note:\n `size` must be positive.\n '''\n if not self.is_empty():\n raise ValueError(\"Cannot call index again on a non-empty index\")\n if not isinstance(entries, list):\n queue = deque([])\n for key, minhash, size in entries:\n if size <= 0:\n raise ValueError(\"Set size must be positive\")\n queue.append((key, minhash, size))\n entries = list(queue)\n if len(entries) == 0:\n raise ValueError(\"entries is empty\")\n # Create optimal partitions.\n sizes, counts = np.array(sorted(\n Counter(e[2] for e in entries).most_common())).T\n partitions = optimal_partitions(sizes, counts, len(self.indexes))\n for i, (lower, upper) in enumerate(partitions):\n self.lowers[i], self.uppers[i] = lower, upper\n # Insert into partitions.\n entries.sort(key=lambda e : e[2])\n curr_part = 0\n for key, minhash, size in entries:\n if size > self.uppers[curr_part]:\n curr_part += 1\n for r in self.indexes[curr_part]:\n self.indexes[curr_part][r].insert(key, minhash)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef query(self, minhash, size):\n '''\n Giving the MinHash and size of the query set, retrieve\n keys that references sets with containment with respect to\n the query set greater than the threshold.\n\n Args:\n minhash (datasketch.MinHash): The MinHash of the query set.\n size (int): The size (number of unique items) of the query set.\n\n Returns:\n `iterator` of keys.\n '''\n for i, index in enumerate(self.indexes):\n u = self.uppers[i]\n if u is None:\n continue\n b, r = self._get_optimal_param(u, size)\n for key in index[r]._query_b(minhash, b):\n yield key", "response": "Retrieve the keys that references sets with respect to\n the query set greater than the threshold."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef is_empty(self):\n '''\n Returns:\n bool: Check if the index is empty.\n '''\n return all(all(index[r].is_empty() for r in index)\n for index in self.indexes)", "response": "Checks if the index is empty."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef jaccard(self, other):\n '''Estimate the `weighted Jaccard similarity`_ between the\n multi-sets represented by this weighted MinHash and the other.\n \n Args:\n other (datasketch.WeightedMinHash): The other weighted MinHash.\n\n Returns:\n float: The weighted Jaccard similarity between 0.0 and 1.0.\n\n .. _`weighted Jaccard similarity`: http://mathoverflow.net/questions/123339/weighted-jaccard-similarity\n '''\n if other.seed != self.seed:\n raise ValueError(\"Cannot compute Jaccard given WeightedMinHash objects with\\\n different seeds\")\n if len(self) != len(other):\n raise ValueError(\"Cannot compute Jaccard given WeightedMinHash objects with\\\n different numbers of hash values\")\n # Check how many pairs of (k, t) hashvalues are equal\n intersection = 0\n for this, that in zip(self.hashvalues, other.hashvalues):\n if np.array_equal(this, that):\n intersection += 1\n return float(intersection) / float(len(self))", "response": "Estimate the weighted Jaccard similarity between this weighted MinHash and the other."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef minhash(self, v):\n '''Create a new weighted MinHash given a weighted Jaccard vector.\n Each dimension is an integer \n frequency of the corresponding element in the multi-set represented\n by the vector.\n\n Args:\n v (numpy.array): The Jaccard vector. \n '''\n if not isinstance(v, collections.Iterable):\n raise TypeError(\"Input vector must be an iterable\")\n if not len(v) == self.dim:\n raise ValueError(\"Input dimension mismatch, expecting %d\" % self.dim)\n if not isinstance(v, np.ndarray):\n v = np.array(v, dtype=np.float32)\n elif v.dtype != np.float32:\n v = v.astype(np.float32)\n hashvalues = np.zeros((self.sample_size, 2), dtype=np.int)\n vzeros = (v == 0)\n if vzeros.all():\n raise ValueError(\"Input is all zeros\")\n v[vzeros] = np.nan\n vlog = np.log(v)\n for i in range(self.sample_size):\n t = np.floor((vlog / self.rs[i]) + self.betas[i])\n ln_y = (t - self.betas[i]) * self.rs[i]\n ln_a = self.ln_cs[i] - ln_y - self.rs[i]\n k = np.nanargmin(ln_a)\n hashvalues[i][0], hashvalues[i][1] = k, int(t[k])\n return WeightedMinHash(self.seed, hashvalues)", "response": "Create a weighted MinHash given a weighted Jaccard vector."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ninserts a key into the index together with a MinHash.", "response": "def insert(self, key, minhash, check_duplication=True):\n '''\n Insert a key to the index, together\n with a MinHash (or weighted MinHash) of the set referenced by \n the key.\n\n :param str key: The identifier of the set.\n :param datasketch.MinHash minhash: The MinHash of the set.\n :param bool check_duplication: To avoid duplicate keys in the storage (`default=True`).\n It's recommended to not change the default, but\n if you want to avoid the overhead during insert\n you can set `check_duplication = False`.\n '''\n self._insert(key, minhash, check_duplication=check_duplication, buffer=False)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nremoving the key from the index.", "response": "def remove(self, key):\n '''\n Remove the key from the index.\n\n Args:\n key (hashable): The unique identifier of a set.\n\n '''\n if self.prepickle:\n key = pickle.dumps(key)\n if key not in self.keys:\n raise ValueError(\"The given key does not exist\")\n for H, hashtable in zip(self.keys[key], self.hashtables):\n hashtable.remove_val(H, key)\n if not hashtable.get(H):\n hashtable.remove(H)\n self.keys.remove(key)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the counts for all the subset of the given keys.", "response": "def get_subset_counts(self, *keys):\n '''\n Returns the bucket allocation counts (see :func:`~datasketch.MinHashLSH.get_counts` above)\n restricted to the list of keys given.\n\n Args:\n keys (hashable) : the keys for which to get the bucket allocation\n counts\n '''\n if self.prepickle:\n key_set = [pickle.dumps(key) for key in set(keys)]\n else:\n key_set = list(set(keys))\n hashtables = [unordered_storage({'type': 'dict'}) for _ in\n range(self.b)]\n Hss = self.keys.getmany(*key_set)\n for key, Hs in zip(key_set, Hss):\n for H, hashtable in zip(Hs, hashtables):\n hashtable.insert(H, key)\n return [hashtable.itemcounts() for hashtable in hashtables]"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef update(self, b):\n '''\n Update the HyperLogLog with a new data value in bytes.\n The value will be hashed using the hash function specified by\n the `hashfunc` argument in the constructor.\n\n Args:\n b: The value to be hashed using the hash function specified.\n\n Example:\n To update with a new string value (using the default SHA1 hash\n function, which requires bytes as input):\n\n .. code-block:: python\n\n hll = HyperLogLog()\n hll.update(\"new value\".encode('utf-8'))\n\n We can also use a different hash function, for example, `pyfarmhash`:\n\n .. code-block:: python\n\n import farmhash\n def _hash_32(b):\n return farmhash.hash32(b)\n hll = HyperLogLog(hashfunc=_hash_32)\n hll.update(\"new value\")\n '''\n # Digest the hash object to get the hash value\n hv = self.hashfunc(b)\n # Get the index of the register using the first p bits of the hash\n reg_index = hv & (self.m - 1)\n # Get the rest of the hash\n bits = hv >> self.p\n # Update the register\n self.reg[reg_index] = max(self.reg[reg_index], self._get_rank(bits))", "response": "Update the HyperLogLog with a new data value in bytes."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nestimates the cardinality of the data values seen so far.", "response": "def count(self):\n '''\n Estimate the cardinality of the data values seen so far.\n\n Returns:\n int: The estimated cardinality.\n '''\n # Use HyperLogLog estimation function\n e = self.alpha * float(self.m ** 2) / np.sum(2.0**(-self.reg))\n # Small range correction\n if e <= (5.0 / 2.0) * self.m:\n num_zero = self.m - np.count_nonzero(self.reg)\n return self._linearcounting(num_zero)\n # Normal range, no correction\n if e <= (1.0 / 30.0) * (1 << 32):\n return e\n # Large range correction\n return self._largerange_correction(e)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef merge(self, other):\n '''\n Merge the other HyperLogLog with this one, making this the union of the\n two.\n\n Args:\n other (datasketch.HyperLogLog):\n '''\n if self.m != other.m or self.p != other.p:\n raise ValueError(\"Cannot merge HyperLogLog with different\\\n precisions.\")\n self.reg = np.maximum(self.reg, other.reg)", "response": "Merge the other HyperLogLog with this one making this the union of the two HyperLogLog s Registers."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreset the current HyperLogLog to empty.", "response": "def clear(self):\n '''\n Reset the current HyperLogLog to empty.\n '''\n self.reg = np.zeros((self.m,), dtype=np.int8)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncompute the average precision at k. This function computes the average prescision at k between two lists of items. Parameters ---------- actual : list A list of elements that are to be predicted (order doesn't matter) predicted : list A list of predicted elements (order does matter) k : int, optional The maximum number of predicted elements Returns ------- score : double The average precision at k over the input lists", "response": "def apk(actual, predicted, k=10):\n \"\"\"\n Computes the average precision at k.\n\n This function computes the average prescision at k between two lists of\n items.\n\n Parameters\n ----------\n actual : list\n A list of elements that are to be predicted (order doesn't matter)\n predicted : list\n A list of predicted elements (order does matter)\n k : int, optional\n The maximum number of predicted elements\n\n Returns\n -------\n score : double\n The average precision at k over the input lists\n\n \"\"\"\n if len(predicted)>k:\n predicted = predicted[:k]\n\n score = 0.0\n num_hits = 0.0\n\n for i,p in enumerate(predicted):\n if p in actual and p not in predicted[:i]:\n num_hits += 1.0\n score += num_hits / (i+1.0)\n\n if len(actual) == 0:\n return 0.0\n\n return score / min(len(actual), k)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef mapk(actual, predicted, k=10):\n return np.mean([apk(a,p,k) for a,p in zip(actual, predicted)])", "response": "This function computes the mean average precision at k between two lists of items that are to be predicted \n"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nadding a unique key together with a MinHash.", "response": "def add(self, key, minhash):\n '''\n Add a unique key, together\n with a MinHash (or weighted MinHash) of the set referenced by the key.\n\n Note:\n The key won't be searchbale until the\n :func:`datasketch.MinHashLSHForest.index` method is called.\n\n Args:\n key (hashable): The unique identifier of the set.\n minhash (datasketch.MinHash): The MinHash of the set.\n '''\n if len(minhash) < self.k*self.l:\n raise ValueError(\"The num_perm of MinHash out of range\")\n if key in self.keys:\n raise ValueError(\"The given key has already been added\")\n self.keys[key] = [self._H(minhash.hashvalues[start:end])\n for start, end in self.hashranges]\n for H, hashtable in zip(self.keys[key], self.hashtables):\n hashtable[H].append(key)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef index(self):\n '''\n Index all the keys added so far and make them searchable.\n '''\n for i, hashtable in enumerate(self.hashtables):\n self.sorted_hashtables[i] = [H for H in hashtable.keys()]\n self.sorted_hashtables[i].sort()", "response": "Index all the keys added so far and make them searchable."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nquery the most similarities of a given key.", "response": "def query(self, minhash, k):\n '''\n Return the approximate top-k keys that have the highest\n Jaccard similarities to the query set.\n\n Args:\n minhash (datasketch.MinHash): The MinHash of the query set.\n k (int): The maximum number of keys to return.\n\n Returns:\n `list` of at most k keys.\n '''\n if k <= 0:\n raise ValueError(\"k must be positive\")\n if len(minhash) < self.k*self.l:\n raise ValueError(\"The num_perm of MinHash out of range\")\n results = set()\n r = self.k\n while r > 0:\n for key in self._query(minhash, r, self.l):\n results.add(key)\n if len(results) >= k:\n return list(results)\n r -= 1\n return list(results)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nclose all resources and disconnects from AsyncMinHashLSH storage.", "response": "async def close(self):\n \"\"\"\n Cleanup client resources and disconnect from AsyncMinHashLSH storage.\n \"\"\"\n async with self._lock:\n for t in self.hashtables:\n await t.close()\n\n if self.keys is not None:\n await self.keys.close()\n\n self._initialized = False"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nqueries the hashtables for a given minhash.", "response": "async def query(self, minhash):\n \"\"\"\n see :class:`datasketch.MinHashLSH`.\n \"\"\"\n if len(minhash) != self.h:\n raise ValueError(\"Expecting minhash with length %d, \"\n \"got %d\" % (self.h, len(minhash)))\n\n fs = (hashtable.get(self._H(minhash.hashvalues[start:end]))\n for (start, end), hashtable in zip(self.hashranges, self.hashtables))\n candidates = frozenset(chain.from_iterable(await asyncio.gather(*fs)))\n\n return list(candidates)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\nasync def get_counts(self):\n fs = (hashtable.itemcounts() for hashtable in self.hashtables)\n return await asyncio.gather(*fs)", "response": "Get the counts of all hashtables."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef ordered_storage(config, name=None):\n '''Return ordered storage system based on the specified config.\n\n The canonical example of such a storage container is\n ``defaultdict(list)``. Thus, the return value of this method contains\n keys and values. The values are ordered lists with the last added\n item at the end.\n\n Args:\n config (dict): Defines the configurations for the storage.\n For in-memory storage, the config ``{'type': 'dict'}`` will\n suffice. For Redis storage, the type should be ``'redis'`` and\n the configurations for the Redis database should be supplied\n under the key ``'redis'``. These parameters should be in a form\n suitable for `redis.Redis`. The parameters may alternatively\n contain references to environment variables, in which case\n literal configuration values should be replaced by dicts of\n the form::\n\n {'env': 'REDIS_HOSTNAME',\n 'default': 'localhost'}\n\n For a full example, see :ref:`minhash_lsh_at_scale`\n\n name (bytes, optional): A reference name for this storage container.\n For dict-type containers, this is ignored. For Redis containers,\n this name is used to prefix keys pertaining to this storage\n container within the database.\n '''\n tp = config['type']\n if tp == 'dict':\n return DictListStorage(config)\n if tp == 'redis':\n return RedisListStorage(config, name=name)", "response": "Return an ordered storage system based on the specified config."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning an unordered storage system based on the specified config.", "response": "def unordered_storage(config, name=None):\n '''Return an unordered storage system based on the specified config.\n\n The canonical example of such a storage container is\n ``defaultdict(set)``. Thus, the return value of this method contains\n keys and values. The values are unordered sets.\n\n Args:\n config (dict): Defines the configurations for the storage.\n For in-memory storage, the config ``{'type': 'dict'}`` will\n suffice. For Redis storage, the type should be ``'redis'`` and\n the configurations for the Redis database should be supplied\n under the key ``'redis'``. These parameters should be in a form\n suitable for `redis.Redis`. The parameters may alternatively\n contain references to environment variables, in which case\n literal configuration values should be replaced by dicts of\n the form::\n\n {'env': 'REDIS_HOSTNAME',\n 'default': 'localhost'}\n\n For a full example, see :ref:`minhash_lsh_at_scale`\n\n name (bytes, optional): A reference name for this storage container.\n For dict-type containers, this is ignored. For Redis containers,\n this name is used to prefix keys pertaining to this storage\n container within the database.\n '''\n tp = config['type']\n if tp == 'dict':\n return DictSetStorage(config)\n if tp == 'redis':\n return RedisSetStorage(config, name=name)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a dict where the keys are the keys of the container. The values are the lengths of the value sequences stored in this container.", "response": "def itemcounts(self, **kwargs):\n '''Returns a dict where the keys are the keys of the container.\n The values are the *lengths* of the value sequences stored\n in this container.\n '''\n return {k: len(v) for k, v in self._dict.items()}"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn a populated instance of the the class.", "response": "def get_social_login(self, adapter, app, token, response):\n \"\"\"\n :param adapter: allauth.socialaccount Adapter subclass.\n Usually OAuthAdapter or Auth2Adapter\n :param app: `allauth.socialaccount.SocialApp` instance\n :param token: `allauth.socialaccount.SocialToken` instance\n :param response: Provider's response for OAuth1. Not used in the\n :returns: A populated instance of the\n `allauth.socialaccount.SocialLoginView` instance\n \"\"\"\n request = self._get_request()\n social_login = adapter.complete_login(request, app, token,\n response=response)\n social_login.token = token\n return social_login"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn the user data for the user in the object obj.", "response": "def get_user(self, obj):\n \"\"\"\n Required to allow using custom USER_DETAILS_SERIALIZER in\n JWTSerializer. Defining it here to avoid circular imports\n \"\"\"\n rest_auth_serializers = getattr(settings, 'REST_AUTH_SERIALIZERS', {})\n JWTUserDetailsSerializer = import_callable(\n rest_auth_serializers.get('USER_DETAILS_SERIALIZER', UserDetailsSerializer)\n )\n user_data = JWTUserDetailsSerializer(obj['user'], context=self.context).data\n return user_data"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_social_login(self, *args, **kwargs):\n social_login = super(SocialConnectMixin, self).get_social_login(*args, **kwargs)\n social_login.state['process'] = AuthProcess.CONNECT\n return social_login", "response": "Override get_social_login to set the social login process to connect rather than login\n "} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef select_text(text, reading=False, prefer=None):\n # select kanji number or kana reading\n if reading:\n text = text[1]\n else:\n text = text[0]\n\n # select the preferred one or the first one from multiple alternatives\n if not isinstance(text, strtype):\n common = set(text) & set(prefer or set())\n if len(common) == 1:\n text = common.pop()\n else:\n text = text[0]\n\n return text", "response": "Select the correct text from the Japanese number reading and the Kanji number or kana alternatives"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nmerge two lists of semirregular rendaku rules.", "response": "def rendaku_merge_pairs(lpair, rpair):\n \"\"\"Merge lpair < rpair while applying semi-irregular rendaku rules\"\"\"\n ltext, lnum = lpair\n rtext, rnum = rpair\n if lnum > rnum:\n raise ValueError\n\n if rpair == (\"\u3072\u3083\u304f\", 100):\n if lpair == (\"\u3055\u3093\", 3):\n rtext = \"\u3073\u3083\u304f\"\n elif lpair == (\"\u308d\u304f\", 6):\n ltext = \"\u308d\u3063\"\n rtext = \"\u3074\u3083\u304f\"\n elif lpair == (\"\u306f\u3061\", 8):\n ltext = \"\u306f\u3063\"\n rtext = \"\u3074\u3083\u304f\"\n elif rpair == (\"\u305b\u3093\", 1000):\n if lpair == (\"\u3055\u3093\", 3):\n rtext = \"\u305c\u3093\"\n elif lpair == (\"\u306f\u3061\", 8):\n ltext = \"\u306f\u3063\"\n elif rpair == (\"\u3061\u3087\u3046\", 10**12):\n if lpair == (\"\u3044\u3061\", 1):\n ltext = \"\u3044\u3063\"\n elif lpair == (\"\u306f\u3061\", 8):\n ltext = \"\u306f\u3063\"\n elif lpair == (\"\u3058\u3085\u3046\", 10):\n ltext = \"\u3058\u3085\u3063\"\n elif rpair == (\"\u3051\u3044\", 10**16):\n if lpair == (\"\u3044\u3061\", 1):\n ltext = \"\u3044\u3063\"\n elif lpair == (\"\u308d\u304f\", 6):\n ltext = \"\u308d\u3063\"\n elif lpair == (\"\u306f\u3061\", 8):\n ltext = \"\u306f\u3063\"\n elif lpair == (\"\u3058\u3085\u3046\", 10):\n ltext = \"\u3058\u3085\u3063\"\n elif lpair == (\"\u3072\u3083\u304f\", 100):\n ltext = \"\u3072\u3083\u3063\"\n\n return (\"%s%s\" % (ltext, rtext), lnum * rnum)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nstarting here, it groups the number by three from the tail '1234567' -> (('1',),('234',),('567',)) :param number:str :rtype:tuple", "response": "def split_by_3(self, number):\n \"\"\"\n starting here, it groups the number by three from the tail\n '1234567' -> (('1',),('234',),('567',))\n :param number:str\n :rtype:tuple\n \"\"\"\n blocks = ()\n length = len(number)\n\n if length < 3:\n blocks += ((number,),)\n else:\n len_of_first_block = length % 3\n\n if len_of_first_block > 0:\n first_block = number[0:len_of_first_block],\n blocks += first_block,\n\n for i in range(len_of_first_block, length, 3):\n next_block = (number[i:i + 3],),\n blocks += next_block\n\n return blocks"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\njoining the words by first join lists in the tuple", "response": "def join(self, word_blocks, float_part):\n \"\"\"\n join the words by first join lists in the tuple\n :param word_blocks: tuple\n :rtype: str\n \"\"\"\n word_list = []\n length = len(word_blocks) - 1\n first_block = word_blocks[0],\n start = 0\n\n if length == 1 and first_block[0][0] == '1':\n word_list += ['seribu']\n start = 1\n\n for i in range(start, length + 1, 1):\n word_list += word_blocks[i][1]\n if not word_blocks[i][1]:\n continue\n if i == length:\n break\n word_list += [self.TENS_TO[(length - i) * 3]]\n\n return ' '.join(word_list) + float_part"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef to_currency(self, val, currency='EUR', cents=True, separator=',',\n adjective=False):\n \"\"\"\n Args:\n val: Numeric value\n currency (str): Currency code\n cents (bool): Verbose cents\n separator (str): Cent separator\n adjective (bool): Prefix currency name with adjective\n Returns:\n str: Formatted string\n\n \"\"\"\n left, right, is_negative = parse_currency_parts(val)\n\n try:\n cr1, cr2 = self.CURRENCY_FORMS[currency]\n\n except KeyError:\n raise NotImplementedError(\n 'Currency code \"%s\" not implemented for \"%s\"' %\n (currency, self.__class__.__name__))\n\n if adjective and currency in self.CURRENCY_ADJECTIVES:\n cr1 = prefix_currency(self.CURRENCY_ADJECTIVES[currency], cr1)\n\n minus_str = \"%s \" % self.negword if is_negative else \"\"\n cents_str = self._cents_verbose(right, currency) \\\n if cents else self._cents_terse(right, currency)\n\n return u'%s%s %s%s %s %s' % (\n minus_str,\n self.to_cardinal(left),\n self.pluralize(left, cr1),\n separator,\n cents_str,\n self.pluralize(right, cr2)\n )", "response": "Returns a string representation of the value in the specified currency."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef parse_scoped_selector(scoped_selector):\n # Conver Macro (%scope/name) to (scope/name/macro.value)\n if scoped_selector[0] == '%':\n if scoped_selector.endswith('.value'):\n err_str = '{} is invalid cannot use % and end with .value'\n raise ValueError(err_str.format(scoped_selector))\n scoped_selector = scoped_selector[1:] + '/macro.value'\n scope_selector_list = scoped_selector.rsplit('/', 1)\n scope = ''.join(scope_selector_list[:-1])\n selector = scope_selector_list[-1]\n return scope, selector", "response": "Parse a scoped selector."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nparse a single statement.", "response": "def parse_statement(self):\n \"\"\"Parse a single statement.\n\n Returns:\n Either a `BindingStatement`, `ImportStatement`, `IncludeStatement`, or\n `None` if no more statements can be parsed (EOF reached).\n \"\"\"\n self._skip_whitespace_and_comments()\n if self._current_token.kind == tokenize.ENDMARKER:\n return None\n\n # Save off location, but ignore char_num for any statement-level errors.\n stmt_loc = self._current_location(ignore_char_num=True)\n binding_key_or_keyword = self._parse_selector()\n statement = None\n if self._current_token.value != '=':\n if binding_key_or_keyword == 'import':\n module = self._parse_selector(scoped=False)\n statement = ImportStatement(module, stmt_loc)\n elif binding_key_or_keyword == 'include':\n str_loc = self._current_location()\n success, filename = self._maybe_parse_basic_type()\n if not success or not isinstance(filename, str):\n self._raise_syntax_error('Expected file path as string.', str_loc)\n statement = IncludeStatement(filename, stmt_loc)\n else:\n self._raise_syntax_error(\"Expected '='.\")\n else: # We saw an '='.\n self._advance_one_token()\n value = self.parse_value()\n scope, selector, arg_name = parse_binding_key(binding_key_or_keyword)\n statement = BindingStatement(scope, selector, arg_name, value, stmt_loc)\n\n assert statement, 'Internal parsing error.'\n\n if (self._current_token.kind != tokenize.NEWLINE and\n self._current_token.kind != tokenize.ENDMARKER):\n self._raise_syntax_error('Expected newline.')\n elif self._current_token.kind == tokenize.NEWLINE:\n self._advance_one_token()\n\n return statement"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nparsing a single literal value.", "response": "def parse_value(self):\n \"\"\"Parse a single literal value.\n\n Returns:\n The parsed value.\n \"\"\"\n parsers = [\n self._maybe_parse_container, self._maybe_parse_basic_type,\n self._maybe_parse_configurable_reference, self._maybe_parse_macro\n ]\n for parser in parsers:\n success, value = parser()\n if success:\n return value\n self._raise_syntax_error('Unable to parse value.')"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nadvances to next line.", "response": "def advance_one_line(self):\n \"\"\"Advances to next line.\"\"\"\n\n current_line = self._current_token.line_number\n while current_line == self._current_token.line_number:\n self._current_token = ConfigParser.Token(*next(self._token_generator))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _parse_selector(self, scoped=True, allow_periods_in_scope=False):\n if self._current_token.kind != tokenize.NAME:\n self._raise_syntax_error('Unexpected token.')\n\n begin_line_num = self._current_token.begin[0]\n begin_char_num = self._current_token.begin[1]\n end_char_num = self._current_token.end[1]\n line = self._current_token.line\n\n selector_parts = []\n # This accepts an alternating sequence of NAME and '/' or '.' tokens.\n step_parity = 0\n while (step_parity == 0 and self._current_token.kind == tokenize.NAME or\n step_parity == 1 and self._current_token.value in ('/', '.')):\n selector_parts.append(self._current_token.value)\n step_parity = not step_parity\n end_char_num = self._current_token.end[1]\n self._advance_one_token()\n self._skip_whitespace_and_comments()\n\n # Due to tokenization, most whitespace has been stripped already. To prevent\n # whitespace inside the scoped selector, we verify that it matches an\n # untokenized version of the selector obtained from the first through last\n # character positions of the consumed tokens in the line being parsed.\n scoped_selector = ''.join(selector_parts)\n untokenized_scoped_selector = line[begin_char_num:end_char_num]\n # Also check that it's properly formatted (e.g., no consecutive slashes).\n scope_re = IDENTIFIER_RE\n if allow_periods_in_scope:\n scope_re = MODULE_RE\n selector_re = MODULE_RE\n\n scope_parts = scoped_selector.split('/')\n valid_format = all(scope_re.match(scope) for scope in scope_parts[:-1])\n valid_format &= bool(selector_re.match(scope_parts[-1]))\n valid_format &= bool(scoped or len(scope_parts) == 1)\n if untokenized_scoped_selector != scoped_selector or not valid_format:\n location = (self._filename, begin_line_num, begin_char_num + 1, line)\n self._raise_syntax_error('Malformatted scope or selector.', location)\n\n return scoped_selector", "response": "Parses a selector from the current token and returns it as a string."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _maybe_parse_container(self):\n bracket_types = {\n '{': ('}', dict, self._parse_dict_item),\n '(': (')', tuple, self.parse_value),\n '[': (']', list, self.parse_value)\n }\n if self._current_token.value in bracket_types:\n open_bracket = self._current_token.value\n close_bracket, type_fn, parse_item = bracket_types[open_bracket]\n self._advance()\n\n values = []\n saw_comma = False\n while self._current_token.value != close_bracket:\n values.append(parse_item())\n if self._current_token.value == ',':\n saw_comma = True\n self._advance()\n elif self._current_token.value != close_bracket:\n self._raise_syntax_error(\"Expected ',' or '%s'.\" % close_bracket)\n\n # If it's just a single value enclosed in parentheses without a trailing\n # comma, it's not a tuple, so just grab the value.\n if type_fn is tuple and len(values) == 1 and not saw_comma:\n type_fn = lambda x: x[0]\n\n self._advance()\n return True, type_fn(values)\n\n return False, None", "response": "Try to parse a container type."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _maybe_parse_basic_type(self):\n token_value = ''\n # Allow a leading dash to handle negative numbers.\n if self._current_token.value == '-':\n token_value += self._current_token.value\n self._advance()\n\n basic_type_tokens = [tokenize.NAME, tokenize.NUMBER, tokenize.STRING]\n continue_parsing = self._current_token.kind in basic_type_tokens\n if not continue_parsing:\n return False, None\n\n while continue_parsing:\n token_value += self._current_token.value\n\n try:\n value = ast.literal_eval(token_value)\n except Exception as e: # pylint: disable=broad-except\n err_str = \"{}\\n Failed to parse token '{}'\"\n self._raise_syntax_error(err_str.format(e, token_value))\n\n was_string = self._current_token.kind == tokenize.STRING\n self._advance()\n is_string = self._current_token.kind == tokenize.STRING\n continue_parsing = was_string and is_string\n\n return True, value", "response": "Try to parse a basic type."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ntries to parse a configurable reference.", "response": "def _maybe_parse_configurable_reference(self):\n \"\"\"Try to parse a configurable reference (@[scope/name/]fn_name[()]).\"\"\"\n if self._current_token.value != '@':\n return False, None\n\n location = self._current_location()\n self._advance_one_token()\n scoped_name = self._parse_selector(allow_periods_in_scope=True)\n\n evaluate = False\n if self._current_token.value == '(':\n evaluate = True\n self._advance()\n if self._current_token.value != ')':\n self._raise_syntax_error(\"Expected ')'.\")\n self._advance_one_token()\n self._skip_whitespace_and_comments()\n\n with utils.try_with_location(location):\n reference = self._delegate.configurable_reference(scoped_name, evaluate)\n\n return True, reference"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ntrying to parse a macro.", "response": "def _maybe_parse_macro(self):\n \"\"\"Try to parse an macro (%scope/name).\"\"\"\n if self._current_token.value != '%':\n return False, None\n\n location = self._current_location()\n self._advance_one_token()\n scoped_name = self._parse_selector(allow_periods_in_scope=True)\n\n with utils.try_with_location(location):\n macro = self._delegate.macro(scoped_name)\n\n return True, macro"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef augment_exception_message_and_reraise(exception, message):\n\n class ExceptionProxy(type(exception)):\n \"\"\"Acts as a proxy for an exception with an augmented message.\"\"\"\n __module__ = type(exception).__module__\n\n def __init__(self):\n pass\n\n def __getattr__(self, attr_name):\n return getattr(exception, attr_name)\n\n def __str__(self):\n return str(exception) + message\n\n ExceptionProxy.__name__ = type(exception).__name__\n\n proxy = ExceptionProxy()\n if six.PY3:\n ExceptionProxy.__qualname__ = type(exception).__qualname__\n six.raise_from(proxy.with_traceback(exception.__traceback__), None)\n else:\n six.reraise(proxy, None, sys.exc_info()[2])", "response": "Augment an exception with a message."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _markdownify_operative_config_str(self, string):\n\n # TODO: Total hack below. Implement more principled formatting.\n def process(line):\n \"\"\"Convert a single line to markdown format.\"\"\"\n if not line.startswith('#'):\n return ' ' + line\n\n line = line[2:]\n if line.startswith('===='):\n return ''\n if line.startswith('None'):\n return ' # None.'\n if line.endswith(':'):\n return '#### ' + line\n return line\n\n output_lines = []\n for line in string.splitlines():\n procd_line = process(line)\n if procd_line is not None:\n output_lines.append(procd_line)\n\n return '\\n'.join(output_lines)", "response": "Convert an operative config string to markdown format."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef after_create_session(self, session=None, coord=None):\n config_str = config.operative_config_str()\n if not tf.gfile.IsDirectory(self._output_dir):\n tf.gfile.MakeDirs(self._output_dir)\n global_step_val = 0\n if session is not None:\n global_step = tf.train.get_global_step()\n if global_step is not None:\n global_step_val = session.run(global_step)\n filename = '%s-%s.gin' % (self._base_name, global_step_val)\n config_path = os.path.join(self._output_dir, filename)\n with tf.gfile.GFile(config_path, 'w') as f:\n f.write(config_str)\n\n if self._summarize_config:\n md_config_str = self._markdownify_operative_config_str(config_str)\n summary_metadata = summary_pb2.SummaryMetadata()\n summary_metadata.plugin_data.plugin_name = 'text'\n summary_metadata.plugin_data.content = b'{}'\n text_tensor = tf.make_tensor_proto(md_config_str)\n summary = summary_pb2.Summary()\n summary.value.add(\n tag='gin/' + self._base_name,\n tensor=text_tensor,\n metadata=summary_metadata)\n if not self._summary_writer:\n # Creating the FileWriter also creates the events file, so it should be\n # done here (where it is most likely to only occur on chief workers), as\n # opposed to in the constructor.\n self._summary_writer = tf.summary.FileWriterCache.get(self._output_dir)\n self._summary_writer.add_summary(summary, global_step_val)\n self._summary_writer.flush()", "response": "Writes out Gin s operative config and maybe adds a summary of it."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _find_class_construction_fn(cls):\n for base in type.mro(cls):\n if '__init__' in base.__dict__:\n return base.__init__\n if '__new__' in base.__dict__:\n return base.__new__", "response": "Find the first __init__ or __new__ method in the given class s MRO."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _ensure_wrappability(fn):\n # Handle \"wrapped_descriptor\" and \"method-wrapper\" types.\n if isinstance(fn, (type(object.__init__), type(object.__call__))):\n # pylint: disable=unnecessary-lambda\n wrappable_fn = lambda *args, **kwargs: fn(*args, **kwargs)\n wrappable_fn.__name__ = fn.__name__\n wrappable_fn.__doc__ = fn.__doc__\n wrappable_fn.__module__ = '' # These types have no __module__, sigh.\n wrappable_fn.__wrapped__ = fn\n return wrappable_fn\n\n # Otherwise we're good to go...\n return fn", "response": "Make sure fn can be wrapped cleanly by functools. wraps."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _decorate_fn_or_cls(decorator, fn_or_cls, subclass=False):\n if not inspect.isclass(fn_or_cls):\n return decorator(_ensure_wrappability(fn_or_cls))\n\n construction_fn = _find_class_construction_fn(fn_or_cls)\n\n if subclass:\n class DecoratedClass(fn_or_cls):\n __doc__ = fn_or_cls.__doc__\n __module__ = fn_or_cls.__module__\n DecoratedClass.__name__ = fn_or_cls.__name__\n if six.PY3:\n DecoratedClass.__qualname__ = fn_or_cls.__qualname__\n cls = DecoratedClass\n else:\n cls = fn_or_cls\n\n decorated_fn = decorator(_ensure_wrappability(construction_fn))\n if construction_fn.__name__ == '__new__':\n decorated_fn = staticmethod(decorated_fn)\n setattr(cls, construction_fn.__name__, decorated_fn)\n return cls", "response": "Decorate a function or class with the given decorator."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _should_skip(selector, skip_unknown):\n _validate_skip_unknown(skip_unknown)\n if _REGISTRY.matching_selectors(selector):\n return False # Never skip known configurables.\n if isinstance(skip_unknown, (list, tuple, set)):\n return selector in skip_unknown\n return skip_unknown", "response": "Checks whether selector should be skipped."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _format_value(value):\n literal = repr(value)\n try:\n if parse_value(literal) == value:\n return literal\n except SyntaxError:\n pass\n return None", "response": "Returns value in a format parseable by parse_value or None."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef clear_config(clear_constants=False):\n _set_config_is_locked(False)\n _CONFIG.clear()\n _SINGLETONS.clear()\n if clear_constants:\n _CONSTANTS.clear()\n else:\n saved_constants = _CONSTANTS.copy()\n _CONSTANTS.clear() # Clear then redefine constants (re-adding bindings).\n for name, value in six.iteritems(saved_constants):\n constant(name, value)\n _IMPORTED_MODULES.clear()\n _OPERATIVE_CONFIG.clear()", "response": "Clears the global configuration."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nbind the value specified by binding_key to value.", "response": "def bind_parameter(binding_key, value):\n \"\"\"Binds the parameter value specified by `binding_key` to `value`.\n\n The `binding_key` argument should either be a string of the form\n `maybe/scope/optional.module.names.configurable_name.parameter_name`, or a\n list or tuple of `(scope, selector, parameter_name)`, where `selector`\n corresponds to `optional.module.names.configurable_name`. Once this function\n has been called, subsequent calls (in the specified scope) to the specified\n configurable function will have `value` supplied to their `parameter_name`\n parameter.\n\n Example:\n\n @configurable('fully_connected_network')\n def network_fn(num_layers=5, units_per_layer=1024):\n ...\n\n def main(_):\n config.bind_parameter('fully_connected_network.num_layers', 3)\n network_fn() # Called with num_layers == 3, not the default of 5.\n\n Args:\n binding_key: The parameter whose value should be set. This can either be a\n string, or a tuple of the form `(scope, selector, parameter)`.\n value: The desired value.\n\n Raises:\n RuntimeError: If the config is locked.\n ValueError: If no function can be found matching the configurable name\n specified by `binding_key`, or if the specified parameter name is\n blacklisted or not in the function's whitelist (if present).\n \"\"\"\n if config_is_locked():\n raise RuntimeError('Attempted to modify locked Gin config.')\n\n pbk = ParsedBindingKey(binding_key)\n fn_dict = _CONFIG.setdefault(pbk.config_key, {})\n fn_dict[pbk.arg_name] = value"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nquerying the value of a parameter in the current configurable name.", "response": "def query_parameter(binding_key):\n \"\"\"Returns the currently bound value to the specified `binding_key`.\n\n The `binding_key` argument should look like\n 'maybe/some/scope/maybe.moduels.configurable_name.parameter_name'. Note that\n this will not include default parameters.\n\n Args:\n binding_key: The parameter whose value should be set.\n\n Returns:\n The value bound to the configurable/parameter combination given in\n `binding_key`.\n\n Raises:\n ValueError: If no function can be found matching the configurable name\n specified by `biding_key`, or if the specified parameter name is\n blacklisted or not in the function's whitelist (if present) or if there is\n no value bound for the queried parameter or configurable.\n \"\"\"\n pbk = ParsedBindingKey(binding_key)\n if pbk.config_key not in _CONFIG:\n err_str = \"Configurable '{}' has no bound parameters.\"\n raise ValueError(err_str.format(pbk.given_selector))\n if pbk.arg_name not in _CONFIG[pbk.config_key]:\n err_str = \"Configurable '{}' has no value bound for parameter '{}'.\"\n raise ValueError(err_str.format(pbk.given_selector, pbk.arg_name))\n return _CONFIG[pbk.config_key][pbk.arg_name]"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns True if fn_or_cls has a parameter named by arg_name.", "response": "def _might_have_parameter(fn_or_cls, arg_name):\n \"\"\"Returns True if `arg_name` might be a valid parameter for `fn_or_cls`.\n\n Specifically, this means that `fn_or_cls` either has a parameter named\n `arg_name`, or has a `**kwargs` parameter.\n\n Args:\n fn_or_cls: The function or class to check.\n arg_name: The name fo the parameter.\n\n Returns:\n Whether `arg_name` might be a valid argument of `fn`.\n \"\"\"\n if inspect.isclass(fn_or_cls):\n fn = _find_class_construction_fn(fn_or_cls)\n else:\n fn = fn_or_cls\n\n while hasattr(fn, '__wrapped__'):\n fn = fn.__wrapped__\n arg_spec = _get_cached_arg_spec(fn)\n if six.PY3:\n if arg_spec.varkw:\n return True\n return arg_name in arg_spec.args or arg_name in arg_spec.kwonlyargs\n else:\n if arg_spec.keywords:\n return True\n return arg_name in arg_spec.args"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget cached argspec for fn.", "response": "def _get_cached_arg_spec(fn):\n \"\"\"Gets cached argspec for `fn`.\"\"\"\n\n arg_spec = _ARG_SPEC_CACHE.get(fn)\n if arg_spec is None:\n arg_spec_fn = inspect.getfullargspec if six.PY3 else inspect.getargspec\n try:\n arg_spec = arg_spec_fn(fn)\n except TypeError:\n # `fn` might be a callable object.\n arg_spec = arg_spec_fn(fn.__call__)\n _ARG_SPEC_CACHE[fn] = arg_spec\n return arg_spec"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the names of the supplied positional arguments to the given function.", "response": "def _get_supplied_positional_parameter_names(fn, args):\n \"\"\"Returns the names of the supplied arguments to the given function.\"\"\"\n arg_spec = _get_cached_arg_spec(fn)\n # May be shorter than len(args) if args contains vararg (*args) arguments.\n return arg_spec.args[:len(args)]"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _get_all_positional_parameter_names(fn):\n arg_spec = _get_cached_arg_spec(fn)\n args = arg_spec.args\n if arg_spec.defaults:\n args = args[:-len(arg_spec.defaults)]\n return args", "response": "Returns the names of all positional arguments to the given function."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nretrieves all default values for configurable parameters of a function.", "response": "def _get_default_configurable_parameter_values(fn, whitelist, blacklist):\n \"\"\"Retrieve all default values for configurable parameters of a function.\n\n Any parameters included in the supplied blacklist, or not included in the\n supplied whitelist, are excluded.\n\n Args:\n fn: The function whose parameter values should be retrieved.\n whitelist: The whitelist (or `None`) associated with the function.\n blacklist: The blacklist (or `None`) associated with the function.\n\n Returns:\n A dictionary mapping configurable parameter names to their default values.\n \"\"\"\n arg_vals = _ARG_DEFAULTS_CACHE.get(fn)\n if arg_vals is not None:\n return arg_vals.copy()\n\n # First, grab any default values not captured in the kwargs var.\n arg_spec = _get_cached_arg_spec(fn)\n if arg_spec.defaults:\n default_kwarg_names = arg_spec.args[-len(arg_spec.defaults):]\n arg_vals = dict(zip(default_kwarg_names, arg_spec.defaults))\n else:\n arg_vals = {}\n\n if six.PY3 and arg_spec.kwonlydefaults:\n arg_vals.update(arg_spec.kwonlydefaults)\n\n # Now, eliminate keywords that are blacklisted, or aren't whitelisted (if\n # there's a whitelist), or aren't representable as a literal value.\n for k in list(six.iterkeys(arg_vals)):\n whitelist_fail = whitelist and k not in whitelist\n blacklist_fail = blacklist and k in blacklist\n representable = _is_literally_representable(arg_vals[k])\n if whitelist_fail or blacklist_fail or not representable:\n del arg_vals[k]\n\n _ARG_DEFAULTS_CACHE[fn] = arg_vals\n return arg_vals.copy()"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nopen a new explicit configuration scope.", "response": "def config_scope(name_or_scope):\n \"\"\"Opens a new configuration scope.\n\n Provides a context manager that opens a new explicit configuration\n scope. Explicit configuration scopes restrict parameter bindings to only\n certain sections of code that run within the scope. Scopes can be nested to\n arbitrary depth; any configurable functions called within a scope inherit\n parameters defined by higher level scopes.\n\n For example, suppose a function named `preprocess_images` is called in two\n places in a codebase: Once when loading data for a training task, and once\n when loading data for an evaluation task:\n\n def load_training_data():\n ...\n with gin.config_scope('train'):\n images = preprocess_images(images)\n ...\n\n\n def load_eval_data():\n ...\n with gin.config_scope('eval'):\n images = preprocess_images(images)\n ...\n\n By using a `config_scope` to wrap each invocation of `preprocess_images` as\n above, it is possible to use Gin to supply specific parameters to each. Here\n is a possible configuration for the above example:\n\n preprocess_images.crop_size = [64, 64]\n preprocess_images.normalize_image = True\n\n train/preprocess_images.crop_location = 'random'\n train/preprocess_images.random_flip_lr = True\n\n eval/preprocess_images.crop_location = 'center'\n\n The `crop_size` and `normalize_image` parameters above will be shared by both\n the `train` and `eval` invocations; only `train` will receive\n `random_flip_lr`, and the two invocations receive different values for\n `crop_location`.\n\n Passing `None` or `''` to `config_scope` will temporarily clear all currently\n active scopes (within the `with` block; they will be restored afterwards).\n\n Args:\n name_or_scope: A name for the config scope, or an existing scope (e.g.,\n captured from `with gin.config_scope(...) as scope`), or `None` to clear\n currently active scopes.\n\n Raises:\n ValueError: If `name_or_scope` is not a list, string, or None.\n\n Yields:\n The resulting config scope (a list of all active scope names, ordered from\n outermost to innermost).\n \"\"\"\n try:\n valid_value = True\n if isinstance(name_or_scope, list):\n new_scope = name_or_scope\n elif name_or_scope and isinstance(name_or_scope, six.string_types):\n new_scope = current_scope() # Returns a copy.\n new_scope.extend(name_or_scope.split('/'))\n else:\n valid_value = name_or_scope in (None, '')\n new_scope = []\n\n # Append new_scope first. It will be popped in the finally block if an\n # exception is raised below.\n _ACTIVE_SCOPES.append(new_scope)\n\n scopes_are_valid = map(config_parser.MODULE_RE.match, new_scope)\n if not valid_value or not all(scopes_are_valid):\n err_str = 'Invalid value for `name_or_scope`: {}.'\n raise ValueError(err_str.format(name_or_scope))\n\n yield new_scope\n finally:\n _ACTIVE_SCOPES.pop()"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _make_configurable(fn_or_cls,\n name=None,\n module=None,\n whitelist=None,\n blacklist=None,\n subclass=False):\n \"\"\"Wraps `fn_or_cls` to make it configurable.\n\n Infers the configurable name from `fn_or_cls.__name__` if necessary, and\n updates global state to keep track of configurable name <-> function\n mappings, as well as whitelisted and blacklisted parameters.\n\n Args:\n fn_or_cls: The function or class to decorate.\n name: A name for the configurable. If `None`, the name will be inferred from\n from `fn_or_cls`. The `name` may also include module components to be used\n for disambiguation (these will be appended to any components explicitly\n specified by `module`).\n module: The module to associate with the configurable, to help handle naming\n collisions. If `None`, `fn_or_cls.__module__` will be used (if no module\n is specified as part of `name`).\n whitelist: A whitelisted set of parameter names to supply values for.\n blacklist: A blacklisted set of parameter names not to supply values for.\n subclass: If `fn_or_cls` is a class and `subclass` is `True`, decorate by\n subclassing `fn_or_cls` and overriding its `__init__` method. If `False`,\n replace the existing `__init__` with a decorated version.\n\n Returns:\n A wrapped version of `fn_or_cls` that will take parameter values from the\n global configuration.\n\n Raises:\n RuntimeError: If the config is locked.\n ValueError: If a configurable with `name` (or the name of `fn_or_cls`)\n already exists, or if both a whitelist and blacklist are specified.\n \"\"\"\n if config_is_locked():\n err_str = 'Attempted to add a new configurable after the config was locked.'\n raise RuntimeError(err_str)\n\n name = fn_or_cls.__name__ if name is None else name\n if config_parser.IDENTIFIER_RE.match(name):\n default_module = getattr(fn_or_cls, '__module__', None)\n module = default_module if module is None else module\n elif not config_parser.MODULE_RE.match(name):\n raise ValueError(\"Configurable name '{}' is invalid.\".format(name))\n\n if module is not None and not config_parser.MODULE_RE.match(module):\n raise ValueError(\"Module '{}' is invalid.\".format(module))\n\n selector = module + '.' + name if module else name\n if not _INTERACTIVE_MODE and selector in _REGISTRY:\n err_str = \"A configurable matching '{}' already exists.\"\n raise ValueError(err_str.format(selector))\n\n if whitelist and blacklist:\n err_str = 'A whitelist or a blacklist can be specified, but not both.'\n raise ValueError(err_str)\n\n if whitelist and not isinstance(whitelist, (list, tuple)):\n raise TypeError('Whitelist should be a list or tuple.')\n\n if blacklist and not isinstance(blacklist, (list, tuple)):\n raise TypeError('Blacklist should be a list or tuple.')\n\n _validate_parameters(fn_or_cls, whitelist, 'whitelist')\n _validate_parameters(fn_or_cls, blacklist, 'blacklist')\n\n def apply_config(fn):\n \"\"\"Wraps `fn` so that it obtains parameters from the configuration.\"\"\"\n\n @six.wraps(fn)\n def wrapper(*args, **kwargs):\n \"\"\"Supplies fn with parameter values from the configuration.\"\"\"\n scope_components = current_scope()\n new_kwargs = {}\n for i in range(len(scope_components) + 1):\n partial_scope_str = '/'.join(scope_components[:i])\n new_kwargs.update(_CONFIG.get((partial_scope_str, selector), {}))\n gin_bound_args = list(new_kwargs.keys())\n scope_str = partial_scope_str\n\n arg_names = _get_supplied_positional_parameter_names(fn, args)\n\n for arg in args[len(arg_names):]:\n if arg is REQUIRED:\n raise ValueError(\n 'gin.REQUIRED is not allowed for unnamed (vararg) parameters. If '\n 'the function being called is wrapped by a non-Gin decorator, '\n 'try explicitly providing argument names for positional '\n 'parameters.')\n\n required_arg_names = []\n required_arg_indexes = []\n for i, arg in enumerate(args[:len(arg_names)]):\n if arg is REQUIRED:\n required_arg_names.append(arg_names[i])\n required_arg_indexes.append(i)\n\n required_kwargs = []\n for kwarg, value in six.iteritems(kwargs):\n if value is REQUIRED:\n required_kwargs.append(kwarg)\n\n # If the caller passed arguments as positional arguments that correspond\n # to a keyword arg in new_kwargs, remove the keyword argument from\n # new_kwargs to let the caller win and avoid throwing an error. Unless it\n # is an arg marked as REQUIRED.\n for arg_name in arg_names:\n if arg_name not in required_arg_names:\n new_kwargs.pop(arg_name, None)\n\n # Get default values for configurable parameters.\n operative_parameter_values = _get_default_configurable_parameter_values(\n fn, whitelist, blacklist)\n # Update with the values supplied via configuration.\n operative_parameter_values.update(new_kwargs)\n\n # Remove any values from the operative config that are overridden by the\n # caller. These can't be configured, so they won't be logged. We skip\n # values that are marked as REQUIRED.\n for k in arg_names:\n if k not in required_arg_names:\n operative_parameter_values.pop(k, None)\n for k in kwargs:\n if k not in required_kwargs:\n operative_parameter_values.pop(k, None)\n\n # An update is performed in case another caller of this same configurable\n # object has supplied a different set of arguments. By doing an update, a\n # Gin-supplied or default value will be present if it was used (not\n # overridden by the caller) at least once.\n _OPERATIVE_CONFIG.setdefault((scope_str, selector), {}).update(\n operative_parameter_values)\n\n # We call deepcopy for two reasons: First, to prevent the called function\n # from modifying any of the values in `_CONFIG` through references passed\n # in via `new_kwargs`; Second, to facilitate evaluation of any\n # `ConfigurableReference` instances buried somewhere inside\n # `new_kwargs`. See the docstring on `ConfigurableReference.__deepcopy__`\n # above for more details on the dark magic happening here.\n new_kwargs = copy.deepcopy(new_kwargs)\n\n # Validate args marked as REQUIRED have been bound in the Gin config.\n missing_required_params = []\n new_args = list(args)\n for i, arg_name in zip(required_arg_indexes, required_arg_names):\n if arg_name not in new_kwargs:\n missing_required_params.append(arg_name)\n else:\n new_args[i] = new_kwargs.pop(arg_name)\n\n # Validate kwargs marked as REQUIRED have been bound in the Gin config.\n for kwarg in required_kwargs:\n if kwarg not in new_kwargs:\n missing_required_params.append(kwarg)\n else:\n # Remove from kwargs and let the new_kwargs value be used.\n kwargs.pop(kwarg)\n\n if missing_required_params:\n err_str = 'Required bindings for `{}` not provided in config: {}'\n minimal_selector = _REGISTRY.minimal_selector(selector)\n err_str = err_str.format(minimal_selector, missing_required_params)\n raise RuntimeError(err_str)\n\n # Now, update with the caller-supplied `kwargs`, allowing the caller to\n # have the final say on keyword argument values.\n new_kwargs.update(kwargs)\n\n try:\n return fn(*new_args, **new_kwargs)\n except Exception as e: # pylint: disable=broad-except\n err_str = ''\n if isinstance(e, TypeError):\n all_arg_names = _get_all_positional_parameter_names(fn)\n if len(new_args) < len(all_arg_names):\n unbound_positional_args = list(\n set(all_arg_names[len(new_args):]) - set(new_kwargs))\n if unbound_positional_args:\n caller_supplied_args = list(\n set(arg_names + list(kwargs)) -\n set(required_arg_names + list(required_kwargs)))\n fmt = ('\\n No values supplied by Gin or caller for arguments: {}'\n '\\n Gin had values bound for: {gin_bound_args}'\n '\\n Caller supplied values for: {caller_supplied_args}')\n canonicalize = lambda x: list(map(str, sorted(x)))\n err_str += fmt.format(\n canonicalize(unbound_positional_args),\n gin_bound_args=canonicalize(gin_bound_args),\n caller_supplied_args=canonicalize(caller_supplied_args))\n err_str += \"\\n In call to configurable '{}' ({}){}\"\n scope_info = \" in scope '{}'\".format(scope_str) if scope_str else ''\n err_str = err_str.format(name, fn, scope_info)\n utils.augment_exception_message_and_reraise(e, err_str)\n\n return wrapper\n\n decorated_fn_or_cls = _decorate_fn_or_cls(\n apply_config, fn_or_cls, subclass=subclass)\n\n _REGISTRY[selector] = Configurable(\n decorated_fn_or_cls,\n name=name,\n module=module,\n whitelist=whitelist,\n blacklist=blacklist,\n selector=selector)\n return decorated_fn_or_cls", "response": "Wraps fn_or_cls to make a configurable."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef external_configurable(fn_or_cls,\n name=None,\n module=None,\n whitelist=None,\n blacklist=None):\n \"\"\"Allow referencing/configuring an external class or function.\n\n This alerts Gin to the existence of the class or function `fn_or_cls` in the\n event that it can't be easily annotated with `@configurable` (for instance, if\n it is from another project). This allows `fn_or_cls` to be configured and\n referenced (using the `@name` notation) via parameter binding strings.\n\n Note that only calls to the return value of this function or resulting from\n references to `fn_or_cls` made through binding strings (configurations) will\n have their parameters injected by Gin---explicit calls to `fn_or_cls` directly\n won't have any parameter bindings applied.\n\n Args:\n fn_or_cls: The external function or class that should be made configurable.\n name: The configurable name to be associated with `fn_or_cls`. The name may\n also include module components to be used for disambiguation (these will\n be appended to any components explicitly specified by `module`).\n module: The module to associate with the configurable, to help handle naming\n collisions. By default, `fn_or_cls.__module__` will be used (if no\n module is specified as part of the name).\n whitelist: A whitelist of parameter names to allow configuration for.\n blacklist: A blacklist of parameter names not to allow configuration for.\n\n Returns:\n A decorated version of `fn_or_cls` that permits parameter binding. For\n functions, this is just a wrapped version of the function. For classes, this\n is a carefully constructed subclass of `fn_or_cls` designed to behave nearly\n identically (even under many type inspection operations) save for the\n addition of parameter binding.\n \"\"\"\n return _make_configurable(\n fn_or_cls,\n name=name,\n module=module,\n whitelist=whitelist,\n blacklist=blacklist,\n subclass=True)", "response": "Allow referencing or configuring an external function or class or function."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef operative_config_str(max_line_length=80, continuation_indent=4):\n def format_binding(key, value):\n \"\"\"Pretty print the given key/value pair.\"\"\"\n formatted_val = pprint.pformat(\n value, width=(max_line_length - continuation_indent))\n formatted_val_lines = formatted_val.split('\\n')\n if (len(formatted_val_lines) == 1 and\n len(key + formatted_val) <= max_line_length):\n output = '{} = {}'.format(key, formatted_val)\n else:\n indented_formatted_val = '\\n'.join(\n [' ' * continuation_indent + line for line in formatted_val_lines])\n output = '{} = \\\\\\n{}'.format(key, indented_formatted_val)\n return output\n\n def sort_key(key_tuple):\n \"\"\"Sort configurable selector/innermost scopes, ignoring case.\"\"\"\n scope, selector = key_tuple[0]\n parts = selector.lower().split('.')[::-1] + scope.lower().split('/')[::-1]\n return '/'.join(parts)\n\n # Build the output as an array of formatted Gin statements. Each statement may\n # span multiple lines. Imports are first, followed by macros, and finally all\n # other bindings sorted in alphabetical order by configurable name.\n formatted_statements = [\n 'import {}'.format(module) for module in sorted(_IMPORTED_MODULES)\n ]\n if formatted_statements:\n formatted_statements.append('')\n\n macros = {}\n for (scope, selector), config in six.iteritems(_OPERATIVE_CONFIG):\n if _REGISTRY[selector].fn_or_cls == macro:\n macros[scope, selector] = config\n if macros:\n formatted_statements.append('# Macros:')\n formatted_statements.append('# ' + '=' * (max_line_length - 2))\n for (name, _), config in sorted(macros.items(), key=sort_key):\n binding = format_binding(name, config['value'])\n formatted_statements.append(binding)\n if macros:\n formatted_statements.append('')\n\n sorted_items = sorted(_OPERATIVE_CONFIG.items(), key=sort_key)\n for (scope, selector), config in sorted_items:\n configurable_ = _REGISTRY[selector]\n\n fn = configurable_.fn_or_cls\n if fn == macro or fn == _retrieve_constant:\n continue\n\n minimal_selector = _REGISTRY.minimal_selector(configurable_.selector)\n scoped_selector = (scope + '/' if scope else '') + minimal_selector\n parameters = [(k, v) for k, v in six.iteritems(config)\n if _is_literally_representable(v)]\n formatted_statements.append('# Parameters for {}:'.format(scoped_selector))\n formatted_statements.append('# ' + '=' * (max_line_length - 2))\n for arg, val in sorted(parameters):\n binding = format_binding('{}.{}'.format(scoped_selector, arg), val)\n formatted_statements.append(binding)\n if not parameters:\n formatted_statements.append('# None.')\n formatted_statements.append('')\n\n return '\\n'.join(formatted_statements)", "response": "Return the operative configuration string that can be used by a program."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef parse_config(bindings, skip_unknown=False):\n if isinstance(bindings, (list, tuple)):\n bindings = '\\n'.join(bindings)\n\n _validate_skip_unknown(skip_unknown)\n if isinstance(skip_unknown, (list, tuple)):\n skip_unknown = set(skip_unknown)\n\n parser = config_parser.ConfigParser(bindings, ParserDelegate(skip_unknown))\n for statement in parser:\n if isinstance(statement, config_parser.BindingStatement):\n scope, selector, arg_name, value, location = statement\n if not arg_name:\n macro_name = '{}/{}'.format(scope, selector) if scope else selector\n with utils.try_with_location(location):\n bind_parameter((macro_name, 'gin.macro', 'value'), value)\n continue\n if not _should_skip(selector, skip_unknown):\n with utils.try_with_location(location):\n bind_parameter((scope, selector, arg_name), value)\n elif isinstance(statement, config_parser.ImportStatement):\n if skip_unknown:\n try:\n __import__(statement.module)\n _IMPORTED_MODULES.add(statement.module)\n except ImportError:\n log_str = 'Skipping import of unknown module `%s` (skip_unknown=%r).'\n logging.info(log_str, statement.module, skip_unknown)\n else:\n with utils.try_with_location(statement.location):\n __import__(statement.module)\n _IMPORTED_MODULES.add(statement.module)\n elif isinstance(statement, config_parser.IncludeStatement):\n with utils.try_with_location(statement.location):\n parse_config_file(statement.filename, skip_unknown)\n else:\n raise AssertionError('Unrecognized statement type {}.'.format(statement))", "response": "Parses a file string or list of parameter bindings and sets the global configuration of the configurable functions."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef register_file_reader(*args):\n def do_registration(file_reader_fn, is_readable_fn):\n if file_reader_fn not in list(zip(*_FILE_READERS))[0]:\n _FILE_READERS.append((file_reader_fn, is_readable_fn))\n\n if len(args) == 1: # It's a decorator.\n return functools.partial(do_registration, is_readable_fn=args[0])\n elif len(args) == 2:\n do_registration(*args)\n else: # 0 or > 2 arguments supplied.\n err_str = 'register_file_reader() takes 1 or 2 arguments ({} given)'\n raise TypeError(err_str.format(len(args)))", "response": "Register a file reader for use in parse_config_file."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nparse a Gin config file.", "response": "def parse_config_file(config_file, skip_unknown=False):\n \"\"\"Parse a Gin config file.\n\n Args:\n config_file: The path to a Gin config file.\n skip_unknown: A boolean indicating whether unknown configurables and imports\n should be skipped instead of causing errors (alternatively a list of\n configurable names to skip if unknown). See `parse_config` for additional\n details.\n\n Raises:\n IOError: If `config_file` cannot be read using any register file reader.\n \"\"\"\n for reader, existence_check in _FILE_READERS:\n if existence_check(config_file):\n with reader(config_file) as f:\n parse_config(f, skip_unknown=skip_unknown)\n return\n raise IOError('Unable to open file: {}'.format(config_file))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nparsing a list of config files followed by extra Gin bindings.", "response": "def parse_config_files_and_bindings(config_files,\n bindings,\n finalize_config=True,\n skip_unknown=False):\n \"\"\"Parse a list of config files followed by extra Gin bindings.\n\n This function is equivalent to:\n\n for config_file in config_files:\n gin.parse_config_file(config_file, skip_configurables)\n gin.parse_config(bindings, skip_configurables)\n if finalize_config:\n gin.finalize()\n\n Args:\n config_files: A list of paths to the Gin config files.\n bindings: A list of individual parameter binding strings.\n finalize_config: Whether to finalize the config after parsing and binding\n (defaults to True).\n skip_unknown: A boolean indicating whether unknown configurables and imports\n should be skipped instead of causing errors (alternatively a list of\n configurable names to skip if unknown). See `parse_config` for additional\n details.\n \"\"\"\n if config_files is None:\n config_files = []\n if bindings is None:\n bindings = ''\n for config_file in config_files:\n parse_config_file(config_file, skip_unknown)\n parse_config(bindings, skip_unknown)\n if finalize_config:\n finalize()"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nparses and return a single Gin value.", "response": "def parse_value(value):\n \"\"\"Parse and return a single Gin value.\"\"\"\n if not isinstance(value, six.string_types):\n raise ValueError('value ({}) should be a string type.'.format(value))\n return config_parser.ConfigParser(value, ParserDelegate()).parse_value()"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef finalize():\n if config_is_locked():\n raise RuntimeError('Finalize called twice (config already locked).')\n\n bindings = {}\n for hook in _FINALIZE_HOOKS:\n new_bindings = hook(_CONFIG)\n if new_bindings is not None:\n for key, value in six.iteritems(new_bindings):\n pbk = ParsedBindingKey(key)\n if pbk in bindings:\n err_str = 'Received conflicting updates when running {}.'\n raise ValueError(err_str.format(hook))\n bindings[pbk] = value\n\n for pbk, value in six.iteritems(bindings):\n bind_parameter(pbk, value)\n\n _set_config_is_locked(True)", "response": "A function that should be called after parsing all Gin config files."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nprovides an iterator over all values in a nested structure.", "response": "def _iterate_flattened_values(value):\n \"\"\"Provides an iterator over all values in a nested structure.\"\"\"\n if isinstance(value, six.string_types):\n yield value\n return\n\n if isinstance(value, collections.Mapping):\n value = collections.ValuesView(value)\n\n if isinstance(value, collections.Iterable):\n for nested_value in value:\n for nested_nested_value in _iterate_flattened_values(nested_value):\n yield nested_nested_value\n\n yield value"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns an iterator over all the references in the given config.", "response": "def iterate_references(config, to=None):\n \"\"\"Provides an iterator over references in the given config.\n\n Args:\n config: A dictionary mapping scoped configurable names to argument bindings.\n to: If supplied, only yield references whose `configurable_fn` matches `to`.\n\n Yields:\n `ConfigurableReference` instances within `config`, maybe restricted to those\n matching the `to` parameter if it is supplied.\n \"\"\"\n for value in _iterate_flattened_values(config):\n if isinstance(value, ConfigurableReference):\n if to is None or value.configurable.fn_or_cls == to:\n yield value"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating a constant that can be referenced from gin config files.", "response": "def constant(name, value):\n \"\"\"Creates a constant that can be referenced from gin config files.\n\n After calling this function in Python, the constant can be referenced from\n within a Gin config file using the macro syntax. For example, in Python:\n\n gin.constant('THE_ANSWER', 42)\n\n Then, in a Gin config file:\n\n meaning.of_life = %THE_ANSWER\n\n Note that any Python object can be used as the value of a constant (including\n objects not representable as Gin literals). Values will be stored until\n program termination in a Gin-internal dictionary, so avoid creating constants\n with values that should have a limited lifetime.\n\n Optionally, a disambiguating module may be prefixed onto the constant\n name. For instance:\n\n gin.constant('some.modules.PI', 3.14159)\n\n Args:\n name: The name of the constant, possibly prepended by one or more\n disambiguating module components separated by periods. An macro with this\n name (including the modules) will be created.\n value: The value of the constant. This can be anything (including objects\n not representable as Gin literals). The value will be stored and returned\n whenever the constant is referenced.\n\n Raises:\n ValueError: If the constant's selector is invalid, or a constant with the\n given selector already exists.\n \"\"\"\n if not config_parser.MODULE_RE.match(name):\n raise ValueError(\"Invalid constant selector '{}'.\".format(name))\n\n if _CONSTANTS.matching_selectors(name):\n err_str = \"Constants matching selector '{}' already exist ({}).\"\n raise ValueError(err_str.format(name, _CONSTANTS.matching_selectors(name)))\n\n _CONSTANTS[name] = value"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef find_unknown_references_hook(config):\n additional_msg_fmt = \" In binding for '{}'.\"\n for (scope, selector), param_bindings in six.iteritems(config):\n for param_name, param_value in six.iteritems(param_bindings):\n for maybe_unknown in _iterate_flattened_values(param_value):\n if isinstance(maybe_unknown, _UnknownConfigurableReference):\n scope_str = scope + '/' if scope else ''\n min_selector = _REGISTRY.minimal_selector(selector)\n binding_key = '{}{}.{}'.format(scope_str, min_selector, param_name)\n additional_msg = additional_msg_fmt.format(binding_key)\n _raise_unknown_reference_error(maybe_unknown, additional_msg)", "response": "Hook to find unknown configurables."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nretrieve all selectors matching the given partial_selector.", "response": "def matching_selectors(self, partial_selector):\n \"\"\"Retrieves all selectors matching `partial_selector`.\n\n For instance, if \"one.a.b\" and \"two.a.b\" are stored in a `SelectorMap`, both\n `matching_selectors('b')` and `matching_selectors('a.b')` will return them.\n\n In the event that `partial_selector` exactly matches an existing complete\n selector, only that complete selector is returned. For instance, if\n \"a.b.c.d\" and \"c.d\" are stored, `matching_selectors('c.d')` will return only\n `['c.d']`, while `matching_selectors('d')` will return both.\n\n Args:\n partial_selector: The partial selector to find matches for.\n\n Returns:\n A list of selectors matching `partial_selector`.\n \"\"\"\n if partial_selector in self._selector_map:\n return [partial_selector]\n\n selector_components = partial_selector.split('.')\n node = self._selector_tree\n\n for component in reversed(selector_components):\n if component not in node:\n return []\n node = node[component]\n\n selectors = []\n dfs_stack = [node]\n while dfs_stack:\n node = dfs_stack.pop().copy()\n selector = node.pop(_TERMINAL_KEY, None)\n dfs_stack.extend(node.values())\n if selector:\n selectors.append(selector)\n\n return selectors"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget a value matching the given partial_selector.", "response": "def get_match(self, partial_selector, default=None):\n \"\"\"Gets a (single) value matching `partial_selector`.\n\n If the partial_selector exactly matches a complete selector, the value\n associated with the complete selector is returned.\n\n Args:\n partial_selector: The partial selector to find values for.\n default: A default value to return if nothing matches `partial_selector`.\n\n Returns:\n The value associated with `partial_selector` if it exists, else `default`.\n\n Raises:\n KeyError: If `partial_selector` matches more than one selector in the map.\n \"\"\"\n matching_selectors = self.matching_selectors(partial_selector)\n if not matching_selectors:\n return default\n if len(matching_selectors) > 1:\n err_str = \"Ambiguous selector '{}', matches {}.\"\n raise KeyError(err_str.format(partial_selector, matching_selectors))\n return self._selector_map[matching_selectors[0]]"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_all_matches(self, partial_selector):\n matching_selectors = self.matching_selectors(partial_selector)\n return [self._selector_map[selector] for selector in matching_selectors]", "response": "Returns all values matching partial_selector as a list."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning the minimal selector that uniquely matches the given complete_selector.", "response": "def minimal_selector(self, complete_selector):\n \"\"\"Returns the minimal selector that uniquely matches `complete_selector`.\n\n Args:\n complete_selector: A complete selector stored in the map.\n\n Returns:\n A partial selector that unambiguously matches `complete_selector`.\n\n Raises:\n KeyError: If `complete_selector` is not in the map.\n \"\"\"\n if complete_selector not in self._selector_map:\n raise KeyError(\"No value with selector '{}'.\".format(complete_selector))\n\n selector_components = complete_selector.split('.')\n node = self._selector_tree\n\n start = None\n for i, component in enumerate(reversed(selector_components)):\n if len(node) == 1:\n if start is None:\n start = -i # Negative index, since we're iterating in reverse.\n else:\n start = None\n node = node[component]\n\n if len(node) > 1: # The selector is a substring of another selector.\n return complete_selector\n return '.'.join(selector_components[start:])"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef sp_search_query(query):\n\n result = []\n\n for (field, values) in query.items():\n field = SEARCH_FIELD_MAP.get(field, field)\n if field is None:\n continue\n\n for value in values:\n if field == 'year':\n value = _transform_year(value)\n if value is not None:\n result.append('%s:%d' % (field, value))\n elif field == 'any':\n result.append('\"%s\"' % value)\n else:\n result.append('%s:\"%s\"' % (field, value))\n\n return ' '.join(result)", "response": "Translate a Mopidy search query to a Spotify search query"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _parse_retry_after(self, response):\n value = response.headers.get('Retry-After')\n\n if not value:\n seconds = 0\n elif re.match(r'^\\s*[0-9]+\\s*$', value):\n seconds = int(value)\n else:\n date_tuple = email.utils.parsedate(value)\n if date_tuple is None:\n seconds = 0\n else:\n seconds = time.mktime(date_tuple) - time.time()\n return max(0, seconds)", "response": "Parse Retry - After header from response."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nvalidate new property value before setting it.", "response": "def validate_value(self, value):\n \"\"\"\n Validate new property value before setting it.\n\n value -- New value\n \"\"\"\n if 'readOnly' in self.metadata and self.metadata['readOnly']:\n raise PropertyError('Read-only property')\n\n try:\n validate(value, self.metadata)\n except ValidationError:\n raise PropertyError('Invalid property value')"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef as_property_description(self):\n description = deepcopy(self.metadata)\n\n if 'links' not in description:\n description['links'] = []\n\n description['links'].append(\n {\n 'rel': 'property',\n 'href': self.href_prefix + self.href,\n }\n )\n return description", "response": "Returns the property description as a dictionary describing the property."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef set_value(self, value):\n self.validate_value(value)\n self.value.set(value)", "response": "Set the current value of the property."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting the thing at the given index.", "response": "def get_thing(self, idx):\n \"\"\"\n Get the thing at the given index.\n\n idx -- the index\n \"\"\"\n try:\n idx = int(idx)\n except ValueError:\n return None\n\n if idx < 0 or idx >= len(self.things):\n return None\n\n return self.things[idx]"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ninitializes the object with the given list of Things managed by this server.", "response": "def initialize(self, things, hosts):\n \"\"\"\n Initialize the handler.\n\n things -- list of Things managed by this server\n hosts -- list of allowed hostnames\n \"\"\"\n self.things = things\n self.hosts = hosts"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nset the default headers for all requests.", "response": "def set_default_headers(self, *args, **kwargs):\n \"\"\"Set the default headers for all requests.\"\"\"\n self.set_header('Access-Control-Allow-Origin', '*')\n self.set_header('Access-Control-Allow-Headers',\n 'Origin, X-Requested-With, Content-Type, Accept')\n self.set_header('Access-Control-Allow-Methods',\n 'GET, HEAD, PUT, POST, DELETE')"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nhandling a GET request.", "response": "def get(self):\n \"\"\"\n Handle a GET request.\n\n property_name -- the name of the property from the URL path\n \"\"\"\n self.set_header('Content-Type', 'application/json')\n ws_href = '{}://{}'.format(\n 'wss' if self.request.protocol == 'https' else 'ws',\n self.request.headers.get('Host', '')\n )\n\n descriptions = []\n for thing in self.things.get_things():\n description = thing.as_thing_description()\n description['links'].append({\n 'rel': 'alternate',\n 'href': '{}{}'.format(ws_href, thing.get_href()),\n })\n descriptions.append(description)\n\n self.write(json.dumps(descriptions))"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nhandles a GET request including websocket requests.", "response": "def get(self, thing_id='0'):\n \"\"\"\n Handle a GET request, including websocket requests.\n\n thing_id -- ID of the thing this request is for\n \"\"\"\n self.thing = self.get_thing(thing_id)\n if self.thing is None:\n self.set_status(404)\n self.finish()\n return\n\n if self.request.headers.get('Upgrade', '').lower() == 'websocket':\n yield tornado.websocket.WebSocketHandler.get(self)\n return\n\n self.set_header('Content-Type', 'application/json')\n ws_href = '{}://{}'.format(\n 'wss' if self.request.protocol == 'https' else 'ws',\n self.request.headers.get('Host', '')\n )\n\n description = self.thing.as_thing_description()\n description['links'].append({\n 'rel': 'alternate',\n 'href': '{}{}'.format(ws_href, self.thing.get_href()),\n })\n\n self.write(json.dumps(description))\n self.finish()"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nhandles an incoming message.", "response": "def on_message(self, message):\n \"\"\"\n Handle an incoming message.\n\n message -- message to handle\n \"\"\"\n try:\n message = json.loads(message)\n except ValueError:\n try:\n self.write_message(json.dumps({\n 'messageType': 'error',\n 'data': {\n 'status': '400 Bad Request',\n 'message': 'Parsing request failed',\n },\n }))\n except tornado.websocket.WebSocketClosedError:\n pass\n\n return\n\n if 'messageType' not in message or 'data' not in message:\n try:\n self.write_message(json.dumps({\n 'messageType': 'error',\n 'data': {\n 'status': '400 Bad Request',\n 'message': 'Invalid message',\n },\n }))\n except tornado.websocket.WebSocketClosedError:\n pass\n\n return\n\n msg_type = message['messageType']\n if msg_type == 'setProperty':\n for property_name, property_value in message['data'].items():\n try:\n self.thing.set_property(property_name, property_value)\n except PropertyError as e:\n self.write_message(json.dumps({\n 'messageType': 'error',\n 'data': {\n 'status': '400 Bad Request',\n 'message': str(e),\n },\n }))\n elif msg_type == 'requestAction':\n for action_name, action_params in message['data'].items():\n input_ = None\n if 'input' in action_params:\n input_ = action_params['input']\n\n action = self.thing.perform_action(action_name, input_)\n if action:\n tornado.ioloop.IOLoop.current().spawn_callback(\n perform_action,\n action,\n )\n else:\n self.write_message(json.dumps({\n 'messageType': 'error',\n 'data': {\n 'status': '400 Bad Request',\n 'message': 'Invalid action request',\n 'request': message,\n },\n }))\n elif msg_type == 'addEventSubscription':\n for event_name in message['data'].keys():\n self.thing.add_event_subscriber(event_name, self)\n else:\n try:\n self.write_message(json.dumps({\n 'messageType': 'error',\n 'data': {\n 'status': '400 Bad Request',\n 'message': 'Unknown messageType: ' + msg_type,\n 'request': message,\n },\n }))\n except tornado.websocket.WebSocketClosedError:\n pass"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get(self, thing_id='0', property_name=None):\n thing = self.get_thing(thing_id)\n if thing is None:\n self.set_status(404)\n return\n\n if thing.has_property(property_name):\n self.set_header('Content-Type', 'application/json')\n self.write(json.dumps({\n property_name: thing.get_property(property_name),\n }))\n else:\n self.set_status(404)", "response": "Handle a GET request."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nhandles a PUT request for a specific thing.", "response": "def put(self, thing_id='0', property_name=None):\n \"\"\"\n Handle a PUT request.\n\n thing_id -- ID of the thing this request is for\n property_name -- the name of the property from the URL path\n \"\"\"\n thing = self.get_thing(thing_id)\n if thing is None:\n self.set_status(404)\n return\n\n try:\n args = json.loads(self.request.body.decode())\n except ValueError:\n self.set_status(400)\n return\n\n if property_name not in args:\n self.set_status(400)\n return\n\n if thing.has_property(property_name):\n try:\n thing.set_property(property_name, args[property_name])\n except PropertyError:\n self.set_status(400)\n return\n\n self.set_header('Content-Type', 'application/json')\n self.write(json.dumps({\n property_name: thing.get_property(property_name),\n }))\n else:\n self.set_status(404)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nhandling a POST request.", "response": "def post(self, thing_id='0'):\n \"\"\"\n Handle a POST request.\n\n thing_id -- ID of the thing this request is for\n \"\"\"\n thing = self.get_thing(thing_id)\n if thing is None:\n self.set_status(404)\n return\n\n try:\n message = json.loads(self.request.body.decode())\n except ValueError:\n self.set_status(400)\n return\n\n response = {}\n for action_name, action_params in message.items():\n input_ = None\n if 'input' in action_params:\n input_ = action_params['input']\n\n action = thing.perform_action(action_name, input_)\n if action:\n response.update(action.as_action_description())\n\n # Start the action\n tornado.ioloop.IOLoop.current().spawn_callback(\n perform_action,\n action,\n )\n\n self.set_status(201)\n self.write(json.dumps(response))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get(self, thing_id='0', action_name=None, action_id=None):\n thing = self.get_thing(thing_id)\n if thing is None:\n self.set_status(404)\n return\n\n action = thing.get_action(action_name, action_id)\n if action is None:\n self.set_status(404)\n return\n\n self.set_header('Content-Type', 'application/json')\n self.write(json.dumps(action.as_action_description()))", "response": "Handle a GET request."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef put(self, thing_id='0', action_name=None, action_id=None):\n thing = self.get_thing(thing_id)\n if thing is None:\n self.set_status(404)\n return\n\n self.set_status(200)", "response": "Handle a PUT request."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef delete(self, thing_id='0', action_name=None, action_id=None):\n thing = self.get_thing(thing_id)\n if thing is None:\n self.set_status(404)\n return\n\n if thing.remove_action(action_name, action_id):\n self.set_status(204)\n else:\n self.set_status(404)", "response": "Handle a DELETE request."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get(self, thing_id='0'):\n thing = self.get_thing(thing_id)\n if thing is None:\n self.set_status(404)\n return\n\n self.set_header('Content-Type', 'application/json')\n self.write(json.dumps(thing.get_event_descriptions()))", "response": "Handle a GET request."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef start(self):\n self.service_info = ServiceInfo(\n '_webthing._tcp.local.',\n '{}._webthing._tcp.local.'.format(self.name),\n address=socket.inet_aton(get_ip()),\n port=self.port,\n properties={\n 'path': '/',\n },\n server='{}.local.'.format(socket.gethostname()))\n self.zeroconf = Zeroconf()\n self.zeroconf.register_service(self.service_info)\n\n self.server.listen(self.port)\n tornado.ioloop.IOLoop.current().start()", "response": "Start listening for incoming connections."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef stop(self):\n self.zeroconf.unregister_service(self.service_info)\n self.zeroconf.close()\n self.server.stop()", "response": "Stop listening for new services."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef as_action_description(self):\n description = {\n self.name: {\n 'href': self.href_prefix + self.href,\n 'timeRequested': self.time_requested,\n 'status': self.status,\n },\n }\n\n if self.input is not None:\n description[self.name]['input'] = self.input\n\n if self.time_completed is not None:\n description[self.name]['timeCompleted'] = self.time_completed\n\n return description", "response": "Returns the action description as a dictionary."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef start(self):\n self.status = 'pending'\n self.thing.action_notify(self)\n self.perform_action()\n self.finish()", "response": "Start performing the action."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef finish(self):\n self.status = 'completed'\n self.time_completed = timestamp()\n self.thing.action_notify(self)", "response": "Finish performing the action."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the event description as a dictionary.", "response": "def as_event_description(self):\n \"\"\"\n Get the event description.\n\n Returns a dictionary describing the event.\n \"\"\"\n description = {\n self.name: {\n 'timestamp': self.time,\n },\n }\n\n if self.data is not None:\n description[self.name]['data'] = self.data\n\n return description"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_ip():\n s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)\n try:\n s.connect(('10.255.255.255', 1))\n ip = s.getsockname()[0]\n except (socket.error, IndexError):\n ip = '127.0.0.1'\n finally:\n s.close()\n\n return ip", "response": "Get the default IP address."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget all IP addresses.", "response": "def get_addresses():\n \"\"\"\n Get all IP addresses.\n\n Returns list of addresses.\n \"\"\"\n addresses = set()\n\n for iface in ifaddr.get_adapters():\n for addr in iface.ips:\n # Filter out link-local addresses.\n if addr.is_IPv4:\n ip = addr.ip\n\n if not ip.startswith('169.254.'):\n addresses.add(ip)\n elif addr.is_IPv6:\n # Sometimes, IPv6 addresses will have the interface name\n # appended, e.g. %eth0. Handle that.\n ip = addr.ip[0].split('%')[0].lower()\n\n if not ip.startswith('fe80:'):\n addresses.add('[{}]'.format(ip))\n\n return sorted(list(addresses))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef set(self, value):\n if self.value_forwarder is not None:\n self.value_forwarder(value)\n\n self.notify_of_external_update(value)", "response": "Set a new value for this thing."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef notify_of_external_update(self, value):\n if value is not None and value != self.last_value:\n self.last_value = value\n self.emit('update', value)", "response": "Notify observers of a new value."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef as_thing_description(self):\n thing = {\n 'name': self.name,\n 'href': self.href_prefix if self.href_prefix else '/',\n '@context': self.context,\n '@type': self.type,\n 'properties': self.get_property_descriptions(),\n 'actions': {},\n 'events': {},\n 'links': [\n {\n 'rel': 'properties',\n 'href': '{}/properties'.format(self.href_prefix),\n },\n {\n 'rel': 'actions',\n 'href': '{}/actions'.format(self.href_prefix),\n },\n {\n 'rel': 'events',\n 'href': '{}/events'.format(self.href_prefix),\n },\n ],\n }\n\n for name, action in self.available_actions.items():\n thing['actions'][name] = action['metadata']\n thing['actions'][name]['links'] = [\n {\n 'rel': 'action',\n 'href': '{}/actions/{}'.format(self.href_prefix, name),\n },\n ]\n\n for name, event in self.available_events.items():\n thing['events'][name] = event['metadata']\n thing['events'][name]['links'] = [\n {\n 'rel': 'event',\n 'href': '{}/events/{}'.format(self.href_prefix, name),\n },\n ]\n\n if self.ui_href is not None:\n thing['links'].append({\n 'rel': 'alternate',\n 'mediaType': 'text/html',\n 'href': self.ui_href,\n })\n\n if self.description:\n thing['description'] = self.description\n\n return thing", "response": "Returns the state as a Thing Description."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef set_href_prefix(self, prefix):\n self.href_prefix = prefix\n\n for property_ in self.properties.values():\n property_.set_href_prefix(prefix)\n\n for action_name in self.actions.keys():\n for action in self.actions[action_name]:\n action.set_href_prefix(prefix)", "response": "Sets the prefix of any hrefs associated with this thing."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget the properties as a dictionary.", "response": "def get_property_descriptions(self):\n \"\"\"\n Get the thing's properties as a dictionary.\n\n Returns the properties as a dictionary, i.e. name -> description.\n \"\"\"\n return {k: v.as_property_description()\n for k, v in self.properties.items()}"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_action_descriptions(self, action_name=None):\n descriptions = []\n\n if action_name is None:\n for name in self.actions:\n for action in self.actions[name]:\n descriptions.append(action.as_action_description())\n elif action_name in self.actions:\n for action in self.actions[action_name]:\n descriptions.append(action.as_action_description())\n\n return descriptions", "response": "Returns the list of action descriptions for the thing."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_event_descriptions(self, event_name=None):\n if event_name is None:\n return [e.as_event_description() for e in self.events]\n else:\n return [e.as_event_description()\n for e in self.events if e.get_name() == event_name]", "response": "Returns the event descriptions for the given event name."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nadd a property to this thing.", "response": "def add_property(self, property_):\n \"\"\"\n Add a property to this thing.\n\n property_ -- property to add\n \"\"\"\n property_.set_href_prefix(self.href_prefix)\n self.properties[property_.name] = property_"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef remove_property(self, property_):\n if property_.name in self.properties:\n del self.properties[property_.name]", "response": "Removes a property from this thing."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_property(self, property_name):\n prop = self.find_property(property_name)\n if prop:\n return prop.get_value()\n\n return None", "response": "Get a property s value."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget a mapping of all properties and their values.", "response": "def get_properties(self):\n \"\"\"\n Get a mapping of all properties and their values.\n\n Returns a dictionary of property_name -> value.\n \"\"\"\n return {prop.get_name(): prop.get_value()\n for prop in self.properties.values()}"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsetting a property value.", "response": "def set_property(self, property_name, value):\n \"\"\"\n Set a property value.\n\n property_name -- name of the property to set\n value -- value to set\n \"\"\"\n prop = self.find_property(property_name)\n if not prop:\n return\n\n prop.set_value(value)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_action(self, action_name, action_id):\n if action_name not in self.actions:\n return None\n\n for action in self.actions[action_name]:\n if action.id == action_id:\n return action\n\n return None", "response": "Get an action from the list of available actions."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef add_event(self, event):\n self.events.append(event)\n self.event_notify(event)", "response": "Add a new event and notify subscribers."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef add_available_event(self, name, metadata):\n if metadata is None:\n metadata = {}\n\n self.available_events[name] = {\n 'metadata': metadata,\n 'subscribers': set(),\n }", "response": "Add an available event."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef perform_action(self, action_name, input_=None):\n if action_name not in self.available_actions:\n return None\n\n action_type = self.available_actions[action_name]\n\n if 'input' in action_type['metadata']:\n try:\n validate(input_, action_type['metadata']['input'])\n except ValidationError:\n return None\n\n action = action_type['class'](self, input_=input_)\n action.set_href_prefix(self.href_prefix)\n self.action_notify(action)\n self.actions[action_name].append(action)\n return action", "response": "Perform an action on the thing."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nremove an existing action.", "response": "def remove_action(self, action_name, action_id):\n \"\"\"\n Remove an existing action.\n\n action_name -- name of the action\n action_id -- ID of the action\n\n Returns a boolean indicating the presence of the action.\n \"\"\"\n action = self.get_action(action_name, action_id)\n if action is None:\n return False\n\n action.cancel()\n self.actions[action_name].remove(action)\n return True"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nadding an available action.", "response": "def add_available_action(self, name, metadata, cls):\n \"\"\"\n Add an available action.\n\n name -- name of the action\n metadata -- action metadata, i.e. type, description, etc., as a dict\n cls -- class to instantiate for this action\n \"\"\"\n if metadata is None:\n metadata = {}\n\n self.available_actions[name] = {\n 'metadata': metadata,\n 'class': cls,\n }\n self.actions[name] = []"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef remove_subscriber(self, ws):\n if ws in self.subscribers:\n self.subscribers.remove(ws)\n\n for name in self.available_events:\n self.remove_event_subscriber(name, ws)", "response": "Remove a websocket subscriber.\n\n ws -- the websocket"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nadds a new websocket subscriber to an event.", "response": "def add_event_subscriber(self, name, ws):\n \"\"\"\n Add a new websocket subscriber to an event.\n\n name -- name of the event\n ws -- the websocket\n \"\"\"\n if name in self.available_events:\n self.available_events[name]['subscribers'].add(ws)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nremoving a websocket subscriber from an event.", "response": "def remove_event_subscriber(self, name, ws):\n \"\"\"\n Remove a websocket subscriber from an event.\n\n name -- name of the event\n ws -- the websocket\n \"\"\"\n if name in self.available_events and \\\n ws in self.available_events[name]['subscribers']:\n self.available_events[name]['subscribers'].remove(ws)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef property_notify(self, property_):\n message = json.dumps({\n 'messageType': 'propertyStatus',\n 'data': {\n property_.name: property_.get_value(),\n }\n })\n\n for subscriber in list(self.subscribers):\n try:\n subscriber.write_message(message)\n except tornado.websocket.WebSocketClosedError:\n pass", "response": "Notify all subscribers of a property change."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nnotify all subscribers of an action status change.", "response": "def action_notify(self, action):\n \"\"\"\n Notify all subscribers of an action status change.\n\n action -- the action whose status changed\n \"\"\"\n message = json.dumps({\n 'messageType': 'actionStatus',\n 'data': action.as_action_description(),\n })\n\n for subscriber in list(self.subscribers):\n try:\n subscriber.write_message(message)\n except tornado.websocket.WebSocketClosedError:\n pass"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nnotify all subscribers of an event.", "response": "def event_notify(self, event):\n \"\"\"\n Notify all subscribers of an event.\n\n event -- the event that occurred\n \"\"\"\n if event.name not in self.available_events:\n return\n\n message = json.dumps({\n 'messageType': 'event',\n 'data': event.as_event_description(),\n })\n\n for subscriber in self.available_events[event.name]['subscribers']:\n try:\n subscriber.write_message(message)\n except tornado.websocket.WebSocketClosedError:\n pass"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef update(self, **fields):\n\n # build up the query to execute\n self._for_write = True\n if django.VERSION >= (2, 0):\n query = self.query.chain(UpdateQuery)\n else:\n query = self.query.clone(UpdateQuery)\n query._annotations = None\n query.add_update_values(fields)\n\n # build the compiler for for the query\n connection = django.db.connections[self.db]\n compiler = PostgresReturningUpdateCompiler(query, connection, self.db)\n\n # execute the query\n with transaction.atomic(using=self.db, savepoint=False):\n rows = compiler.execute_sql(CURSOR)\n self._result_cache = None\n\n # send out a signal for each row\n for row in rows:\n signals.update.send(self.model, pk=row[0])\n\n # the original update(..) returns the amount of rows\n # affected, let's do the same\n return len(rows)", "response": "Updates all rows that match the filter. Returns the number of rows affected."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nset the action to take when attempting to insert or create a new row.", "response": "def on_conflict(self, fields: List[Union[str, Tuple[str]]], action, index_predicate: str=None):\n \"\"\"Sets the action to take when conflicts arise when attempting\n to insert/create a new row.\n\n Arguments:\n fields:\n The fields the conflicts can occur in.\n\n action:\n The action to take when the conflict occurs.\n\n index_predicate:\n The index predicate to satisfy an arbiter partial index (i.e. what partial index to use for checking\n conflicts)\n \"\"\"\n\n self.conflict_target = fields\n self.conflict_action = action\n self.index_predicate = index_predicate\n\n return self"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncreating multiple new records in the database.", "response": "def bulk_insert(self, rows, return_model=False):\n \"\"\"Creates multiple new records in the database.\n\n This allows specifying custom conflict behavior using .on_conflict().\n If no special behavior was specified, this uses the normal Django create(..)\n\n Arguments:\n rows:\n An array of dictionaries, where each dictionary\n describes the fields to insert.\n\n return_model (default: False):\n If model instances should be returned rather than\n just dicts.\n\n Returns:\n A list of either the dicts of the rows inserted, including the pk or\n the models of the rows inserted with defaults for any fields not specified\n \"\"\"\n\n if self.conflict_target or self.conflict_action:\n compiler = self._build_insert_compiler(rows)\n objs = compiler.execute_sql(return_id=True)\n if return_model:\n return [self.model(**dict(r, **k)) for r, k in zip(rows, objs)]\n else:\n return [dict(r, **k) for r, k in zip(rows, objs)]\n\n # no special action required, use the standard Django bulk_create(..)\n return super().bulk_create([self.model(**fields) for fields in rows])"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef insert(self, **fields):\n\n if self.conflict_target or self.conflict_action:\n compiler = self._build_insert_compiler([fields])\n rows = compiler.execute_sql(return_id=True)\n\n pk_field_name = self.model._meta.pk.name\n return rows[0][pk_field_name]\n\n # no special action required, use the standard Django create(..)\n return super().create(**fields).pk", "response": "Creates a new record in the database."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef insert_and_get(self, **fields):\n\n if not self.conflict_target and not self.conflict_action:\n # no special action required, use the standard Django create(..)\n return super().create(**fields)\n\n compiler = self._build_insert_compiler([fields])\n rows = compiler.execute_sql(return_id=False)\n\n columns = rows[0]\n\n # get a list of columns that are officially part of the model and preserve the fact that the attribute name\n # might be different than the database column name\n model_columns = {}\n for field in self.model._meta.local_concrete_fields:\n model_columns[field.column] = field.attname\n\n # strip out any columns/fields returned by the db that\n # are not present in the model\n model_init_fields = {}\n for column_name, column_value in columns.items():\n try:\n model_init_fields[model_columns[column_name]] = column_value\n except KeyError:\n pass\n\n return self.model(**model_init_fields)", "response": "Creates a new record in the database and then gets the entire record."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating a new record or updates an existing one.", "response": "def upsert(self, conflict_target: List, fields: Dict, index_predicate: str=None) -> int:\n \"\"\"Creates a new record or updates the existing one\n with the specified data.\n\n Arguments:\n conflict_target:\n Fields to pass into the ON CONFLICT clause.\n\n fields:\n Fields to insert/update.\n\n index_predicate:\n The index predicate to satisfy an arbiter partial index (i.e. what partial index to use for checking\n conflicts)\n\n Returns:\n The primary key of the row that was created/updated.\n \"\"\"\n\n self.on_conflict(conflict_target, ConflictAction.UPDATE, index_predicate)\n return self.insert(**fields)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ncreating a new record or updates the existing one and then gets the row.", "response": "def upsert_and_get(self, conflict_target: List, fields: Dict, index_predicate: str=None):\n \"\"\"Creates a new record or updates the existing one\n with the specified data and then gets the row.\n\n Arguments:\n conflict_target:\n Fields to pass into the ON CONFLICT clause.\n\n fields:\n Fields to insert/update.\n\n index_predicate:\n The index predicate to satisfy an arbiter partial index (i.e. what partial index to use for checking\n conflicts)\n\n Returns:\n The model instance representing the row\n that was created/updated.\n \"\"\"\n\n self.on_conflict(conflict_target, ConflictAction.UPDATE, index_predicate)\n return self.insert_and_get(**fields)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef bulk_upsert(self, conflict_target: List, rows: List[Dict], index_predicate: str=None):\n\n if not rows or len(rows) <= 0:\n return\n\n self.on_conflict(conflict_target, ConflictAction.UPDATE, index_predicate)\n return self.bulk_insert(rows)", "response": "Bulk upserts the specified data into the existing records or updates the existing records."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _build_insert_compiler(self, rows: List[Dict]):\n\n # create model objects, we also have to detect cases\n # such as:\n # [dict(first_name='swen'), dict(fist_name='swen', last_name='kooij')]\n # we need to be certain that each row specifies the exact same\n # amount of fields/columns\n objs = []\n field_count = len(rows[0])\n for index, row in enumerate(rows):\n if field_count != len(row):\n raise SuspiciousOperation((\n 'In bulk upserts, you cannot have rows with different field '\n 'configurations. Row {0} has a different field config than '\n 'the first row.'\n ).format(index))\n\n objs.append(self.model(**row))\n\n # indicate this query is going to perform write\n self._for_write = True\n\n # get the fields to be used during update/insert\n insert_fields, update_fields = self._get_upsert_fields(rows[0])\n\n # build a normal insert query\n query = PostgresInsertQuery(self.model)\n query.conflict_action = self.conflict_action\n query.conflict_target = self.conflict_target\n query.index_predicate = self.index_predicate\n query.values(objs, insert_fields, update_fields)\n\n # use the postgresql insert query compiler to transform the insert\n # into an special postgresql insert\n connection = django.db.connections[self.db]\n compiler = PostgresInsertCompiler(query, connection, self.db)\n\n return compiler", "response": "Builds the SQL compiler for an insert query."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _is_magical_field(self, model_instance, field, is_insert: bool):\n\n # does this field modify someting upon insert?\n old_value = getattr(model_instance, field.name, None)\n field.pre_save(model_instance, is_insert)\n new_value = getattr(model_instance, field.name, None)\n\n return old_value != new_value", "response": "Verifies whether this field is magical."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget the fields to use in an upsert. This some nice magic. We'll split the fields into a group of \"insert fields\" and \"update fields\": INSERT INTO bla (\"val1\", \"val2\") ON CONFLICT DO UPDATE SET val1 = EXCLUDED.val1 ^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^ insert_fields update_fields Often, fields appear in both lists. But, for example, a :see:DateTime field with `auto_now_add=True` set, will only appear in \"insert_fields\", since it won't be set on existing rows. Other than that, the user specificies a list of fields in the upsert() call. That migt not be all fields. The user could decide to leave out optional fields. If we end up doing an update, we don't want to overwrite those non-specified fields. We cannot just take the list of fields the user specifies, because as mentioned, some fields make modifications to the model on their own. We'll have to detect which fields make modifications and include them in the list of insert/update fields.", "response": "def _get_upsert_fields(self, kwargs):\n \"\"\"Gets the fields to use in an upsert.\n\n This some nice magic. We'll split the fields into\n a group of \"insert fields\" and \"update fields\":\n\n INSERT INTO bla (\"val1\", \"val2\") ON CONFLICT DO UPDATE SET val1 = EXCLUDED.val1\n\n ^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^\n insert_fields update_fields\n\n Often, fields appear in both lists. But, for example,\n a :see:DateTime field with `auto_now_add=True` set, will\n only appear in \"insert_fields\", since it won't be set\n on existing rows.\n\n Other than that, the user specificies a list of fields\n in the upsert() call. That migt not be all fields. The\n user could decide to leave out optional fields. If we\n end up doing an update, we don't want to overwrite\n those non-specified fields.\n\n We cannot just take the list of fields the user\n specifies, because as mentioned, some fields\n make modifications to the model on their own.\n\n We'll have to detect which fields make modifications\n and include them in the list of insert/update fields.\n \"\"\"\n\n model_instance = self.model(**kwargs)\n insert_fields = []\n update_fields = []\n\n for field in model_instance._meta.local_concrete_fields:\n has_default = field.default != NOT_PROVIDED\n if (field.name in kwargs or field.column in kwargs):\n insert_fields.append(field)\n update_fields.append(field)\n continue\n elif has_default:\n insert_fields.append(field)\n continue\n\n # special handling for 'pk' which always refers to\n # the primary key, so if we the user specifies `pk`\n # instead of a concrete field, we have to handle that\n if field.primary_key is True and 'pk' in kwargs:\n insert_fields.append(field)\n update_fields.append(field)\n continue\n\n if self._is_magical_field(model_instance, field, is_insert=True):\n insert_fields.append(field)\n\n if self._is_magical_field(model_instance, field, is_insert=False):\n update_fields.append(field)\n\n return insert_fields, update_fields"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsets the action to take when attempting to insert or create a new row.", "response": "def on_conflict(self, fields: List[Union[str, Tuple[str]]], action, index_predicate: str=None):\n \"\"\"Sets the action to take when conflicts arise when attempting\n to insert/create a new row.\n\n Arguments:\n fields:\n The fields the conflicts can occur in.\n\n action:\n The action to take when the conflict occurs.\n\n index_predicate:\n The index predicate to satisfy an arbiter partial index.\n \"\"\"\n return self.get_queryset().on_conflict(fields, action, index_predicate)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncreate a new record or updates the existing one with the specified data.", "response": "def upsert(self, conflict_target: List, fields: Dict, index_predicate: str=None) -> int:\n \"\"\"Creates a new record or updates the existing one\n with the specified data.\n\n Arguments:\n conflict_target:\n Fields to pass into the ON CONFLICT clause.\n\n fields:\n Fields to insert/update.\n\n index_predicate:\n The index predicate to satisfy an arbiter partial index.\n\n Returns:\n The primary key of the row that was created/updated.\n \"\"\"\n\n return self.get_queryset().upsert(conflict_target, fields, index_predicate)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef upsert_and_get(self, conflict_target: List, fields: Dict, index_predicate: str=None):\n\n return self.get_queryset().upsert_and_get(conflict_target, fields, index_predicate)", "response": "Creates a new record or updates the existing one\n with the specified data and then gets the row."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef bulk_upsert(self, conflict_target: List, rows: List[Dict], index_predicate: str=None):\n\n return self.get_queryset().bulk_upsert(conflict_target, rows, index_predicate)", "response": "Bulk upserts the specified data into the specified set of records or updates the existing records."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _on_model_save(sender, **kwargs):\n\n created, instance = kwargs['created'], kwargs['instance']\n\n if created:\n signals.create.send(sender, pk=instance.pk)\n else:\n signals.update.send(sender, pk=instance.pk)", "response": "When a model gets created or updated send signals."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _on_model_delete(sender, **kwargs):\n\n instance = kwargs['instance']\n signals.delete.send(sender, pk=instance.pk)", "response": "When a model gets deleted send a signal to the user that the related object is deleted."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef IsNotNone(*fields, default=None):\n\n when_clauses = [\n expressions.When(\n ~expressions.Q(**{field: None}),\n then=expressions.F(field)\n )\n for field in reversed(fields)\n ]\n\n return expressions.Case(\n *when_clauses,\n default=expressions.Value(default),\n output_field=CharField()\n )", "response": "Selects whichever field is not None."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nresolves expressions inside the dictionary.", "response": "def resolve_expression(self, *args, **kwargs):\n \"\"\"Resolves expressions inside the dictionary.\"\"\"\n\n result = dict()\n for key, value in self.value.items():\n if hasattr(value, 'resolve_expression'):\n result[key] = value.resolve_expression(\n *args, **kwargs)\n else:\n result[key] = value\n\n return HStoreValue(result)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef as_sql(self, compiler, connection):\n\n result = []\n for key, value in self.value.items():\n if hasattr(value, 'as_sql'):\n sql, params = value.as_sql(compiler, connection)\n result.append('hstore(\\'%s\\', %s)' % (\n key, sql % params))\n elif value is not None:\n result.append('hstore(\\'%s\\', \\'%s\\')' % ((\n key, value)))\n else:\n result.append('hstore(\\'%s\\', NULL)' % key)\n\n return '%s' % ' || '.join(result), []", "response": "Compiles the HStore value into SQL."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncompiles this expression into SQL.", "response": "def as_sql(self, compiler, connection):\n \"\"\"Compiles this expression into SQL.\"\"\"\n\n qn = compiler.quote_name_unless_alias\n return \"%s.%s->'%s'\" % (qn(self.alias), qn(self.target.column), self.hstore_key), []"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef relabeled_clone(self, relabels):\n\n return self.__class__(\n relabels.get(self.alias, self.alias),\n self.target,\n self.hstore_key,\n self.output_field\n )", "response": "Gets a re - labeled clone of this expression."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nresolve the expression into a HStoreColumn expression.", "response": "def resolve_expression(self, *args, **kwargs) -> HStoreColumn:\n \"\"\"Resolves the expression into a :see:HStoreColumn expression.\"\"\"\n\n original_expression = super().resolve_expression(*args, **kwargs)\n expression = HStoreColumn(\n original_expression.alias,\n original_expression.target,\n self.key\n )\n return expression"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef as_sql(self, compiler, connection):\n\n sql, params = super().as_sql(compiler, connection)\n return 'EXTRACT(epoch FROM {})'.format(sql), params", "response": "Compiles this expression into SQL."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef rename_annotations(self, annotations) -> None:\n\n for old_name, new_name in annotations.items():\n annotation = self.annotations.get(old_name)\n\n if not annotation:\n raise SuspiciousOperation((\n 'Cannot rename annotation \"{old_name}\" to \"{new_name}\", because there'\n ' is no annotation named \"{old_name}\".'\n ).format(old_name=old_name, new_name=new_name))\n\n self._annotations = OrderedDict(\n [(new_name, v) if k == old_name else (k, v) for k, v in self._annotations.items()])\n\n if django.VERSION < (2, 0):\n self.set_annotation_mask(\n (new_name if v == old_name else v for v in (self.annotation_select_mask or [])))", "response": "Renames the aliases for the specified annotations."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nadding an extra condition to an existing JOIN.", "response": "def add_join_conditions(self, conditions: Dict[str, Any]) -> None:\n \"\"\"Adds an extra condition to an existing JOIN.\n\n This allows you to for example do:\n\n INNER JOIN othertable ON (mytable.id = othertable.other_id AND [extra conditions])\n\n This does not work if nothing else in your query doesn't already generate the\n initial join in the first place.\n \"\"\"\n\n alias = self.get_initial_alias()\n opts = self.get_meta()\n\n for name, value in conditions.items():\n parts = name.split(LOOKUP_SEP)\n join_info = self.setup_joins(parts, opts, alias, allow_many=True)\n self.trim_joins(join_info[1], join_info[3], join_info[4])\n\n target_table = join_info[3][-1]\n field = join_info[1][-1]\n join = self.alias_map.get(target_table)\n\n if not join:\n raise SuspiciousOperation((\n 'Cannot add an extra join condition for \"%s\", there\\'s no'\n ' existing join to add it to.'\n ) % target_table)\n\n # convert the Join object into a ConditionalJoin object, which\n # allows us to add the extra condition\n if not isinstance(join, ConditionalJoin):\n self.alias_map[target_table] = ConditionalJoin.from_join(join)\n join = self.alias_map[target_table]\n\n join.add_condition(field, value)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nadd the given field names to the select set.", "response": "def add_fields(self, field_names: List[str], allow_m2m: bool=True) -> bool:\n \"\"\"\n Adds the given (model) fields to the select set. The field names are\n added in the order specified.\n\n This overrides the base class's add_fields method. This is called by\n the .values() or .values_list() method of the query set. It instructs\n the ORM to only select certain values. A lot of processing is neccesarry\n because it can be used to easily do joins. For example, `my_fk__name` pulls\n in the `name` field in foreign key `my_fk`.\n\n In our case, we want to be able to do `title__en`, where `title` is a HStoreField\n and `en` a key. This doesn't really involve a join. We iterate over the specified\n field names and filter out the ones that refer to HStoreField and compile it into\n an expression which is added to the list of to be selected fields using `self.add_select`.\n \"\"\"\n\n alias = self.get_initial_alias()\n opts = self.get_meta()\n cols = []\n for name in field_names:\n parts = name.split(LOOKUP_SEP)\n\n # it cannot be a special hstore thing if there's no __ in it\n if len(parts) > 1:\n column_name, hstore_key = parts[:2]\n is_hstore, field = self._is_hstore_field(column_name)\n if is_hstore:\n cols.append(\n HStoreColumn(self.model._meta.db_table or self.model.name, field, hstore_key)\n )\n continue\n\n join_info = self.setup_joins(parts, opts, alias, allow_many=allow_m2m)\n targets, final_alias, joins = self.trim_joins(\n join_info[1], join_info[3], join_info[4]\n )\n\n for target in targets:\n cols.append(target.get_col(final_alias))\n if cols:\n self.set_select(cols)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _is_hstore_field(self, field_name: str) -> Tuple[bool, Optional[models.Field]]:\n\n field_instance = None\n for field in self.model._meta.local_concrete_fields:\n if field.name == field_name or field.column == field_name:\n field_instance = field\n break\n\n return isinstance(field_instance, HStoreField), field_instance", "response": "Returns whether the field with the specified name is a HStoreField and the HStoreField instance if it is."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef values(self, objs: List, insert_fields: List, update_fields: List=[]):\n\n self.insert_values(insert_fields, objs, raw=False)\n self.update_fields = update_fields", "response": "Sets the values to be used in this query."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nruns when a new model is created.", "response": "def create_model(self, model):\n \"\"\"Ran when a new model is created.\"\"\"\n\n for field in model._meta.local_fields:\n if not isinstance(field, HStoreField):\n continue\n\n self.add_field(model, field)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef delete_model(self, model):\n\n for field in model._meta.local_fields:\n if not isinstance(field, HStoreField):\n continue\n\n self.remove_field(model, field)", "response": "Ran when a model is being deleted."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nrun when the name of a model is changed.", "response": "def alter_db_table(self, model, old_db_table, new_db_table):\n \"\"\"Ran when the name of a model is changed.\"\"\"\n\n for field in model._meta.local_fields:\n if not isinstance(field, HStoreField):\n continue\n\n for key in self._iterate_required_keys(field):\n self._rename_hstore_required(\n old_db_table,\n new_db_table,\n field,\n field,\n key\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef add_field(self, model, field):\n\n for key in self._iterate_required_keys(field):\n self._create_hstore_required(\n model._meta.db_table,\n field,\n key\n )", "response": "Ran when a field is added to a model."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nrunning when a field is removed from a model.", "response": "def remove_field(self, model, field):\n \"\"\"Ran when a field is removed from a model.\"\"\"\n\n for key in self._iterate_required_keys(field):\n self._drop_hstore_required(\n model._meta.db_table,\n field,\n key\n )"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nrun when the configuration on a field has changed.", "response": "def alter_field(self, model, old_field, new_field, strict=False):\n \"\"\"Ran when the configuration on a field changed.\"\"\"\n\n is_old_field_hstore = isinstance(old_field, HStoreField)\n is_new_field_hstore = isinstance(new_field, HStoreField)\n\n if not is_old_field_hstore and not is_new_field_hstore:\n return\n\n old_required = getattr(old_field, 'required', []) or []\n new_required = getattr(new_field, 'required', []) or []\n\n # handle field renames before moving on\n if str(old_field.column) != str(new_field.column):\n for key in self._iterate_required_keys(old_field):\n self._rename_hstore_required(\n model._meta.db_table,\n model._meta.db_table,\n old_field,\n new_field,\n key\n )\n\n # drop the constraints for keys that have been removed\n for key in old_required:\n if key not in new_required:\n self._drop_hstore_required(\n model._meta.db_table,\n old_field,\n key\n )\n\n # create new constraints for keys that have been added\n for key in new_required:\n if key not in old_required:\n self._create_hstore_required(\n model._meta.db_table,\n new_field,\n key\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncreate a REQUIRED CONSTRAINT for the specified hstore key.", "response": "def _create_hstore_required(self, table_name, field, key):\n \"\"\"Creates a REQUIRED CONSTRAINT for the specified hstore key.\"\"\"\n\n name = self._required_constraint_name(\n table_name, field, key)\n\n sql = self.sql_hstore_required_create.format(\n name=self.quote_name(name),\n table=self.quote_name(table_name),\n field=self.quote_name(field.column),\n key=key\n )\n self.execute(sql)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nrenames an existing REQUIRED CONSTRAINT for the specified hstore key.", "response": "def _rename_hstore_required(self, old_table_name, new_table_name,\n old_field, new_field, key):\n \"\"\"Renames an existing REQUIRED CONSTRAINT for the specified\n hstore key.\"\"\"\n\n old_name = self._required_constraint_name(\n old_table_name, old_field, key)\n new_name = self._required_constraint_name(\n new_table_name, new_field, key)\n\n sql = self.sql_hstore_required_rename.format(\n table=self.quote_name(new_table_name),\n old_name=self.quote_name(old_name),\n new_name=self.quote_name(new_name)\n )\n self.execute(sql)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ndropping a REQUIRED CONSTRAINT for the specified hstore key.", "response": "def _drop_hstore_required(self, table_name, field, key):\n \"\"\"Drops a REQUIRED CONSTRAINT for the specified hstore key.\"\"\"\n\n name = self._required_constraint_name(\n table_name, field, key)\n\n sql = self.sql_hstore_required_drop.format(\n table=self.quote_name(table_name),\n name=self.quote_name(name)\n )\n self.execute(sql)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _required_constraint_name(table: str, field, key):\n\n return '{table}_{field}_required_{postfix}'.format(\n table=table,\n field=field.column,\n postfix=key\n )", "response": "Gets the name for a CONSTRAINT that applies\n to a single hstore key."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate the actual SQL used when applying the migration.", "response": "def create_sql(self, model, schema_editor, using=''):\n \"\"\"Creates the actual SQL used when applying the migration.\"\"\"\n if django.VERSION >= (2, 0):\n statement = super().create_sql(model, schema_editor, using)\n statement.template = self.sql_create_index\n statement.parts['condition'] = self.condition\n return statement\n else:\n sql_create_index = self.sql_create_index\n sql_parameters = {\n **Index.get_sql_create_template_values(self, model, schema_editor, using),\n 'condition': self.condition\n }\n return sql_create_index % sql_parameters"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef deconstruct(self):\n path = '%s.%s' % (self.__class__.__module__, self.__class__.__name__)\n path = path.replace('django.db.models.indexes', 'django.db.models')\n return path, (), {'fields': self.fields, 'name': self.name, 'condition': self.condition}", "response": "Serializes the : see : ConditionalUniqueIndex for the migrations file."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating a custom setup. py command.", "response": "def create_command(text, commands):\n \"\"\"Creates a custom setup.py command.\"\"\"\n\n class CustomCommand(BaseCommand):\n description = text\n\n def run(self):\n for cmd in commands:\n subprocess.check_call(cmd)\n\n return CustomCommand"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn the base class for the custom database back - end.", "response": "def _get_backend_base():\n \"\"\"Gets the base class for the custom database back-end.\n\n This should be the Django PostgreSQL back-end. However,\n some people are already using a custom back-end from\n another package. We are nice people and expose an option\n that allows them to configure the back-end we base upon.\n\n As long as the specified base eventually also has\n the PostgreSQL back-end as a base, then everything should\n work as intended.\n \"\"\"\n base_class_name = getattr(\n settings,\n 'POSTGRES_EXTRA_DB_BACKEND_BASE',\n 'django.db.backends.postgresql'\n )\n\n base_class_module = importlib.import_module(base_class_name + '.base')\n base_class = getattr(base_class_module, 'DatabaseWrapper', None)\n\n if not base_class:\n raise ImproperlyConfigured((\n '\\'%s\\' is not a valid database back-end.'\n ' The module does not define a DatabaseWrapper class.'\n ' Check the value of POSTGRES_EXTRA_DB_BACKEND_BASE.'\n ) % base_class_name)\n\n if isinstance(base_class, Psycopg2DatabaseWrapper):\n raise ImproperlyConfigured((\n '\\'%s\\' is not a valid database back-end.'\n ' It does inherit from the PostgreSQL back-end.'\n ' Check the value of POSTGRES_EXTRA_DB_BACKEND_BASE.'\n ) % base_class_name)\n\n return base_class"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nruns when a new model is created.", "response": "def create_model(self, model):\n \"\"\"Ran when a new model is created.\"\"\"\n\n super().create_model(model)\n\n for mixin in self.post_processing_mixins:\n mixin.create_model(model)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nruns when a model is being deleted.", "response": "def delete_model(self, model):\n \"\"\"Ran when a model is being deleted.\"\"\"\n\n for mixin in self.post_processing_mixins:\n mixin.delete_model(model)\n\n super().delete_model(model)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef alter_db_table(self, model, old_db_table, new_db_table):\n\n super(SchemaEditor, self).alter_db_table(\n model, old_db_table, new_db_table\n )\n\n for mixin in self.post_processing_mixins:\n mixin.alter_db_table(\n model,\n old_db_table,\n new_db_table\n )", "response": "Ran when the name of a model is changed."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef add_field(self, model, field):\n\n super(SchemaEditor, self).add_field(model, field)\n\n for mixin in self.post_processing_mixins:\n mixin.add_field(model, field)", "response": "Ran when a field is added to a model."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef remove_field(self, model, field):\n\n for mixin in self.post_processing_mixins:\n mixin.remove_field(model, field)\n\n super(SchemaEditor, self).remove_field(model, field)", "response": "Ran when a field is removed from a model."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef alter_field(self, model, old_field, new_field, strict=False):\n\n super(SchemaEditor, self).alter_field(\n model, old_field, new_field, strict\n )\n\n for mixin in self.post_processing_mixins:\n mixin.alter_field(\n model, old_field, new_field, strict\n )", "response": "Ran when the configuration on a field changed."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nrunning to prepare the configured database.", "response": "def prepare_database(self):\n \"\"\"Ran to prepare the configured database.\n\n This is where we enable the `hstore` extension\n if it wasn't enabled yet.\"\"\"\n\n super().prepare_database()\n with self.cursor() as cursor:\n try:\n cursor.execute('CREATE EXTENSION IF NOT EXISTS hstore')\n except ProgrammingError: # permission denied\n logger.warning(\n 'Failed to create \"hstore\" extension. '\n 'Tables with hstore columns may fail to migrate. '\n 'If hstore is needed, make sure you are connected '\n 'to the database as a superuser '\n 'or add the extension manually.',\n exc_info=True)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\noverriding the base class so it doesn t cast all values to strings.", "response": "def get_prep_value(self, value):\n \"\"\"Override the base class so it doesn't cast all values\n to strings.\n\n psqlextra supports expressions in hstore fields, so casting\n all values to strings is a bad idea.\"\"\"\n\n value = Field.get_prep_value(self, value)\n\n if isinstance(value, dict):\n prep_value = {}\n for key, val in value.items():\n if isinstance(val, Expression):\n prep_value[key] = val\n elif val is not None:\n prep_value[key] = str(val)\n else:\n prep_value[key] = val\n\n value = prep_value\n\n if isinstance(value, list):\n value = [str(item) for item in value]\n\n return value"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets the values to pass to : see : __init__ when re - creating this object.", "response": "def deconstruct(self):\n \"\"\"Gets the values to pass to :see:__init__ when\n re-creating this object.\"\"\"\n\n name, path, args, kwargs = super(\n HStoreField, self).deconstruct()\n\n if self.uniqueness is not None:\n kwargs['uniqueness'] = self.uniqueness\n\n if self.required is not None:\n kwargs['required'] = self.required\n\n return name, path, args, kwargs"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _prepare_query_values(self):\n\n new_query_values = []\n for field, model, val in self.query.values:\n if isinstance(val, dict):\n val = HStoreValue(val)\n\n new_query_values.append((\n field,\n model,\n val\n ))\n\n self.query.values = new_query_values", "response": "Extra prep on query values by converting the dictionary into HStoreValue expressions."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _form_returning(self):\n\n qn = self.connection.ops.quote_name\n return ' RETURNING %s' % qn(self.query.model._meta.pk.attname)", "response": "Builds the RETURNING part of the query."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef as_sql(self, return_id=False):\n\n queries = [\n self._rewrite_insert(sql, params, return_id)\n for sql, params in super().as_sql()\n ]\n\n return queries", "response": "Builds the SQL INSERT statement."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nrewriting a formed SQL INSERT query to include the ON CONFLICT clause.", "response": "def _rewrite_insert(self, sql, params, return_id=False):\n \"\"\"Rewrites a formed SQL INSERT query to include\n the ON CONFLICT clause.\n\n Arguments:\n sql:\n The SQL INSERT query to rewrite.\n\n params:\n The parameters passed to the query.\n\n returning:\n What to put in the `RETURNING` clause\n of the resulting query.\n\n Returns:\n A tuple of the rewritten SQL query and new params.\n \"\"\"\n\n returning = self.qn(self.query.model._meta.pk.attname) if return_id else '*'\n\n if self.query.conflict_action.value == 'UPDATE':\n return self._rewrite_insert_update(sql, params, returning)\n elif self.query.conflict_action.value == 'NOTHING':\n return self._rewrite_insert_nothing(sql, params, returning)\n\n raise SuspiciousOperation((\n '%s is not a valid conflict action, specify '\n 'ConflictAction.UPDATE or ConflictAction.NOTHING.'\n ) % str(self.query.conflict_action))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _rewrite_insert_update(self, sql, params, returning):\n\n update_columns = ', '.join([\n '{0} = EXCLUDED.{0}'.format(self.qn(field.column))\n for field in self.query.update_fields\n ])\n\n # build the conflict target, the columns to watch\n # for conflicts\n conflict_target = self._build_conflict_target()\n\n index_predicate = self.query.index_predicate\n\n sql_template = (\n '{insert} ON CONFLICT {conflict_target} DO UPDATE '\n 'SET {update_columns} RETURNING {returning}'\n )\n\n if index_predicate:\n sql_template = (\n '{insert} ON CONFLICT {conflict_target} WHERE {index_predicate} DO UPDATE '\n 'SET {update_columns} RETURNING {returning}'\n )\n\n return (\n sql_template.format(\n insert=sql,\n conflict_target=conflict_target,\n update_columns=update_columns,\n returning=returning,\n index_predicate=index_predicate,\n ),\n params\n )", "response": "Rewrites a formed SQL INSERT query to include\n the ON CONFLICT DO UPDATE clause."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _rewrite_insert_nothing(self, sql, params, returning):\n\n # build the conflict target, the columns to watch\n # for conflicts\n conflict_target = self._build_conflict_target()\n\n where_clause = ' AND '.join([\n '{0} = %s'.format(self._format_field_name(field_name))\n for field_name in self.query.conflict_target\n ])\n\n where_clause_params = [\n self._format_field_value(field_name)\n for field_name in self.query.conflict_target\n ]\n\n params = params + tuple(where_clause_params)\n\n # this looks complicated, and it is, but it is for a reason... a normal\n # ON CONFLICT DO NOTHING doesn't return anything if the row already exists\n # so we do DO UPDATE instead that never executes to lock the row, and then\n # select from the table in case we're dealing with an existing row..\n return (\n (\n 'WITH insdata AS ('\n '{insert} ON CONFLICT {conflict_target} DO UPDATE'\n ' SET {pk_column} = NULL WHERE FALSE RETURNING {returning})'\n ' SELECT * FROM insdata UNION ALL'\n ' SELECT {returning} FROM {table} WHERE {where_clause} LIMIT 1;'\n ).format(\n insert=sql,\n conflict_target=conflict_target,\n pk_column=self.qn(self.query.model._meta.pk.column),\n returning=returning,\n table=self.query.objs[0]._meta.db_table,\n where_clause=where_clause\n ),\n params\n )", "response": "Rewrites a formed SQL INSERT query to include\n the ON CONFLICT DO NOTHING clause."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nbuilding the conflict_target for the ON CONFLICT clause.", "response": "def _build_conflict_target(self):\n \"\"\"Builds the `conflict_target` for the ON CONFLICT\n clause.\"\"\"\n\n conflict_target = []\n\n if not isinstance(self.query.conflict_target, list):\n raise SuspiciousOperation((\n '%s is not a valid conflict target, specify '\n 'a list of column names, or tuples with column '\n 'names and hstore key.'\n ) % str(self.query.conflict_target))\n\n def _assert_valid_field(field_name):\n field_name = self._normalize_field_name(field_name)\n if self._get_model_field(field_name):\n return\n\n raise SuspiciousOperation((\n '%s is not a valid conflict target, specify '\n 'a list of column names, or tuples with column '\n 'names and hstore key.'\n ) % str(field_name))\n\n for field_name in self.query.conflict_target:\n _assert_valid_field(field_name)\n\n # special handling for hstore keys\n if isinstance(field_name, tuple):\n conflict_target.append(\n '(%s->\\'%s\\')' % (\n self._format_field_name(field_name),\n field_name[1]\n )\n )\n else:\n conflict_target.append(\n self._format_field_name(field_name))\n\n return '(%s)' % ','.join(conflict_target)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _get_model_field(self, name: str):\n\n field_name = self._normalize_field_name(name)\n\n # 'pk' has special meaning and always refers to the primary\n # key of a model, we have to respect this de-facto standard behaviour\n if field_name == 'pk' and self.query.model._meta.pk:\n return self.query.model._meta.pk\n\n for field in self.query.model._meta.local_concrete_fields:\n if field.name == field_name or field.column == field_name:\n return field\n\n return None", "response": "Gets the field on a model with the specified name."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nformat a field s name for usage in SQL.", "response": "def _format_field_name(self, field_name) -> str:\n \"\"\"Formats a field's name for usage in SQL.\n\n Arguments:\n field_name:\n The field name to format.\n\n Returns:\n The specified field name formatted for\n usage in SQL.\n \"\"\"\n\n field = self._get_model_field(field_name)\n return self.qn(field.column)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _format_field_value(self, field_name) -> str:\n\n field_name = self._normalize_field_name(field_name)\n field = self._get_model_field(field_name)\n\n return SQLInsertCompiler.prepare_value(\n self,\n field,\n # Note: this deliberately doesn't use `pre_save_val` as we don't\n # want things like auto_now on DateTimeField (etc.) to change the\n # value. We rely on pre_save having already been done by the\n # underlying compiler so that things like FileField have already had\n # the opportunity to save out their data.\n getattr(self.query.objs[0], field.attname)\n )", "response": "Formats a field s value for usage in SQL."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _normalize_field_name(self, field_name) -> str:\n\n if isinstance(field_name, tuple):\n field_name, _ = field_name\n\n return field_name", "response": "Normalizes a field name into a string by extracting the field name if it was specified by the HStore key."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nruns when the name of a model is changed.", "response": "def alter_db_table(self, model, old_db_table, new_db_table):\n \"\"\"Ran when the name of a model is changed.\"\"\"\n\n for field in model._meta.local_fields:\n if not isinstance(field, HStoreField):\n continue\n\n for keys in self._iterate_uniqueness_keys(field):\n self._rename_hstore_unique(\n old_db_table,\n new_db_table,\n field,\n field,\n keys\n )"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_field(self, model, field):\n\n for keys in self._iterate_uniqueness_keys(field):\n self._create_hstore_unique(\n model,\n field,\n keys\n )", "response": "Ran when a field is added to a model."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nruns when a field is removed from a model.", "response": "def remove_field(self, model, field):\n \"\"\"Ran when a field is removed from a model.\"\"\"\n\n for keys in self._iterate_uniqueness_keys(field):\n self._drop_hstore_unique(\n model,\n field,\n keys\n )"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef alter_field(self, model, old_field, new_field, strict=False):\n\n is_old_field_hstore = isinstance(old_field, HStoreField)\n is_new_field_hstore = isinstance(new_field, HStoreField)\n\n if not is_old_field_hstore and not is_new_field_hstore:\n return\n\n old_uniqueness = getattr(old_field, 'uniqueness', []) or []\n new_uniqueness = getattr(new_field, 'uniqueness', []) or []\n\n # handle field renames before moving on\n if str(old_field.column) != str(new_field.column):\n for keys in self._iterate_uniqueness_keys(old_field):\n self._rename_hstore_unique(\n model._meta.db_table,\n model._meta.db_table,\n old_field,\n new_field,\n keys\n )\n\n # drop the indexes for keys that have been removed\n for keys in old_uniqueness:\n if keys not in new_uniqueness:\n self._drop_hstore_unique(\n model,\n old_field,\n self._compose_keys(keys)\n )\n\n # create new indexes for keys that have been added\n for keys in new_uniqueness:\n if keys not in old_uniqueness:\n self._create_hstore_unique(\n model,\n new_field,\n self._compose_keys(keys)\n )", "response": "Ran when the configuration on a field has changed."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _create_hstore_unique(self, model, field, keys):\n\n name = self._unique_constraint_name(\n model._meta.db_table, field, keys)\n columns = [\n '(%s->\\'%s\\')' % (field.column, key)\n for key in keys\n ]\n sql = self.sql_hstore_unique_create.format(\n name=self.quote_name(name),\n table=self.quote_name(model._meta.db_table),\n columns=','.join(columns)\n )\n self.execute(sql)", "response": "Creates a UNIQUE constraint for the specified hstore keys."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nrenaming an existing UNIQUE constraint for the specified hstore keys.", "response": "def _rename_hstore_unique(self, old_table_name, new_table_name,\n old_field, new_field, keys):\n \"\"\"Renames an existing UNIQUE constraint for the specified\n hstore keys.\"\"\"\n\n old_name = self._unique_constraint_name(\n old_table_name, old_field, keys)\n new_name = self._unique_constraint_name(\n new_table_name, new_field, keys)\n\n sql = self.sql_hstore_unique_rename.format(\n old_name=self.quote_name(old_name),\n new_name=self.quote_name(new_name)\n )\n self.execute(sql)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _drop_hstore_unique(self, model, field, keys):\n\n name = self._unique_constraint_name(\n model._meta.db_table, field, keys)\n sql = self.sql_hstore_unique_drop.format(name=self.quote_name(name))\n self.execute(sql)", "response": "Drops a UNIQUE constraint for the specified hstore keys."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _unique_constraint_name(table: str, field, keys):\n postfix = '_'.join(keys)\n return '{table}_{field}_unique_{postfix}'.format(\n table=table,\n field=field.column,\n postfix=postfix\n )", "response": "Gets the name for a UNIQUE INDEX that applies\n to one or more keys in a hstore field."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _iterate_uniqueness_keys(self, field):\n\n uniqueness = getattr(field, 'uniqueness', None)\n if not uniqueness:\n return\n\n for keys in uniqueness:\n composed_keys = self._compose_keys(keys)\n yield composed_keys", "response": "Iterates over the keys marked as unique in the specified field."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef add_condition(self, field, value: Any) -> None:\n\n self.extra_conditions.append((field, value))", "response": "Adds an extra condition to this join."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef as_sql(self, compiler, connection) -> Tuple[str, List[Any]]:\n\n sql, params = super().as_sql(compiler, connection)\n qn = compiler.quote_name_unless_alias\n\n # generate the extra conditions\n extra_conditions = ' AND '.join([\n '{}.{} = %s'.format(\n qn(self.table_name),\n qn(field.column)\n )\n for field, value in self.extra_conditions\n ])\n\n # add to the existing params, so the connector will\n # actually nicely format the value for us\n for _, value in self.extra_conditions:\n params.append(value)\n\n # rewrite the sql to include the extra conditions\n rewritten_sql = sql.replace(')', ' AND {})'.format(extra_conditions))\n return rewritten_sql, params", "response": "Compiles this JOIN into a SQL string."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef from_join(cls, join: Join) -> 'ConditionalJoin':\n\n return cls(\n join.table_name,\n join.parent_alias,\n join.table_alias,\n join.join_type,\n join.join_field,\n join.nullable\n )", "response": "Creates a new : see : ConditionalJoin object from the specified join object."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef tdist95conf_level(df):\n df = int(round(df))\n highest_table_df = len(_T_DIST_95_CONF_LEVELS)\n if df >= 200:\n return 1.960\n if df >= 100:\n return 1.984\n if df >= 80:\n return 1.990\n if df >= 60:\n return 2.000\n if df >= 50:\n return 2.009\n if df >= 40:\n return 2.021\n if df >= highest_table_df:\n return _T_DIST_95_CONF_LEVELS[highest_table_df - 1]\n return _T_DIST_95_CONF_LEVELS[df]", "response": "Approximate the 95% confidence interval for the Student s T distribution. Given the degrees of freedom returns an approximation to the 95% confidence interval for the Student s T distribution."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef pooled_sample_variance(sample1, sample2):\n deg_freedom = len(sample1) + len(sample2) - 2\n mean1 = statistics.mean(sample1)\n squares1 = ((x - mean1) ** 2 for x in sample1)\n mean2 = statistics.mean(sample2)\n squares2 = ((x - mean2) ** 2 for x in sample2)\n\n return (math.fsum(squares1) + math.fsum(squares2)) / float(deg_freedom)", "response": "Find the pooled sample variance for two samples."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncalculates a t - test score for the difference between two samples.", "response": "def tscore(sample1, sample2):\n \"\"\"Calculate a t-test score for the difference between two samples.\n\n Args:\n sample1: one sample.\n sample2: the other sample.\n\n Returns:\n The t-test score, as a float.\n \"\"\"\n if len(sample1) != len(sample2):\n raise ValueError(\"different number of values\")\n error = pooled_sample_variance(sample1, sample2) / len(sample1)\n diff = statistics.mean(sample1) - statistics.mean(sample2)\n return diff / math.sqrt(error * 2)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndetermines if two samples differ significantly.", "response": "def is_significant(sample1, sample2):\n \"\"\"Determine whether two samples differ significantly.\n\n This uses a Student's two-sample, two-tailed t-test with alpha=0.95.\n\n Args:\n sample1: one sample.\n sample2: the other sample.\n\n Returns:\n (significant, t_score) where significant is a bool indicating whether\n the two samples differ significantly; t_score is the score from the\n two-sample T test.\n \"\"\"\n deg_freedom = len(sample1) + len(sample2) - 2\n critical_value = tdist95conf_level(deg_freedom)\n t_score = tscore(sample1, sample2)\n return (abs(t_score) >= critical_value, t_score)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef topoSort(roots, getParents):\n\n results = []\n visited = set()\n\n # Use iterative version to avoid stack limits for large datasets\n stack = [(node, 0) for node in roots]\n while stack:\n current, state = stack.pop()\n if state == 0:\n # before recursing\n if current not in visited:\n visited.add(current)\n stack.append((current, 1))\n stack.extend((parent, 0) for parent in getParents(current))\n else:\n # after recursing\n assert(current in visited)\n results.append(current)\n return results", "response": "Return a topological sorting of nodes in a graph."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef permutations(iterable, r=None):\n pool = tuple(iterable)\n n = len(pool)\n if r is None:\n r = n\n indices = list(range(n))\n cycles = list(range(n - r + 1, n + 1))[::-1]\n yield tuple(pool[i] for i in indices[:r])\n while n:\n for i in reversed(range(r)):\n cycles[i] -= 1\n if cycles[i] == 0:\n indices[i:] = indices[i + 1:] + indices[i:i + 1]\n cycles[i] = n - i\n else:\n j = cycles[i]\n indices[i], indices[-j] = indices[-j], indices[i]\n yield tuple(pool[i] for i in indices[:r])\n break\n else:\n return", "response": "Generator that yields the permutations of the given iterable."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef n_queens(queen_count):\n cols = range(queen_count)\n for vec in permutations(cols):\n if (queen_count == len(set(vec[i] + i for i in cols))\n == len(set(vec[i] - i for i in cols))):\n yield vec", "response": "N - Queens solver."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nselecting move ; unexplored children first then according to uct value", "response": "def select(self, board):\n \"\"\" select move; unexplored children first, then according to uct value \"\"\"\n if self.unexplored:\n i = random.randrange(len(self.unexplored))\n pos = self.unexplored[i]\n self.unexplored[i] = self.unexplored[len(self.unexplored) - 1]\n self.unexplored.pop()\n return pos\n elif self.bestchild:\n return self.bestchild.pos\n else:\n return PASS"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nupdates win / loss count along path", "response": "def update_path(self, board, color, path):\n \"\"\" update win/loss count along path \"\"\"\n wins = board.score(BLACK) >= board.score(WHITE)\n for node in path:\n if color == BLACK:\n color = WHITE\n else:\n color = BLACK\n if wins == (color == BLACK):\n node.wins += 1\n else:\n node.losses += 1\n if node.parent:\n node.parent.bestchild = node.parent.best_child()"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nfilter out benchmarks not supported by both Pythons and Pythons.", "response": "def filter_benchmarks(benchmarks, bench_funcs, base_ver):\n \"\"\"Filters out benchmarks not supported by both Pythons.\n\n Args:\n benchmarks: a set() of benchmark names\n bench_funcs: dict mapping benchmark names to functions\n python: the interpereter commands (as lists)\n\n Returns:\n The filtered set of benchmark names\n \"\"\"\n for bm in list(benchmarks):\n func = bench_funcs[bm]\n if getattr(func, '_python2_only', False) and (3, 0) <= base_ver:\n benchmarks.discard(bm)\n logging.info(\"Skipping Python2-only benchmark %s; \"\n \"not compatible with Python %s\" % (bm, base_ver))\n continue\n return benchmarks"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef expand_benchmark_name(bm_name, bench_groups):\n expansion = bench_groups.get(bm_name)\n if expansion:\n for name in expansion:\n for name in expand_benchmark_name(name, bench_groups):\n yield name\n else:\n yield bm_name", "response": "Recursively expand a benchmark name."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef gen_string_table(n):\n strings = []\n\n def append(s):\n if USE_BYTES_IN_PY3K:\n strings.append(s.encode('latin1'))\n else:\n strings.append(s)\n append('-' * n + 'Perl' + '-' * n)\n append('P' * n + 'Perl' + 'P' * n)\n append('-' * n + 'Perl' + '-' * n)\n append('-' * n + 'Perl' + '-' * n)\n append('-' * n + 'Python' + '-' * n)\n append('P' * n + 'Python' + 'P' * n)\n append('-' * n + 'Python' + '-' * n)\n append('-' * n + 'Python' + '-' * n)\n append('-' * n + 'Python' + '-' * n)\n append('-' * n + 'Python' + '-' * n)\n append('-' * n + 'Perl' + '-' * n)\n append('P' * n + 'Perl' + 'P' * n)\n append('-' * n + 'Perl' + '-' * n)\n append('-' * n + 'Perl' + '-' * n)\n append('-' * n + 'PythonPython' + '-' * n)\n append('P' * n + 'PythonPython' + 'P' * n)\n append('-' * n + 'a5,b7,c9,' + '-' * n)\n append('-' * n + 'a5,b7,c9,' + '-' * n)\n append('-' * n + 'a5,b7,c9,' + '-' * n)\n append('-' * n + 'a5,b7,c9,' + '-' * n)\n append('-' * n + 'Python' + '-' * n)\n return strings", "response": "Generates the list of strings that will be used in the benchmarks."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef init_benchmarks(n_values=None):\n\n if n_values is None:\n n_values = (0, 5, 50, 250, 1000, 5000, 10000)\n\n string_tables = {n: gen_string_table(n) for n in n_values}\n regexs = gen_regex_table()\n\n data = []\n for n in n_values:\n for id in xrange(len(regexs)):\n regex = regexs[id]\n string = string_tables[n][id]\n data.append((regex, string))\n return data", "response": "Initialize the strings we ll run the regexes against."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef combinations(l):\n result = []\n for x in xrange(len(l) - 1):\n ls = l[x + 1:]\n for y in ls:\n result.append((l[x], y))\n return result", "response": "Pure - Python implementation of itertools. combinations."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the domain of the B - Spline.", "response": "def GetDomain(self):\n \"\"\"Returns the domain of the B-Spline\"\"\"\n return (self.knots[self.degree - 1],\n self.knots[len(self.knots) - self.degree])"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nfetching the items of a specific category from the backend.", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the messages.\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n from_date = kwargs['from_date']\n\n logger.info(\"Fetching messages of '%s' - '%s' channel from %s\",\n self.url, self.channel, str(from_date))\n\n fetching = True\n page = 0\n nposts = 0\n\n # Convert timestamp to integer for comparing\n since = int(from_date.timestamp() * 1000)\n\n while fetching:\n raw_posts = self.client.posts(self.channel, page=page)\n\n posts_before = nposts\n\n for post in self._parse_posts(raw_posts):\n if post['update_at'] < since:\n fetching = False\n break\n\n # Fetch user data\n user_id = post['user_id']\n user = self._get_or_fetch_user(user_id)\n post['user_data'] = user\n\n yield post\n nposts += 1\n\n if fetching:\n # If no new posts were fetched; stop the process\n if posts_before == nposts:\n fetching = False\n else:\n page += 1\n\n logger.info(\"Fetch process completed: %s posts fetched\", nposts)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nparses posts and returns in order.", "response": "def _parse_posts(self, raw_posts):\n \"\"\"Parse posts and returns in order.\"\"\"\n\n parsed_posts = self.parse_json(raw_posts)\n\n # Posts are not sorted. The order is provided by\n # 'order' key.\n for post_id in parsed_posts['order']:\n yield parsed_posts['posts'][post_id]"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef posts(self, channel, page=None):\n\n entrypoint = self.RCHANNELS + '/' + channel + '/' + self.RPOSTS\n\n params = {\n self.PPER_PAGE: self.max_items\n }\n\n if page is not None:\n params[self.PPAGE] = page\n\n response = self._fetch(entrypoint, params)\n\n return response", "response": "Fetch the history of a channel."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfetches a resource. the is the ID of the resource in the entry point.", "response": "def _fetch(self, entry_point, params):\n \"\"\"Fetch a resource.\n\n :param entrypoint: entrypoint to access\n :param params: dict with the HTTP parameters needed to access the\n given entry point\n \"\"\"\n url = self.API_URL % {'base_url': self.base_url, 'entrypoint': entry_point}\n\n logger.debug(\"Mattermost client requests: %s params: %s\",\n entry_point, str(params))\n\n r = self.fetch(url, payload=params)\n\n return r.text"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _pre_init(self):\n\n if not self.parsed_args.mboxes_path:\n base_path = os.path.expanduser('~/.perceval/mailinglists/')\n dirpath = os.path.join(base_path, self.parsed_args.url)\n else:\n dirpath = self.parsed_args.mboxes_path\n\n setattr(self.parsed_args, 'dirpath', dirpath)", "response": "Initialize mailing lists directory path"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fetch(self, from_date=DEFAULT_DATETIME):\n logger.info(\"Downloading mboxes from '%s' to since %s\",\n self.url, str(from_date))\n logger.debug(\"Storing mboxes in '%s'\", self.dirpath)\n\n from_date = datetime_to_utc(from_date)\n\n r = requests.get(self.url, verify=self.verify)\n r.raise_for_status()\n\n links = self._parse_archive_links(r.text)\n\n fetched = []\n\n if not os.path.exists(self.dirpath):\n os.makedirs(self.dirpath)\n\n for l in links:\n filename = os.path.basename(l)\n\n mbox_dt = self._parse_date_from_filepath(filename)\n\n if ((from_date.year == mbox_dt.year and\n from_date.month == mbox_dt.month) or\n from_date < mbox_dt):\n\n filepath = os.path.join(self.dirpath, filename)\n success = self._download_archive(l, filepath)\n\n if success:\n fetched.append((l, filepath))\n\n logger.info(\"%s/%s MBoxes downloaded\", len(fetched), len(links))\n\n return fetched", "response": "Fetch the mbox files from the remote archiver."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget the mboxes managed by this mailing list.", "response": "def mboxes(self):\n \"\"\"Get the mboxes managed by this mailing list.\n\n Returns the archives sorted by date in ascending order.\n\n :returns: a list of `.MBoxArchive` objects\n \"\"\"\n archives = []\n\n for mbox in super().mboxes:\n dt = self._parse_date_from_filepath(mbox.filepath)\n archives.append((dt, mbox))\n\n archives.sort(key=lambda x: x[0])\n\n return [a[1] for a in archives]"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfetches the entries from the url.", "response": "def fetch(self, category=CATEGORY_ENTRY):\n \"\"\"Fetch the entries from the url.\n\n The method retrieves all entries from a RSS url\n\n :param category: the category of items to fetch\n\n :returns: a generator of entries\n \"\"\"\n kwargs = {}\n items = super().fetch(category, **kwargs)\n\n return items"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef fetch_items(self, category, **kwargs):\n logger.info(\"Looking for rss entries at feed '%s'\", self.url)\n\n nentries = 0 # number of entries\n\n raw_entries = self.client.get_entries()\n entries = self.parse_feed(raw_entries)['entries']\n for item in entries:\n yield item\n nentries += 1\n\n logger.info(\"Total number of entries: %i\", nentries)", "response": "Fetch the items from the feed"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning the RSS argument parser.", "response": "def setup_cmd_parser(cls):\n \"\"\"Returns the RSS argument parser.\"\"\"\n\n parser = BackendCommandArgumentParser(cls.BACKEND.CATEGORIES,\n archive=True)\n\n # Required arguments\n parser.parser.add_argument('url',\n help=\"URL of the RSS feed\")\n\n return parser"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfetches the bugs from a Bugzilla repository.", "response": "def fetch(self, category=CATEGORY_BUG, from_date=DEFAULT_DATETIME):\n \"\"\"Fetch the bugs from the repository.\n\n The method retrieves, from a Bugzilla repository, the bugs\n updated since the given date.\n\n :param category: the category of items to fetch\n :param from_date: obtain bugs updated since this date\n\n :returns: a generator of bugs\n \"\"\"\n if not from_date:\n from_date = DEFAULT_DATETIME\n\n kwargs = {'from_date': from_date}\n items = super().fetch(category, **kwargs)\n\n return items"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nfetching the bugs from the backend and return a generator of items", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the bugs\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n\n from_date = kwargs['from_date']\n\n logger.info(\"Looking for bugs: '%s' updated from '%s'\",\n self.url, str(from_date))\n\n nbugs = 0\n for bug in self.__fetch_and_parse_bugs(from_date):\n nbugs += 1\n yield bug\n\n logger.info(\"Fetch process completed: %s bugs fetched\", nbugs)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nauthenticate a user in the server.", "response": "def login(self, user, password):\n \"\"\"Authenticate a user in the server.\n\n :param user: Bugzilla user\n :param password: user password\n \"\"\"\n params = {\n self.PBUGZILLA_LOGIN: user,\n self.PBUGZILLA_PASSWORD: password\n }\n\n try:\n r = self.call(self.RLOGIN, params)\n except requests.exceptions.HTTPError as e:\n cause = (\"Bugzilla REST client could not authenticate user %s. \"\n \"See exception: %s\") % (user, str(e))\n raise BackendError(cause=cause)\n\n data = json.loads(r)\n self.api_token = data['token']"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef bugs(self, from_date=DEFAULT_DATETIME, offset=None, max_bugs=MAX_BUGS):\n date = datetime_to_utc(from_date)\n date = date.strftime(\"%Y-%m-%dT%H:%M:%SZ\")\n\n params = {\n self.PLAST_CHANGE_TIME: date,\n self.PLIMIT: max_bugs,\n self.PORDER: self.VCHANGE_DATE_ORDER,\n self.PINCLUDE_FIELDS: self.VINCLUDE_ALL\n }\n\n if offset:\n params[self.POFFSET] = offset\n\n response = self.call(self.RBUG, params)\n\n return response", "response": "Get the information of bugs in a list of modules."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef comments(self, *bug_ids):\n # Hack. The first value must be a valid bug id\n resource = urijoin(self.RBUG, bug_ids[0], self.RCOMMENT)\n\n params = {\n self.PIDS: bug_ids\n }\n\n response = self.call(resource, params)\n\n return response", "response": "Get the comments of the given bugs."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget the history of the given bugs.", "response": "def history(self, *bug_ids):\n \"\"\"Get the history of the given bugs.\n\n :param bug_ids: list of bug identifiers\n \"\"\"\n resource = urijoin(self.RBUG, bug_ids[0], self.RHISTORY)\n\n params = {\n self.PIDS: bug_ids\n }\n\n response = self.call(resource, params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget the attachments of the given bugs.", "response": "def attachments(self, *bug_ids):\n \"\"\"Get the attachments of the given bugs.\n\n :param bug_id: list of bug identifiers\n \"\"\"\n resource = urijoin(self.RBUG, bug_ids[0], self.RATTACHMENT)\n\n params = {\n self.PIDS: bug_ids,\n self.PEXCLUDE_FIELDS: self.VEXCLUDE_ATTCH_DATA\n }\n\n response = self.call(resource, params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef call(self, resource, params):\n url = self.URL % {'base': self.base_url, 'resource': resource}\n\n if self.api_token:\n params[self.PBUGZILLA_TOKEN] = self.api_token\n\n logger.debug(\"Bugzilla REST client requests: %s params: %s\",\n resource, str(params))\n\n r = self.fetch(url, payload=params)\n\n # Check for possible Bugzilla API errors\n result = r.json()\n\n if result.get('error', False):\n raise BugzillaRESTError(error=result['message'],\n code=result['code'])\n\n return r.text", "response": "Retrive the given resource."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsanitizing the payload of a HTTP request for storing or retrieving archived items.", "response": "def sanitize_for_archive(url, headers, payload):\n \"\"\"Sanitize payload of a HTTP request by removing the login, password and token information\n before storing/retrieving archived items\n\n :param: url: HTTP url request\n :param: headers: HTTP headers request\n :param: payload: HTTP payload request\n\n :returns url, headers and the sanitized payload\n \"\"\"\n if BugzillaRESTClient.PBUGZILLA_LOGIN in payload:\n payload.pop(BugzillaRESTClient.PBUGZILLA_LOGIN)\n\n if BugzillaRESTClient.PBUGZILLA_PASSWORD in payload:\n payload.pop(BugzillaRESTClient.PBUGZILLA_PASSWORD)\n\n if BugzillaRESTClient.PBUGZILLA_TOKEN in payload:\n payload.pop(BugzillaRESTClient.PBUGZILLA_TOKEN)\n\n return url, headers, payload"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef fetch_items(self, category, **kwargs):\n from_date = kwargs['from_date']\n\n if category == CATEGORY_ISSUE:\n items = self.__fetch_issues(from_date)\n else:\n items = self.__fetch_merge_requests(from_date)\n\n return items", "response": "Fetch the items from the backend"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfetches the issues from the API.", "response": "def __fetch_issues(self, from_date):\n \"\"\"Fetch the issues\"\"\"\n\n issues_groups = self.client.issues(from_date=from_date)\n\n for raw_issues in issues_groups:\n issues = json.loads(raw_issues)\n for issue in issues:\n issue_id = issue['iid']\n\n if self.blacklist_ids and issue_id in self.blacklist_ids:\n logger.warning(\"Skipping blacklisted issue %s\", issue_id)\n continue\n\n self.__init_issue_extra_fields(issue)\n\n issue['notes_data'] = \\\n self.__get_issue_notes(issue_id)\n issue['award_emoji_data'] = \\\n self.__get_award_emoji(GitLabClient.ISSUES, issue_id)\n\n yield issue"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef __fetch_merge_requests(self, from_date):\n\n merges_groups = self.client.merges(from_date=from_date)\n\n for raw_merges in merges_groups:\n merges = json.loads(raw_merges)\n for merge in merges:\n merge_id = merge['iid']\n\n if self.blacklist_ids and merge_id in self.blacklist_ids:\n logger.warning(\"Skipping blacklisted merge request %s\", merge_id)\n continue\n\n # The single merge_request API call returns a more\n # complete merge request, thus we inflate it with\n # other data (e.g., notes, emojis, versions)\n merge_full_raw = self.client.merge(merge_id)\n merge_full = json.loads(merge_full_raw)\n\n self.__init_merge_extra_fields(merge_full)\n\n merge_full['notes_data'] = self.__get_merge_notes(merge_id)\n merge_full['award_emoji_data'] = self.__get_award_emoji(GitLabClient.MERGES, merge_id)\n merge_full['versions_data'] = self.__get_merge_versions(merge_id)\n\n yield merge_full", "response": "Fetch the merge requests from the server"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef __get_merge_notes(self, merge_id):\n\n notes = []\n\n group_notes = self.client.notes(GitLabClient.MERGES, merge_id)\n\n for raw_notes in group_notes:\n for note in json.loads(raw_notes):\n note_id = note['id']\n note['award_emoji_data'] = \\\n self.__get_note_award_emoji(GitLabClient.MERGES, merge_id, note_id)\n notes.append(note)\n\n return notes", "response": "Get the merge notes"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef __get_award_emoji(self, item_type, item_id):\n\n emojis = []\n\n group_emojis = self.client.emojis(item_type, item_id)\n for raw_emojis in group_emojis:\n\n for emoji in json.loads(raw_emojis):\n emojis.append(emoji)\n\n return emojis", "response": "Get award emojis for issue / merge request"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef __get_note_award_emoji(self, item_type, item_id, note_id):\n\n emojis = []\n\n group_emojis = self.client.note_emojis(item_type, item_id, note_id)\n try:\n for raw_emojis in group_emojis:\n\n for emoji in json.loads(raw_emojis):\n emojis.append(emoji)\n except requests.exceptions.HTTPError as error:\n if error.response.status_code == 404:\n logger.warning(\"Emojis not available for %s \",\n urijoin(item_type, str(item_id), GitLabClient.NOTES,\n str(note_id), GitLabClient.EMOJI))\n return emojis\n\n return emojis", "response": "Fetch emojis for a note of an issue / merge request"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef issues(self, from_date=None):\n\n payload = {\n 'state': 'all',\n 'order_by': 'updated_at',\n 'sort': 'asc',\n 'per_page': PER_PAGE\n }\n\n if from_date:\n payload['updated_after'] = from_date.isoformat()\n\n return self.fetch_items(GitLabClient.ISSUES, payload)", "response": "Get the issues from pagination"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets the merge requests from pagination", "response": "def merges(self, from_date=None):\n \"\"\"Get the merge requests from pagination\"\"\"\n\n payload = {\n 'state': 'all',\n 'order_by': 'updated_at',\n 'sort': 'asc',\n 'view': 'simple',\n 'per_page': PER_PAGE\n }\n\n if from_date:\n payload['updated_after'] = from_date.isoformat()\n\n return self.fetch_items(GitLabClient.MERGES, payload)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef merge(self, merge_id):\n\n path = urijoin(self.base_url,\n GitLabClient.PROJECTS, self.owner + '%2F' + self.repository,\n GitLabClient.MERGES, merge_id)\n\n response = self.fetch(path)\n\n return response.text", "response": "Get the merge full data"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting the merge versions from pagination", "response": "def merge_versions(self, merge_id):\n \"\"\"Get the merge versions from pagination\"\"\"\n\n payload = {\n 'order_by': 'updated_at',\n 'sort': 'asc',\n 'per_page': PER_PAGE\n }\n\n path = urijoin(GitLabClient.MERGES, str(merge_id), GitLabClient.VERSIONS)\n return self.fetch_items(path, payload)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting merge version detail", "response": "def merge_version(self, merge_id, version_id):\n \"\"\"Get merge version detail\"\"\"\n\n path = urijoin(self.base_url,\n GitLabClient.PROJECTS, self.owner + '%2F' + self.repository,\n GitLabClient.MERGES, merge_id, GitLabClient.VERSIONS, version_id)\n\n response = self.fetch(path)\n\n return response.text"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef notes(self, item_type, item_id):\n\n payload = {\n 'order_by': 'updated_at',\n 'sort': 'asc',\n 'per_page': PER_PAGE\n }\n\n path = urijoin(item_type, str(item_id), GitLabClient.NOTES)\n\n return self.fetch_items(path, payload)", "response": "Get the notes from pagination"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef emojis(self, item_type, item_id):\n\n payload = {\n 'order_by': 'updated_at',\n 'sort': 'asc',\n 'per_page': PER_PAGE\n }\n\n path = urijoin(item_type, str(item_id), GitLabClient.EMOJI)\n\n return self.fetch_items(path, payload)", "response": "Get emojis from pagination"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef note_emojis(self, item_type, item_id, note_id):\n\n payload = {\n 'order_by': 'updated_at',\n 'sort': 'asc',\n 'per_page': PER_PAGE\n }\n\n path = urijoin(item_type, str(item_id), GitLabClient.NOTES,\n str(note_id), GitLabClient.EMOJI)\n\n return self.fetch_items(path, payload)", "response": "Get the emojis of a note"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef calculate_time_to_reset(self):\n\n time_to_reset = self.rate_limit_reset_ts - (datetime_utcnow().replace(microsecond=0).timestamp() + 1)\n\n if time_to_reset < 0:\n time_to_reset = 0\n\n return time_to_reset", "response": "Calculate the seconds to reset the token requests by obtaining the different\n between the current date and the next date when the token is fully regenerated."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef fetch(self, url, payload=None, headers=None, method=HttpClient.GET, stream=False):\n if not self.from_archive:\n self.sleep_for_rate_limit()\n\n response = super().fetch(url, payload, headers, method, stream)\n\n if not self.from_archive:\n self.update_rate_limit(response)\n\n return response", "response": "Fetch the data from a given URL."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef fetch_items(self, path, payload):\n\n page = 0 # current page\n last_page = None # last page\n url_next = urijoin(self.base_url, GitLabClient.PROJECTS, self.owner + '%2F' + self.repository, path)\n\n logger.debug(\"Get GitLab paginated items from \" + url_next)\n\n response = self.fetch(url_next, payload=payload)\n\n items = response.text\n page += 1\n\n if 'last' in response.links:\n last_url = response.links['last']['url']\n last_page = last_url.split('&page=')[1].split('&')[0]\n last_page = int(last_page)\n logger.debug(\"Page: %i/%i\" % (page, last_page))\n\n while items:\n yield items\n\n items = None\n\n if 'next' in response.links:\n url_next = response.links['next']['url'] # Loving requests :)\n response = self.fetch(url_next, payload=payload)\n page += 1\n\n items = response.text\n logger.debug(\"Page: %i/%i\" % (page, last_page))", "response": "Fetch the items from GitLab API using links pagination"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef sanitize_for_archive(url, headers, payload):\n if headers and 'PRIVATE-TOKEN' in headers:\n headers.pop('PRIVATE-TOKEN', None)\n\n return url, headers, payload", "response": "Sanitize the payload of a HTTP request for storing or retrieving archived items."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ninitialize rate limit information", "response": "def _init_rate_limit(self):\n \"\"\"Initialize rate limit information\"\"\"\n\n url = urijoin(self.base_url, 'projects', self.owner + '%2F' + self.repository)\n try:\n response = super().fetch(url)\n self.update_rate_limit(response)\n except requests.exceptions.HTTPError as error:\n if error.response.status_code == 401:\n raise error\n else:\n logger.warning(\"Rate limit not initialized: %s\", error)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef setup_cmd_parser(cls):\n\n parser = BackendCommandArgumentParser(cls.BACKEND.CATEGORIES,\n from_date=True,\n token_auth=True,\n archive=True)\n\n # GitLab options\n group = parser.parser.add_argument_group('GitLab arguments')\n group.add_argument('--enterprise-url', dest='base_url',\n help=\"Base URL for GitLab Enterprise instance\")\n group.add_argument('--sleep-for-rate', dest='sleep_for_rate',\n action='store_true',\n help=\"sleep for getting more rate\")\n group.add_argument('--min-rate-to-sleep', dest='min_rate_to_sleep',\n default=MIN_RATE_LIMIT, type=int,\n help=\"sleep until reset when the rate limit \\\n reaches this value\")\n group.add_argument('--blacklist-ids', dest='blacklist_ids',\n nargs='*', type=int,\n help=\"Ids of items that must not be retrieved.\")\n\n # Generic client options\n group.add_argument('--max-retries', dest='max_retries',\n default=MAX_RETRIES, type=int,\n help=\"number of API call retries\")\n group.add_argument('--sleep-time', dest='sleep_time',\n default=DEFAULT_SLEEP_TIME, type=int,\n help=\"sleeping time between API call retries\")\n\n # Positional arguments\n parser.parser.add_argument('owner',\n help=\"GitLab owner\")\n parser.parser.add_argument('repository',\n help=\"GitLab repository\")\n\n return parser", "response": "Returns the GitLab argument parser."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef fetch(self, category=CATEGORY_MESSAGE, from_date=DEFAULT_DATETIME):\n if not from_date:\n from_date = DEFAULT_DATETIME\n\n from_date = datetime_to_utc(from_date)\n latest = datetime_utcnow().timestamp()\n\n kwargs = {'from_date': from_date, 'latest': latest}\n items = super().fetch(category, **kwargs)\n\n return items", "response": "Fetch the messages from the channel."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nfetch the items of a specific category from Slack.", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the messages\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n from_date = kwargs['from_date']\n latest = kwargs['latest']\n\n logger.info(\"Fetching messages of '%s' channel from %s\",\n self.channel, str(from_date))\n\n raw_info = self.client.channel_info(self.channel)\n\n channel_info = self.parse_channel_info(raw_info)\n channel_info['num_members'] = self.client.conversation_members(self.channel)\n\n oldest = datetime_to_utc(from_date).timestamp()\n\n # Minimum value supported by Slack is 0 not 0.0\n if oldest == 0.0:\n oldest = 0\n\n # Slack does not include on its result the lower limit\n # of the search if it has the same date of 'oldest'. To get\n # this messages too, we substract a low value to be sure\n # the dates are not the same. To avoid precision problems\n # it is substracted by five decimals and not by six.\n if oldest > 0.0:\n oldest -= .00001\n\n fetching = True\n nmsgs = 0\n\n while fetching:\n raw_history = self.client.history(self.channel,\n oldest=oldest, latest=latest)\n messages, fetching = self.parse_history(raw_history)\n\n for message in messages:\n # Fetch user data\n user_id = None\n if 'user' in message:\n user_id = message['user']\n elif 'comment' in message:\n user_id = message['comment']['user']\n\n if user_id:\n message['user_data'] = self.__get_or_fetch_user(user_id)\n\n message['channel_info'] = channel_info\n yield message\n\n nmsgs += 1\n\n if fetching:\n latest = float(message['ts'])\n\n logger.info(\"Fetch process completed: %s message fetched\", nmsgs)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef metadata_id(item):\n if 'user' in item:\n nick = item['user']\n elif 'comment' in item:\n nick = item['comment']['user']\n else:\n nick = item['bot_id']\n\n return item['ts'] + nick", "response": "Extracts the unique identifier from a Slack item."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef conversation_members(self, conversation):\n members = 0\n\n resource = self.RCONVERSATION_INFO\n\n params = {\n self.PCHANNEL: conversation,\n }\n\n raw_response = self._fetch(resource, params)\n response = json.loads(raw_response)\n\n members += len(response[\"members\"])\n while 'next_cursor' in response['response_metadata'] and response['response_metadata']['next_cursor']:\n params['cursor'] = response['response_metadata']['next_cursor']\n raw_response = self._fetch(resource, params)\n response = json.loads(raw_response)\n members += len(response[\"members\"])\n\n return members", "response": "Fetch the number of members in a conversation which is a supertype for public and\n private ones DM and group DM."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef channel_info(self, channel):\n\n resource = self.RCHANNEL_INFO\n\n params = {\n self.PCHANNEL: channel,\n }\n\n response = self._fetch(resource, params)\n\n return response", "response": "Fetch information about a channel."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfetching the history of a channel.", "response": "def history(self, channel, oldest=None, latest=None):\n \"\"\"Fetch the history of a channel.\"\"\"\n\n resource = self.RCHANNEL_HISTORY\n\n params = {\n self.PCHANNEL: channel,\n self.PCOUNT: self.max_items\n }\n\n if oldest is not None:\n params[self.POLDEST] = oldest\n if latest is not None:\n params[self.PLATEST] = latest\n\n response = self._fetch(resource, params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nsanitize the payload of a HTTP request for storing or retrieving archived items.", "response": "def sanitize_for_archive(url, headers, payload):\n \"\"\"Sanitize payload of a HTTP request by removing the token information\n before storing/retrieving archived items\n\n :param: url: HTTP url request\n :param: headers: HTTP headers request\n :param: payload: HTTP payload request\n\n :returns url, headers and the sanitized payload\n \"\"\"\n if SlackClient.PTOKEN in payload:\n payload.pop(SlackClient.PTOKEN)\n\n return url, headers, payload"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfetching a resource. get", "response": "def _fetch(self, resource, params):\n \"\"\"Fetch a resource.\n\n :param resource: resource to get\n :param params: dict with the HTTP parameters needed to get\n the given resource\n \"\"\"\n url = self.URL % {'resource': resource}\n params[self.PTOKEN] = self.api_token\n\n logger.debug(\"Slack client requests: %s params: %s\",\n resource, str(params))\n\n r = self.fetch(url, payload=params)\n\n # Check for possible API errors\n result = r.json()\n\n if not result['ok']:\n raise SlackClientError(error=result['error'])\n\n return r.text"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef setup_cmd_parser(cls):\n\n parser = BackendCommandArgumentParser(cls.BACKEND.CATEGORIES,\n from_date=True,\n token_auth=True,\n archive=True)\n\n # Backend token is required\n action = parser.parser._option_string_actions['--api-token']\n action.required = True\n\n # Slack options\n group = parser.parser.add_argument_group('Slack arguments')\n group.add_argument('--max-items', dest='max_items',\n type=int, default=MAX_ITEMS,\n help=\"Maximum number of items requested on the same query\")\n\n # Required arguments\n parser.parser.add_argument('channel',\n help=\"Slack channel identifier\")\n\n return parser", "response": "Returns the Slack argument parser."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nfetching the bugs and items from the backend", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the bugs\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n\n from_date = kwargs['from_date']\n\n logger.info(\"Looking for bugs: '%s' updated from '%s'\",\n self.url, str(from_date))\n\n buglist = [bug for bug in self.__fetch_buglist(from_date)]\n\n nbugs = 0\n tbugs = len(buglist)\n\n for i in range(0, tbugs, self.max_bugs):\n chunk = buglist[i:i + self.max_bugs]\n bugs_ids = [b['bug_id'] for b in chunk]\n\n logger.info(\"Fetching bugs: %s/%s\", i, tbugs)\n bugs = self.__fetch_and_parse_bugs_details(bugs_ids)\n\n for bug in bugs:\n bug_id = bug['bug_id'][0]['__text__']\n bug['activity'] = self.__fetch_and_parse_bug_activity(bug_id)\n nbugs += 1\n yield bug\n\n logger.info(\"Fetch process completed: %s/%s bugs fetched\",\n nbugs, tbugs)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef metadata_updated_on(item):\n ts = item['delta_ts'][0]['__text__']\n ts = str_to_datetime(ts)\n ts = ts.replace(tzinfo=dateutil.tz.tzutc())\n\n return ts.timestamp()", "response": "Extracts and coverts the update time from a Bugzilla item."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef parse_buglist(raw_csv):\n reader = csv.DictReader(raw_csv.split('\\n'),\n delimiter=',', quotechar='\"')\n for row in reader:\n yield row", "response": "Parse a Bugzilla CSV bug list."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef parse_bugs_details(raw_xml):\n bugs = xml_to_dict(raw_xml)\n\n if 'bug' not in bugs:\n cause = \"No bugs found. XML stream seems to be invalid.\"\n raise ParseError(cause=cause)\n\n for bug in bugs['bug']:\n yield bug", "response": "Parse a Bugilla bugs details XML stream."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nauthenticates a user in Bugzilla server.", "response": "def login(self, user, password):\n \"\"\"Authenticate a user in the server.\n\n :param user: Bugzilla user\n :param password: user password\n \"\"\"\n url = self.URL % {'base': self.base_url, 'cgi': self.CGI_LOGIN}\n\n payload = {\n self.PBUGZILLA_LOGIN: user,\n self.PBUGZILLA_PASSWORD: password,\n self.PLOGIN: 'Log in'\n }\n\n headers = {'Referer': self.base_url}\n\n req = self.fetch(url, payload=payload, headers=headers, method=HttpClient.POST)\n\n # Check if the authentication went OK. When this string\n # is found means that the authentication was successful\n if req.text.find(\"index.cgi?logout=1\") < 0:\n cause = (\"Bugzilla client could not authenticate user %s. \"\n \"Please check user and password parameters. \"\n \"URLs may also need a trailing '/'.\") % user\n raise BackendError(cause=cause)\n\n logger.debug(\"Bugzilla user %s authenticated in %s\",\n user, self.base_url)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef logout(self):\n\n params = {\n self.PLOGOUT: '1'\n }\n\n self.call(self.CGI_LOGIN, params)\n self._close_http_session()\n\n logger.debug(\"Bugzilla user logged out from %s\",\n self.base_url)", "response": "Logout from the server."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngets metadata information in XML format.", "response": "def metadata(self):\n \"\"\"Get metadata information in XML format.\"\"\"\n\n params = {\n self.PCTYPE: self.CTYPE_XML\n }\n\n response = self.call(self.CGI_BUG, params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef buglist(self, from_date=DEFAULT_DATETIME):\n if not self.version:\n self.version = self.__fetch_version()\n\n if self.version in self.OLD_STYLE_VERSIONS:\n order = 'Last+Changed'\n else:\n order = 'changeddate'\n\n date = from_date.strftime(\"%Y-%m-%d %H:%M:%S\")\n\n params = {\n self.PCHFIELD_FROM: date,\n self.PCTYPE: self.CTYPE_CSV,\n self.PLIMIT: self.max_bugs_csv,\n self.PORDER: order\n }\n\n response = self.call(self.CGI_BUGLIST, params)\n\n return response", "response": "Get a summary of bugs in CSV format."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef bugs(self, *bug_ids):\n params = {\n self.PBUG_ID: bug_ids,\n self.PCTYPE: self.CTYPE_XML,\n self.PEXCLUDE_FIELD: 'attachmentdata'\n }\n\n response = self.call(self.CGI_BUG, params)\n\n return response", "response": "Get the information of a list of bugs in XML format."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets the activity of a bug in HTML format.", "response": "def bug_activity(self, bug_id):\n \"\"\"Get the activity of a bug in HTML format.\n\n :param bug_id: bug identifier\n \"\"\"\n params = {\n self.PBUG_ID: bug_id\n }\n\n response = self.call(self.CGI_BUG_ACTIVITY, params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef call(self, cgi, params):\n url = self.URL % {'base': self.base_url, 'cgi': cgi}\n\n logger.debug(\"Bugzilla client calls command: %s params: %s\",\n cgi, str(params))\n\n req = self.fetch(url, payload=params)\n\n return req.text", "response": "Run an API command."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsanitize the payload of a HTTP request for storing or retrieving archived items.", "response": "def sanitize_for_archive(url, headers, payload):\n \"\"\"Sanitize payload of a HTTP request by removing the login and password information\n before storing/retrieving archived items\n\n :param: url: HTTP url request\n :param: headers: HTTP headers request\n :param: payload: HTTP payload request\n\n :returns url, headers and the sanitized payload\n \"\"\"\n if BugzillaClient.PBUGZILLA_LOGIN in payload:\n payload.pop(BugzillaClient.PBUGZILLA_LOGIN)\n\n if BugzillaClient.PBUGZILLA_PASSWORD in payload:\n payload.pop(BugzillaClient.PBUGZILLA_PASSWORD)\n\n if BugzillaClient.PLOGIN in payload:\n payload.pop(BugzillaClient.PLOGIN)\n\n return url, headers, payload"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfetches the events from the server.", "response": "def fetch(self, category=CATEGORY_EVENT, from_date=DEFAULT_DATETIME, to_date=None,\n filter_classified=False):\n \"\"\"Fetch the events from the server.\n\n This method fetches those events of a group stored on the server\n that were updated since the given date. Data comments and rsvps\n are included within each event.\n\n :param category: the category of items to fetch\n :param from_date: obtain events updated since this date\n :param to_date: obtain events updated before this date\n :param filter_classified: remove classified fields from the resulting items\n\n :returns: a generator of events\n \"\"\"\n if not from_date:\n from_date = DEFAULT_DATETIME\n\n from_date = datetime_to_utc(from_date)\n\n kwargs = {\"from_date\": from_date, \"to_date\": to_date}\n items = super().fetch(category,\n filter_classified=filter_classified,\n **kwargs)\n\n return items"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef fetch_items(self, category, **kwargs):\n from_date = kwargs['from_date']\n to_date = kwargs['to_date']\n\n logger.info(\"Fetching events of '%s' group from %s to %s\",\n self.group, str(from_date),\n str(to_date) if to_date else '--')\n\n to_date_ts = datetime_to_utc(to_date).timestamp() if to_date else None\n\n nevents = 0\n stop_fetching = False\n\n ev_pages = self.client.events(self.group, from_date=from_date)\n\n for evp in ev_pages:\n events = [event for event in self.parse_json(evp)]\n\n for event in events:\n event_id = event['id']\n\n event['comments'] = self.__fetch_and_parse_comments(event_id)\n event['rsvps'] = self.__fetch_and_parse_rsvps(event_id)\n\n # Check events updated before 'to_date'\n event_ts = self.metadata_updated_on(event)\n\n if to_date_ts and event_ts >= to_date_ts:\n stop_fetching = True\n continue\n\n yield event\n nevents += 1\n\n if stop_fetching:\n break\n\n logger.info(\"Fetch process completed: %s events fetched\", nevents)", "response": "Fetch the items from the backend"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfetches the events pages of a given group.", "response": "def events(self, group, from_date=DEFAULT_DATETIME):\n \"\"\"Fetch the events pages of a given group.\"\"\"\n\n date = datetime_to_utc(from_date)\n date = date.strftime(\"since:%Y-%m-%dT%H:%M:%S.000Z\")\n\n resource = urijoin(group, self.REVENTS)\n\n # Hack required due to Metup API does not support list\n # values with the format `?param=value1¶m=value2`.\n # It only works with `?param=value1,value2`.\n # Morever, urrlib3 encodes comma characters when values\n # are given using params dict, which it doesn't work\n # with Meetup, either.\n fixed_params = '?' + self.PFIELDS + '=' + ','.join(self.VEVENT_FIELDS)\n fixed_params += '&' + self.PSTATUS + '=' + ','.join(self.VSTATUS)\n resource += fixed_params\n\n params = {\n self.PORDER: self.VUPDATED,\n self.PSCROLL: date,\n self.PPAGE: self.max_items\n }\n\n try:\n for page in self._fetch(resource, params):\n yield page\n except requests.exceptions.HTTPError as error:\n if error.response.status_code == 410:\n msg = \"Group is no longer accessible: {}\".format(error)\n raise RepositoryError(cause=msg)\n else:\n raise error"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef comments(self, group, event_id):\n\n resource = urijoin(group, self.REVENTS, event_id, self.RCOMMENTS)\n\n params = {\n self.PPAGE: self.max_items\n }\n\n for page in self._fetch(resource, params):\n yield page", "response": "Fetch the comments of a given event."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nfetch the rsvps of a given event.", "response": "def rsvps(self, group, event_id):\n \"\"\"Fetch the rsvps of a given event.\"\"\"\n\n resource = urijoin(group, self.REVENTS, event_id, self.RRSVPS)\n\n # Same hack that in 'events' method\n fixed_params = '?' + self.PFIELDS + '=' + ','.join(self.VRSVP_FIELDS)\n fixed_params += '&' + self.PRESPONSE + '=' + ','.join(self.VRESPONSE)\n resource += fixed_params\n\n params = {\n self.PPAGE: self.max_items\n }\n\n for page in self._fetch(resource, params):\n yield page"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nsanitize the payload of a HTTP request for storing or retrieving archived items.", "response": "def sanitize_for_archive(url, headers, payload):\n \"\"\"Sanitize payload of a HTTP request by removing the token information\n before storing/retrieving archived items\n\n :param: url: HTTP url request\n :param: headers: HTTP headers request\n :param: payload: HTTP payload request\n\n :returns url, headers and the sanitized payload\n \"\"\"\n if MeetupClient.PKEY in payload:\n payload.pop(MeetupClient.PKEY)\n\n if MeetupClient.PSIGN in payload:\n payload.pop(MeetupClient.PSIGN)\n\n return url, headers, payload"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _fetch(self, resource, params):\n url = urijoin(self.base_url, resource)\n\n params[self.PKEY] = self.api_key\n params[self.PSIGN] = 'true',\n\n do_fetch = True\n\n while do_fetch:\n logger.debug(\"Meetup client calls resource: %s params: %s\",\n resource, str(params))\n\n if not self.from_archive:\n self.sleep_for_rate_limit()\n\n r = self.fetch(url, payload=params)\n\n if not self.from_archive:\n self.update_rate_limit(r)\n\n yield r.text\n\n if r.links and 'next' in r.links:\n url = r.links['next']['url']\n params = {\n self.PKEY: self.api_key,\n self.PSIGN: 'true'\n }\n else:\n do_fetch = False", "response": "Fetch a resource and return a generator of pages."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fetch_items(self, category, **kwargs):\n\n from_date = datetime_to_utc(kwargs['from_date']).timestamp()\n\n questions_groups = self.client.get_api_questions(AskbotClient.API_QUESTIONS)\n for questions in questions_groups:\n\n for question in questions['questions']:\n updated_at = int(question['last_activity_at'])\n if updated_at > from_date:\n html_question = self.__fetch_question(question)\n if not html_question:\n continue\n\n logger.debug(\"Fetching HTML question %s\", question['id'])\n comments = self.__fetch_comments(question)\n question_obj = self.__build_question(html_question, question, comments)\n question.update(question_obj)\n yield question", "response": "Fetch the items from the API."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef __fetch_question(self, question):\n html_question_items = []\n\n npages = 1\n next_request = True\n\n while next_request:\n try:\n html_question = self.client.get_html_question(question['id'], npages)\n html_question_items.append(html_question)\n tpages = self.ab_parser.parse_number_of_html_pages(html_question)\n\n if npages == tpages:\n next_request = False\n\n npages = npages + 1\n except requests.exceptions.TooManyRedirects as e:\n logger.warning(\"%s, data not retrieved for question %s\", e, question['id'])\n next_request = False\n\n return html_question_items", "response": "Fetch an Askbot HTML question body."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef __fetch_comments(self, question):\n comments = {}\n comments[question['id']] = json.loads(self.client.get_comments(question['id']))\n for object_id in question['answer_ids']:\n comments[object_id] = json.loads(self.client.get_comments(object_id))\n return comments", "response": "Fetch all the comments of an Askbot question and answers."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nbuild an Askbot HTML response.", "response": "def __build_question(html_question, question, comments):\n \"\"\"Build an Askbot HTML response.\n\n The method puts together all the information regarding a question\n\n :param html_question: array of HTML raw pages\n :param question: question object from the API\n :param comments: list of comments to add\n\n :returns: a dict item with the parsed question information\n \"\"\"\n question_object = {}\n # Parse the user info from the soup container\n question_container = AskbotParser.parse_question_container(html_question[0])\n # Add the info to the question object\n question_object.update(question_container)\n # Add the comments of the question (if any)\n if comments[int(question['id'])]:\n question_object['comments'] = comments[int(question['id'])]\n\n answers = []\n\n for page in html_question:\n answers.extend(AskbotParser.parse_answers(page))\n\n if len(answers) != 0:\n question_object['answers'] = answers\n for answer in question_object['answers']:\n if comments[int(answer['id'])]:\n answer['comments'] = comments[int(answer['id'])]\n\n return question_object"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_api_questions(self, path):\n npages = 1\n next_request = True\n\n path = urijoin(self.base_url, path)\n while next_request:\n\n try:\n params = {\n 'page': npages,\n 'sort': self.ORDER_API\n }\n\n response = self.fetch(path, payload=params)\n\n whole_page = response.text\n\n raw_questions = json.loads(whole_page)\n tpages = raw_questions['pages']\n\n logger.debug(\"Fetching questions from '%s': page %s/%s\",\n self.base_url, npages, tpages)\n\n if npages == tpages:\n next_request = False\n\n npages = npages + 1\n yield raw_questions\n\n except requests.exceptions.TooManyRedirects as e:\n logger.warning(\"%s, data not retrieved for resource %s\", e, path)\n next_request = False", "response": "Retrieve a question page using the API."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nretrieve a raw HTML question and all it s information.", "response": "def get_html_question(self, question_id, page=1):\n \"\"\"Retrieve a raw HTML question and all it's information.\n\n :param question_id: question identifier\n :param page: page to retrieve\n \"\"\"\n path = urijoin(self.base_url, self.HTML_QUESTION, question_id)\n params = {\n 'page': page,\n 'sort': self.ORDER_HTML\n }\n\n response = self.fetch(path, payload=params)\n return response.text"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nretrieve a list of comments by a given id.", "response": "def get_comments(self, post_id):\n \"\"\"Retrieve a list of comments by a given id.\n\n :param object_id: object identifiere\n \"\"\"\n path = urijoin(self.base_url, self.COMMENTS if self._use_new_urls else self.COMMENTS_OLD)\n params = {\n 'post_id': post_id,\n 'post_type': 'answer',\n 'avatar_size': 0\n }\n headers = {'X-Requested-With': 'XMLHttpRequest'}\n\n try:\n response = self.fetch(path, payload=params, headers=headers)\n raw = response.text\n except requests.exceptions.HTTPError as ex:\n if ex.response.status_code == 404:\n logger.debug(\"Comments URL did not work. Using old URL schema.\")\n self._use_new_urls = False\n path = urijoin(self.base_url, self.COMMENTS_OLD)\n response = self.fetch(path, payload=params, headers=headers)\n raw = response.text\n elif ex.response.status_code == 500:\n logger.warning(\"Comments not retrieved due to %s\", ex)\n raw = '[]'\n else:\n raise ex\n\n return raw"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nparsing the question info container of a given HTML question.", "response": "def parse_question_container(html_question):\n \"\"\"Parse the question info container of a given HTML question.\n\n The method parses the information available in the question information\n container. The container can have up to 2 elements: the first one\n contains the information related with the user who generated the question\n and the date (if any). The second one contains the date of the updated,\n and the user who updated it (if not the same who generated the question).\n\n :param html_question: raw HTML question element\n\n :returns: an object with the parsed information\n \"\"\"\n container_info = {}\n bs_question = bs4.BeautifulSoup(html_question, \"html.parser\")\n question = AskbotParser._find_question_container(bs_question)\n container = question.select(\"div.post-update-info\")\n created = container[0]\n container_info['author'] = AskbotParser.parse_user_info(created)\n try:\n container[1]\n except IndexError:\n pass\n else:\n updated = container[1]\n if AskbotParser.parse_user_info(updated):\n container_info['updated_by'] = AskbotParser.parse_user_info(updated)\n\n return container_info"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef parse_answers(html_question):\n\n def parse_answer_container(update_info):\n \"\"\"Parse the answer info container of a given HTML question.\n\n The method parses the information available in the answer information\n container. The container can have up to 2 elements: the first one\n contains the information related with the user who generated the question\n and the date (if any). The second one contains the date of the updated,\n and the user who updated it (if not the same who generated the question).\n\n :param update_info: beautiful soup update_info container element\n\n :returns: an object with the parsed information\n \"\"\"\n container_info = {}\n created = update_info[0]\n answered_at = created.abbr.attrs[\"title\"]\n # Convert date to UNIX timestamp\n container_info['added_at'] = str(str_to_datetime(answered_at).timestamp())\n container_info['answered_by'] = AskbotParser.parse_user_info(created)\n try:\n update_info[1]\n except IndexError:\n pass\n else:\n updated = update_info[1]\n updated_at = updated.abbr.attrs[\"title\"]\n # Convert date to UNIX timestamp\n container_info['updated_at'] = str(str_to_datetime(updated_at).timestamp())\n if AskbotParser.parse_user_info(updated):\n container_info['updated_by'] = AskbotParser.parse_user_info(updated)\n return container_info\n\n answer_list = []\n # Select all the answers\n bs_question = bs4.BeautifulSoup(html_question, \"html.parser\")\n bs_answers = bs_question.select(\"div.answer\")\n for bs_answer in bs_answers:\n answer_id = bs_answer.attrs[\"data-post-id\"]\n votes_element = bs_answer.select(\"div.vote-number\")[0].text\n accepted_answer = bs_answer.select(\"div.answer-img-accept\")[0].get('title').endswith(\"correct\")\n # Select the body of the answer\n body = bs_answer.select(\"div.post-body\")\n # Get the user information container and parse it\n update_info = body[0].select(\"div.post-update-info\")\n answer_container = parse_answer_container(update_info)\n # Remove the update-info-container div to be able to get the body\n body[0].div.extract().select(\"div.post-update-info-container\")\n # Override the body with a clean one\n body = body[0].get_text(strip=True)\n # Generate the answer object\n answer = {'id': answer_id,\n 'score': votes_element,\n 'summary': body,\n 'accepted': accepted_answer\n }\n # Update the object with the information in the answer container\n answer.update(answer_container)\n answer_list.append(answer)\n return answer_list", "response": "Parses the answers related with a given HTML question and returns a list of the answers related with the question."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nparse number of answer pages to paginate over them.", "response": "def parse_number_of_html_pages(html_question):\n \"\"\"Parse number of answer pages to paginate over them.\n\n :param html_question: raw HTML question element\n\n :returns: an integer with the number of pages\n \"\"\"\n bs_question = bs4.BeautifulSoup(html_question, \"html.parser\")\n try:\n bs_question.select('div.paginator')[0]\n except IndexError:\n return 1\n else:\n return int(bs_question.select('div.paginator')[0].attrs['data-num-pages'])"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef parse_user_info(update_info):\n user_info = {}\n if update_info.select(\"div.user-info\"):\n # Get all the elements in the container. First contains the user\n # information, second one (if exists), the website of the user.\n elements = update_info.select(\"div.user-info\")[0].find_all(\"a\")\n href = elements[0].attrs[\"href\"]\n user_info['id'] = re.search(r'\\d+', href).group(0)\n user_info['username'] = elements[0].text\n user_info['reputation'] = update_info.select('span.reputation-score')[0].text\n user_info['badges'] = update_info.select(\"span.badges\")[0].attrs[\"title\"]\n try:\n elements[1]\n except IndexError:\n pass\n else:\n user_info['website'] = elements[1].attrs[\"href\"]\n if update_info.select(\"img.flag\"):\n flag = update_info.select(\"img.flag\")[0].attrs[\"alt\"]\n user_info['country'] = re.sub(\"flag of \", \"\", flag)\n\n return user_info", "response": "Parses the user information of a given HTML container."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef fetch_items(self, category, **kwargs):\n from_date = kwargs['from_date']\n\n if self.client.version[0] == 2 and self.client.version[1] == 8:\n fetcher = self._fetch_gerrit28(from_date)\n else:\n fetcher = self._fetch_gerrit(from_date)\n\n for review in fetcher:\n yield review", "response": "Fetch the items in the given category"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef parse_reviews(raw_data):\n\n # Join isolated reviews in JSON in array for parsing\n items_raw = \"[\" + raw_data.replace(\"\\n\", \",\") + \"]\"\n items_raw = items_raw.replace(\",]\", \"]\")\n items = json.loads(items_raw)\n reviews = []\n\n for item in items:\n if 'project' in item.keys():\n reviews.append(item)\n\n return reviews", "response": "Parse a Gerrit reviews list."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfetching the latest version of the group items from Gerrit 2. 8 and return the list of items.", "response": "def _fetch_gerrit28(self, from_date=DEFAULT_DATETIME):\n \"\"\" Specific fetch for gerrit 2.8 version.\n\n Get open and closed reviews in different queries.\n Take the newer review from both lists and iterate.\n \"\"\"\n\n # Convert date to Unix time\n from_ut = datetime_to_utc(from_date)\n from_ut = from_ut.timestamp()\n\n filter_open = \"status:open\"\n filter_closed = \"status:closed\"\n\n last_item_open = self.client.next_retrieve_group_item()\n last_item_closed = self.client.next_retrieve_group_item()\n reviews_open = self._get_reviews(last_item_open, filter_open)\n reviews_closed = self._get_reviews(last_item_closed, filter_closed)\n last_nreviews_open = len(reviews_open)\n last_nreviews_closed = len(reviews_closed)\n\n while reviews_open or reviews_closed:\n if reviews_open and reviews_closed:\n if reviews_open[0]['lastUpdated'] >= reviews_closed[0]['lastUpdated']:\n review_open = reviews_open.pop(0)\n review = review_open\n else:\n review_closed = reviews_closed.pop(0)\n review = review_closed\n elif reviews_closed:\n review_closed = reviews_closed.pop(0)\n review = review_closed\n else:\n review_open = reviews_open.pop(0)\n review = review_open\n\n updated = review['lastUpdated']\n if updated <= from_ut:\n logger.debug(\"No more updates for %s\" % (self.hostname))\n break\n else:\n yield review\n\n if not reviews_open and last_nreviews_open >= self.max_reviews:\n last_item_open = self.client.next_retrieve_group_item(last_item_open, review_open)\n reviews_open = self._get_reviews(last_item_open, filter_open)\n last_nreviews_open = len(reviews_open)\n if not reviews_closed and last_nreviews_closed >= self.max_reviews:\n last_item_closed = self.client.next_retrieve_group_item(last_item_closed, review_closed)\n reviews_closed = self._get_reviews(last_item_closed, filter_closed)\n last_nreviews_closed = len(reviews_closed)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the Gerrit server version.", "response": "def version(self):\n \"\"\"Return the Gerrit server version.\"\"\"\n\n if self._version:\n return self._version\n\n cmd = self.gerrit_cmd + \" %s \" % (GerritClient.CMD_VERSION)\n\n logger.debug(\"Getting version: %s\" % (cmd))\n raw_data = self.__execute(cmd)\n raw_data = str(raw_data, \"UTF-8\")\n logger.debug(\"Gerrit version: %s\" % (raw_data))\n\n # output: gerrit version 2.10-rc1-988-g333a9dd\n m = re.match(GerritClient.VERSION_REGEX, raw_data)\n\n if not m:\n cause = \"Invalid gerrit version %s\" % raw_data\n raise BackendError(cause=cause)\n\n try:\n mayor = int(m.group(1))\n minor = int(m.group(2))\n except Exception:\n cause = \"Gerrit client could not determine the server version.\"\n raise BackendError(cause=cause)\n\n self._version = [mayor, minor]\n return self._version"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef reviews(self, last_item, filter_=None):\n\n cmd = self._get_gerrit_cmd(last_item, filter_)\n\n logger.debug(\"Getting reviews with command: %s\", cmd)\n raw_data = self.__execute(cmd)\n raw_data = str(raw_data, \"UTF-8\")\n\n return raw_data", "response": "Get the reviews starting from last_item."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning the item to start from in next reviews group.", "response": "def next_retrieve_group_item(self, last_item=None, entry=None):\n \"\"\"Return the item to start from in next reviews group.\"\"\"\n\n next_item = None\n\n gerrit_version = self.version\n\n if gerrit_version[0] == 2 and gerrit_version[1] > 9:\n if last_item is None:\n next_item = 0\n else:\n next_item = last_item\n elif gerrit_version[0] == 2 and gerrit_version[1] == 9:\n # https://groups.google.com/forum/#!topic/repo-discuss/yQgRR5hlS3E\n cause = \"Gerrit 2.9.0 does not support pagination\"\n raise BackendError(cause=cause)\n else:\n if entry is not None:\n next_item = entry['sortKey']\n\n return next_item"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nexecutes a gerrit command against the archive", "response": "def __execute_from_archive(self, cmd):\n \"\"\"Execute gerrit command against the archive\"\"\"\n\n cmd = self.sanitize_for_archive(cmd)\n response = self.archive.retrieve(cmd, None, None)\n\n if isinstance(response, RuntimeError):\n raise response\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nexecuting a gerrit command and store the result in the archive if it fails", "response": "def __execute_from_remote(self, cmd):\n \"\"\"Execute gerrit command with retry if it fails\"\"\"\n\n result = None # data result from the cmd execution\n retries = 0\n\n while retries < self.MAX_RETRIES:\n try:\n result = subprocess.check_output(cmd, shell=True)\n break\n except subprocess.CalledProcessError as ex:\n logger.error(\"gerrit cmd %s failed: %s\", cmd, ex)\n time.sleep(self.RETRY_WAIT * retries)\n retries += 1\n\n if result is None:\n result = RuntimeError(cmd + \" failed \" + str(self.MAX_RETRIES) + \" times. Giving up!\")\n\n if self.archive:\n cmd = self.sanitize_for_archive(cmd)\n self.archive.store(cmd, None, None, result)\n\n if isinstance(result, RuntimeError):\n raise result\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef setup_cmd_parser(cls):\n\n parser = BackendCommandArgumentParser(cls.BACKEND.CATEGORIES,\n from_date=True,\n archive=True)\n\n # Gerrit options\n group = parser.parser.add_argument_group('Gerrit arguments')\n group.add_argument('--user', dest='user',\n help=\"Gerrit ssh user\")\n group.add_argument('--max-reviews', dest='max_reviews',\n type=int, default=MAX_REVIEWS,\n help=\"Max number of reviews per ssh query.\")\n group.add_argument('--blacklist-reviews', dest='blacklist_reviews',\n nargs='*',\n help=\"Wrong reviews that must not be retrieved.\")\n group.add_argument('--disable-host-key-check', dest='disable_host_key_check', action='store_true',\n help=\"Don't check remote host identity\")\n group.add_argument('--ssh-port', dest='port',\n default=PORT, type=int,\n help=\"Set SSH port of the Gerrit server\")\n\n # Required arguments\n parser.parser.add_argument('hostname',\n help=\"Hostname of the Gerrit server\")\n\n return parser", "response": "Returns the Gerrit argument parser."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nfetch the items from the backend", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the issues\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n from_date = kwargs['from_date']\n\n logger.info(\"Fetching issues of '%s' distribution from %s\",\n self.distribution, str(from_date))\n\n nissues = 0\n\n for issue in self._fetch_issues(from_date):\n yield issue\n nissues += 1\n\n logger.info(\"Fetch process completed: %s issues fetched\", nissues)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _init_client(self, from_archive=False):\n\n return LaunchpadClient(self.distribution, self.package, self.items_per_page,\n self.sleep_time, self.archive, from_archive)", "response": "Initialize a new LaunchpadClient object."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _fetch_issues(self, from_date):\n\n issues_groups = self.client.issues(start=from_date)\n\n for raw_issues in issues_groups:\n\n issues = json.loads(raw_issues)['entries']\n for issue in issues:\n issue = self.__init_extra_issue_fields(issue)\n issue_id = self.__extract_issue_id(issue['bug_link'])\n\n for field in TARGET_ISSUE_FIELDS:\n\n if not issue[field]:\n continue\n\n if field == 'bug_link':\n issue['bug_data'] = self.__fetch_issue_data(issue_id)\n issue['activity_data'] = [activity for activity in self.__fetch_issue_activities(issue_id)]\n issue['messages_data'] = [message for message in self.__fetch_issue_messages(issue_id)]\n issue['attachments_data'] = [attachment for attachment in self.__fetch_issue_attachments(issue_id)]\n elif field == 'assignee_link':\n issue['assignee_data'] = self.__fetch_user_data('{ASSIGNEE}', issue[field])\n elif field == 'owner_link':\n issue['owner_data'] = self.__fetch_user_data('{OWNER}', issue[field])\n\n yield issue", "response": "Fetch the issues from a project."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef __fetch_issue_data(self, issue_id):\n\n raw_issue = self.client.issue(issue_id)\n issue = json.loads(raw_issue)\n\n return issue", "response": "Fetch data associated to an issue"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef __fetch_issue_attachments(self, issue_id):\n\n for attachments_raw in self.client.issue_collection(issue_id, \"attachments\"):\n attachments = json.loads(attachments_raw)\n\n for attachment in attachments['entries']:\n yield attachment", "response": "Get all attachments of an issue"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget all the messages of an issue", "response": "def __fetch_issue_messages(self, issue_id):\n \"\"\"Get messages of an issue\"\"\"\n\n for messages_raw in self.client.issue_collection(issue_id, \"messages\"):\n messages = json.loads(messages_raw)\n\n for msg in messages['entries']:\n msg['owner_data'] = self.__fetch_user_data('{OWNER}', msg['owner_link'])\n yield msg"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting activities on an issue", "response": "def __fetch_issue_activities(self, issue_id):\n \"\"\"Get activities on an issue\"\"\"\n\n for activities_raw in self.client.issue_collection(issue_id, \"activity\"):\n activities = json.loads(activities_raw)\n\n for act in activities['entries']:\n act['person_data'] = self.__fetch_user_data('{PERSON}', act['person_link'])\n yield act"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef __fetch_user_data(self, tag_type, user_link):\n\n user_name = self.client.user_name(user_link)\n\n user = {}\n\n if not user_name:\n return user\n\n user_raw = self.client.user(user_name)\n user = json.loads(user_raw)\n\n return user", "response": "Fetch data associated to a user"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting the issues from pagination", "response": "def issues(self, start=None):\n \"\"\"Get the issues from pagination\"\"\"\n\n payload = self.__build_payload(size=self.items_per_page, operation=True, startdate=start)\n path = self.__get_url_project()\n return self.__fetch_items(path=path, payload=payload)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget the user data by URL", "response": "def user(self, user_name):\n \"\"\"Get the user data by URL\"\"\"\n\n user = None\n\n if user_name in self._users:\n return self._users[user_name]\n\n url_user = self.__get_url(\"~\" + user_name)\n\n logger.info(\"Getting info for %s\" % (url_user))\n\n try:\n raw_user = self.__send_request(url_user)\n user = raw_user\n except requests.exceptions.HTTPError as e:\n if e.response.status_code in [404, 410]:\n logger.warning(\"Data is not available - %s\", url_user)\n user = '{}'\n else:\n raise e\n\n self._users[user_name] = user\n\n return user"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef issue(self, issue_id):\n\n path = urijoin(\"bugs\", str(issue_id))\n url_issue = self.__get_url(path)\n raw_text = self.__send_request(url_issue)\n\n return raw_text", "response": "Get the issue data by its ID"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets a collection list of a given issue", "response": "def issue_collection(self, issue_id, collection_name):\n \"\"\"Get a collection list of a given issue\"\"\"\n\n path = urijoin(\"bugs\", str(issue_id), collection_name)\n url_collection = self.__get_url(path)\n payload = {'ws.size': self.items_per_page, 'ws.start': 0, 'order_by': 'date_last_updated'}\n\n raw_items = self.__fetch_items(path=url_collection, payload=payload)\n\n return raw_items"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef __send_request(self, url, params=None):\n\n r = self.fetch(url, payload=params)\n return r.text", "response": "Send a request to the server and return the response."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nbuild the payload for the current session.", "response": "def __build_payload(self, size, operation=False, startdate=None):\n \"\"\"Build payload\"\"\"\n\n payload = {\n 'ws.size': size,\n 'order_by': 'date_last_updated',\n 'omit_duplicates': 'false',\n 'status': [\"New\", \"Incomplete\", \"Opinion\", \"Invalid\", \"Won't Fix\",\n \"Expired\", \"Confirmed\", \"Triaged\", \"In Progress\",\n \"Fix Committed\", \"Fix Released\",\n \"Incomplete (with response)\",\n \"Incomplete (without response)\"]\n }\n\n if operation:\n payload['ws.op'] = 'searchTasks'\n if startdate:\n startdate = startdate.isoformat()\n payload['modified_since'] = startdate\n\n return payload"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfetching the items from Launchpad API using pagination", "response": "def __fetch_items(self, path, payload):\n \"\"\"Return the items from Launchpad API using pagination\"\"\"\n\n page = 0 # current page\n url_next = path\n fetch_data = True\n\n while fetch_data:\n logger.debug(\"Fetching page: %i\", page)\n\n try:\n raw_content = self.__send_request(url_next, payload)\n content = json.loads(raw_content)\n except requests.exceptions.HTTPError as e:\n if e.response.status_code in [410]:\n logger.warning(\"Data is not available - %s\", url_next)\n raw_content = '{\"total_size\": 0, \"start\": 0, \"entries\": []}'\n content = json.loads(raw_content)\n else:\n raise e\n\n if 'next_collection_link' in content:\n url_next = content['next_collection_link']\n payload = None\n else:\n fetch_data = False\n\n yield raw_content\n page += 1"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fetch_items(self, category, **kwargs):\n from_date = kwargs['from_date']\n\n logger.info(\"Looking for messages from '%s' since %s\",\n self.uri, str(from_date))\n\n mailing_list = GroupsioClient(self.group_name, self.dirpath,\n self.api_token, self.verify)\n mailing_list.fetch()\n\n messages = self._fetch_and_parse_messages(mailing_list, from_date)\n\n for message in messages:\n yield message\n\n logger.info(\"Fetch process completed\")", "response": "Fetch the items from the group"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef fetch(self):\n logger.info(\"Downloading mboxes from '%s'\", self.uri)\n logger.debug(\"Storing mboxes in '%s'\", self.dirpath)\n\n if not os.path.exists(self.dirpath):\n os.makedirs(self.dirpath)\n\n group_id = self.__find_group_id()\n\n url = urijoin(GROUPSIO_API_URL, self.DOWNLOAD_ARCHIVES)\n payload = {'group_id': group_id}\n filepath = os.path.join(self.dirpath, MBOX_FILE)\n success = self._download_archive(url, payload, filepath)\n\n return success", "response": "Fetch the mbox files from the remote archiver."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef subscriptions(self, per_page=PER_PAGE):\n url = urijoin(GROUPSIO_API_URL, self.GET_SUBSCRIPTIONS)\n logger.debug(\"Get groupsio paginated subscriptions from \" + url)\n\n keep_fetching = True\n payload = {\n \"limit\": per_page\n }\n\n while keep_fetching:\n r = self.__fetch(url, payload)\n response_raw = r.json()\n subscriptions = response_raw['data']\n yield subscriptions\n\n total_subscriptions = response_raw['total_count']\n logger.debug(\"Subscriptions: %i/%i\" % (response_raw['end_item'], total_subscriptions))\n\n payload['page_token'] = response_raw['next_page_token']\n keep_fetching = response_raw['has_more']", "response": "Fetch the groupsio paginated subscriptions for a given token\n "} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef __find_group_id(self):\n\n group_subscriptions = self.subscriptions(self.auth)\n\n for subscriptions in group_subscriptions:\n for sub in subscriptions:\n if sub['group_name'] == self.group_name:\n return sub['group_id']\n\n msg = \"Group id not found for group name %s\" % self.group_name\n raise BackendError(cause=msg)", "response": "Find the id of a group given its name by iterating on the list of subscriptions and returning the first one that matches the group name."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nfetching requests from groupsio API", "response": "def __fetch(self, url, payload):\n \"\"\"Fetch requests from groupsio API\"\"\"\n\n r = requests.get(url, params=payload, auth=self.auth, verify=self.verify)\n try:\n r.raise_for_status()\n except requests.exceptions.HTTPError as e:\n raise e\n\n return r"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ninitializes mailing lists directory path", "response": "def _pre_init(self):\n \"\"\"Initialize mailing lists directory path\"\"\"\n\n if not self.parsed_args.mboxes_path:\n base_path = os.path.expanduser('~/.perceval/mailinglists/')\n dirpath = os.path.join(base_path, GROUPSIO_URL, 'g', self.parsed_args.group_name)\n else:\n dirpath = self.parsed_args.mboxes_path\n\n setattr(self.parsed_args, 'dirpath', dirpath)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the Groupsio argument parser.", "response": "def setup_cmd_parser(cls):\n \"\"\"Returns the Groupsio argument parser.\"\"\"\n\n parser = BackendCommandArgumentParser(cls.BACKEND.CATEGORIES,\n from_date=True,\n token_auth=True)\n\n # Backend token is required\n action = parser.parser._option_string_actions['--api-token']\n action.required = True\n\n # Optional arguments\n group = parser.parser.add_argument_group('Groupsio arguments')\n group.add_argument('--mboxes-path', dest='mboxes_path',\n help=\"Path where mbox files will be stored\")\n group.add_argument('--no-verify', dest='verify',\n action='store_false',\n help=\"Value 'True' enable SSL verification\")\n\n # Required arguments\n parser.parser.add_argument('group_name',\n help=\"Name of the group on Groups.io\")\n\n return parser"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngenerates a UUID based on the given parameters.", "response": "def uuid(*args):\n \"\"\"Generate a UUID based on the given parameters.\n\n The UUID will be the SHA1 of the concatenation of the values\n from the list. The separator bewteedn these values is ':'.\n Each value must be a non-empty string, otherwise, the function\n will raise an exception.\n\n :param *args: list of arguments used to generate the UUID\n\n :returns: a universal unique identifier\n\n :raises ValueError: when anyone of the values is not a string,\n is empty or `None`.\n \"\"\"\n def check_value(v):\n if not isinstance(v, str):\n raise ValueError(\"%s value is not a string instance\" % str(v))\n elif not v:\n raise ValueError(\"value cannot be None or empty\")\n else:\n return v\n\n s = ':'.join(map(check_value, args))\n\n sha1 = hashlib.sha1(s.encode('utf-8', errors='surrogateescape'))\n uuid_sha1 = sha1.hexdigest()\n\n return uuid_sha1"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nfetch items using the given backend. Generator to get items using the given backend class. When an archive manager is given, this function will store the fetched items in an `Archive`. If an exception is raised, this archive will be removed to avoid corrupted archives. The parameters needed to initialize the `backend` class and get the items are given using `backend_args` dict parameter. :param backend_class: backend class to fetch items :param backend_args: dict of arguments needed to fetch the items :param category: category of the items to retrieve. If None, it will use the default backend category :param filter_classified: remove classified fields from the resulting items :param manager: archive manager needed to store the items :returns: a generator of items", "response": "def fetch(backend_class, backend_args, category, filter_classified=False,\n manager=None):\n \"\"\"Fetch items using the given backend.\n\n Generator to get items using the given backend class. When\n an archive manager is given, this function will store\n the fetched items in an `Archive`. If an exception is raised,\n this archive will be removed to avoid corrupted archives.\n\n The parameters needed to initialize the `backend` class and\n get the items are given using `backend_args` dict parameter.\n\n :param backend_class: backend class to fetch items\n :param backend_args: dict of arguments needed to fetch the items\n :param category: category of the items to retrieve.\n If None, it will use the default backend category\n :param filter_classified: remove classified fields from the resulting items\n :param manager: archive manager needed to store the items\n\n :returns: a generator of items\n \"\"\"\n init_args = find_signature_parameters(backend_class.__init__,\n backend_args)\n archive = manager.create_archive() if manager else None\n init_args['archive'] = archive\n\n backend = backend_class(**init_args)\n\n if category:\n backend_args['category'] = category\n if filter_classified:\n backend_args['filter_classified'] = filter_classified\n\n fetch_args = find_signature_parameters(backend.fetch,\n backend_args)\n items = backend.fetch(**fetch_args)\n\n try:\n for item in items:\n yield item\n except Exception as e:\n if manager:\n archive_path = archive.archive_path\n manager.remove_archive(archive_path)\n raise e"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfetches items from an archive manager.", "response": "def fetch_from_archive(backend_class, backend_args, manager,\n category, archived_after):\n \"\"\"Fetch items from an archive manager.\n\n Generator to get the items of a category (previously fetched\n by the given backend class) from an archive manager. Only those\n items archived after the given date will be returned.\n\n The parameters needed to initialize `backend` and get the\n items are given using `backend_args` dict parameter.\n\n :param backend_class: backend class to retrive items\n :param backend_args: dict of arguments needed to retrieve the items\n :param manager: archive manager where the items will be retrieved\n :param category: category of the items to retrieve\n :param archived_after: return items archived after this date\n\n :returns: a generator of archived items\n \"\"\"\n init_args = find_signature_parameters(backend_class.__init__,\n backend_args)\n backend = backend_class(**init_args)\n\n filepaths = manager.search(backend.origin,\n backend.__class__.__name__,\n category,\n archived_after)\n\n for filepath in filepaths:\n backend.archive = Archive(filepath)\n items = backend.fetch_from_archive()\n\n try:\n for item in items:\n yield item\n except ArchiveError as e:\n logger.warning(\"Ignoring %s archive due to: %s\", filepath, str(e))"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfinds available backends under top_package and return a tuple of backend classes one with backendCommand classes and one with oneodata with backendCommand classes", "response": "def find_backends(top_package):\n \"\"\"Find available backends.\n\n Look for the Perceval backends and commands under `top_package`\n and its sub-packages. When `top_package` defines a namespace,\n backends under that same namespace will be found too.\n\n :param top_package: package storing backends\n\n :returns: a tuple with two dicts: one with `Backend` classes and one\n with `BackendCommand` classes\n \"\"\"\n candidates = pkgutil.walk_packages(top_package.__path__,\n prefix=top_package.__name__ + '.')\n\n modules = [name for _, name, is_pkg in candidates if not is_pkg]\n\n return _import_backends(modules)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nfetch items from a repository.", "response": "def fetch(self, category, filter_classified=False, **kwargs):\n \"\"\"Fetch items from the repository.\n\n The method retrieves items from a repository.\n\n To removed classified fields from the resulting items, set\n the parameter `filter_classified`. Take into account this\n parameter is incompatible with archiving items. Raw client\n data are archived before any other process. Therefore,\n classified data are stored within the archive. To prevent\n from possible data leaks or security issues when users do\n not need these fields, archiving and filtering are not\n compatible.\n\n :param category: the category of the items fetched\n :param filter_classified: remove classified fields from the resulting items\n :param kwargs: a list of other parameters (e.g., from_date, offset, etc.\n specific for each backend)\n\n :returns: a generator of items\n\n :raises BackendError: either when the category is not valid or\n 'filter_classified' and 'archive' are active at the same time.\n \"\"\"\n if category not in self.categories:\n cause = \"%s category not valid for %s\" % (category, self.__class__.__name__)\n raise BackendError(cause=cause)\n\n if filter_classified and self.archive:\n cause = \"classified fields filtering is not compatible with archiving items\"\n raise BackendError(cause=cause)\n\n if self.archive:\n self.archive.init_metadata(self.origin, self.__class__.__name__, self.version, category,\n kwargs)\n\n self.client = self._init_client()\n\n for item in self.fetch_items(category, **kwargs):\n if filter_classified:\n item = self.filter_classified_data(item)\n\n yield self.metadata(item, filter_classified=filter_classified)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfetches the questions from an archive.", "response": "def fetch_from_archive(self):\n \"\"\"Fetch the questions from an archive.\n\n It returns the items stored within an archive. If this method is called but\n no archive was provided, the method will raise a `ArchiveError` exception.\n\n :returns: a generator of items\n\n :raises ArchiveError: raised when an error occurs accessing an archive\n \"\"\"\n if not self.archive:\n raise ArchiveError(cause=\"archive instance was not provided\")\n\n self.client = self._init_client(from_archive=True)\n\n for item in self.fetch_items(self.archive.category, **self.archive.backend_params):\n yield self.metadata(item)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nremoving classified or confidential data from an item.", "response": "def filter_classified_data(self, item):\n \"\"\"Remove classified or confidential data from an item.\n\n It removes those fields that contain data considered as classified.\n Classified fields are defined in `CLASSIFIED_FIELDS` class attribute.\n\n :param item: fields will be removed from this item\n\n :returns: the same item but with confidential data filtered\n \"\"\"\n item_uuid = uuid(self.origin, self.metadata_id(item))\n\n logger.debug(\"Filtering classified data for item %s\", item_uuid)\n\n for cf in self.CLASSIFIED_FIELDS:\n try:\n _remove_key_from_nested_dict(item, cf)\n except KeyError:\n logger.debug(\"Classified field '%s' not found for item %s; field ignored\",\n '.'.join(cf), item_uuid)\n\n logger.debug(\"Classified data filtered for item %s\", item_uuid)\n\n return item"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nadds metadata to an item.", "response": "def metadata(self, item, filter_classified=False):\n \"\"\"Add metadata to an item.\n\n It adds metadata to a given item such as how and\n when it was fetched. The contents from the original item will\n be stored under the 'data' keyword.\n\n :param item: an item fetched by a backend\n :param filter_classified: sets if classified fields were filtered\n \"\"\"\n item = {\n 'backend_name': self.__class__.__name__,\n 'backend_version': self.version,\n 'perceval_version': __version__,\n 'timestamp': datetime_utcnow().timestamp(),\n 'origin': self.origin,\n 'uuid': uuid(self.origin, self.metadata_id(item)),\n 'updated_on': self.metadata_updated_on(item),\n 'classified_fields_filtered': self.classified_fields if filter_classified else None,\n 'category': self.metadata_category(item),\n 'tag': self.tag,\n 'data': item,\n }\n\n return item"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nparses a list of arguments and return a parsed Namespace object.", "response": "def parse(self, *args):\n \"\"\"Parse a list of arguments.\n\n Parse argument strings needed to run a backend command. The result\n will be a `argparse.Namespace` object populated with the values\n obtained after the validation of the parameters.\n\n :param args: argument strings\n\n :result: an object with the parsed values\n \"\"\"\n parsed_args = self.parser.parse_args(args)\n\n # Category was not set, remove it\n if parsed_args.category is None:\n delattr(parsed_args, 'category')\n\n if self._from_date:\n parsed_args.from_date = str_to_datetime(parsed_args.from_date)\n if self._to_date and parsed_args.to_date:\n parsed_args.to_date = str_to_datetime(parsed_args.to_date)\n if self._archive and parsed_args.archived_since:\n parsed_args.archived_since = str_to_datetime(parsed_args.archived_since)\n\n if self._archive and parsed_args.fetch_archive and parsed_args.no_archive:\n raise AttributeError(\"fetch-archive and no-archive arguments are not compatible\")\n if self._archive and parsed_args.fetch_archive and not parsed_args.category:\n raise AttributeError(\"fetch-archive needs a category to work with\")\n\n # Set aliases\n for alias, arg in self.aliases.items():\n if (alias not in parsed_args) and (arg in parsed_args):\n value = getattr(parsed_args, arg, None)\n setattr(parsed_args, alias, value)\n\n return parsed_args"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _set_auth_arguments(self, basic_auth=True, token_auth=False):\n\n group = self.parser.add_argument_group('authentication arguments')\n\n if basic_auth:\n group.add_argument('-u', '--backend-user', dest='user',\n help=\"backend user\")\n group.add_argument('-p', '--backend-password', dest='password',\n help=\"backend password\")\n if token_auth:\n group.add_argument('-t', '--api-token', dest='api_token',\n help=\"backend authentication token / API key\")", "response": "Activate authentication arguments parsing"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nactivating archive arguments parsing", "response": "def _set_archive_arguments(self):\n \"\"\"Activate archive arguments parsing\"\"\"\n\n group = self.parser.add_argument_group('archive arguments')\n group.add_argument('--archive-path', dest='archive_path', default=None,\n help=\"directory path to the archives\")\n group.add_argument('--no-archive', dest='no_archive', action='store_true',\n help=\"do not archive data\")\n group.add_argument('--fetch-archive', dest='fetch_archive', action='store_true',\n help=\"fetch data from the archives\")\n group.add_argument('--archived-since', dest='archived_since', default='1970-01-01',\n help=\"retrieve items archived since the given date\")"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _set_output_arguments(self):\n\n group = self.parser.add_argument_group('output arguments')\n group.add_argument('-o', '--output', type=argparse.FileType('w'),\n dest='outfile', default=sys.stdout,\n help=\"output file\")\n group.add_argument('--json-line', dest='json_line', action='store_true',\n help=\"produce a JSON line for each output item\")", "response": "Activate output arguments parsing"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nfetching and write items from the given base - class.", "response": "def run(self):\n \"\"\"Fetch and write items.\n\n This method runs the backend to fetch the items from the given\n origin. Items are converted to JSON objects and written to the\n defined output.\n\n If `fetch-archive` parameter was given as an argument during\n the inizialization of the instance, the items will be retrieved\n using the archive manager.\n \"\"\"\n backend_args = vars(self.parsed_args)\n category = backend_args.pop('category', None)\n filter_classified = backend_args.pop('filter_classified', False)\n archived_since = backend_args.pop('archived_since', None)\n\n if self.archive_manager and self.parsed_args.fetch_archive:\n items = fetch_from_archive(self.BACKEND, backend_args,\n self.archive_manager,\n category,\n archived_since)\n else:\n items = fetch(self.BACKEND, backend_args, category,\n filter_classified=filter_classified,\n manager=self.archive_manager)\n\n try:\n for item in items:\n if self.json_line:\n obj = json.dumps(item, separators=(',', ':'), sort_keys=True)\n else:\n obj = json.dumps(item, indent=4, sort_keys=True)\n self.outfile.write(obj)\n self.outfile.write('\\n')\n except IOError as e:\n raise RuntimeError(str(e))\n except Exception as e:\n raise RuntimeError(str(e))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _initialize_archive(self):\n\n if 'archive_path' not in self.parsed_args:\n manager = None\n elif self.parsed_args.no_archive:\n manager = None\n else:\n if not self.parsed_args.archive_path:\n archive_path = os.path.expanduser(ARCHIVES_DEFAULT_PATH)\n else:\n archive_path = self.parsed_args.archive_path\n\n manager = ArchiveManager(archive_path)\n\n self.archive_manager = manager", "response": "Initialize the archive based on the parsed parameters"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfetches the items from the backend", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the messages\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n from_date = kwargs['from_date']\n\n logger.info(\"Looking for messages from '%s' on '%s' since %s\",\n self.uri, self.dirpath, str(from_date))\n\n mailing_list = MailingList(self.uri, self.dirpath)\n\n messages = self._fetch_and_parse_messages(mailing_list, from_date)\n\n for message in messages:\n yield message\n\n logger.info(\"Fetch process completed\")"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef metadata_updated_on(item):\n ts = item[MBox.DATE_FIELD]\n ts = str_to_datetime(ts)\n\n return ts.timestamp()", "response": "Extracts the update time from a MBox item."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nparse a mbox file and return an iterator of dictionaries. Each one of this contains an email message. Each one of this contains an email message.", "response": "def parse_mbox(filepath):\n \"\"\"Parse a mbox file.\n\n This method parses a mbox file and returns an iterator of dictionaries.\n Each one of this contains an email message.\n\n :param filepath: path of the mbox to parse\n\n :returns : generator of messages; each message is stored in a\n dictionary of type `requests.structures.CaseInsensitiveDict`\n \"\"\"\n mbox = _MBox(filepath, create=False)\n\n for msg in mbox:\n message = message_to_dict(msg)\n yield message"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _fetch_and_parse_messages(self, mailing_list, from_date):\n\n from_date = datetime_to_utc(from_date)\n\n nmsgs, imsgs, tmsgs = (0, 0, 0)\n\n for mbox in mailing_list.mboxes:\n tmp_path = None\n\n try:\n tmp_path = self._copy_mbox(mbox)\n\n for message in self.parse_mbox(tmp_path):\n tmsgs += 1\n\n if not self._validate_message(message):\n imsgs += 1\n continue\n\n # Ignore those messages sent before the given date\n dt = str_to_datetime(message[MBox.DATE_FIELD])\n\n if dt < from_date:\n logger.debug(\"Message %s sent before %s; skipped\",\n message['unixfrom'], str(from_date))\n tmsgs -= 1\n continue\n\n # Convert 'CaseInsensitiveDict' to dict\n message = self._casedict_to_dict(message)\n\n nmsgs += 1\n logger.debug(\"Message %s parsed\", message['unixfrom'])\n\n yield message\n except (OSError, EOFError) as e:\n logger.warning(\"Ignoring %s mbox due to: %s\", mbox.filepath, str(e))\n except Exception as e:\n if tmp_path and os.path.exists(tmp_path):\n os.remove(tmp_path)\n raise e\n finally:\n if tmp_path and os.path.exists(tmp_path):\n os.remove(tmp_path)\n\n logger.info(\"Done. %s/%s messages fetched; %s ignored\",\n nmsgs, tmsgs, imsgs)", "response": "Fetch and parse the messages from a mailing list from a given date."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _copy_mbox(self, mbox):\n\n tmp_path = tempfile.mktemp(prefix='perceval_')\n\n with mbox.container as f_in:\n with open(tmp_path, mode='wb') as f_out:\n for l in f_in:\n f_out.write(l)\n return tmp_path", "response": "Copy the contents of a mbox to a temporary file"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _validate_message(self, message):\n\n # This check is \"case insensitive\" because we're\n # using 'CaseInsensitiveDict' from requests.structures\n # module to store the contents of a message.\n if self.MESSAGE_ID_FIELD not in message:\n logger.warning(\"Field 'Message-ID' not found in message %s; ignoring\",\n message['unixfrom'])\n return False\n\n if not message[self.MESSAGE_ID_FIELD]:\n logger.warning(\"Field 'Message-ID' is empty in message %s; ignoring\",\n message['unixfrom'])\n return False\n\n if self.DATE_FIELD not in message:\n logger.warning(\"Field 'Date' not found in message %s; ignoring\",\n message['unixfrom'])\n return False\n\n if not message[self.DATE_FIELD]:\n logger.warning(\"Field 'Date' is empty in message %s; ignoring\",\n message['unixfrom'])\n return False\n\n try:\n str_to_datetime(message[self.DATE_FIELD])\n except InvalidDateError:\n logger.warning(\"Invalid date %s in message %s; ignoring\",\n message[self.DATE_FIELD], message['unixfrom'])\n return False\n\n return True", "response": "Check if the given message has the mandatory fields."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _casedict_to_dict(self, message):\n message_id = message.pop(self.MESSAGE_ID_FIELD)\n date = message.pop(self.DATE_FIELD)\n\n msg = {k: v for k, v in message.items()}\n msg[self.MESSAGE_ID_FIELD] = message_id\n msg[self.DATE_FIELD] = date\n\n return msg", "response": "Convert a message in CaseInsensitiveDict to dict."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_message(self, key):\n\n start, stop = self._lookup(key)\n self._file.seek(start)\n from_line = self._file.readline().replace(mailbox.linesep, b'')\n string = self._file.read(stop - self._file.tell())\n msg = self._message_factory(string.replace(mailbox.linesep, b'\\n'))\n\n try:\n msg.set_from(from_line[5:].decode('ascii'))\n return msg\n except UnicodeDecodeError:\n pass\n\n try:\n msg.set_from(from_line[5:].decode('utf-8'))\n except UnicodeDecodeError:\n msg.set_from(from_line[5:].decode('iso-8859-1'))\n\n return msg", "response": "Return a Message representation or raise a KeyError."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef mboxes(self):\n archives = []\n\n if os.path.isfile(self.dirpath):\n try:\n archives.append(MBoxArchive(self.dirpath))\n except OSError as e:\n logger.warning(\"Ignoring %s mbox due to: %s\", self.dirpath, str(e))\n else:\n for root, _, files in os.walk(self.dirpath):\n for filename in sorted(files):\n try:\n location = os.path.join(root, filename)\n archives.append(MBoxArchive(location))\n except OSError as e:\n logger.warning(\"Ignoring %s mbox due to: %s\", filename, str(e))\n return archives", "response": "Returns the list of. MBoxArchive objects managed by this mailing list."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfetches commits from a Git repository or a Git log file.", "response": "def fetch(self, category=CATEGORY_COMMIT, from_date=DEFAULT_DATETIME, to_date=DEFAULT_LAST_DATETIME,\n branches=None, latest_items=False, no_update=False):\n \"\"\"Fetch commits.\n\n The method retrieves from a Git repository or a log file\n a list of commits. Commits are returned in the same order\n they were obtained.\n\n When `from_date` parameter is given it returns items commited\n since the given date.\n\n The list of `branches` is a list of strings, with the names of\n the branches to fetch. If the list of branches is empty, no\n commit is fetched. If the list of branches is None, all commits\n for all branches will be fetched.\n\n The parameter `latest_items` returns only those commits which\n are new since the last time this method was called.\n\n The parameter `no_update` returns all commits without performing\n an update of the repository before.\n\n Take into account that `from_date` and `branches` are ignored\n when the commits are fetched from a Git log file or when\n `latest_items` flag is set.\n\n The class raises a `RepositoryError` exception when an error\n occurs accessing the repository.\n\n :param category: the category of items to fetch\n :param from_date: obtain commits newer than a specific date\n (inclusive)\n :param to_date: obtain commits older than a specific date\n :param branches: names of branches to fetch from (default: None)\n :param latest_items: sync with the repository to fetch only the\n newest commits\n :param no_update: if enabled, don't update the repo with the latest changes\n\n :returns: a generator of commits\n \"\"\"\n if not from_date:\n from_date = DEFAULT_DATETIME\n if not to_date:\n to_date = DEFAULT_LAST_DATETIME\n\n kwargs = {\n 'from_date': from_date,\n 'to_date': to_date,\n 'branches': branches,\n 'latest_items': latest_items,\n 'no_update': no_update\n }\n items = super().fetch(category, **kwargs)\n\n return items"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef fetch_items(self, category, **kwargs):\n from_date = kwargs['from_date']\n to_date = kwargs['to_date']\n branches = kwargs['branches']\n latest_items = kwargs['latest_items']\n no_update = kwargs['no_update']\n\n ncommits = 0\n\n try:\n if os.path.isfile(self.gitpath):\n commits = self.__fetch_from_log()\n else:\n commits = self.__fetch_from_repo(from_date, to_date, branches,\n latest_items, no_update)\n\n for commit in commits:\n yield commit\n ncommits += 1\n except EmptyRepositoryError:\n pass\n\n logger.info(\"Fetch process completed: %s commits fetched\",\n ncommits)", "response": "Fetch the commits\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef parse_git_log_from_file(filepath):\n with open(filepath, 'r', errors='surrogateescape',\n newline=os.linesep) as f:\n parser = GitParser(f)\n\n for commit in parser.parse():\n yield commit", "response": "Parse a Git log file and return an iterator of commits. Each one of this contains a commit. The iterator of commits contains a dictionary containing the commit ID and the commit ID of the commit."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ninitialize repositories directory path", "response": "def _pre_init(self):\n \"\"\"Initialize repositories directory path\"\"\"\n\n if self.parsed_args.git_log:\n git_path = self.parsed_args.git_log\n elif not self.parsed_args.git_path:\n base_path = os.path.expanduser('~/.perceval/repositories/')\n processed_uri = self.parsed_args.uri.lstrip('/')\n git_path = os.path.join(base_path, processed_uri) + '-git'\n else:\n git_path = self.parsed_args.git_path\n\n setattr(self.parsed_args, 'gitpath', git_path)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef setup_cmd_parser(cls):\n\n parser = BackendCommandArgumentParser(cls.BACKEND.CATEGORIES,\n from_date=True,\n to_date=True)\n\n # Optional arguments\n group = parser.parser.add_argument_group('Git arguments')\n group.add_argument('--branches', dest='branches',\n nargs='+', type=str, default=None,\n help=\"Fetch commits only from these branches\")\n\n # Mutual exclusive parameters\n exgroup = group.add_mutually_exclusive_group()\n exgroup.add_argument('--git-path', dest='git_path',\n help=\"Path where the Git repository will be cloned\")\n exgroup.add_argument('--git-log', dest='git_log',\n help=\"Path to the Git log file\")\n\n exgroup_fetch = group.add_mutually_exclusive_group()\n exgroup_fetch.add_argument('--latest-items', dest='latest_items',\n action='store_true',\n help=\"Fetch latest commits added to the repository\")\n exgroup_fetch.add_argument('--no-update', dest='no_update',\n action='store_true',\n help=\"Fetch all commits without updating the repository\")\n\n # Required arguments\n parser.parser.add_argument('uri',\n help=\"URI of the Git log repository\")\n\n return parser", "response": "Returns the Git argument parser."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef parse(self):\n\n for line in self.stream:\n line = line.rstrip('\\n')\n parsed = False\n self.nline += 1\n\n while not parsed:\n parsed = self.handlers[self.state](line)\n\n if self.state == self.COMMIT and self.commit:\n commit = self._build_commit()\n logger.debug(\"Commit %s parsed\", commit['commit'])\n yield commit\n\n # Return the last commit, if any\n if self.commit:\n commit = self._build_commit()\n logger.debug(\"Commit %s parsed\", commit['commit'])\n yield commit", "response": "Parse the Git log stream."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef __get_old_filepath(self, f):\n i = f.find('{')\n j = f.find('}')\n\n if i > -1 and j > -1:\n prefix = f[0:i]\n inner = f[i + 1:f.find(' => ', i)]\n suffix = f[j + 1:]\n return prefix + inner + suffix\n elif ' => ' in f:\n return f.split(' => ')[0]\n else:\n return f", "response": "Get the old filepath of a moved or renamed file."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nclones a Git repository.", "response": "def clone(cls, uri, dirpath):\n \"\"\"Clone a Git repository.\n\n Make a bare copy of the repository stored in `uri` into `dirpath`.\n The repository would be either local or remote.\n\n :param uri: URI of the repository\n :param dirtpath: directory where the repository will be cloned\n\n :returns: a `GitRepository` class having cloned the repository\n\n :raises RepositoryError: when an error occurs cloning the given\n repository\n \"\"\"\n cmd = ['git', 'clone', '--bare', uri, dirpath]\n env = {\n 'LANG': 'C',\n 'HOME': os.getenv('HOME', '')\n }\n\n cls._exec(cmd, env=env)\n\n logger.debug(\"Git %s repository cloned into %s\",\n uri, dirpath)\n\n return cls(uri, dirpath)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the total number of objects packed and unpacked on the repository.", "response": "def count_objects(self):\n \"\"\"Count the objects of a repository.\n\n The method returns the total number of objects (packed and unpacked)\n available on the repository.\n\n :raises RepositoryError: when an error occurs counting the objects\n of a repository\n \"\"\"\n cmd_count = ['git', 'count-objects', '-v']\n\n outs = self._exec(cmd_count, cwd=self.dirpath, env=self.gitenv)\n outs = outs.decode('utf-8', errors='surrogateescape').rstrip()\n\n try:\n cobjs = {k: v for k, v in (x.split(': ') for x in outs.split('\\n'))}\n nobjs = int(cobjs['count']) + int(cobjs['in-pack'])\n except KeyError as e:\n error = \"unable to parse 'count-objects' output; reason: '%s' entry not found\" \\\n % e.args[0]\n raise RepositoryError(cause=error)\n except ValueError as e:\n error = \"unable to parse 'count-objects' output; reason: %s\" % str(e)\n raise RepositoryError(cause=error)\n\n logger.debug(\"Git %s repository has %s objects\",\n self.uri, str(nobjs))\n\n return nobjs"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef is_detached(self):\n cmd_sym = ['git', 'symbolic-ref', 'HEAD']\n\n try:\n self._exec(cmd_sym, cwd=self.dirpath, env=self.gitenv)\n except RepositoryError as e:\n if e.msg.find(\"ref HEAD is not a symbolic ref\") == -1:\n raise e\n return True\n else:\n return False", "response": "Check if the repository is detached."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef update(self):\n cmd_update = ['git', 'fetch', 'origin', '+refs/heads/*:refs/heads/*', '--prune']\n self._exec(cmd_update, cwd=self.dirpath, env=self.gitenv)\n\n logger.debug(\"Git %s repository updated into %s\",\n self.uri, self.dirpath)", "response": "Update the local copy of the current repository with the remote repository."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nkeeping the repository in sync.", "response": "def sync(self):\n \"\"\"Keep the repository in sync.\n\n This method will synchronize the repository with its 'origin',\n fetching newest objects and updating references. It uses low\n level commands which allow to keep track of which things\n have changed in the repository.\n\n The method also returns a list of hashes related to the new\n commits fetched during the process.\n\n :returns: list of new commits\n\n :raises RepositoryError: when an error occurs synchronizing\n the repository\n \"\"\"\n pack_name, refs = self._fetch_pack()\n\n if pack_name:\n commits = self._read_commits_from_pack(pack_name)\n else:\n commits = []\n logger.debug(\"Git repository %s (%s) does not have any new object\",\n self.uri, self.dirpath)\n\n self._update_references(refs)\n\n logger.debug(\"Git repository %s (%s) is synced\",\n self.uri, self.dirpath)\n\n return commits"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreading the list commits from the repository.", "response": "def rev_list(self, branches=None):\n \"\"\"Read the list commits from the repository\n\n The list of branches is a list of strings, with the names of the\n branches to fetch. If the list of branches is empty, no commit\n is fetched. If the list of branches is None, all commits\n for all branches will be fetched.\n\n The method returns the Git rev-list of the repository using the\n following options:\n\n git rev-list --topo-order\n\n :param branches: names of branches to fetch from (default: None)\n\n :raises EmptyRepositoryError: when the repository is empty and\n the action cannot be performed\n :raises RepositoryError: when an error occurs executing the command\n \"\"\"\n if self.is_empty():\n logger.warning(\"Git %s repository is empty; unable to get the rev-list\",\n self.uri)\n raise EmptyRepositoryError(repository=self.uri)\n\n cmd_rev_list = ['git', 'rev-list', '--topo-order']\n\n if branches is None:\n cmd_rev_list.extend(['--branches', '--tags', '--remotes=origin'])\n elif len(branches) == 0:\n cmd_rev_list.extend(['--branches', '--tags', '--max-count=0'])\n else:\n branches = ['refs/heads/' + branch for branch in branches]\n cmd_rev_list.extend(branches)\n\n for line in self._exec_nb(cmd_rev_list, cwd=self.dirpath, env=self.gitenv):\n yield line.rstrip('\\n')\n\n logger.debug(\"Git rev-list fetched from %s repository (%s)\",\n self.uri, self.dirpath)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef log(self, from_date=None, to_date=None, branches=None, encoding='utf-8'):\n if self.is_empty():\n logger.warning(\"Git %s repository is empty; unable to get the log\",\n self.uri)\n raise EmptyRepositoryError(repository=self.uri)\n\n cmd_log = ['git', 'log', '--reverse', '--topo-order']\n cmd_log.extend(self.GIT_PRETTY_OUTPUT_OPTS)\n\n if from_date:\n dt = from_date.strftime(\"%Y-%m-%d %H:%M:%S %z\")\n cmd_log.append('--since=' + dt)\n\n if to_date:\n dt = to_date.strftime(\"%Y-%m-%d %H:%M:%S %z\")\n cmd_log.append('--until=' + dt)\n\n if branches is None:\n cmd_log.extend(['--branches', '--tags', '--remotes=origin'])\n elif len(branches) == 0:\n cmd_log.append('--max-count=0')\n else:\n branches = ['refs/heads/' + branch for branch in branches]\n cmd_log.extend(branches)\n\n for line in self._exec_nb(cmd_log, cwd=self.dirpath, env=self.gitenv):\n yield line\n\n logger.debug(\"Git log fetched from %s repository (%s)\",\n self.uri, self.dirpath)", "response": "Read the commit log of the current object."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nshow the data of a set of commits.", "response": "def show(self, commits=None, encoding='utf-8'):\n \"\"\"Show the data of a set of commits.\n\n The method returns the output of Git show command for a\n set of commits using the following options:\n\n git show --raw --numstat --pretty=fuller --decorate=full\n --parents -M -C -c [...]\n\n When the list of commits is empty, the command will return\n data about the last commit, like the default behaviour of\n `git show`.\n\n :param commits: list of commits to show data\n :param encoding: encode the output using this format\n\n :returns: a generator where each item is a line from the show output\n\n :raises EmptyRepositoryError: when the repository is empty and\n the action cannot be performed\n :raises RepositoryError: when an error occurs fetching the show output\n \"\"\"\n if self.is_empty():\n logger.warning(\"Git %s repository is empty; unable to run show\",\n self.uri)\n raise EmptyRepositoryError(repository=self.uri)\n\n if commits is None:\n commits = []\n\n cmd_show = ['git', 'show']\n cmd_show.extend(self.GIT_PRETTY_OUTPUT_OPTS)\n cmd_show.extend(commits)\n\n for line in self._exec_nb(cmd_show, cwd=self.dirpath, env=self.gitenv):\n yield line\n\n logger.debug(\"Git show fetched from %s repository (%s)\",\n self.uri, self.dirpath)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _fetch_pack(self):\n\n def prepare_refs(refs):\n return [ref.hash.encode('utf-8') for ref in refs\n if not ref.refname.endswith('^{}')]\n\n def determine_wants(refs):\n remote_refs = prepare_refs(self._discover_refs(remote=True))\n local_refs = prepare_refs(self._discover_refs())\n wants = [ref for ref in remote_refs if ref not in local_refs]\n return wants\n\n client, repo_path = dulwich.client.get_transport_and_path(self.uri)\n repo = dulwich.repo.Repo(self.dirpath)\n fd = io.BytesIO()\n\n local_refs = self._discover_refs()\n graph_walker = _GraphWalker(local_refs)\n\n result = client.fetch_pack(repo_path,\n determine_wants,\n graph_walker,\n fd.write)\n refs = [GitRef(ref_hash.decode('utf-8'), ref_name.decode('utf-8'))\n for ref_name, ref_hash in result.refs.items()]\n\n if len(fd.getvalue()) > 0:\n fd.seek(0)\n pack = repo.object_store.add_thin_pack(fd.read, None)\n pack_name = pack.name().decode('utf-8')\n else:\n pack_name = None\n\n return (pack_name, refs)", "response": "Fetch changes and store them in a pack."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreads the commits of a pack.", "response": "def _read_commits_from_pack(self, packet_name):\n \"\"\"Read the commits of a pack.\"\"\"\n\n filepath = 'objects/pack/pack-' + packet_name\n\n cmd_verify_pack = ['git', 'verify-pack', '-v', filepath]\n\n outs = self._exec(cmd_verify_pack, cwd=self.dirpath, env=self.gitenv)\n outs = outs.decode('utf-8', errors='surrogateescape').rstrip()\n\n lines = [line.split(' ') for line in outs.split('\\n')]\n\n # Commits usually come in the pack ordered from newest to oldest\n commits = [parts[0] for parts in lines if parts[1] == 'commit']\n commits.reverse()\n\n return commits"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nupdates references removing old ones.", "response": "def _update_references(self, refs):\n \"\"\"Update references removing old ones.\"\"\"\n\n new_refs = [ref.refname for ref in refs]\n\n # Delete old references\n for old_ref in self._discover_refs():\n if not old_ref.refname.startswith('refs/heads/'):\n continue\n if old_ref.refname in new_refs:\n continue\n self._update_ref(old_ref, delete=True)\n\n # Update new references\n for new_ref in refs:\n refname = new_ref.refname\n\n if refname.endswith('^{}'):\n logger.debug(\"Annotated tag %s ignored for updating in sync process\",\n refname)\n continue\n elif not refname.startswith('refs/heads/') and not refname.startswith('refs/tags/'):\n logger.debug(\"Reference %s not needed; ignored for updating in sync process\",\n refname)\n continue\n else:\n self._update_ref(new_ref)\n\n # Prune repository to remove old branches\n cmd = ['git', 'remote', 'prune', 'origin']\n self._exec(cmd, cwd=self.dirpath, env=self.gitenv)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _discover_refs(self, remote=False):\n\n if remote:\n cmd_refs = ['git', 'ls-remote', '-h', '-t', '--exit-code', 'origin']\n sep = '\\t'\n ignored_error_codes = [2]\n else:\n # Check first whether the local repo is empty;\n # Running 'show-ref' in empty repos gives an error\n if self.is_empty():\n raise EmptyRepositoryError(repository=self.uri)\n\n cmd_refs = ['git', 'show-ref', '--heads', '--tags']\n sep = ' '\n ignored_error_codes = [1]\n\n # Error codes returned when no matching refs (i.e, no heads\n # or tags) are found in a repository will be ignored. Otherwise,\n # the full process would fail for those situations.\n outs = self._exec(cmd_refs, cwd=self.dirpath,\n env=self.gitenv,\n ignored_error_codes=ignored_error_codes)\n outs = outs.decode('utf-8', errors='surrogateescape').rstrip()\n outs = outs.split('\\n') if outs else []\n\n refs = []\n\n for line in outs:\n data = line.split(sep)\n ref = GitRef(data[0], data[1])\n refs.append(ref)\n\n return refs", "response": "Get the current list of local or remote refs."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _exec_nb(self, cmd, cwd=None, env=None, encoding='utf-8'):\n self.failed_message = None\n\n logger.debug(\"Running command %s (cwd: %s, env: %s)\",\n ' '.join(cmd), cwd, str(env))\n\n try:\n self.proc = subprocess.Popen(cmd,\n stdout=subprocess.PIPE,\n stderr=subprocess.PIPE,\n cwd=cwd,\n env=env)\n err_thread = threading.Thread(target=self._read_stderr,\n kwargs={'encoding': encoding},\n daemon=True)\n err_thread.start()\n for line in self.proc.stdout:\n yield line.decode(encoding, errors='surrogateescape')\n err_thread.join()\n\n self.proc.communicate()\n self.proc.stdout.close()\n self.proc.stderr.close()\n except OSError as e:\n err_thread.join()\n raise RepositoryError(cause=str(e))\n\n if self.proc.returncode != 0:\n cause = \"git command - %s (return code: %d)\" % \\\n (self.failed_message, self.proc.returncode)\n raise RepositoryError(cause=cause)", "response": "Execute a command with a non blocking call."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _read_stderr(self, encoding='utf-8'):\n for line in self.proc.stderr:\n err_line = line.decode(encoding, errors='surrogateescape')\n\n if self.proc.returncode != 0:\n # If the subprocess didn't finish successfully, we expect\n # the last line in stderr to provide the cause\n if self.failed_message is not None:\n # We had a message, there is a newer line, print it\n logger.debug(\"Git log stderr: \" + self.failed_message)\n self.failed_message = err_line\n else:\n # The subprocess is successfully up to now, print the line\n logger.debug(\"Git log stderr: \" + err_line)", "response": "Reads the stderr of the git log and writes it to self. failed_message."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nexecutes a command in the directory set by cwd.", "response": "def _exec(cmd, cwd=None, env=None, ignored_error_codes=None,\n encoding='utf-8'):\n \"\"\"Run a command.\n\n Execute `cmd` command in the directory set by `cwd`. Environment\n variables can be set using the `env` dictionary. The output\n data is returned as encoded bytes.\n\n Commands which their returning status codes are non-zero will\n be treated as failed. Error codes considered as valid can be\n ignored giving them in the `ignored_error_codes` list.\n\n :returns: the output of the command as encoded bytes\n\n :raises RepositoryError: when an error occurs running the command\n \"\"\"\n if ignored_error_codes is None:\n ignored_error_codes = []\n\n logger.debug(\"Running command %s (cwd: %s, env: %s)\",\n ' '.join(cmd), cwd, str(env))\n\n try:\n proc = subprocess.Popen(cmd, stdout=subprocess.PIPE,\n stderr=subprocess.PIPE,\n cwd=cwd, env=env)\n (outs, errs) = proc.communicate()\n except OSError as e:\n raise RepositoryError(cause=str(e))\n\n if proc.returncode != 0 and proc.returncode not in ignored_error_codes:\n err = errs.decode(encoding, errors='surrogateescape')\n cause = \"git command - %s\" % err\n raise RepositoryError(cause=cause)\n else:\n logger.debug(errs.decode(encoding, errors='surrogateescape'))\n\n return outs"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef fetch(self, category=CATEGORY_TWEET, since_id=None, max_id=None,\n geocode=None, lang=None,\n include_entities=True, tweets_type=TWEET_TYPE_MIXED):\n \"\"\"Fetch the tweets from the server.\n\n This method fetches tweets from the TwitterSearch API published in the last seven days.\n\n :param category: the category of items to fetch\n :param since_id: if not null, it returns results with an ID greater than the specified ID\n :param max_id: when it is set or if not None, it returns results with an ID less than the specified ID\n :param geocode: if enabled, returns tweets by users located at latitude,longitude,\"mi\"|\"km\"\n :param lang: if enabled, restricts tweets to the given language, given by an ISO 639-1 code\n :param include_entities: if disabled, it excludes entities node\n :param tweets_type: type of tweets returned. Default is \u201cmixed\u201d, others are \"recent\" and \"popular\"\n\n :returns: a generator of tweets\n \"\"\"\n kwargs = {\"since_id\": since_id,\n \"max_id\": max_id,\n \"geocode\": geocode,\n \"lang\": lang,\n \"include_entities\": include_entities,\n \"result_type\": tweets_type}\n items = super().fetch(category, **kwargs)\n\n return items", "response": "Fetch the tweets from the TwitterSearch API."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nfetch the items from the backend", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the tweets\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n since_id = kwargs['since_id']\n max_id = kwargs['max_id']\n geocode = kwargs['geocode']\n lang = kwargs['lang']\n entities = kwargs['include_entities']\n tweets_type = kwargs['result_type']\n\n logger.info(\"Fetching tweets %s from %s to %s\",\n self.query, str(since_id),\n str(max_id) if max_id else '--')\n\n tweets_ids = []\n min_date = None\n max_date = None\n group_tweets = self.client.tweets(self.query, since_id=since_id, max_id=max_id, geocode=geocode,\n lang=lang, include_entities=entities, result_type=tweets_type)\n\n for tweets in group_tweets:\n for i in range(len(tweets)):\n tweet = tweets[i]\n tweets_ids.append(tweet['id'])\n\n if tweets[-1] == tweet:\n min_date = str_to_datetime(tweets[-1]['created_at'])\n\n if tweets[0] == tweet and not max_date:\n max_date = str_to_datetime(tweets[0]['created_at'])\n\n yield tweet\n\n logger.info(\"Fetch process completed: %s (unique %s) tweets fetched, from %s to %s\",\n len(tweets_ids), len(list(set(tweets_ids))), min_date, max_date)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ninitialize the client object.", "response": "def _init_client(self, from_archive=False):\n \"\"\"Init client\"\"\"\n\n return TwitterClient(self.api_token, self.max_items,\n self.sleep_for_rate, self.min_rate_to_sleep, self.sleep_time,\n self.archive, from_archive)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfetching tweets for a given query between since_id and max_id.", "response": "def tweets(self, query, since_id=None, max_id=None, geocode=None, lang=None,\n include_entities=True, result_type=TWEET_TYPE_MIXED):\n \"\"\"Fetch tweets for a given query between since_id and max_id.\n\n :param query: query to fetch tweets\n :param since_id: if not null, it returns results with an ID greater than the specified ID\n :param max_id: if not null, it returns results with an ID less than the specified ID\n :param geocode: if enabled, returns tweets by users located at latitude,longitude,\"mi\"|\"km\"\n :param lang: if enabled, restricts tweets to the given language, given by an ISO 639-1 code\n :param include_entities: if disabled, it excludes entities node\n :param result_type: type of tweets returned. Default is \u201cmixed\u201d, others are \"recent\" and \"popular\"\n\n :returns: a generator of tweets\n \"\"\"\n resource = self.base_url\n params = {'q': query,\n 'count': self.max_items}\n\n if since_id:\n params['since_id'] = since_id\n\n if max_id:\n params['max_id'] = max_id\n\n if geocode:\n params['geocode'] = geocode\n\n if lang:\n params['lang'] = lang\n\n params['include_entities'] = include_entities\n params['result_type'] = result_type\n\n while True:\n raw_tweets = self._fetch(resource, params=params)\n tweets = json.loads(raw_tweets)\n\n if not tweets['statuses']:\n break\n\n params['max_id'] = tweets['statuses'][-1]['id'] - 1\n yield tweets['statuses']"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _fetch(self, url, params):\n if not self.from_archive:\n self.sleep_for_rate_limit()\n\n headers = {'Authorization': 'Bearer ' + self.api_key}\n r = self.fetch(url, payload=params, headers=headers)\n\n if not self.from_archive:\n self.update_rate_limit(r)\n\n return r.text", "response": "Fetch a resource. A resource. a\n method to iterate over the contents of a\n type of resource and parameters."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef setup_cmd_parser(cls):\n\n parser = BackendCommandArgumentParser(cls.BACKEND.CATEGORIES,\n token_auth=True,\n archive=True)\n\n # Backend token is required\n action = parser.parser._option_string_actions['--api-token']\n action.required = True\n\n # Meetup options\n group = parser.parser.add_argument_group('Twitter arguments')\n group.add_argument('--max-items', dest='max_items',\n type=int, default=MAX_ITEMS,\n help=\"Maximum number of items requested on the same query\")\n group.add_argument('--no-entities', dest='include_entities',\n action='store_false',\n help=\" Exclude entities node\")\n group.add_argument('--geo-code', dest='geocode',\n help=\"Select tweets by users located at latitude,longitude,radius\")\n group.add_argument('--lang', dest='lang',\n help=\"Select tweets to the given language in ISO 639-1 code\")\n group.add_argument('--tweets-type', dest='tweets_type', default=TWEET_TYPE_MIXED,\n help=\"Type of tweets returned. Default is 'mixed', others are 'recent' and 'popular'\")\n group.add_argument('--sleep-for-rate', dest='sleep_for_rate',\n action='store_true',\n help=\"sleep for getting more rate\")\n group.add_argument('--min-rate-to-sleep', dest='min_rate_to_sleep',\n default=MIN_RATE_LIMIT, type=int,\n help=\"sleep until reset when the rate limit reaches this value\")\n group.add_argument('--sleep-time', dest='sleep_time',\n default=SLEEP_TIME, type=int,\n help=\"minimun sleeping time to avoid too many request exception\")\n\n # Required arguments\n parser.parser.add_argument('query',\n help=\"Search query including operators, max 500 chars\")\n\n return parser", "response": "Returns the Twitter argument parser."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nfetches data from Google API.", "response": "def fetch(self, category=CATEGORY_HITS):\n \"\"\"Fetch data from Google API.\n\n The method retrieves a list of hits for some\n given keywords using the Google API.\n\n :param category: the category of items to fetch\n\n :returns: a generator of data\n \"\"\"\n kwargs = {}\n items = super().fetch(category, **kwargs)\n\n return items"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef fetch_items(self, category, **kwargs):\n logger.info(\"Fetching data for '%s'\", self.keywords)\n\n hits_raw = self.client.hits(self.keywords)\n hits = self.__parse_hits(hits_raw)\n\n yield hits\n\n logger.info(\"Fetch process completed\")", "response": "Fetch items from Google"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nparses the hits returned by the Google Search API", "response": "def __parse_hits(self, hit_raw):\n \"\"\"Parse the hits returned by the Google Search API\"\"\"\n\n # Create the soup and get the desired div\n bs_result = bs4.BeautifulSoup(hit_raw, 'html.parser')\n hit_string = bs_result.find(\"div\", id=\"resultStats\").text\n\n # Remove commas or dots\n hit_string = hit_string.replace(',', u'')\n hit_string = hit_string.replace('.', u'')\n\n fetched_on = datetime_utcnow().timestamp()\n id_args = self.keywords[:]\n id_args.append(str(fetched_on))\n\n hits_json = {\n 'fetched_on': fetched_on,\n 'id': uuid(*id_args),\n 'keywords': self.keywords,\n 'type': 'googleSearchHits'\n }\n\n if not hit_string:\n logger.warning(\"No hits for %s\", self.keywords)\n hits_json['hits'] = 0\n\n return hits_json\n\n str_hits = re.search(r'\\d+', hit_string).group(0)\n hits = int(str_hits)\n hits_json['hits'] = hits\n\n return hits_json"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfetching information about a list of keywords.", "response": "def hits(self, keywords):\n \"\"\"Fetch information about a list of keywords.\"\"\"\n\n if len(keywords) == 1:\n query_str = keywords[0]\n else:\n query_str = ' '.join([k for k in keywords])\n\n logger.info(\"Fetching hits for '%s'\", query_str)\n params = {'q': query_str}\n\n # Make the request\n req = self.fetch(GOOGLE_SEARCH_URL, payload=params)\n\n return req.text"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fetch(self, category=CATEGORY_ISSUE, from_date=DEFAULT_DATETIME, to_date=DEFAULT_LAST_DATETIME):\n if not from_date:\n from_date = DEFAULT_DATETIME\n if not to_date:\n to_date = DEFAULT_LAST_DATETIME\n\n from_date = datetime_to_utc(from_date)\n to_date = datetime_to_utc(to_date)\n\n kwargs = {\n 'from_date': from_date,\n 'to_date': to_date\n }\n items = super().fetch(category, **kwargs)\n\n return items", "response": "Fetch the issues and pull requests from a GitHub repository."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nfetch the items from the backend", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the items (issues or pull_requests)\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n from_date = kwargs['from_date']\n to_date = kwargs['to_date']\n\n if category == CATEGORY_ISSUE:\n items = self.__fetch_issues(from_date, to_date)\n elif category == CATEGORY_PULL_REQUEST:\n items = self.__fetch_pull_requests(from_date, to_date)\n else:\n items = self.__fetch_repo_info()\n\n return items"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nextracting the update time from a GitHub item.", "response": "def metadata_updated_on(item):\n \"\"\"Extracts the update time from a GitHub item.\n\n The timestamp used is extracted from 'updated_at' field.\n This date is converted to UNIX timestamp format. As GitHub\n dates are in UTC the conversion is straightforward.\n\n :param item: item generated by the backend\n\n :returns: a UNIX timestamp\n \"\"\"\n if \"forks_count\" in item:\n return item['fetched_on']\n else:\n ts = item['updated_at']\n ts = str_to_datetime(ts)\n\n return ts.timestamp()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nextract the category from a GitHub item.", "response": "def metadata_category(item):\n \"\"\"Extracts the category from a GitHub item.\n\n This backend generates two types of item which are\n 'issue' and 'pull_request'.\n \"\"\"\n\n if \"base\" in item:\n category = CATEGORY_PULL_REQUEST\n elif \"forks_count\" in item:\n category = CATEGORY_REPO\n else:\n category = CATEGORY_ISSUE\n\n return category"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfetches the issues from the API", "response": "def __fetch_issues(self, from_date, to_date):\n \"\"\"Fetch the issues\"\"\"\n\n issues_groups = self.client.issues(from_date=from_date)\n\n for raw_issues in issues_groups:\n issues = json.loads(raw_issues)\n for issue in issues:\n\n if str_to_datetime(issue['updated_at']) > to_date:\n return\n\n self.__init_extra_issue_fields(issue)\n for field in TARGET_ISSUE_FIELDS:\n\n if not issue[field]:\n continue\n\n if field == 'user':\n issue[field + '_data'] = self.__get_user(issue[field]['login'])\n elif field == 'assignee':\n issue[field + '_data'] = self.__get_issue_assignee(issue[field])\n elif field == 'assignees':\n issue[field + '_data'] = self.__get_issue_assignees(issue[field])\n elif field == 'comments':\n issue[field + '_data'] = self.__get_issue_comments(issue['number'])\n elif field == 'reactions':\n issue[field + '_data'] = \\\n self.__get_issue_reactions(issue['number'], issue['reactions']['total_count'])\n\n yield issue"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nfetches the pull requests from the server", "response": "def __fetch_pull_requests(self, from_date, to_date):\n \"\"\"Fetch the pull requests\"\"\"\n\n raw_pulls = self.client.pulls(from_date=from_date)\n for raw_pull in raw_pulls:\n pull = json.loads(raw_pull)\n\n if str_to_datetime(pull['updated_at']) > to_date:\n return\n\n self.__init_extra_pull_fields(pull)\n for field in TARGET_PULL_FIELDS:\n\n if not pull[field]:\n continue\n\n if field == 'user':\n pull[field + '_data'] = self.__get_user(pull[field]['login'])\n elif field == 'merged_by':\n pull[field + '_data'] = self.__get_user(pull[field]['login'])\n elif field == 'review_comments':\n pull[field + '_data'] = self.__get_pull_review_comments(pull['number'])\n elif field == 'requested_reviewers':\n pull[field + '_data'] = self.__get_pull_requested_reviewers(pull['number'])\n elif field == 'commits':\n pull[field + '_data'] = self.__get_pull_commits(pull['number'])\n\n yield pull"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget repo info about stars watchers and forks", "response": "def __fetch_repo_info(self):\n \"\"\"Get repo info about stars, watchers and forks\"\"\"\n\n raw_repo = self.client.repo()\n repo = json.loads(raw_repo)\n\n fetched_on = datetime_utcnow()\n repo['fetched_on'] = fetched_on.timestamp()\n\n yield repo"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef __get_issue_reactions(self, issue_number, total_count):\n\n reactions = []\n\n if total_count == 0:\n return reactions\n\n group_reactions = self.client.issue_reactions(issue_number)\n\n for raw_reactions in group_reactions:\n\n for reaction in json.loads(raw_reactions):\n reaction['user_data'] = self.__get_user(reaction['user']['login'])\n reactions.append(reaction)\n\n return reactions", "response": "Get the issue reactions"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef __get_issue_comment_reactions(self, comment_id, total_count):\n\n reactions = []\n\n if total_count == 0:\n return reactions\n\n group_reactions = self.client.issue_comment_reactions(comment_id)\n\n for raw_reactions in group_reactions:\n\n for reaction in json.loads(raw_reactions):\n reaction['user_data'] = self.__get_user(reaction['user']['login'])\n reactions.append(reaction)\n\n return reactions", "response": "Get reactions on issue comments"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef __get_pull_requested_reviewers(self, pr_number):\n\n requested_reviewers = []\n group_requested_reviewers = self.client.pull_requested_reviewers(pr_number)\n\n for raw_requested_reviewers in group_requested_reviewers:\n group_requested_reviewers = json.loads(raw_requested_reviewers)\n\n for requested_reviewer in group_requested_reviewers['users']:\n user_data = self.__get_user(requested_reviewer['login'])\n requested_reviewers.append(user_data)\n\n return requested_reviewers", "response": "Get pull request requested reviewers"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef __get_pull_commits(self, pr_number):\n\n hashes = []\n group_pull_commits = self.client.pull_commits(pr_number)\n\n for raw_pull_commits in group_pull_commits:\n\n for commit in json.loads(raw_pull_commits):\n commit_hash = commit['sha']\n hashes.append(commit_hash)\n\n return hashes", "response": "Get the list of commit hashes for a pull request"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef __get_pull_review_comments(self, pr_number):\n\n comments = []\n group_comments = self.client.pull_review_comments(pr_number)\n\n for raw_comments in group_comments:\n\n for comment in json.loads(raw_comments):\n comment_id = comment.get('id')\n\n user = comment.get('user', None)\n if not user:\n logger.warning(\"Missing user info for %s\", comment['url'])\n comment['user_data'] = None\n else:\n comment['user_data'] = self.__get_user(user['login'])\n\n comment['reactions_data'] = \\\n self.__get_pull_review_comment_reactions(comment_id, comment['reactions']['total_count'])\n comments.append(comment)\n\n return comments", "response": "Get all comments for a pull request"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting the list of reactions for a pull review comment", "response": "def __get_pull_review_comment_reactions(self, comment_id, total_count):\n \"\"\"Get pull review comment reactions\"\"\"\n\n reactions = []\n\n if total_count == 0:\n return reactions\n\n group_reactions = self.client.pull_review_comment_reactions(comment_id)\n\n for raw_reactions in group_reactions:\n\n for reaction in json.loads(raw_reactions):\n reaction['user_data'] = self.__get_user(reaction['user']['login'])\n reactions.append(reaction)\n\n return reactions"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting user and organizations data for the login", "response": "def __get_user(self, login):\n \"\"\"Get user and org data for the login\"\"\"\n\n user = {}\n\n if not login:\n return user\n\n user_raw = self.client.user(login)\n user = json.loads(user_raw)\n user_orgs_raw = \\\n self.client.user_orgs(login)\n user['organizations'] = json.loads(user_orgs_raw)\n\n return user"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef issue_reactions(self, issue_number):\n\n payload = {\n 'per_page': PER_PAGE,\n 'direction': 'asc',\n 'sort': 'updated'\n }\n\n path = urijoin(\"issues\", str(issue_number), \"reactions\")\n return self.fetch_items(path, payload)", "response": "Get reactions of an issue"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nfetch the issues from the GitHub repository.", "response": "def issues(self, from_date=None):\n \"\"\"Fetch the issues from the repository.\n\n The method retrieves, from a GitHub repository, the issues\n updated since the given date.\n\n :param from_date: obtain issues updated since this date\n\n :returns: a generator of issues\n \"\"\"\n payload = {\n 'state': 'all',\n 'per_page': PER_PAGE,\n 'direction': 'asc',\n 'sort': 'updated'}\n\n if from_date:\n payload['since'] = from_date.isoformat()\n\n path = urijoin(\"issues\")\n return self.fetch_items(path, payload)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nfetch the pull requests from the GitHub repository.", "response": "def pulls(self, from_date=None):\n \"\"\"Fetch the pull requests from the repository.\n\n The method retrieves, from a GitHub repository, the pull requests\n updated since the given date.\n\n :param from_date: obtain pull requests updated since this date\n\n :returns: a generator of pull requests\n \"\"\"\n issues_groups = self.issues(from_date=from_date)\n\n for raw_issues in issues_groups:\n issues = json.loads(raw_issues)\n for issue in issues:\n\n if \"pull_request\" not in issue:\n continue\n\n pull_number = issue[\"number\"]\n path = urijoin(self.base_url, 'repos', self.owner, self.repository, \"pulls\", pull_number)\n\n r = self.fetch(path)\n pull = r.text\n\n yield pull"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef repo(self):\n\n path = urijoin(self.base_url, 'repos', self.owner, self.repository)\n\n r = self.fetch(path)\n repo = r.text\n\n return repo", "response": "Get the repository data"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef pull_requested_reviewers(self, pr_number):\n\n requested_reviewers_url = urijoin(\"pulls\", str(pr_number), \"requested_reviewers\")\n return self.fetch_items(requested_reviewers_url, {})", "response": "Get a list of pull requested reviewers"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets the commits for a pull request", "response": "def pull_commits(self, pr_number):\n \"\"\"Get pull request commits\"\"\"\n\n payload = {\n 'per_page': PER_PAGE,\n }\n\n commit_url = urijoin(\"pulls\", str(pr_number), \"commits\")\n return self.fetch_items(commit_url, payload)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef pull_review_comments(self, pr_number):\n\n payload = {\n 'per_page': PER_PAGE,\n 'direction': 'asc',\n 'sort': 'updated'\n }\n\n comments_url = urijoin(\"pulls\", str(pr_number), \"comments\")\n return self.fetch_items(comments_url, payload)", "response": "Get the comments for a pull request"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef pull_review_comment_reactions(self, comment_id):\n\n payload = {\n 'per_page': PER_PAGE,\n 'direction': 'asc',\n 'sort': 'updated'\n }\n\n path = urijoin(\"pulls\", \"comments\", str(comment_id), \"reactions\")\n return self.fetch_items(path, payload)", "response": "Get reactions of a review comment"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting the user information and update the user cache", "response": "def user(self, login):\n \"\"\"Get the user information and update the user cache\"\"\"\n user = None\n\n if login in self._users:\n return self._users[login]\n\n url_user = urijoin(self.base_url, 'users', login)\n\n logging.info(\"Getting info for %s\" % (url_user))\n\n r = self.fetch(url_user)\n user = r.text\n self._users[login] = user\n\n return user"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef user_orgs(self, login):\n if login in self._users_orgs:\n return self._users_orgs[login]\n\n url = urijoin(self.base_url, 'users', login, 'orgs')\n try:\n r = self.fetch(url)\n orgs = r.text\n except requests.exceptions.HTTPError as error:\n # 404 not found is wrongly received sometimes\n if error.response.status_code == 404:\n logger.error(\"Can't get github login orgs: %s\", error)\n orgs = '[]'\n else:\n raise error\n\n self._users_orgs[login] = orgs\n\n return orgs", "response": "Get the user public organizations"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _get_token_rate_limit(self, token):\n\n rate_url = urijoin(self.base_url, \"rate_limit\")\n self.session.headers.update({'Authorization': 'token ' + token})\n remaining = 0\n try:\n headers = super().fetch(rate_url).headers\n if self.rate_limit_header in headers:\n remaining = int(headers[self.rate_limit_header])\n except requests.exceptions.HTTPError as error:\n logger.warning(\"Rate limit not initialized: %s\", error)\n return remaining", "response": "Return token s remaining API points"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning array of all tokens remaining API points", "response": "def _get_tokens_rate_limits(self):\n \"\"\"Return array of all tokens remaining API points\"\"\"\n\n remainings = [0] * self.n_tokens\n # Turn off archiving when checking rates, because that would cause\n # archive key conflict (the same URLs giving different responses)\n arch = self.archive\n self.archive = None\n for idx, token in enumerate(self.tokens):\n # Pass flag to skip disabling archiving because this function doies it\n remainings[idx] = self._get_token_rate_limit(token)\n # Restore archiving to whatever state it was\n self.archive = arch\n logger.debug(\"Remaining API points: {}\".format(remainings))\n return remainings"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nchecking all API tokens defined and choose one with most remaining API points and update the current token with the new one", "response": "def _choose_best_api_token(self):\n \"\"\"Check all API tokens defined and choose one with most remaining API points\"\"\"\n\n # Return if no tokens given\n if self.n_tokens == 0:\n return\n\n # If multiple tokens given, choose best\n token_idx = 0\n if self.n_tokens > 1:\n remainings = self._get_tokens_rate_limits()\n token_idx = remainings.index(max(remainings))\n logger.debug(\"Remaining API points: {}, choosen index: {}\".format(remainings, token_idx))\n\n # If we have any tokens - use best of them\n self.current_token = self.tokens[token_idx]\n self.session.headers.update({'Authorization': 'token ' + self.current_token})\n # Update rate limit data for the current token\n self._update_current_rate_limit()"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _need_check_tokens(self):\n\n if self.n_tokens <= 1 or self.rate_limit is None:\n return False\n elif self.last_rate_limit_checked is None:\n self.last_rate_limit_checked = self.rate_limit\n return True\n\n # If approaching minimum rate limit for sleep\n approaching_limit = float(self.min_rate_to_sleep) * (1.0 + TOKEN_USAGE_BEFORE_SWITCH) + 1\n if self.rate_limit <= approaching_limit:\n self.last_rate_limit_checked = self.rate_limit\n return True\n\n # Only switch token when used predefined factor of the current token's remaining API points\n ratio = float(self.rate_limit) / float(self.last_rate_limit_checked)\n if ratio < 1.0 - TOKEN_USAGE_BEFORE_SWITCH:\n self.last_rate_limit_checked = self.rate_limit\n return True\n elif ratio > 1.0:\n self.last_rate_limit_checked = self.rate_limit\n return False\n else:\n return False", "response": "Check if we need to switch GitHub API tokens"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nupdating the rate limit data for the current token", "response": "def _update_current_rate_limit(self):\n \"\"\"Update rate limits data for the current token\"\"\"\n\n url = urijoin(self.base_url, \"rate_limit\")\n try:\n # Turn off archiving when checking rates, because that would cause\n # archive key conflict (the same URLs giving different responses)\n arch = self.archive\n self.archive = None\n response = super().fetch(url)\n self.archive = arch\n self.update_rate_limit(response)\n self.last_rate_limit_checked = self.rate_limit\n except requests.exceptions.HTTPError as error:\n if error.response.status_code == 404:\n logger.warning(\"Rate limit not initialized: %s\", error)\n else:\n raise error"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ninitializes the metadata of the items in the archive.", "response": "def init_metadata(self, origin, backend_name, backend_version,\n category, backend_params):\n \"\"\"Init metadata information.\n\n Metatada is composed by basic information needed to identify\n where archived data came from and how it can be retrieved\n and built into Perceval items.\n\n :param: origin: identifier of the repository\n :param: backend_name: name of the backend\n :param: backend_version: version of the backend\n :param: category: category of the items fetched\n :param: backend_params: dict representation of the fetch parameters\n\n raises ArchiveError: when an error occurs initializing the metadata\n \"\"\"\n created_on = datetime_to_utc(datetime_utcnow())\n created_on_dumped = created_on.isoformat()\n backend_params_dumped = pickle.dumps(backend_params, 0)\n\n metadata = (origin, backend_name, backend_version, category,\n backend_params_dumped, created_on_dumped,)\n\n try:\n cursor = self._db.cursor()\n insert_stmt = \"INSERT INTO \" + self.METADATA_TABLE + \" \"\\\n \"(origin, backend_name, backend_version, \" \\\n \"category, backend_params, created_on) \" \\\n \"VALUES (?, ?, ?, ?, ?, ?)\"\n cursor.execute(insert_stmt, metadata)\n\n self._db.commit()\n cursor.close()\n except sqlite3.DatabaseError as e:\n msg = \"metadata initialization error; cause: %s\" % str(e)\n raise ArchiveError(cause=msg)\n\n self.origin = origin\n self.backend_name = backend_name\n self.backend_version = backend_version\n self.category = category\n self.backend_params = backend_params\n self.created_on = created_on\n\n logger.debug(\"Metadata of archive %s initialized to %s\",\n self.archive_path, metadata)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nstoring a raw item in this archive.", "response": "def store(self, uri, payload, headers, data):\n \"\"\"Store a raw item in this archive.\n\n The method will store `data` content in this archive. The unique\n identifier for that item will be generated using the rest of the\n parameters.\n\n :param uri: request URI\n :param payload: request payload\n :param headers: request headers\n :param data: data to store in this archive\n\n :raises ArchiveError: when an error occurs storing the given data\n \"\"\"\n hashcode = self.make_hashcode(uri, payload, headers)\n payload_dump = pickle.dumps(payload, 0)\n headers_dump = pickle.dumps(headers, 0)\n data_dump = pickle.dumps(data, 0)\n\n logger.debug(\"Archiving %s with %s %s %s in %s\",\n hashcode, uri, payload, headers, self.archive_path)\n\n try:\n cursor = self._db.cursor()\n insert_stmt = \"INSERT INTO \" + self.ARCHIVE_TABLE + \" (\" \\\n \"id, hashcode, uri, payload, headers, data) \" \\\n \"VALUES(?,?,?,?,?,?)\"\n cursor.execute(insert_stmt, (None, hashcode, uri,\n payload_dump, headers_dump, data_dump))\n self._db.commit()\n cursor.close()\n except sqlite3.IntegrityError as e:\n msg = \"data storage error; cause: duplicated entry %s\" % hashcode\n raise ArchiveError(cause=msg)\n except sqlite3.DatabaseError as e:\n msg = \"data storage error; cause: %s\" % str(e)\n raise ArchiveError(cause=msg)\n\n logger.debug(\"%s data archived in %s\", hashcode, self.archive_path)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nretrieve a raw item from the archive.", "response": "def retrieve(self, uri, payload, headers):\n \"\"\"Retrieve a raw item from the archive.\n\n The method will return the `data` content corresponding to the\n hascode derived from the given parameters.\n\n :param uri: request URI\n :param payload: request payload\n :param headers: request headers\n\n :returns: the archived data\n\n :raises ArchiveError: when an error occurs retrieving data\n \"\"\"\n hashcode = self.make_hashcode(uri, payload, headers)\n\n logger.debug(\"Retrieving entry %s with %s %s %s in %s\",\n hashcode, uri, payload, headers, self.archive_path)\n\n self._db.row_factory = sqlite3.Row\n\n try:\n cursor = self._db.cursor()\n select_stmt = \"SELECT data \" \\\n \"FROM \" + self.ARCHIVE_TABLE + \" \" \\\n \"WHERE hashcode = ?\"\n cursor.execute(select_stmt, (hashcode,))\n row = cursor.fetchone()\n cursor.close()\n except sqlite3.DatabaseError as e:\n msg = \"data retrieval error; cause: %s\" % str(e)\n raise ArchiveError(cause=msg)\n\n if row:\n found = pickle.loads(row['data'])\n else:\n msg = \"entry %s not found in archive %s\" % (hashcode, self.archive_path)\n raise ArchiveError(cause=msg)\n\n return found"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef create(cls, archive_path):\n if os.path.exists(archive_path):\n msg = \"archive %s already exists; remove it before creating a new one\"\n raise ArchiveError(cause=msg % (archive_path))\n\n conn = sqlite3.connect(archive_path)\n\n cursor = conn.cursor()\n cursor.execute(cls.METADATA_CREATE_STMT)\n cursor.execute(cls.ARCHIVE_CREATE_STMT)\n conn.commit()\n\n cursor.close()\n conn.close()\n\n logger.debug(\"Creating archive %s\", archive_path)\n archive = cls(archive_path)\n logger.debug(\"Achive %s was created\", archive_path)\n\n return archive", "response": "Create a brand new archive."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ngenerate a SHA1 hash code based on the given arguments.", "response": "def make_hashcode(uri, payload, headers):\n \"\"\"Generate a SHA1 based on the given arguments.\n\n Hashcodes created by this method will used as unique identifiers\n for the raw items or resources stored by this archive.\n\n :param uri: URI to the resource\n :param payload: payload of the request needed to fetch the resource\n :param headers: headers of the request needed to fetch the resource\n\n :returns: a SHA1 hash code\n \"\"\"\n def dict_to_json_str(data):\n return json.dumps(data, sort_keys=True)\n\n content = ':'.join([uri, dict_to_json_str(payload), dict_to_json_str(headers)])\n hashcode = hashlib.sha1(content.encode('utf-8'))\n return hashcode.hexdigest()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking whether the archive is valid or not. This method will check if tables were created and if they contain valid data.", "response": "def _verify_archive(self):\n \"\"\"Check whether the archive is valid or not.\n\n This method will check if tables were created and if they\n contain valid data.\n \"\"\"\n nentries = self._count_table_rows(self.ARCHIVE_TABLE)\n nmetadata = self._count_table_rows(self.METADATA_TABLE)\n\n if nmetadata > 1:\n msg = \"archive %s metadata corrupted; multiple metadata entries\" % (self.archive_path)\n raise ArchiveError(cause=msg)\n if nmetadata == 0 and nentries > 0:\n msg = \"archive %s metadata is empty but %s entries were achived\" % (self.archive_path)\n raise ArchiveError(cause=msg)\n\n logger.debug(\"Integrity of archive %s OK; entries: %s rows, metadata: %s rows\",\n self.archive_path, nentries, nmetadata)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _load_metadata(self):\n\n logger.debug(\"Loading metadata infomation of archive %s\", self.archive_path)\n\n cursor = self._db.cursor()\n select_stmt = \"SELECT origin, backend_name, backend_version, \" \\\n \"category, backend_params, created_on \" \\\n \"FROM \" + self.METADATA_TABLE + \" \" \\\n \"LIMIT 1\"\n cursor.execute(select_stmt)\n row = cursor.fetchone()\n cursor.close()\n\n if row:\n self.origin = row[0]\n self.backend_name = row[1]\n self.backend_version = row[2]\n self.category = row[3]\n self.backend_params = pickle.loads(row[4])\n self.created_on = str_to_datetime(row[5])\n else:\n logger.debug(\"Metadata of archive %s was empty\", self.archive_path)\n\n logger.debug(\"Metadata of archive %s loaded\", self.archive_path)", "response": "Load the metadata from the archive file"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfetching the number of rows in a table", "response": "def _count_table_rows(self, table_name):\n \"\"\"Fetch the number of rows in a table\"\"\"\n\n cursor = self._db.cursor()\n select_stmt = \"SELECT COUNT(*) FROM \" + table_name\n\n try:\n cursor.execute(select_stmt)\n row = cursor.fetchone()\n except sqlite3.DatabaseError as e:\n msg = \"invalid archive file; cause: %s\" % str(e)\n raise ArchiveError(cause=msg)\n finally:\n cursor.close()\n\n return row[0]"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncreates a new archive with the given name and a random SHA1 hash.", "response": "def create_archive(self):\n \"\"\"Create a new archive.\n\n The method creates in the filesystem a brand new archive with\n a random SHA1 as its name. The first byte of the hashcode will\n be the name of the subdirectory; the remaining bytes, the\n archive name.\n\n :returns: a new `Archive` object\n\n :raises ArchiveManagerError: when an error occurs creating the\n new archive\n \"\"\"\n hashcode = uuid.uuid4().hex\n archive_dir = os.path.join(self.dirpath, hashcode[0:2])\n archive_name = hashcode[2:] + self.STORAGE_EXT\n archive_path = os.path.join(archive_dir, archive_name)\n\n if not os.path.exists(archive_dir):\n os.makedirs(archive_dir)\n\n try:\n archive = Archive.create(archive_path)\n except ArchiveError as e:\n raise ArchiveManagerError(cause=str(e))\n\n return archive"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nremove an archive. This method deletes from the filesystem the archive stored in `archive_path`. :param archive_path: path to the archive :raises ArchiveManangerError: when an error occurs removing the archive", "response": "def remove_archive(self, archive_path):\n \"\"\"Remove an archive.\n\n This method deletes from the filesystem the archive stored\n in `archive_path`.\n\n :param archive_path: path to the archive\n\n :raises ArchiveManangerError: when an error occurs removing the\n archive\n \"\"\"\n try:\n Archive(archive_path)\n except ArchiveError as e:\n raise ArchiveManagerError(cause=str(e))\n\n os.remove(archive_path)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef search(self, origin, backend_name, category, archived_after):\n archives = self._search_archives(origin, backend_name,\n category, archived_after)\n archives = [(fp, date) for fp, date in archives]\n archives = [fp for fp, _ in sorted(archives, key=lambda x: x[1])]\n\n return archives", "response": "Search archives.\n\n Get the archives which store data based on the given parameters.\n These parameters define which the origin was (`origin`), how data\n was fetched (`backend_name`) and data type ('category').\n Only those archives created on or after `archived_after` will be\n returned.\n\n The method returns a list with the file paths to those archives.\n The list is sorted by the date of creation of each archive.\n\n :param origin: data origin\n :param backend_name: backed used to fetch data\n :param category: type of the items fetched by the backend\n :param archived_after: get archives created on or after this date\n\n :returns: a list with archive names which match the search criteria"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsearches archives using filters.", "response": "def _search_archives(self, origin, backend_name, category, archived_after):\n \"\"\"Search archives using filters.\"\"\"\n\n for archive_path in self._search_files():\n try:\n archive = Archive(archive_path)\n except ArchiveError:\n continue\n\n match = archive.origin == origin and \\\n archive.backend_name == backend_name and \\\n archive.category == category and \\\n archive.created_on >= archived_after\n\n if not match:\n continue\n\n yield archive_path, archive.created_on"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _search_files(self):\n\n for root, _, files in os.walk(self.dirpath):\n for filename in files:\n location = os.path.join(root, filename)\n yield location", "response": "Retrieve the file paths stored under the base path."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef check_compressed_file_type(filepath):\n def compressed_file_type(content):\n magic_dict = {\n b'\\x1f\\x8b\\x08': 'gz',\n b'\\x42\\x5a\\x68': 'bz2',\n b'PK\\x03\\x04': 'zip'\n }\n\n for magic, filetype in magic_dict.items():\n if content.startswith(magic):\n return filetype\n\n return None\n\n with open(filepath, mode='rb') as f:\n magic_number = f.read(4)\n return compressed_file_type(magic_number)", "response": "Check if the file is a compressed file supported by the tool."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef months_range(from_date, to_date):\n start = datetime.datetime(from_date.year, from_date.month, 1)\n end = datetime.datetime(to_date.year, to_date.month, 1)\n\n month_gen = dateutil.rrule.rrule(freq=dateutil.rrule.MONTHLY,\n dtstart=start, until=end)\n months = [d for d in month_gen]\n\n pos = 0\n for x in range(1, len(months)):\n yield months[pos], months[x]\n pos = x", "response": "Generate a generator of months starting on from_date util to_date."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef message_to_dict(msg):\n def parse_headers(msg):\n headers = {}\n\n for header, value in msg.items():\n hv = []\n\n for text, charset in email.header.decode_header(value):\n if type(text) == bytes:\n charset = charset if charset else 'utf-8'\n try:\n text = text.decode(charset, errors='surrogateescape')\n except (UnicodeError, LookupError):\n # Try again with a 7bit encoding\n text = text.decode('ascii', errors='surrogateescape')\n hv.append(text)\n\n v = ' '.join(hv)\n headers[header] = v if v else None\n\n return headers\n\n def parse_payload(msg):\n body = {}\n\n if not msg.is_multipart():\n payload = decode_payload(msg)\n subtype = msg.get_content_subtype()\n body[subtype] = [payload]\n else:\n # Include all the attached texts if it is multipart\n # Ignores binary parts by default\n for part in email.iterators.typed_subpart_iterator(msg):\n payload = decode_payload(part)\n subtype = part.get_content_subtype()\n body.setdefault(subtype, []).append(payload)\n\n return {k: '\\n'.join(v) for k, v in body.items()}\n\n def decode_payload(msg_or_part):\n charset = msg_or_part.get_content_charset('utf-8')\n payload = msg_or_part.get_payload(decode=True)\n\n try:\n payload = payload.decode(charset, errors='surrogateescape')\n except (UnicodeError, LookupError):\n # Try again with a 7bit encoding\n payload = payload.decode('ascii', errors='surrogateescape')\n return payload\n\n # The function starts here\n message = requests.structures.CaseInsensitiveDict()\n\n if isinstance(msg, mailbox.mboxMessage):\n message['unixfrom'] = msg.get_from()\n else:\n message['unixfrom'] = None\n\n try:\n for k, v in parse_headers(msg).items():\n message[k] = v\n message['body'] = parse_payload(msg)\n except UnicodeError as e:\n raise ParseError(cause=str(e))\n\n return message", "response": "Convert an email. message. Message object into a dictionary."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef remove_invalid_xml_chars(raw_xml):\n illegal_unichrs = [(0x00, 0x08), (0x0B, 0x1F),\n (0x7F, 0x84), (0x86, 0x9F)]\n\n illegal_ranges = ['%s-%s' % (chr(low), chr(high))\n for (low, high) in illegal_unichrs\n if low < sys.maxunicode]\n\n illegal_xml_re = re.compile('[%s]' % ''.join(illegal_ranges))\n\n purged_xml = ''\n\n for c in raw_xml:\n if illegal_xml_re.search(c) is not None:\n c = ' '\n purged_xml += c\n\n return purged_xml", "response": "Remove control and invalid characters from an XML stream."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef xml_to_dict(raw_xml):\n def node_to_dict(node):\n d = {}\n d.update(node.items())\n\n text = getattr(node, 'text', None)\n\n if text is not None:\n d['__text__'] = text\n\n childs = {}\n for child in node:\n childs.setdefault(child.tag, []).append(node_to_dict(child))\n\n d.update(childs.items())\n\n return d\n\n purged_xml = remove_invalid_xml_chars(raw_xml)\n\n try:\n tree = xml.etree.ElementTree.fromstring(purged_xml)\n except xml.etree.ElementTree.ParseError as e:\n cause = \"XML stream %s\" % (str(e))\n raise ParseError(cause=cause)\n\n d = node_to_dict(tree)\n\n return d", "response": "Convert a XML stream into a dictionary."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef fetch_items(self, category, **kwargs):\n from_date = kwargs['from_date']\n\n logger.info(\"Fetching issues of '%s' from %s\",\n self.url, str(from_date))\n\n nissues = 0\n\n for issue_id in self.__fetch_issues_ids(from_date):\n issue = self.__fetch_and_parse_issue(issue_id)\n\n for key in USER_FIELDS:\n if key not in issue:\n continue\n\n user = self.__get_or_fetch_user(issue[key]['id'])\n issue[key + '_data'] = user\n\n for journal in issue['journals']:\n if 'user' not in journal:\n continue\n\n user = self.__get_or_fetch_user(journal['user']['id'])\n journal['user_data'] = user\n\n yield issue\n nissues += 1\n\n logger.info(\"Fetch process completed: %s issues fetched\", nissues)", "response": "Fetch the items from the backend"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nparses a Redmine issues JSON stream and return a list iterator.", "response": "def parse_issues(raw_json):\n \"\"\"Parse a Redmine issues JSON stream.\n\n The method parses a JSON stream and returns a list iterator.\n Each item is a dictionary that contains the issue parsed data.\n\n :param raw_json: JSON string to parse\n\n :returns: a generator of parsed issues\n \"\"\"\n results = json.loads(raw_json)\n\n issues = results['issues']\n for issue in issues:\n yield issue"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef issues(self, from_date=DEFAULT_DATETIME,\n offset=None, max_issues=MAX_ISSUES):\n \"\"\"Get the information of a list of issues.\n\n :param from_date: retrieve issues that where updated from that date;\n dates are converted to UTC\n :param offset: starting position for the search\n :param max_issues: maximum number of issues to reteurn per query\n \"\"\"\n resource = self.RISSUES + self.CJSON\n\n ts = datetime_to_utc(from_date)\n ts = ts.strftime(\"%Y-%m-%dT%H:%M:%SZ\")\n\n # By default, Redmine returns open issues only.\n # Parameter 'status_id' is set to get all the statuses.\n params = {\n self.PSTATUS_ID: '*',\n self.PSORT: self.PUPDATED_ON,\n self.PUPDATED_ON: '>=' + ts,\n self.PLIMIT: max_issues\n }\n\n if offset is not None:\n params[self.POFFSET] = offset\n\n response = self._call(resource, params)\n\n return response", "response": "Get the information of a list of issues."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget the information of the given issue.", "response": "def issue(self, issue_id):\n \"\"\"Get the information of the given issue.\n\n :param issue_id: issue identifier\n \"\"\"\n resource = urijoin(self.RISSUES, str(issue_id) + self.CJSON)\n\n params = {\n self.PINCLUDE: ','.join([self.CATTACHMENTS, self.CCHANGESETS,\n self.CCHILDREN, self.CJOURNALS,\n self.CRELATIONS, self.CWATCHERS])\n }\n\n response = self._call(resource, params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget the information of the given user.", "response": "def user(self, user_id):\n \"\"\"Get the information of the given user.\n\n :param user_id: user identifier\n \"\"\"\n resource = urijoin(self.RUSERS, str(user_id) + self.CJSON)\n\n params = {}\n\n response = self._call(resource, params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsanitizing the payload of a HTTP request for storing or retrieving archived items.", "response": "def sanitize_for_archive(url, headers, payload):\n \"\"\"Sanitize payload of a HTTP request by removing the token information\n before storing/retrieving archived items\n\n :param: url: HTTP url request\n :param: headers: HTTP headers request\n :param: payload: HTTP payload request\n\n :returns url, headers and the sanitized payload\n \"\"\"\n if RedmineClient.PKEY in payload:\n payload.pop(RedmineClient.PKEY)\n\n return url, headers, payload"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncall to get a resource.", "response": "def _call(self, resource, params):\n \"\"\"Call to get a resource.\n\n :param method: resource to get\n :param params: dict with the HTTP parameters needed to get\n the given resource\n \"\"\"\n url = self.URL % {'base': self.base_url, 'resource': resource}\n\n if self.api_token:\n params[self.PKEY] = self.api_token\n\n logger.debug(\"Redmine client requests: %s params: %s\",\n resource, str(params))\n\n r = self.fetch(url, payload=params, verify=False)\n\n return r.text"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nfetches the items from the backend", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the messages\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n from_date = kwargs['from_date']\n\n logger.info(\"Looking for messages from '%s' since %s\",\n self.url, str(from_date))\n\n mailing_list = HyperKittyList(self.url, self.dirpath)\n mailing_list.fetch(from_date=from_date)\n\n messages = self._fetch_and_parse_messages(mailing_list, from_date)\n\n for message in messages:\n yield message\n\n logger.info(\"Fetch process completed\")"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nfetching the mbox files from the remote archiver.", "response": "def fetch(self, from_date=DEFAULT_DATETIME):\n \"\"\"Fetch the mbox files from the remote archiver.\n\n This method stores the archives in the path given during the\n initialization of this object.\n\n HyperKitty archives are accessed month by month and stored following\n the schema year-month. Archives are fetched from the given month\n till the current month.\n\n :param from_date: fetch archives that store messages\n equal or after the given date; only year and month values\n are compared\n\n :returns: a list of tuples, storing the links and paths of the\n fetched archives\n \"\"\"\n logger.info(\"Downloading mboxes from '%s' to since %s\",\n self.client.base_url, str(from_date))\n logger.debug(\"Storing mboxes in '%s'\", self.dirpath)\n\n self.client.fetch(self.client.base_url)\n\n from_date = datetime_to_utc(from_date)\n to_end = datetime_utcnow()\n to_end += dateutil.relativedelta.relativedelta(months=1)\n\n months = months_range(from_date, to_end)\n\n fetched = []\n\n if not os.path.exists(self.dirpath):\n os.makedirs(self.dirpath)\n\n tmbox = 0\n\n for dts in months:\n tmbox += 1\n start, end = dts[0], dts[1]\n filename = start.strftime(\"%Y-%m.mbox.gz\")\n filepath = os.path.join(self.dirpath, filename)\n\n url = urijoin(self.client.base_url, 'export', filename)\n\n params = {\n 'start': start.strftime(\"%Y-%m-%d\"),\n 'end': end.strftime(\"%Y-%m-%d\")\n }\n\n success = self._download_archive(url, params, filepath)\n\n if success:\n fetched.append((url, filepath))\n\n logger.info(\"%s/%s MBoxes downloaded\", len(fetched), tmbox)\n\n return fetched"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfetching data from a Docker Hub repository.", "response": "def fetch(self, category=CATEGORY_DOCKERHUB_DATA):\n \"\"\"Fetch data from a Docker Hub repository.\n\n The method retrieves, from a repository stored in Docker Hub,\n its data which includes number of pulls, stars, description,\n among other data.\n\n :param category: the category of items to fetch\n\n :returns: a generator of data\n \"\"\"\n kwargs = {}\n items = super().fetch(category, **kwargs)\n\n return items"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nfetch the Dockher Hub items", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the Dockher Hub items\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n logger.info(\"Fetching data from '%s' repository of '%s' owner\",\n self.repository, self.owner)\n\n raw_data = self.client.repository(self.owner, self.repository)\n fetched_on = datetime_utcnow().timestamp()\n\n data = self.parse_json(raw_data)\n data['fetched_on'] = fetched_on\n yield data\n\n logger.info(\"Fetch process completed\")"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfetches information about a repository.", "response": "def repository(self, owner, repository):\n \"\"\"Fetch information about a repository.\"\"\"\n\n url = urijoin(self.base_url, self.RREPOSITORY, owner, repository)\n\n logger.debug(\"DockerHub client requests: %s\", url)\n\n response = self.fetch(url)\n\n return response.text"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nmapping extra information for a custom field.", "response": "def map_custom_field(custom_fields, fields):\n \"\"\"Add extra information for custom fields.\n\n :param custom_fields: set of custom fields with the extra information\n :param fields: fields of the issue where to add the extra information\n\n :returns: an set of items with the extra information mapped\n \"\"\"\n def build_cf(cf, v):\n return {'id': cf['id'], 'name': cf['name'], 'value': v}\n\n return {\n k: build_cf(custom_fields[k], v)\n for k, v in fields.items()\n if k in custom_fields\n }"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef filter_custom_fields(fields):\n\n custom_fields = {}\n\n sorted_fields = [field for field in fields if field['custom'] is True]\n\n for custom_field in sorted_fields:\n custom_fields[custom_field['id']] = custom_field\n\n return custom_fields", "response": "Filter custom fields from a given set of fields."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fetch_items(self, category, **kwargs):\n from_date = kwargs['from_date']\n\n logger.info(\"Looking for issues at site '%s', in project '%s' and updated from '%s'\",\n self.url, self.project, str(from_date))\n\n whole_pages = self.client.get_issues(from_date)\n\n fields = json.loads(self.client.get_fields())\n custom_fields = filter_custom_fields(fields)\n\n for whole_page in whole_pages:\n issues = self.parse_issues(whole_page)\n for issue in issues:\n mapping = map_custom_field(custom_fields, issue['fields'])\n for k, v in mapping.items():\n issue['fields'][k] = v\n\n comments_data = self.__get_issue_comments(issue['id'])\n issue['comments_data'] = comments_data\n\n yield issue", "response": "Fetch the issues and items from the backend"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef parse_issues(raw_page):\n raw_issues = json.loads(raw_page)\n issues = raw_issues['issues']\n for issue in issues:\n yield issue", "response": "Parse a JIRA API raw response."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_items(self, from_date, url, expand_fields=True):\n start_at = 0\n\n req = self.fetch(url, payload=self.__build_payload(start_at, from_date, expand_fields))\n issues = req.text\n\n data = req.json()\n titems = data['total']\n nitems = data['maxResults']\n\n start_at += min(nitems, titems)\n self.__log_status(start_at, titems, url)\n\n while issues:\n yield issues\n issues = None\n\n if data['startAt'] + nitems < titems:\n req = self.fetch(url, payload=self.__build_payload(start_at, from_date, expand_fields))\n\n data = req.json()\n start_at += nitems\n issues = req.text\n self.__log_status(start_at, titems, url)", "response": "Retrieve all the items from a given date."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_issues(self, from_date):\n url = urijoin(self.base_url, self.RESOURCE, self.VERSION_API, 'search')\n issues = self.get_items(from_date, url)\n\n return issues", "response": "Retrieve all the issues from a given date."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_comments(self, issue_id):\n url = urijoin(self.base_url, self.RESOURCE, self.VERSION_API, self.ISSUE, issue_id, self.COMMENT)\n comments = self.get_items(DEFAULT_DATETIME, url, expand_fields=False)\n\n return comments", "response": "Retrieve all the comments of a given issue."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nretrieve all the fields available.", "response": "def get_fields(self):\n \"\"\"Retrieve all the fields available.\"\"\"\n\n url = urijoin(self.base_url, self.RESOURCE, self.VERSION_API, 'field')\n req = self.fetch(url)\n\n return req.text"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nfetch the builds from Jenkins url.", "response": "def fetch(self, category=CATEGORY_BUILD):\n \"\"\"Fetch the builds from the url.\n\n The method retrieves, from a Jenkins url, the\n builds updated since the given date.\n\n :param category: the category of items to fetch\n\n :returns: a generator of builds\n \"\"\"\n\n kwargs = {}\n items = super().fetch(category, **kwargs)\n\n return items"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nfetch the contents of the items in the specified category.", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the contents\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n logger.info(\"Looking for projects at url '%s'\", self.url)\n\n nbuilds = 0 # number of builds processed\n njobs = 0 # number of jobs processed\n\n projects = json.loads(self.client.get_jobs())\n jobs = projects['jobs']\n\n for job in jobs:\n logger.debug(\"Adding builds from %s (%i/%i)\",\n job['url'], njobs, len(jobs))\n\n try:\n raw_builds = self.client.get_builds(job['name'])\n except requests.exceptions.HTTPError as e:\n if e.response.status_code == 500:\n logger.warning(e)\n logger.warning(\"Unable to fetch builds from job %s; skipping\",\n job['url'])\n continue\n else:\n raise e\n\n if not raw_builds:\n continue\n\n try:\n builds = json.loads(raw_builds)\n except ValueError:\n logger.warning(\"Unable to parse builds from job %s; skipping\",\n job['url'])\n continue\n\n builds = builds['builds']\n for build in builds:\n yield build\n nbuilds += 1\n\n njobs += 1\n\n logger.info(\"Total number of jobs: %i/%i\", njobs, len(jobs))\n logger.info(\"Total number of builds: %i\", nbuilds)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nretrieve all jobs from Jenkins", "response": "def get_jobs(self):\n \"\"\" Retrieve all jobs\"\"\"\n\n url_jenkins = urijoin(self.base_url, \"api\", \"json\")\n\n response = self.fetch(url_jenkins)\n return response.text"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nretrieving all builds from a job", "response": "def get_builds(self, job_name):\n \"\"\" Retrieve all builds from a job\"\"\"\n\n if self.blacklist_jobs and job_name in self.blacklist_jobs:\n logger.warning(\"Not getting blacklisted job: %s\", job_name)\n return\n\n payload = {'depth': self.detail_depth}\n url_build = urijoin(self.base_url, \"job\", job_name, \"api\", \"json\")\n\n response = self.fetch(url_build, payload=payload)\n return response.text"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfetches the questions at the given category and return a generator of items", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the questions\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n from_date = kwargs['from_date']\n\n logger.info(\"Looking for questions at site '%s', with tag '%s' and updated from '%s'\",\n self.site, self.tagged, str(from_date))\n\n whole_pages = self.client.get_questions(from_date)\n\n for whole_page in whole_pages:\n questions = self.parse_questions(whole_page)\n for question in questions:\n yield question"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nparses a StackExchange API raw response retrieving the items", "response": "def parse_questions(raw_page):\n \"\"\"Parse a StackExchange API raw response.\n\n The method parses the API response retrieving the\n questions from the received items\n\n :param items: items from where to parse the questions\n\n :returns: a generator of questions\n \"\"\"\n raw_questions = json.loads(raw_page)\n questions = raw_questions['items']\n for question in questions:\n yield question"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_questions(self, from_date):\n\n page = 1\n url = urijoin(self.base_url, self.VERSION_API, \"questions\")\n\n req = self.fetch(url, payload=self.__build_payload(page, from_date))\n questions = req.text\n\n data = req.json()\n tquestions = data['total']\n nquestions = data['page_size']\n\n self.__log_status(data['quota_remaining'],\n data['quota_max'],\n nquestions,\n tquestions)\n\n while questions:\n yield questions\n questions = None\n\n if data['has_more']:\n page += 1\n\n backoff = data.get('backoff', None)\n if backoff:\n logger.debug(\"Expensive query. Wait %s secs to send a new request\",\n backoff)\n time.sleep(float(backoff))\n\n req = self.fetch(url, payload=self.__build_payload(page, from_date))\n data = req.json()\n questions = req.text\n nquestions += data['page_size']\n self.__log_status(data['quota_remaining'],\n data['quota_max'],\n nquestions,\n tquestions)", "response": "Retrieve all the questions from a given date."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef sanitize_for_archive(url, headers, payload):\n if 'key' in payload:\n payload.pop('key')\n\n return url, headers, payload", "response": "Sanitize the payload of a HTTP request for storing or retrieving archived items."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef setup_cmd_parser(cls):\n\n parser = BackendCommandArgumentParser(cls.BACKEND.CATEGORIES,\n from_date=True,\n token_auth=True,\n archive=True)\n\n # StackExchange options\n group = parser.parser.add_argument_group('StackExchange arguments')\n group.add_argument('--site', dest='site',\n required=True,\n help=\"StackExchange site\")\n group.add_argument('--tagged', dest='tagged',\n help=\"filter items by question Tag\")\n group.add_argument('--max-questions', dest='max_questions',\n type=int, default=MAX_QUESTIONS,\n help=\"Maximum number of questions requested in the same query\")\n\n return parser", "response": "Returns the StackExchange argument parser."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nfetch the pages from a MediaWiki url.", "response": "def fetch(self, category=CATEGORY_PAGE, from_date=DEFAULT_DATETIME, reviews_api=False):\n \"\"\"Fetch the pages from the backend url.\n\n The method retrieves, from a MediaWiki url, the\n wiki pages.\n\n :param category: the category of items to fetch\n :param from_date: obtain pages updated since this date\n :param reviews_api: use the reviews API available in MediaWiki >= 1.27\n\n :returns: a generator of pages\n \"\"\"\n if from_date == DEFAULT_DATETIME:\n from_date = None\n else:\n from_date = datetime_to_utc(from_date)\n\n kwargs = {\"from_date\": from_date, \"reviews_api\": reviews_api}\n items = super().fetch(category, **kwargs)\n\n return items"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fetch_items(self, category, **kwargs):\n from_date = kwargs['from_date']\n reviews_api = kwargs['reviews_api']\n\n mediawiki_version = self.client.get_version()\n logger.info(\"MediaWiki version: %s\", mediawiki_version)\n\n if reviews_api:\n if ((mediawiki_version[0] == 1 and mediawiki_version[1] >= 27) or mediawiki_version[0] > 1):\n fetcher = self.__fetch_1_27(from_date)\n else:\n logger.warning(\"Reviews API only available in MediaWiki >= 1.27\")\n logger.warning(\"Using the Pages API instead\")\n fetcher = self.__fetch_pre1_27(from_date)\n else:\n fetcher = self.__fetch_pre1_27(from_date)\n\n for page_reviews in fetcher:\n yield page_reviews", "response": "Fetch the pages and items from the MediaWiki API."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting the max date in unixtime format from reviews.", "response": "def __get_max_date(self, reviews):\n \"\"\"\"Get the max date in unixtime format from reviews.\"\"\"\n max_ts = 0\n for review in reviews:\n ts = str_to_datetime(review['timestamp'])\n ts = datetime_to_utc(ts)\n if ts.timestamp() > max_ts:\n max_ts = ts.timestamp()\n return max_ts"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef __fetch_1_27(self, from_date=None):\n\n logger.info(\"Looking for pages at url '%s'\", self.url)\n\n npages = 0 # number of pages processed\n tpages = 0 # number of total pages\n pages_done = [] # pages already retrieved in reviews API\n\n namespaces_contents = self.__get_namespaces_contents()\n\n arvcontinue = '' # pagination for getting revisions and their pages\n while arvcontinue is not None:\n raw_pages = self.client.get_pages_from_allrevisions(namespaces_contents, from_date, arvcontinue)\n data_json = json.loads(raw_pages)\n arvcontinue = data_json['continue']['arvcontinue'] if 'continue' in data_json else None\n pages_json = data_json['query']['allrevisions']\n for page in pages_json:\n\n if page['pageid'] in pages_done:\n logger.debug(\"Page %s already processed; skipped\", page['pageid'])\n continue\n\n tpages += 1\n pages_done.append(page['pageid'])\n page_reviews = self.__get_page_reviews(page)\n\n if not page_reviews:\n logger.warning(\"Revisions not found in %s [page id: %s], page skipped\",\n page['title'], page['pageid'])\n continue\n\n yield page_reviews\n npages += 1\n\n logger.info(\"Total number of pages: %i, skipped %i\", tpages, tpages - npages)", "response": "Fetch the pages from the backend url for MediaWiki > = 1. 27."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nfetch the pages from the backend url.", "response": "def __fetch_pre1_27(self, from_date=None):\n \"\"\"Fetch the pages from the backend url.\n\n The method retrieves, from a MediaWiki url, the\n wiki pages.\n\n :returns: a generator of pages\n \"\"\"\n\n def fetch_incremental_changes(namespaces_contents):\n # Use recent changes API to get the pages from date\n npages = 0 # number of pages processed\n tpages = 0 # number of total pages\n pages_done = [] # pages already retrieved in reviews API\n\n rccontinue = ''\n hole_created = True # To detect that incremental is not complete\n while rccontinue is not None:\n raw_pages = self.client.get_recent_pages(namespaces_contents, rccontinue)\n data_json = json.loads(raw_pages)\n\n if 'query-continue' in data_json:\n # < 1.27\n rccontinue = data_json['query-continue']['recentchanges']['rccontinue']\n elif 'continue' in data_json:\n # >= 1.27\n rccontinue = data_json['continue']['rccontinue']\n else:\n rccontinue = None\n\n pages_json = data_json['query']['recentchanges']\n for page in pages_json:\n\n page_ts = dateutil.parser.parse(page['timestamp'])\n if from_date >= page_ts:\n # The rest of recent changes are older than from_date\n logger.debug(\"All recent changes newer than %s processed.\", from_date)\n rccontinue = None\n hole_created = False\n break\n\n if page['pageid'] in pages_done:\n logger.debug(\"Page %s already processed; skipped\", page['pageid'])\n continue\n\n tpages += 1\n pages_done.append(page['pageid'])\n page_reviews = self.__get_page_reviews(page)\n\n if not page_reviews:\n logger.warning(\"Revisions not found in %s [page id: %s], page skipped\",\n page['title'], page['pageid'])\n continue\n\n yield page_reviews\n npages += 1\n if hole_created:\n logger.error(\"Incremental update NOT completed. Hole in history created.\")\n logger.info(\"Total number of pages: %i, skipped %i\", tpages, tpages - npages)\n\n def fetch_all_pages(namespaces_contents):\n # Use get all pages API to get pages\n npages = 0 # number of pages processed\n tpages = 0 # number of total pages\n pages_done = [] # pages already retrieved in reviews API\n\n for ns in namespaces_contents:\n apcontinue = '' # pagination for getting pages\n logger.debug(\"Getting pages for namespace: %s\", ns)\n while apcontinue is not None:\n raw_pages = self.client.get_pages(ns, apcontinue)\n data_json = json.loads(raw_pages)\n if 'query-continue' in data_json:\n # < 1.27\n apcontinue = data_json['query-continue']['allpages']['apcontinue']\n elif 'continue' in data_json:\n # >= 1.27\n apcontinue = data_json['continue']['apcontinue']\n else:\n apcontinue = None\n pages_json = data_json['query']['allpages']\n for page in pages_json:\n\n if page['pageid'] in pages_done:\n logger.debug(\"Page %s already processed; skipped\", page['pageid'])\n continue\n\n tpages += 1\n pages_done.append(page['pageid'])\n page_reviews = self.__get_page_reviews(page)\n\n if not page_reviews:\n logger.warning(\"Revisions not found in %s [page id: %s], page skipped\",\n page['title'], page['pageid'])\n continue\n\n yield page_reviews\n npages += 1\n logger.info(\"Total number of pages: %i, skipped %i\", tpages, tpages - npages)\n\n logger.info(\"Looking for pages at url '%s'\", self.url)\n\n # from_date can not be older than MAX_RECENT_DAYS days ago\n if from_date:\n if (datetime_utcnow() - from_date).days >= MAX_RECENT_DAYS:\n cause = \"Can't get incremental pages older than %i days.\" % MAX_RECENT_DAYS\n cause += \" Do a complete analysis without from_date for older changes.\"\n raise BackendError(cause=cause)\n\n namespaces_contents = self.__get_namespaces_contents()\n\n if not from_date:\n return fetch_all_pages(namespaces_contents)\n else:\n return fetch_incremental_changes(namespaces_contents)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nruns an API command.", "response": "def call(self, params):\n \"\"\"Run an API command.\n :param cgi: cgi command to run on the server\n :param params: dict with the HTTP parameters needed to run\n the given command\n \"\"\"\n logger.debug(\"MediaWiki client calls API: %s params: %s\",\n self.base_url, str(params))\n\n req = self.fetch(self.base_url, payload=params)\n return req.text"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nretrieving all pages from a namespace starting from apcontinue.", "response": "def get_pages(self, namespace, apcontinue=''):\n \"\"\"Retrieve all pages from a namespace starting from apcontinue.\"\"\"\n params = {\n \"action\": \"query\",\n \"list\": \"allpages\",\n \"aplimit\": self.limit,\n \"apnamespace\": namespace,\n \"format\": \"json\"\n }\n if apcontinue:\n params['apcontinue'] = apcontinue\n\n return self.call(params)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_recent_pages(self, namespaces, rccontinue=''):\n\n namespaces.sort()\n params = {\n \"action\": \"query\",\n \"list\": \"recentchanges\",\n \"rclimit\": self.limit,\n \"rcnamespace\": \"|\".join(namespaces),\n \"rcprop\": \"title|timestamp|ids\",\n \"format\": \"json\"\n }\n if rccontinue:\n params['rccontinue'] = rccontinue\n\n return self.call(params)", "response": "Retrieve recent pages from all namespaces starting from rccontinue."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef fetch(self, category=CATEGORY_MESSAGE, offset=DEFAULT_OFFSET, chats=None):\n if not offset:\n offset = DEFAULT_OFFSET\n\n kwargs = {\"offset\": offset, \"chats\": chats}\n items = super().fetch(category, **kwargs)\n\n return items", "response": "Fetch the messages from the server."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fetch_items(self, category, **kwargs):\n offset = kwargs['offset']\n chats = kwargs['chats']\n\n logger.info(\"Looking for messages of '%s' bot from offset '%s'\",\n self.bot, offset)\n\n if chats is not None:\n if len(chats) == 0:\n logger.warning(\"Chat list filter is empty. No messages will be returned\")\n else:\n logger.info(\"Messages which belong to chats %s will be fetched\",\n '[' + ','.join(str(ch_id) for ch_id in chats) + ']')\n\n nmsgs = 0\n\n while True:\n raw_json = self.client.updates(offset=offset)\n messages = [msg for msg in self.parse_messages(raw_json)]\n\n if len(messages) == 0:\n break\n\n for msg in messages:\n offset = max(msg['update_id'], offset)\n\n if not self._filter_message_by_chats(msg, chats):\n logger.debug(\"Message %s does not belong to any chat; filtered\",\n msg['message']['message_id'])\n continue\n\n yield msg\n nmsgs += 1\n\n offset += 1\n\n logger.info(\"Fetch process completed: %s messages fetched\",\n nmsgs)", "response": "Fetch the items of a specific category"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef parse_messages(raw_json):\n result = json.loads(raw_json)\n\n messages = result['result']\n for msg in messages:\n yield msg", "response": "Parse a Telegram JSON messages list."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _init_client(self, from_archive=False):\n\n return TelegramBotClient(self.bot_token, self.archive, from_archive)", "response": "Initialize a TelegramBotClient object."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck if a message can be filtered based in a list of chats.", "response": "def _filter_message_by_chats(self, message, chats):\n \"\"\"Check if a message can be filtered based in a list of chats.\n\n This method returns `True` when the message was sent to a chat\n of the given list. It also returns `True` when chats is `None`.\n\n :param message: Telegram message\n :param chats: list of chat, groups and channels identifiers\n\n :returns: `True` when the message can be filtered; otherwise,\n it returns `False`\n \"\"\"\n if chats is None:\n return True\n\n chat_id = message['message']['chat']['id']\n\n return chat_id in chats"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nfetch the messages that a bot can read.", "response": "def updates(self, offset=None):\n \"\"\"Fetch the messages that a bot can read.\n\n When the `offset` is given it will retrieve all the messages\n that are greater or equal to that offset. Take into account\n that, due to how the API works, all previous messages will\n be removed from the server.\n\n :param offset: fetch the messages starting on this offset\n \"\"\"\n params = {}\n\n if offset:\n params[self.OFFSET] = offset\n\n response = self._call(self.UPDATES_METHOD, params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nsanitize the url of a HTTP request by removing the token information before storing or retrieving archived items", "response": "def sanitize_for_archive(url, headers, payload):\n \"\"\"Sanitize URL of a HTTP request by removing the token information\n before storing/retrieving archived items\n\n :param: url: HTTP url request\n :param: headers: HTTP headers request\n :param: payload: HTTP payload request\n\n :returns the sanitized url, plus the headers and payload\n \"\"\"\n url = re.sub('bot.*/', 'botXXXXX/', url)\n\n return url, headers, payload"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncall the given resource.", "response": "def _call(self, method, params):\n \"\"\"Retrive the given resource.\n\n :param resource: resource to retrieve\n :param params: dict with the HTTP parameters needed to retrieve\n the given resource\n \"\"\"\n url = self.base_url % {'token': self.bot_token, 'method': method}\n\n logger.debug(\"Telegram bot calls method: %s params: %s\",\n method, str(params))\n\n r = self.fetch(url, payload=params)\n\n return r.text"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef fetch_items(self, category, **kwargs):\n offset = kwargs['offset']\n\n logger.info(\"Fetching articles of '%s' group on '%s' offset %s\",\n self.group, self.host, str(offset))\n\n narts, iarts, tarts = (0, 0, 0)\n\n _, _, first, last, _ = self.client.group(self.group)\n\n if offset <= last:\n first = max(first, offset)\n _, overview = self.client.over((first, last))\n else:\n overview = []\n\n tarts = len(overview)\n\n logger.debug(\"Total number of articles to fetch: %s\", tarts)\n\n for article_id, _ in overview:\n try:\n article_raw = self.client.article(article_id)\n article = self.__parse_article(article_raw)\n except ParseError:\n logger.warning(\"Error parsing %s article; skipping\",\n article_id)\n iarts += 1\n continue\n except nntplib.NNTPTemporaryError as e:\n logger.warning(\"Error '%s' fetching article %s; skipping\",\n e.response, article_id)\n iarts += 1\n continue\n\n yield article\n narts += 1", "response": "Fetch the items of a specific category"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef metadata(self, item, filter_classified=False):\n item = super().metadata(item, filter_classified=filter_classified)\n item['offset'] = item['data']['offset']\n\n return item", "response": "NNTP metadata. metadata decorator"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef parse_article(raw_article):\n try:\n message = email.message_from_string(raw_article)\n article = message_to_dict(message)\n except UnicodeEncodeError as e:\n raise ParseError(cause=str(e))\n return article", "response": "Parses a NNTP article stored in a string object\n and returns a dictionary of type requests. structures. CaseInsensitiveDict"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _fetch(self, method, args):\n if self.from_archive:\n data = self._fetch_from_archive(method, args)\n else:\n data = self._fetch_from_remote(method, args)\n\n return data", "response": "Fetch NNTP data from the server or from the remote server."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nfetch the data from the API.", "response": "def _fetch_article(self, article_id):\n \"\"\"Fetch article data\n\n :param article_id: id of the article to fetch\n \"\"\"\n fetched_data = self.handler.article(article_id)\n data = {\n 'number': fetched_data[1].number,\n 'message_id': fetched_data[1].message_id,\n 'lines': fetched_data[1].lines\n }\n\n return data"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _fetch_from_remote(self, method, args):\n try:\n if method == NNTTPClient.GROUP:\n data = self.handler.group(args)\n elif method == NNTTPClient.OVER:\n data = self.handler.over(args)\n elif method == NNTTPClient.ARTICLE:\n data = self._fetch_article(args)\n except nntplib.NNTPTemporaryError as e:\n data = e\n raise e\n finally:\n if self.archive:\n self.archive.store(method, args, None, data)\n\n return data", "response": "Fetch data from NNTP and store it in the archive."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfetches data from the archive", "response": "def _fetch_from_archive(self, method, args):\n \"\"\"Fetch data from the archive\n\n :param method: the name of the command to execute\n :param args: the arguments required by the command\n \"\"\"\n if not self.archive:\n raise ArchiveError(cause=\"Archive not provided\")\n\n data = self.archive.retrieve(method, args, None)\n\n if isinstance(data, nntplib.NNTPTemporaryError):\n raise data\n\n return data"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfetching the data from a given URL.", "response": "def fetch(self, url, payload=None, headers=None, method=GET, stream=False, verify=True):\n \"\"\"Fetch the data from a given URL.\n\n :param url: link to the resource\n :param payload: payload of the request\n :param headers: headers of the request\n :param method: type of request call (GET or POST)\n :param stream: defer downloading the response body until the response content is available\n :param verify: verifying the SSL certificate\n\n :returns a response object\n \"\"\"\n if self.from_archive:\n response = self._fetch_from_archive(url, payload, headers)\n else:\n response = self._fetch_from_remote(url, payload, headers, method, stream, verify)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncreate a http session and initialize the retry object.", "response": "def _create_http_session(self):\n \"\"\"Create a http session and initialize the retry object.\"\"\"\n\n self.session = requests.Session()\n\n if self.headers:\n self.session.headers.update(self.headers)\n\n retries = urllib3.util.Retry(total=self.max_retries,\n connect=self.max_retries_on_connect,\n read=self.max_retries_on_read,\n redirect=self.max_retries_on_redirect,\n status=self.max_retries_on_status,\n method_whitelist=self.method_whitelist,\n status_forcelist=self.status_forcelist,\n backoff_factor=self.sleep_time,\n raise_on_redirect=self.raise_on_redirect,\n raise_on_status=self.raise_on_status,\n respect_retry_after_header=self.respect_retry_after_header)\n\n self.session.mount('http://', requests.adapters.HTTPAdapter(max_retries=retries))\n self.session.mount('https://', requests.adapters.HTTPAdapter(max_retries=retries))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef setup_rate_limit_handler(self, sleep_for_rate=False, min_rate_to_sleep=MIN_RATE_LIMIT,\n rate_limit_header=RATE_LIMIT_HEADER,\n rate_limit_reset_header=RATE_LIMIT_RESET_HEADER):\n \"\"\"Setup the rate limit handler.\n\n :param sleep_for_rate: sleep until rate limit is reset\n :param min_rate_to_sleep: minimun rate needed to make the fecthing process sleep\n :param rate_limit_header: header from where extract the rate limit data\n :param rate_limit_reset_header: header from where extract the rate limit reset data\n \"\"\"\n self.rate_limit = None\n self.rate_limit_reset_ts = None\n self.sleep_for_rate = sleep_for_rate\n self.rate_limit_header = rate_limit_header\n self.rate_limit_reset_header = rate_limit_reset_header\n\n if min_rate_to_sleep > self.MAX_RATE_LIMIT:\n msg = \"Minimum rate to sleep value exceeded (%d).\"\n msg += \"High values might cause the client to sleep forever.\"\n msg += \"Reset to %d.\"\n self.min_rate_to_sleep = self.MAX_RATE_LIMIT\n logger.warning(msg, min_rate_to_sleep, self.MAX_RATE_LIMIT)\n else:\n self.min_rate_to_sleep = min_rate_to_sleep", "response": "Setup the rate limit handler."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef sleep_for_rate_limit(self):\n if self.rate_limit is not None and self.rate_limit <= self.min_rate_to_sleep:\n seconds_to_reset = self.calculate_time_to_reset()\n\n if seconds_to_reset < 0:\n logger.warning(\"Value of sleep for rate limit is negative, reset it to 0\")\n seconds_to_reset = 0\n\n cause = \"Rate limit exhausted.\"\n if self.sleep_for_rate:\n logger.info(\"%s Waiting %i secs for rate limit reset.\", cause, seconds_to_reset)\n time.sleep(seconds_to_reset)\n else:\n raise RateLimitError(cause=cause, seconds_to_reset=seconds_to_reset)", "response": "The fetching process sleeps until the rate limit is reached."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nupdates the rate limit and time to reset the rate limit.", "response": "def update_rate_limit(self, response):\n \"\"\"Update the rate limit and the time to reset\n from the response headers.\n\n :param: response: the response object\n \"\"\"\n if self.rate_limit_header in response.headers:\n self.rate_limit = int(response.headers[self.rate_limit_header])\n logger.debug(\"Rate limit: %s\", self.rate_limit)\n else:\n self.rate_limit = None\n\n if self.rate_limit_reset_header in response.headers:\n self.rate_limit_reset_ts = int(response.headers[self.rate_limit_reset_header])\n logger.debug(\"Rate limit reset: %s\", self.calculate_time_to_reset())\n else:\n self.rate_limit_reset_ts = None"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fetch_items(self, category, **kwargs):\n from_date = kwargs['from_date']\n\n logger.info(\"Fetching messages of '%s' from %s\",\n self.uri, str(from_date))\n\n nmessages = 0\n archives = self.__retrieve_archives(from_date)\n\n for archive in archives:\n logger.debug(\"Parsing supybot archive %s\", archive)\n\n for message in self.parse_supybot_log(archive):\n dt = str_to_datetime(message['timestamp'])\n\n if dt < from_date:\n logger.debug(\"Message %s sent before %s; skipped\",\n str(dt), str(from_date))\n continue\n\n yield message\n nmessages += 1\n\n logger.info(\"Fetch process completed: %s messages fetched\",\n nmessages)", "response": "Fetch the items of the given category from the given date."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nparses a Supybot IRC log file and return an iterator of dictionaries. Each dictionary contains a message for each entry in the file. Each dictionary contains a message for each entry in the file. Each dictionary contains a message for each entry in the file.", "response": "def parse_supybot_log(filepath):\n \"\"\"Parse a Supybot IRC log file.\n\n The method parses the Supybot IRC log file and returns an iterator of\n dictionaries. Each one of this, contains a message from the file.\n\n :param filepath: path to the IRC log file\n\n :returns: a generator of parsed messages\n\n :raises ParseError: raised when the format of the Supybot log file\n is invalid\n :raises OSError: raised when an error occurs reading the\n given file\n \"\"\"\n with open(filepath, 'r', errors='surrogateescape',\n newline=os.linesep) as f:\n parser = SupybotParser(f)\n\n try:\n for message in parser.parse():\n yield message\n except ParseError as e:\n cause = \"file: %s; reason: %s\" % (filepath, str(e))\n raise ParseError(cause=cause)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef __retrieve_archives(self, from_date):\n\n archives = []\n\n candidates = self.__list_supybot_archives()\n\n for candidate in candidates:\n dt = self.__parse_date_from_filepath(candidate)\n\n if dt.date() >= from_date.date():\n archives.append((dt, candidate))\n else:\n logger.debug(\"Archive %s stored before %s; skipped\",\n candidate, str(from_date))\n\n archives.sort(key=lambda x: x[0])\n\n return [archive[1] for archive in archives]", "response": "Retrieve the Supybot archives after the given date"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nlist the filepath of the archives stored in dirpath", "response": "def __list_supybot_archives(self):\n \"\"\"List the filepath of the archives stored in dirpath\"\"\"\n\n archives = []\n\n for root, _, files in os.walk(self.dirpath):\n for filename in files:\n location = os.path.join(root, filename)\n archives.append(location)\n\n return archives"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef parse(self):\n for line in self.stream:\n line = line.rstrip('\\n')\n self.nline += 1\n\n if self.SUPYBOT_EMPTY_REGEX.match(line):\n continue\n\n ts, msg = self._parse_supybot_timestamp(line)\n\n if self.SUPYBOT_EMPTY_COMMENT_REGEX.match(msg):\n continue\n elif self.SUPYBOT_EMPTY_COMMENT_ACTION_REGEX.match(msg):\n continue\n elif self.SUPYBOT_EMPTY_BOT_REGEX.match(msg):\n continue\n\n itype, nick, body = self._parse_supybot_msg(msg)\n item = self._build_item(ts, itype, nick, body)\n\n yield item", "response": "Parse a Supybot IRC stream and return an iterator of dicts. Each dictionary contains information about the date type nick and body of a single log entry."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _parse_supybot_timestamp(self, line):\n\n m = self.SUPYBOT_TIMESTAMP_REGEX.match(line)\n\n if not m:\n msg = \"date expected on line %s\" % (str(self.nline))\n raise ParseError(cause=msg)\n\n ts = m.group('ts')\n msg = m.group('msg')\n\n return ts, msg", "response": "Parse the supybot timestamp section."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nparses the supybot message section.", "response": "def _parse_supybot_msg(self, line):\n \"\"\"Parse message section\"\"\"\n\n patterns = [(self.SUPYBOT_COMMENT_REGEX, self.TCOMMENT),\n (self.SUPYBOT_COMMENT_ACTION_REGEX, self.TCOMMENT),\n (self.SUPYBOT_SERVER_REGEX, self.TSERVER),\n (self.SUPYBOT_BOT_REGEX, self.TCOMMENT)]\n\n for p in patterns:\n m = p[0].match(line)\n if not m:\n continue\n return p[1], m.group('nick'), m.group('body').strip()\n\n msg = \"invalid message on line %s\" % (str(self.nline))\n raise ParseError(cause=msg)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef fetch_items(self, category, **kwargs):\n\n from_date = kwargs['from_date']\n\n logger.info(\"Looking for topics at '%s', updated from '%s'\",\n self.url, str(from_date))\n\n ntopics = 0\n\n topics_ids = self.__fetch_and_parse_topics_ids(from_date)\n\n for topic_id in topics_ids:\n topic = self.__fetch_and_parse_topic(topic_id)\n ntopics += 1\n yield topic\n\n logger.info(\"Fetch process completed: %s topics fetched\",\n ntopics)", "response": "Fetch the items from the backend"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef __parse_topics_page(self, raw_json):\n topics_page = json.loads(raw_json)\n\n topics_ids = []\n\n for topic in topics_page['topic_list']['topics']:\n topic_id = topic['id']\n if topic['last_posted_at'] is None:\n logger.warning(\"Topic %s with last_posted_at null. Ignoring it.\", topic['title'])\n continue\n updated_at = str_to_datetime(topic['last_posted_at'])\n pinned = topic['pinned']\n topics_ids.append((topic_id, updated_at, pinned))\n\n return topics_ids", "response": "Parse a topics page stream."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nretrieve the #page summaries of the latest topics.", "response": "def topics_page(self, page=None):\n \"\"\"Retrieve the #page summaries of the latest topics.\n\n :param page: number of page to retrieve\n \"\"\"\n params = {\n self.PKEY: self.api_key,\n self.PPAGE: page\n }\n\n # http://example.com/latest.json\n response = self._call(self.ALL_TOPICS, self.TOPICS_SUMMARY,\n params=params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef topic(self, topic_id):\n params = {\n self.PKEY: self.api_key\n }\n\n # http://example.com/t/8.json\n response = self._call(self.TOPIC, topic_id,\n params=params)\n\n return response", "response": "Retrive the topic with topic_id identifier."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef post(self, post_id):\n params = {\n self.PKEY: self.api_key\n }\n\n # http://example.com/posts/10.json\n response = self._call(self.POSTS, post_id,\n params=params)\n\n return response", "response": "Retrieve the post whit post_id identifier."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nsanitize the payload of a HTTP request for storing or retrieving archived items.", "response": "def sanitize_for_archive(url, headers, payload):\n \"\"\"Sanitize payload of a HTTP request by removing the token information\n before storing/retrieving archived items\n\n :param: url: HTTP url request\n :param: headers: HTTP headers request\n :param: payload: HTTP payload request\n\n :returns url, headers and the sanitized payload\n \"\"\"\n if DiscourseClient.PKEY in payload:\n payload.pop(DiscourseClient.PKEY)\n\n return url, headers, payload"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nruns an API command.", "response": "def _call(self, res, res_id, params):\n \"\"\"Run an API command.\n\n :param res: type of resource to fetch\n :param res_id: identifier of the resource\n :param params: dict with the HTTP parameters needed to run\n the given command\n \"\"\"\n if res:\n url = urijoin(self.base_url, res, res_id)\n else:\n url = urijoin(self.base_url, res_id)\n url += self.TJSON\n\n logger.debug(\"Discourse client calls resource: %s %s params: %s\",\n res, res_id, str(params))\n\n r = self.fetch(url, payload=params)\n return r.text"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nfetches the items from the backend", "response": "def fetch_items(self, category, **kwargs):\n \"\"\"Fetch the tasks\n\n :param category: the category of items to fetch\n :param kwargs: backend arguments\n\n :returns: a generator of items\n \"\"\"\n from_date = kwargs['from_date']\n\n logger.info(\"Fetching tasks of '%s' from %s\", self.url, str(from_date))\n\n ntasks = 0\n\n for task in self.__fetch_tasks(from_date):\n yield task\n ntasks += 1\n\n logger.info(\"Fetch process completed: %s tasks fetched\", ntasks)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nparse a Phabricator tasks JSON stream and return a list iterator.", "response": "def parse_tasks(raw_json):\n \"\"\"Parse a Phabricator tasks JSON stream.\n\n The method parses a JSON stream and returns a list iterator.\n Each item is a dictionary that contains the task parsed data.\n\n :param raw_json: JSON string to parse\n\n :returns: a generator of parsed tasks\n \"\"\"\n results = json.loads(raw_json)\n\n tasks = results['result']['data']\n for t in tasks:\n yield t"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef parse_users(raw_json):\n results = json.loads(raw_json)\n\n users = results['result']\n for u in users:\n yield u", "response": "Parse a Phabricator users JSON stream and return a list iterator."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef tasks(self, from_date=DEFAULT_DATETIME):\n # Convert 'from_date' to epoch timestamp.\n # Zero value (1970-01-01 00:00:00) is not allowed for\n # 'modifiedStart' so it will be set to 1, by default.\n ts = int(datetime_to_utc(from_date).timestamp()) or 1\n\n consts = {\n self.PMODIFIED_START: ts\n }\n\n attachments = {\n self. PPROJECTS: True\n }\n\n params = {\n self.PCONSTRAINTS: consts,\n self.PATTACHMENTS: attachments,\n self.PORDER: self.VOUTDATED,\n }\n\n while True:\n r = self._call(self.MANIPHEST_TASKS, params)\n yield r\n j = json.loads(r)\n after = j['result']['cursor']['after']\n if not after:\n break\n params[self.PAFTER] = after", "response": "Retrieve all tasks in a specific date."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nretrieves tasks transactions. :param phids: list of tasks identifiers", "response": "def transactions(self, *phids):\n \"\"\"Retrieve tasks transactions.\n\n :param phids: list of tasks identifiers\n \"\"\"\n params = {\n self.PIDS: phids\n }\n\n response = self._call(self.MANIPHEST_TRANSACTIONS, params)\n\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef users(self, *phids):\n params = {\n self.PHIDS: phids\n }\n\n response = self._call(self.PHAB_USERS, params)\n\n return response", "response": "Retrieve users.\n\n :params phids: list of users identifiers"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef phids(self, *phids):\n params = {\n self.PHIDS: phids\n }\n\n response = self._call(self.PHAB_PHIDS, params)\n\n return response", "response": "Retrieve data about PHIDs."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsanitize the payload of a HTTP request by removing the token information before storing or retrieving archived items", "response": "def sanitize_for_archive(url, headers, payload):\n \"\"\"Sanitize payload of a HTTP request by removing the token information\n before storing/retrieving archived items\n\n :param: url: HTTP url request\n :param: headers: HTTP headers request\n :param: payload: HTTP payload request\n\n :returns url, headers and the sanitized payload\n \"\"\"\n if '__conduit__' in payload['params']:\n params = json.loads(payload['params'])\n params.pop('__conduit__')\n payload['params'] = json.dumps(params, sort_keys=True)\n\n return url, headers, payload"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncalling a method. :param method: method to call :param params: dict with the HTTP parameters needed to call the given method :raises ConduitError: when an error is returned by the server", "response": "def _call(self, method, params):\n \"\"\"Call a method.\n\n :param method: method to call\n :param params: dict with the HTTP parameters needed to call\n the given method\n\n :raises ConduitError: when an error is returned by the server\n \"\"\"\n url = self.URL % {'base': self.base_url, 'method': method}\n\n # Conduit and POST parameters\n params['__conduit__'] = {'token': self.api_token}\n\n data = {\n 'params': json.dumps(params, sort_keys=True),\n 'output': 'json',\n '__conduit__': True\n }\n\n logger.debug(\"Phabricator Conduit client requests: %s params: %s\",\n method, str(data))\n\n r = self.fetch(url, payload=data, method=HttpClient.POST, verify=False)\n\n # Check for possible Conduit API errors\n result = r.json()\n\n if result['error_code']:\n raise ConduitError(error=result['error_info'],\n code=result['error_code'])\n\n return r.text"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef fetch_items(self, category, **kwargs):\n\n from_date = kwargs['from_date']\n\n logger.info(\"Fetching historical contents of '%s' from %s\",\n self.url, str(from_date))\n\n nhcs = 0\n\n contents = self.__fetch_contents_summary(from_date)\n contents = [content for content in contents]\n\n for content in contents:\n cid = content['id']\n content_url = urijoin(self.origin, content['_links']['webui'])\n\n hcs = self.__fetch_historical_contents(cid, from_date)\n\n for hc in hcs:\n hc['content_url'] = content_url\n hc['ancestors'] = content.get('ancestors', [])\n\n yield hc\n nhcs += 1\n\n logger.info(\"Fetch process completed: %s historical contents fetched\",\n nhcs)", "response": "Fetch the contents of the items in the given category"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef metadata_id(item):\n cid = item['id']\n cversion = item['version']['number']\n\n return str(cid) + '#v' + str(cversion)", "response": "Extracts the identifier from a Confluence item."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef parse_contents_summary(raw_json):\n summary = json.loads(raw_json)\n\n contents = summary['results']\n for c in contents:\n yield c", "response": "Parse a Confluence summary JSON list."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef contents(self, from_date=DEFAULT_DATETIME,\n offset=None, max_contents=MAX_CONTENTS):\n \"\"\"Get the contents of a repository.\n\n This method returns an iterator that manages the pagination\n over contents. Take into account that the seconds of `from_date`\n parameter will be ignored because the API only works with\n hours and minutes.\n\n :param from_date: fetch the contents updated since this date\n :param offset: fetch the contents starting from this offset\n :param limit: maximum number of contents to fetch per request\n \"\"\"\n resource = self.RCONTENTS + '/' + self.MSEARCH\n\n # Set confluence query parameter (cql)\n date = from_date.strftime(\"%Y-%m-%d %H:%M\")\n cql = self.VCQL % {'date': date}\n\n # Set parameters\n params = {\n self.PCQL: cql,\n self.PLIMIT: max_contents,\n self.PEXPAND: self.PANCESTORS\n }\n\n if offset:\n params[self.PSTART] = offset\n\n for response in self._call(resource, params):\n yield response", "response": "Get the contents of a repository."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef historical_content(self, content_id, version):\n resource = self.RCONTENTS + '/' + str(content_id)\n\n params = {\n self.PVERSION: version,\n self.PSTATUS: self.VHISTORICAL,\n self.PEXPAND: ','.join(self.VEXPAND)\n }\n\n # Only one item is returned\n response = [response for response in self._call(resource, params)]\n return response[0]", "response": "Get the historical content for the given content_id and version."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nretrieves the given resource.", "response": "def _call(self, resource, params):\n \"\"\"Retrive the given resource.\n\n :param resource: resource to retrieve\n :param params: dict with the HTTP parameters needed to retrieve\n the given resource\n \"\"\"\n url = self.URL % {'base': self.base_url, 'resource': resource}\n\n logger.debug(\"Confluence client requests: %s params: %s\",\n resource, str(params))\n\n while True:\n r = self.fetch(url, payload=params)\n yield r.text\n\n # Pagination is available when 'next' link exists\n j = r.json()\n if '_links' not in j:\n break\n if 'next' not in j['_links']:\n break\n\n url = urijoin(self.base_url, j['_links']['next'])\n params = {}"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _parse_result(self):\n ''' Parse the result property, extracting the value\n and unit of measure '''\n if self.result is not None:\n uom = testXMLAttribute(self.result, \"uom\")\n value_str = testXMLValue(self.result)\n try:\n value = float(value_str)\n except:\n raise ValueError(\"Error parsing measurement value\")\n self.result = Measurement(value, uom)", "response": "Parse the result property and extract the value\n and unit of measure"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn a capabilities url", "response": "def capabilities_url(self, service_url):\n \"\"\"Return a capabilities url\n \"\"\"\n qs = []\n if service_url.find('?') != -1:\n qs = cgi.parse_qsl(service_url.split('?')[1])\n\n params = [x[0] for x in qs]\n\n if 'service' not in params:\n qs.append(('service', 'WFS'))\n if 'request' not in params:\n qs.append(('request', 'GetCapabilities'))\n if 'version' not in params:\n qs.append(('version', self.version))\n\n urlqs = urlencode(tuple(qs))\n return service_url.split('?')[0] + '?' + urlqs"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget and parse a WFS capabilities document returning an ancestry instance of WFSCapabilitiesInfoset", "response": "def read(self, url, timeout=30):\n \"\"\"Get and parse a WFS capabilities document, returning an\n instance of WFSCapabilitiesInfoset\n\n Parameters\n ----------\n url : string\n The URL to the WFS capabilities document.\n timeout : number\n A timeout value (in seconds) for the request.\n \"\"\"\n request = self.capabilities_url(url)\n u = openURL(request, timeout=timeout,\n username=self.username, password=self.password)\n return etree.fromstring(u.read())"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nparses a WFS capabilities document returning an analyzed version of the XML capabilities document", "response": "def readString(self, st):\n \"\"\"Parse a WFS capabilities document, returning an\n instance of WFSCapabilitiesInfoset\n\n string should be an XML capabilities document\n \"\"\"\n if not isinstance(st, str) and not isinstance(st, bytes):\n raise ValueError(\"String must be of type string or bytes, not %s\" % type(st))\n return etree.fromstring(st)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nparsing the result element of the observation type", "response": "def _parse_result(self):\n ''' Parse the result element of the observation type '''\n if self.result is not None:\n result = self.result.find(nspv(\n \"wml2:MeasurementTimeseries\"))\n self.result = MeasurementTimeseries(result)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef conformance(self):\n\n url = self._build_url('conformance')\n LOGGER.debug('Request: {}'.format(url))\n response = requests.get(url, headers=REQUEST_HEADERS).json()\n return response", "response": "Returns the current conformance object."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the metadata for a feature collection", "response": "def collection(self, collection_name):\n \"\"\"\n implements Requirement 15 (/req/core/sfc-md-op)\n\n @type collection_name: string\n @param collection_name: name of collection\n\n @returns: feature collection metadata\n \"\"\"\n\n path = 'collections/{}'.format(collection_name)\n url = self._build_url(path)\n LOGGER.debug('Request: {}'.format(url))\n response = requests.get(url, headers=REQUEST_HEADERS).json()\n return response"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning a list of items in a collection", "response": "def collection_items(self, collection_name, **kwargs):\n \"\"\"\n implements Requirement 17 (/req/core/fc-op)\n\n @type collection_name: string\n @param collection_name: name of collection\n @type bbox: list\n @param bbox: list of minx,miny,maxx,maxy\n @type time: string\n @param time: time extent or time instant\n @type limit: int\n @param limit: limit number of features\n @type startindex: int\n @param startindex: start position of results\n\n @returns: feature results\n \"\"\"\n\n if 'bbox' in kwargs:\n kwargs['bbox'] = ','.join(kwargs['bbox'])\n\n path = 'collections/{}/items'.format(collection_name)\n url = self._build_url(path)\n LOGGER.debug('Request: {}'.format(url))\n response = requests.get(url, headers=REQUEST_HEADERS,\n params=kwargs).json()\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_schema(url, typename, version='1.0.0', timeout=30, username=None, password=None):\n\n url = _get_describefeaturetype_url(url, version, typename)\n res = openURL(url, timeout=timeout, username=username, password=password)\n root = etree.fromstring(res.read())\n\n if ':' in typename:\n typename = typename.split(':')[1]\n type_element = findall(root, '{%s}element' % XS_NAMESPACE,\n attribute_name='name', attribute_value=typename)[0]\n complex_type = type_element.attrib['type'].split(\":\")[1]\n elements = _get_elements(complex_type, root)\n nsmap = None\n if hasattr(root, 'nsmap'):\n nsmap = root.nsmap\n return _construct_schema(elements, nsmap)", "response": "Parses DescribeFeatureType response and creates schema compatible with Fiona"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget elements from XML element", "response": "def _get_elements(complex_type, root):\n \"\"\"Get attribute elements\n \"\"\"\n\n found_elements = []\n element = findall(root, '{%s}complexType' % XS_NAMESPACE,\n attribute_name='name', attribute_value=complex_type)[0]\n found_elements = findall(element, '{%s}element' % XS_NAMESPACE)\n\n return found_elements"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _construct_schema(elements, nsmap):\n\n schema = {\n 'properties': {},\n 'geometry': None\n }\n\n schema_key = None\n gml_key = None\n\n # if nsmap is defined, use it\n if nsmap:\n for key in nsmap:\n if nsmap[key] == XS_NAMESPACE:\n schema_key = key\n if nsmap[key] in GML_NAMESPACES:\n gml_key = key\n # if no nsmap is defined, we have to guess\n else:\n gml_key = 'gml'\n schema_key = 'xsd'\n\n mappings = {\n 'PointPropertyType': 'Point',\n 'PolygonPropertyType': 'Polygon',\n 'LineStringPropertyType': 'LineString',\n 'MultiPointPropertyType': 'MultiPoint',\n 'MultiLineStringPropertyType': 'MultiLineString',\n 'MultiPolygonPropertyType': 'MultiPolygon',\n 'MultiGeometryPropertyType': 'MultiGeometry',\n 'GeometryPropertyType': 'GeometryCollection',\n 'SurfacePropertyType': '3D Polygon',\n 'MultiSurfacePropertyType': '3D MultiPolygon'\n }\n\n for element in elements:\n data_type = element.attrib['type'].replace(gml_key + ':', '')\n name = element.attrib['name']\n\n if data_type in mappings:\n schema['geometry'] = mappings[data_type]\n schema['geometry_column'] = name\n else:\n schema['properties'][name] = data_type.replace(schema_key+':', '')\n\n if schema['properties'] or schema['geometry']:\n return schema\n else:\n return None", "response": "Construct a fiona schema based on given elements and nsmap."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets url for describefeaturetype request", "response": "def _get_describefeaturetype_url(url, version, typename):\n \"\"\"Get url for describefeaturetype request\n\n :return str: url\n \"\"\"\n\n query_string = []\n if url.find('?') != -1:\n query_string = cgi.parse_qsl(url.split('?')[1])\n\n params = [x[0] for x in query_string]\n\n if 'service' not in params:\n query_string.append(('service', 'WFS'))\n if 'request' not in params:\n query_string.append(('request', 'DescribeFeatureType'))\n if 'version' not in params:\n query_string.append(('version', version))\n\n query_string.append(('typeName', typename))\n\n urlqs = urlencode(tuple(query_string))\n return url.split('?')[0] + '?' + urlqs"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef complex_input_with_reference():\n \n print(\"\\ncomplex_input_with_reference ...\")\n\n wps = WebProcessingService('http://localhost:8094/wps', verbose=verbose)\n\n processid = 'wordcount'\n textdoc = ComplexDataInput(\"http://www.gutenberg.org/files/28885/28885-h/28885-h.htm\") # alice in wonderland\n inputs = [(\"text\", textdoc)]\n # list of tuple (output identifier, asReference attribute, mimeType attribute)\n # when asReference or mimeType is None - the wps service will use its default option\n outputs = [(\"output\",True,'some/mime-type')]\n\n execution = wps.execute(processid, inputs, output=outputs)\n monitorExecution(execution)\n\n # show status\n print('percent complete', execution.percentCompleted)\n print('status message', execution.statusMessage)\n\n for output in execution.processOutputs:\n print('identifier=%s, dataType=%s, data=%s, reference=%s' % (output.identifier, output.dataType, output.data, output.reference))", "response": "Use ComplexDataInput with a reference to a document\n "} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef movie_list(self, **kwargs):\n path = self._get_path('movie_list')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the list of Movie genres."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef tv_list(self, **kwargs):\n path = self._get_path('tv_list')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the list of TV genres."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef movies(self, **kwargs):\n path = self._get_id_path('movies')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the list of movies for a particular genre by id. By default only the movies with 10 or more votes are included."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the basic movie information for a specific movie id.", "response": "def info(self, **kwargs):\n \"\"\"\n Get the basic movie information for a specific movie id.\n\n Args:\n language: (optional) ISO 639-1 code.\n append_to_response: (optional) Comma separated, any movie method.\n\n Returns:\n A dict representation of the JSON returned from the API.\n \"\"\"\n path = self._get_id_path('info')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the alternative titles for a specific movie id.", "response": "def alternative_titles(self, **kwargs):\n \"\"\"\n Get the alternative titles for a specific movie id.\n\n Args:\n country: (optional) ISO 3166-1 code.\n append_to_response: (optional) Comma separated, any movie method.\n\n Returns:\n A dict representation of the JSON returned from the API.\n \"\"\"\n path = self._get_id_path('alternative_titles')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef credits(self, **kwargs):\n path = self._get_id_path('credits')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the cast and crew information for a specific movie id."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef external_ids(self, **kwargs):\n path = self._get_id_path('external_ids')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the external ids for a specific movie id."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef images(self, **kwargs):\n path = self._get_id_path('images')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the images for a specific movie id."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting the plot keywords for a specific movie id.", "response": "def keywords(self):\n \"\"\"\n Get the plot keywords for a specific movie id.\n\n Returns:\n A dict representation of the JSON returned from the API.\n \"\"\"\n path = self._get_id_path('keywords')\n\n response = self._GET(path)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef recommendations(self, **kwargs):\n path = self._get_id_path('recommendations')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get a list of recommended movies for a movie."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef release_dates(self, **kwargs):\n path = self._get_id_path('release_dates')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the release dates and certification for a specific movie id."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef releases(self, **kwargs):\n path = self._get_id_path('releases')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the release date and certification information by country for a specific movie id."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting the videos for a specific movie id.", "response": "def videos(self, **kwargs):\n \"\"\"\n Get the videos (trailers, teasers, clips, etc...) for a\n specific movie id.\n\n Args:\n append_to_response: (optional) Comma separated, any movie method.\n\n Returns:\n A dict representation of the JSON returned from the API.\n \"\"\"\n path = self._get_id_path('videos')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef translations(self, **kwargs):\n path = self._get_id_path('translations')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the translations for a specific movie id."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef similar_movies(self, **kwargs):\n path = self._get_id_path('similar_movies')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the similar movies for a specific movie id."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef reviews(self, **kwargs):\n path = self._get_id_path('reviews')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the reviews for a particular movie id."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef changes(self, **kwargs):\n path = self._get_id_path('changes')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the changes for a specific movie id."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting the list of upcoming movies. This list refreshes every day.", "response": "def upcoming(self, **kwargs):\n \"\"\"\n Get the list of upcoming movies. This list refreshes every day.\n The maximum number of items this list will include is 100.\n\n Args:\n page: (optional) Minimum value of 1. Expected value is an integer.\n language: (optional) ISO 639-1 code.\n\n Returns:\n A dict representation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('upcoming')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget the list of movies playing in theatres. This list refreshes the list every day.", "response": "def now_playing(self, **kwargs):\n \"\"\"\n Get the list of movies playing in theatres. This list refreshes\n every day. The maximum number of items this list will include is 100.\n\n Args:\n page: (optional) Minimum value of 1. Expected value is an integer.\n language: (optional) ISO 639-1 code.\n\n Returns:\n A dict representation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('now_playing')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the top rated movies.", "response": "def top_rated(self, **kwargs):\n \"\"\"\n Get the list of top rated movies. By default, this list will only\n include movies that have 10 or more votes. This list refreshes every\n day.\n\n Args:\n page: (optional) Minimum value of 1. Expected value is an integer.\n language: (optional) ISO 639-1 code.\n\n Returns:\n A dict representation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('top_rated')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef account_states(self, **kwargs):\n path = self._get_id_path('account_states')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "This method lets users get the status of the movie s has\n ."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef rating(self, **kwargs):\n path = self._get_id_path('rating')\n\n payload = {\n 'value': kwargs.pop('value', None),\n }\n\n response = self._POST(path, kwargs, payload)\n self._set_attrs_to_values(response)\n return response", "response": "This method lets users rate a movie. A valid session id or guest_session_id is required."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef movie_credits(self, **kwargs):\n path = self._get_id_path('movie_credits')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the movie credits for a specific person id."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef tv_credits(self, **kwargs):\n path = self._get_id_path('tv_credits')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the TV credits for a specific person id."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef combined_credits(self, **kwargs):\n path = self._get_id_path('combined_credits')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the combined credits for a specific person id."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef info(self, **kwargs):\n path = self._get_credit_id_path('info')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Returns the detailed information about a particular credit record."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a dict containing information about the TV of the specified page number of votes genres and network.", "response": "def tv(self, **kwargs):\n \"\"\"\n Discover TV shows by different types of data like average rating, \n number of votes, genres, the network they aired on and air dates.\n\n Args:\n page: (optional) Minimum 1, maximum 1000.\n language: (optional) ISO 639-1 code.\n sort_by: (optional) Available options are 'vote_average.desc', \n 'vote_average.asc', 'first_air_date.desc', \n 'first_air_date.asc', 'popularity.desc', 'popularity.asc'\n first_air_year: (optional) Filter the results release dates to \n matches that include this value. Expected value \n is a year.\n vote_count.gte or vote_count_gte: (optional) Only include TV shows \n that are equal to,\n or have vote count higher than this value. Expected\n value is an integer.\n vote_average.gte or vote_average_gte: (optional) Only include TV \n shows that are equal \n to, or have a higher average rating than this \n value. Expected value is a float.\n with_genres: (optional) Only include TV shows with the specified \n genres. Expected value is an integer (the id of a \n genre). Multiple valued can be specified. Comma \n separated indicates an 'AND' query, while a \n pipe (|) separated value indicates an 'OR'.\n with_networks: (optional) Filter TV shows to include a specific \n network. Expected value is an integer (the id of a\n network). They can be comma separated to indicate an\n 'AND' query.\n first_air_date.gte or first_air_date_gte: (optional) The minimum \n release to include. \n Expected format is 'YYYY-MM-DD'.\n first_air_date.lte or first_air_date_lte: (optional) The maximum \n release to include. \n Expected format is 'YYYY-MM-DD'.\n \n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n # Periods are not allowed in keyword arguments but several API \n # arguments contain periods. See both usages in tests/test_discover.py.\n for param in kwargs:\n if '_lte' in param:\n kwargs[param.replace('_lte', '.lte')] = kwargs.pop(param)\n if '_gte' in param:\n kwargs[param.replace('_gte', '.gte')] = kwargs.pop(param)\n \n path = self._get_path('tv')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef info(self, **kwargs):\n path = self._get_path('info')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the system wide configuration info."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef list(self, **kwargs):\n path = self._get_path('movie_list')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Returns the list of supported certifications for movies."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets basic information for an account.", "response": "def info(self, **kwargs):\n \"\"\"\n Get the basic information for an account.\n\n Call this method first, before calling other Account methods.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('info')\n kwargs.update({'session_id': self.session_id})\n\n response = self._GET(path, kwargs)\n self.id = response['id']\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef watchlist_movies(self, **kwargs):\n path = self._get_id_path('watchlist_movies')\n kwargs.update({'session_id': self.session_id})\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the list of movies on an account watchlist."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ngenerate a valid request token for user based authentication.", "response": "def token_new(self, **kwargs):\n \"\"\"\n Generate a valid request token for user based authentication.\n\n A request token is required to ask the user for permission to\n access their account.\n\n After obtaining the request_token, either:\n (1) Direct your user to:\n https://www.themoviedb.org/authenticate/REQUEST_TOKEN\n or:\n (2) Call token_validate_with_login() below.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('token_new')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nauthenticating a user with a TMDb username and password and return a dictionary that contains the JSON response from the API.", "response": "def token_validate_with_login(self, **kwargs):\n \"\"\"\n Authenticate a user with a TMDb username and password. The user\n must have a verified email address and be registered on TMDb.\n\n Args:\n request_token: The token you generated for the user to approve.\n username: The user's username on TMDb.\n password: The user's password on TMDb.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('token_validate_with_login')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngenerates a new session id for user based authentication.", "response": "def session_new(self, **kwargs):\n \"\"\"\n Generate a session id for user based authentication.\n\n A session id is required in order to use any of the write methods.\n\n Args:\n request_token: The token you generated for the user to approve.\n The token needs to be approved before being\n used here.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('session_new')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ngenerate a guest session id.", "response": "def guest_session_new(self, **kwargs):\n \"\"\"\n Generate a guest session id.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('guest_session_new')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget a list of rated moviews for a specific guest session id.", "response": "def rated_movies(self, **kwargs):\n \"\"\"\n Get a list of rated moview for a specific guest session id.\n\n Args:\n page: (optional) Minimum 1, maximum 1000.\n sort_by: (optional) 'created_at.asc' | 'created_at.desc'\n language: (optional) ISO 639-1 code.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_guest_session_id_path('rated_movies')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck to see if a movie id is already added to a list.", "response": "def item_status(self, **kwargs):\n \"\"\"\n Check to see if a movie id is already added to a list.\n\n Args:\n movie_id: The id of the movie.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_id_path('item_status')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef create_list(self, **kwargs):\n path = self._get_path('create_list')\n kwargs.update({'session_id': self.session_id})\n\n payload = {\n 'name': kwargs.pop('name', None), \n 'description': kwargs.pop('description', None),\n }\n if 'language' in kwargs:\n payload['language'] = kwargs['language']\n\n response = self._POST(path, kwargs, payload)\n self._set_attrs_to_values(response)\n return response", "response": "Create a new list."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nremoves a movie from a list that the user created.", "response": "def remove_item(self, **kwargs):\n \"\"\"\n Delete movies from a list that the user created.\n\n A valid session id is required.\n\n Args:\n media_id: A movie id.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_id_path('remove_item')\n kwargs.update({'session_id': self.session_id})\n\n payload = {\n 'media_id': kwargs.pop('media_id', None), \n }\n\n response = self._POST(path, kwargs, payload)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef clear_list(self, **kwargs):\n path = self._get_id_path('clear')\n kwargs.update({'session_id': self.session_id})\n\n payload = {}\n\n response = self._POST(path, kwargs, payload)\n self._set_attrs_to_values(response)\n return response", "response": "Clears all of the items within a list."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets the content ratings for a TV Series.", "response": "def content_ratings(self, **kwargs):\n \"\"\"\n Get the content ratings for a TV Series.\n\n Args:\n language: (optional) ISO 639 code.\n append_to_response: (optional) Comma separated, any collection\n method.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_id_path('content_ratings')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting the similar TV series for a specific TV series id.", "response": "def similar(self, **kwargs):\n \"\"\"\n Get the similar TV series for a specific TV series id.\n\n Args:\n page: (optional) Minimum value of 1. Expected value is an integer.\n language: (optional) ISO 639-1 code.\n append_to_response: (optional) Comma separated, any TV method.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_id_path('similar')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef on_the_air(self, **kwargs):\n path = self._get_path('on_the_air')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Query the list of TV shows that are currently on the air. This is a simple API call."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef airing_today(self, **kwargs):\n path = self._get_path('airing_today')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the list of TV shows that air today."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef info(self, **kwargs):\n path = self._get_series_id_season_number_path('info')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the primary information about a TV season by its season number."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting the cast & crew credits for a TV season by season number.", "response": "def credits(self, **kwargs):\n \"\"\"\n Get the cast & crew credits for a TV season by season number.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_series_id_season_number_path('credits')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef external_ids(self, **kwargs):\n path = self._get_series_id_season_number_path('external_ids')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the external ids that we have stored for a TV season by season\n number."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef images(self, **kwargs):\n path = self._get_series_id_season_number_path('images')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the images that we have stored for a TV season by season\n number."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget the videos that have been added to a TV season.", "response": "def videos(self, **kwargs):\n \"\"\"\n Get the videos that have been added to a TV season (trailers, teasers,\n etc...).\n\n Args:\n language: (optional) ISO 639 code.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_series_id_season_number_path('videos')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef info(self, **kwargs):\n path = self._get_series_id_season_number_episode_number_path('info')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the primary information about a TV episode by combination of season and episode number."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef credits(self, **kwargs):\n path = self._get_series_id_season_number_episode_number_path('credits')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the TV episode credits by combination of season and episode number."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef external_ids(self, **kwargs):\n path = self._get_series_id_season_number_episode_number_path(\n 'external_ids')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the external ids for a TV episode by combination of a season and episode number."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef images(self, **kwargs):\n path = self._get_series_id_season_number_episode_number_path('images')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the images for a TV episode by combination of season number and episode number."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef videos(self, **kwargs):\n path = self._get_series_id_season_number_episode_number_path('videos')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Get the videos that have been added to a TV episode."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nset attributes to values.", "response": "def _set_attrs_to_values(self, response={}):\n \"\"\"\n Set attributes to dictionary values.\n\n - e.g.\n >>> import tmdbsimple as tmdb\n >>> movie = tmdb.Movies(103332)\n >>> response = movie.info()\n >>> movie.title # instead of response['title']\n \"\"\"\n if isinstance(response, dict):\n for key in response.keys():\n if not hasattr(self, key) or not callable(getattr(self, key)):\n setattr(self, key, response[key])"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsearches for movies by title.", "response": "def movie(self, **kwargs):\n \"\"\"\n Search for movies by title.\n\n Args:\n query: CGI escpaed string.\n page: (optional) Minimum value of 1. Expected value is an integer.\n language: (optional) ISO 639-1 code.\n include_adult: (optional) Toggle the inclusion of adult titles. \n Expected value is True or False.\n year: (optional) Filter the results release dates to matches that \n include this value.\n primary_release_year: (optional) Filter the results so that only \n the primary release dates have this value.\n search_type: (optional) By default, the search type is 'phrase'. \n This is almost guaranteed the option you will want. \n It's a great all purpose search type and by far the \n most tuned for every day querying. For those wanting \n more of an \"autocomplete\" type search, set this \n option to 'ngram'.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('movie')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nsearch for collections by name.", "response": "def collection(self, **kwargs):\n \"\"\"\n Search for collections by name.\n\n Args:\n query: CGI escpaed string.\n page: (optional) Minimum value of 1. Expected value is an integer.\n language: (optional) ISO 639-1 code.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('collection')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsearching for TV shows by title.", "response": "def tv(self, **kwargs):\n \"\"\"\n Search for TV shows by title.\n\n Args:\n query: CGI escpaed string.\n page: (optional) Minimum value of 1. Expected value is an integer.\n language: (optional) ISO 639-1 code.\n first_air_date_year: (optional) Filter the results to only match \n shows that have a air date with with value.\n search_type: (optional) By default, the search type is 'phrase'. \n This is almost guaranteed the option you will want. \n It's a great all purpose search type and by far the \n most tuned for every day querying. For those wanting \n more of an \"autocomplete\" type search, set this \n option to 'ngram'.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('tv')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsearching for people by name.", "response": "def person(self, **kwargs):\n \"\"\"\n Search for people by name.\n\n Args:\n query: CGI escpaed string.\n page: (optional) Minimum value of 1. Expected value is an integer.\n include_adult: (optional) Toggle the inclusion of adult titles. \n Expected value is True or False.\n search_type: (optional) By default, the search type is 'phrase'. \n This is almost guaranteed the option you will want. \n It's a great all purpose search type and by far the \n most tuned for every day querying. For those wanting \n more of an \"autocomplete\" type search, set this \n option to 'ngram'.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('person')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef company(self, **kwargs):\n path = self._get_path('company')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response", "response": "Search for companies by name."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsearches for keywords by name.", "response": "def keyword(self, **kwargs):\n \"\"\"\n Search for keywords by name.\n\n Args:\n query: CGI escpaed string.\n page: (optional) Minimum value of 1. Expected value is an integer.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('keyword')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsearch the movie tv show person collections with a single query.", "response": "def multi(self, **kwargs):\n \"\"\"\n Search the movie, tv show and person collections with a single query.\n\n Args:\n query: CGI escpaed string.\n page: (optional) Minimum value of 1. Expected value is an integer.\n language: (optional) ISO 639-1 code.\n include_adult: (optional) Toggle the inclusion of adult titles.\n Expected value is True or False.\n\n Returns:\n A dict respresentation of the JSON returned from the API.\n \"\"\"\n path = self._get_path('multi')\n\n response = self._GET(path, kwargs)\n self._set_attrs_to_values(response)\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nnormalizes and tokenize text. This is lifted from NIST mteval - v11a. pl.", "response": "def normalize(s):\n '''Normalize and tokenize text. This is lifted from NIST mteval-v11a.pl.'''\n # Added to bypass NIST-style pre-processing of hyp and ref files -- wade\n if (nonorm):\n return s.split()\n try:\n s.split()\n except:\n s = \" \".join(s)\n # language-independent part:\n for (pattern, replace) in normalize1:\n s = re.sub(pattern, replace, s)\n s = xml.sax.saxutils.unescape(s, {'"':'\"'})\n # language-dependent part (assuming Western languages):\n s = \" %s \" % s\n if not preserve_case:\n s = s.lower() # this might not be identical to the original\n return [tok for tok in normalize3.split(s) if tok and tok != ' ']"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ntakes a list of reference sentences for a single segment and returns an object that encapsulates everything that BLEU needs to know about them.", "response": "def cook_refs(refs, n=4):\n '''Takes a list of reference sentences for a single segment\n and returns an object that encapsulates everything that BLEU\n needs to know about them.'''\n \n refs = [normalize(ref) for ref in refs]\n maxcounts = {}\n for ref in refs:\n counts = count_ngrams(ref, n)\n for (ngram,count) in list(counts.items()):\n maxcounts[ngram] = max(maxcounts.get(ngram,0), count)\n return ([len(ref) for ref in refs], maxcounts)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ntake a list of reference sentences and returns a set that encapsulates everything that bleualign wants.", "response": "def cook_ref_set(ref, n=4):\n '''Takes a reference sentences for a single segment\n and returns an object that encapsulates everything that BLEU\n needs to know about them. Also provides a set cause bleualign wants it'''\n ref = normalize(ref)\n counts = count_ngrams(ref, n)\n return (len(ref), counts, frozenset(counts))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef align_probability(i, j, source_sentences, target_sentences, alignment, params):\n l_s = sum(source_sentences[i - offset] for offset in range(alignment[0]))\n l_t = sum(target_sentences[j - offset] for offset in range(alignment[1]))\n try:\n # actually, the paper says l_s * params.VARIANCE_CHARACTERS, this is based on the C\n # reference implementation. With l_s in the denominator, insertions are impossible.\n m = (l_s + l_t / params.AVERAGE_CHARACTERS) / 2\n delta = (l_t - l_s * params.AVERAGE_CHARACTERS) / math.sqrt(m * params.VARIANCE_CHARACTERS)\n except ZeroDivisionError:\n delta = infinity\n\n return 2 * (1 - norm_cdf(abs(delta))) * params.PRIORS[alignment]", "response": "Returns the probability of the two sentences C{source_sentences[i], C{target_sentences[j] } being aligned with a specific alignment."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef align_blocks(source_sentences, target_sentences, params = LanguageIndependent):\n alignment_types = list(params.PRIORS.keys())\n\n # there are always three rows in the history (with the last of them being filled)\n # and the rows are always |target_text| + 2, so that we never have to do\n # boundary checks\n D = [(len(target_sentences) + 2) * [0] for x in range(2)]\n\n # for the first sentence, only substitution, insertion or deletion are\n # allowed, and they are all equally likely ( == 1)\n\n D.append([0, 1])\n try:\n D[-2][1] = 1\n D[-2][2] = 1\n except:\n pass\n\n backlinks = {}\n\n for i in range(len(source_sentences)):\n for j in range(len(target_sentences)):\n m = []\n for a in alignment_types:\n k = D[-(1 + a[0])][j + 2 - a[1]]\n if k > 0:\n p = k * \\\n align_probability(i, j, source_sentences, target_sentences, a, params)\n m.append((p, a))\n\n if len(m) > 0:\n v = max(m)\n backlinks[(i, j)] = v[1]\n D[-1].append(v[0])\n else:\n backlinks[(i, j)] = (1, 1)\n D[-1].append(0)\n\n D.pop(0)\n D.append([0, 0])\n\n return trace(backlinks, source_sentences, target_sentences)", "response": "Creates the sentence alignment of two blocks of texts."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncreate a list of sentences that aligns two texts.", "response": "def align_texts(source_blocks, target_blocks, params = LanguageIndependent):\n \"\"\"Creates the sentence alignment of two texts.\n\n Texts can consist of several blocks. Block boundaries cannot be crossed by sentence \n alignment links. \n\n Each block consists of a list that contains the lengths (in characters) of the sentences\n in this block.\n \n @param source_blocks: The list of blocks in the source text.\n @param target_blocks: The list of blocks in the target text.\n @param params: the sentence alignment parameters.\n\n @returns: A list of sentence alignment lists\n \"\"\"\n if len(source_blocks) != len(target_blocks):\n raise ValueError(\"Source and target texts do not have the same number of blocks.\")\n \n return [align_blocks(source_block, target_block, params) \n for source_block, target_block in zip(source_blocks, target_blocks)]"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef split_at(it, split_value):\n def _chunk_iterator(first):\n v = first\n while v != split_value:\n yield v\n v = next(it)\n \n while True:\n yield _chunk_iterator(next(it))", "response": "Splits an iterator C { it } at values of split_value."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nparses a stream of tokens and splits it into sentences and blocks.", "response": "def parse_token_stream(stream, soft_delimiter, hard_delimiter):\n \"\"\"Parses a stream of tokens and splits it into sentences (using C{soft_delimiter} tokens) \n and blocks (using C{hard_delimiter} tokens) for use with the L{align_texts} function.\n \"\"\"\n return [\n [sum(len(token) for token in sentence_it) \n for sentence_it in split_at(block_it, soft_delimiter)]\n for block_it in split_at(stream, hard_delimiter)]"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_descriptors_from_module(mdl, submodule=False):\n warnings.warn(\"use get_descriptors_in_module\", DeprecationWarning)\n __all__ = getattr(mdl, \"__all__\", None)\n if __all__ is None:\n __all__ = dir(mdl)\n\n all_functions = (getattr(mdl, name) for name in __all__ if name[:1] != \"_\")\n\n if submodule:\n descs = [\n d\n for fn in all_functions\n if is_descriptor_class(fn) or isinstance(fn, ModuleType)\n for d in (\n [fn] if is_descriptor_class(fn)\n else get_descriptors_from_module(fn, submodule=True)\n )\n ]\n else:\n descs = [\n fn\n for fn in all_functions\n if is_descriptor_class(fn)\n ]\n\n return descs", "response": "r Get all the descriptors from a module."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_descriptors_in_module(mdl, submodule=True):\n __all__ = getattr(mdl, \"__all__\", None)\n if __all__ is None:\n __all__ = dir(mdl)\n\n all_values = (getattr(mdl, name) for name in __all__ if name[:1] != \"_\")\n\n if submodule:\n for v in all_values:\n if is_descriptor_class(v):\n yield v\n if isinstance(v, ModuleType):\n for v in get_descriptors_in_module(v, submodule=True):\n yield v\n\n else:\n for v in all_values:\n if is_descriptor_class(v):\n yield v", "response": "r Returns an iterator over all the descriptors in a module."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nregisters Descriptors from json object objects.", "response": "def register_json(self, obj):\n \"\"\"Register Descriptors from json descriptor objects.\n\n Parameters:\n obj(list or dict): descriptors to register\n\n \"\"\"\n if not isinstance(obj, list):\n obj = [obj]\n\n self.register(Descriptor.from_json(j) for j in obj)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef register(self, desc, version=None, ignore_3D=False):\n if version is None:\n version = __version__\n\n version = StrictVersion(version)\n return self._register(desc, version, ignore_3D)", "response": "r Register a new object in the cache."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef echo(self, s, file=sys.stdout, end=\"\\n\"):\n p = getattr(self, \"_progress_bar\", None)\n if p is not None:\n p.write(s, file=file, end=\"\\n\")\n return\n\n print(s, file=file, end=\"\\n\")", "response": "Output a string to file"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef pandas(self, mols, nproc=None, nmols=None, quiet=False, ipynb=False, id=-1):\n from .pandas_module import MordredDataFrame, Series\n\n if isinstance(mols, Series):\n index = mols.index\n else:\n index = None\n\n return MordredDataFrame(\n (list(r) for r in self.map(mols, nproc, nmols, quiet, ipynb, id)),\n columns=[str(d) for d in self.descriptors],\n index=index,\n )", "response": "r Returns a pandas DataFrame with the descriptors of the given MOCs."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef is_descriptor_class(desc, include_abstract=False):\n return (\n isinstance(desc, type)\n and issubclass(desc, Descriptor)\n and (True if include_abstract else not inspect.isabstract(desc))\n )", "response": "r Check calculatable descriptor class or not."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nconverts to json serializable dictionary.", "response": "def to_json(self):\n \"\"\"Convert to json serializable dictionary.\n\n Returns:\n dict: dictionary of descriptor\n\n \"\"\"\n d, ps = self._to_json()\n if len(ps) == 0:\n return {\"name\": d}\n else:\n return {\"name\": d, \"args\": ps}"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef coord(self):\n if not self.require_3D:\n self.fail(AttributeError(\"use 3D coordinate in 2D descriptor\"))\n\n return self._context.get_coord(self)", "response": "Get 3D coordinate.\n\n Returns:\n numpy.array[3, N]: coordinate matrix"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef rethrow_zerodiv(self):\n with np.errstate(divide=\"raise\", invalid=\"raise\"):\n try:\n yield\n except (FloatingPointError, ZeroDivisionError) as e:\n self.fail(ZeroDivisionError(*e.args))", "response": "Raises ZeroDivisionError and raises a Failure."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef atomic_sa(self, i):\n sa = 4.0 * np.pi * self.rads2[i]\n\n neighbors = self.neighbors.get(i)\n\n if neighbors is None:\n return sa\n\n XYZi = self.xyzs[i, np.newaxis].T\n\n sphere = self.sphere * self.rads[i] + XYZi\n N = sphere.shape[1]\n\n for j, _ in neighbors:\n XYZj = self.xyzs[j, np.newaxis].T\n\n d2 = (sphere - XYZj) ** 2\n mask = (d2[0] + d2[1] + d2[2]) > self.rads2[j]\n sphere = np.compress(mask, sphere, axis=1)\n\n return sa * sphere.shape[1] / N", "response": "r Returns the atomic surface area of the atom with the given index."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef surface_area(self):\n return [self.atomic_sa(i) for i in range(len(self.rads))]", "response": "rCalculate all atomic surface area."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef from_mol(cls, mol, conformer=-1, solvent_radius=1.4, level=4):\n rs = atoms_to_numpy(lambda a: vdw_radii[a.GetAtomicNum()] + solvent_radius, mol)\n\n conf = mol.GetConformer(conformer)\n\n ps = np.array([list(conf.GetAtomPosition(i)) for i in range(mol.GetNumAtoms())])\n\n return cls(rs, ps, level)", "response": "rConstruct a SurfaceArea from rdkit Mol type."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a Descriptor instance from a json dict.", "response": "def _Descriptor_from_json(self, obj):\n \"\"\"Create Descriptor instance from json dict.\n\n Parameters:\n obj(dict): descriptor dict\n\n Returns:\n Descriptor: descriptor\n\n \"\"\"\n descs = getattr(self, \"_all_descriptors\", None)\n\n if descs is None:\n from mordred import descriptors\n descs = {\n cls.__name__: cls\n for cls in get_descriptors_in_module(descriptors)\n }\n descs[ConstDescriptor.__name__] = ConstDescriptor\n self._all_descriptors = descs\n\n return _from_json(obj, descs)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef fill_missing(self, value=np.nan):\n return self.__class__(\n self.mol,\n [(value if is_missing(v) else v) for v in self.values()],\n self.keys(),\n )", "response": "Returns a new object with the missing values replaced."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef drop_missing(self):\n newvalues = []\n newdescs = []\n for d, v in self.items():\n if not is_missing(v):\n newvalues.append(v)\n newdescs.append(d)\n\n return self.__class__(self.mol, newvalues, newdescs)", "response": "r Delete missing value."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting items. Returns a Iterable of tuples.", "response": "def items(self):\n r\"\"\"Get items.\n\n Returns:\n Iterable[(Descriptor, value)]\n\n \"\"\"\n return ((k, v) for k, v in zip(self.keys(), self.values()))"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nconvert Result to dict.", "response": "def asdict(self, rawkey=False):\n r\"\"\"Convert Result to dict.\n\n Parameters:\n rawkey(bool):\n * True: dict key is Descriptor instance\n * False: dict key is str\n\n Returns:\n dict\n\n \"\"\"\n if rawkey:\n return dict(self.items())\n else:\n return {\n str(k): v\n for k, v in self.items()\n }"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef log_calls(func):\n '''Decorator to log function calls.'''\n def wrapper(*args, **kargs):\n callStr = \"%s(%s)\" % (func.__name__, \", \".join([repr(p) for p in args] + [\"%s=%s\" % (k, repr(v)) for (k, v) in list(kargs.items())]))\n debug(\">> %s\", callStr)\n ret = func(*args, **kargs)\n debug(\"<< %s: %s\", callStr, repr(ret))\n return ret\n return wrapper", "response": "Decorator to log function calls."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef synchronized(func):\n '''Decorator to synchronize function.'''\n func.__lock__ = threading.Lock()\n def synced_func(*args, **kargs):\n with func.__lock__:\n return func(*args, **kargs)\n return synced_func", "response": "Decorator to synchronize function."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nshow current progress message to stderr.", "response": "def progress(msg, *args):\n '''Show current progress message to stderr.\n This function will remember the previous message so that next time,\n it will clear the previous message before showing next one.\n '''\n # Don't show any progress if the output is directed to a file.\n if not (sys.stdout.isatty() and sys.stderr.isatty()):\n return\n\n text = (msg % args)\n if progress.prev_message:\n sys.stderr.write(' ' * len(progress.prev_message) + '\\r')\n sys.stderr.write(text + '\\r')\n progress.prev_message = text"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef fail(message, exc_info=None, status=1, stacktrace=False):\n '''Utility function to handle runtime failures gracefully.\n Show concise information if possible, then terminate program.\n '''\n text = message\n if exc_info:\n text += str(exc_info)\n error(text)\n if stacktrace:\n error(traceback.format_exc())\n clean_tempfiles()\n if __name__ == '__main__':\n sys.exit(status)\n else:\n raise RuntimeError(status)", "response": "Utility function to handle runtime failures gracefully."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef tempfile_get(target):\n '''Get a temp filename for atomic download.'''\n fn = '%s-%s.tmp' % (target, ''.join(random.Random().sample(\"0123456789abcdefghijklmnopqrstuvwxyz\", 15)))\n TEMP_FILES.add(fn)\n return fn", "response": "Get a temp filename for atomic download."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef clean_tempfiles():\n '''Clean up temp files'''\n for fn in TEMP_FILES:\n if os.path.exists(fn):\n os.unlink(fn)", "response": "Clean up temp files"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_loggers(self):\n '''Return a list of the logger methods: (debug, info, warn, error)'''\n\n return self.log.debug, self.log.info, self.log.warn, self.log.error", "response": "Return a list of the logger methods"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets the fixed part of the path without wildcard", "response": "def get_fixed_path(self):\n '''Get the fixed part of the path without wildcard'''\n pi = self.path.split(PATH_SEP)\n fi = []\n for p in pi:\n if '*' in p or '?' in p:\n break\n fi.append(p)\n return PATH_SEP.join(fi)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_legal_params(self, method):\n '''Given a API name, list all legal parameters using boto3 service model.'''\n if method not in self.client.meta.method_to_api_mapping:\n # Injected methods. Ignore.\n return []\n api = self.client.meta.method_to_api_mapping[method]\n shape = self.client.meta.service_model.operation_model(api).input_shape\n if shape is None:\n # No params needed for this API.\n return []\n return shape.members.keys()", "response": "Given a API name list all legal parameters using boto3 service model."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncombine existing parameters with extra options supplied from command line options. Carefully merge special type of parameter if needed.", "response": "def merge_opt_params(self, method, kargs):\n '''Combine existing parameters with extra options supplied from command line\n options. Carefully merge special type of parameter if needed.\n '''\n for key in self.legal_params[method]:\n if not hasattr(self.opt, key) or getattr(self.opt, key) is None:\n continue\n if key in kargs and type(kargs[key]) == dict:\n assert(type(getattr(self.opt, key)) == dict)\n # Merge two dictionaries.\n for k, v in getattr(self.opt, key).iteritems():\n kargs[key][k] = v\n else:\n # Overwrite values.\n kargs[key] = getattr(self.opt, key)\n\n return kargs"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef add_options(parser):\n '''Add the whole list of API parameters into optparse.'''\n for param, param_type, param_doc in BotoClient.EXTRA_CLIENT_PARAMS:\n parser.add_option('--API-' + param, help=param_doc, type=param_type, dest=param)", "response": "Add the whole list of API parameters into optparse."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef join(self):\n '''Override original join() with a timeout and handle keyboard interrupt.'''\n self.all_tasks_done.acquire()\n try:\n while self.unfinished_tasks:\n self.all_tasks_done.wait(1000)\n\n # Child thread has exceptions, fail main thread too.\n if self.exc_info:\n fail('[Thread Failure] ', exc_info=self.exc_info)\n except KeyboardInterrupt:\n raise Failure('Interrupted by user')\n finally:\n self.all_tasks_done.release()", "response": "Override original join with a timeout and handle keyboard interrupt."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef terminate(self, exc_info=None):\n '''Terminate all threads by deleting the queue and forcing the child threads\n to quit.\n '''\n if exc_info:\n self.exc_info = exc_info\n try:\n while self.get_nowait():\n self.task_done()\n except Queue.Empty:\n pass", "response": "Terminate all threads by deleting the queue and forcing the child threads\n to quit."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef add_task(self, func_name, *args, **kargs):\n '''Utility function to add a single task into task queue'''\n self.tasks.put((func_name, 0, args, kargs))", "response": "Utility function to add a single task into task queue"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef join(self):\n '''Utility function to wait all tasks to complete'''\n self.tasks.join()\n\n # Force each thread to break loop.\n for worker in self.workers:\n self.tasks.put(None)\n\n # Wait for all thread to terminate.\n for worker in self.workers:\n worker.join()\n worker.s3 = None", "response": "Utility function to wait all tasks to complete"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef processed(self):\n '''Increase the processed task counter and show progress message'''\n self.processed_tasks += 1\n qsize = self.tasks.qsize()\n if qsize > 0:\n progress('[%d task(s) completed, %d remaining, %d thread(s)]', self.processed_tasks, qsize, len(self.workers))\n else:\n progress('[%d task(s) completed, %d thread(s)]', self.processed_tasks, len(self.workers))", "response": "Increase the processed task counter and show progress message"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef s3_keys_from_env():\n '''Retrieve S3 access keys from the environment, or None if not present.'''\n env = os.environ\n if S3_ACCESS_KEY_NAME in env and S3_SECRET_KEY_NAME in env:\n keys = (env[S3_ACCESS_KEY_NAME], env[S3_SECRET_KEY_NAME])\n debug(\"read S3 keys from environment\")\n return keys\n else:\n return None", "response": "Retrieve S3 access keys from the environment or None if not present."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nretrieve S3 access keys from the commandline or None if not present.", "response": "def s3_keys_from_cmdline(opt):\n '''Retrieve S3 access keys from the command line, or None if not present.'''\n if opt.access_key != None and opt.secret_key != None:\n keys = (opt.access_key, opt.secret_key)\n debug(\"read S3 keys from commandline\")\n return keys\n else:\n return None"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef s3_keys_from_s3cfg(opt):\n '''Retrieve S3 access key settings from s3cmd's config file, if present; otherwise return None.'''\n try:\n if opt.s3cfg != None:\n s3cfg_path = \"%s\" % opt.s3cfg\n else:\n s3cfg_path = \"%s/.s3cfg\" % os.environ[\"HOME\"]\n if not os.path.exists(s3cfg_path):\n return None\n config = ConfigParser.ConfigParser()\n config.read(s3cfg_path)\n keys = config.get(\"default\", \"access_key\"), config.get(\"default\", \"secret_key\")\n debug(\"read S3 keys from %s file\", s3cfg_path)\n return keys\n except Exception as e:\n info(\"could not read S3 keys from %s file; skipping (%s)\", s3cfg_path, e)\n return None", "response": "Retrieve S3 access key settings from s3cmd s config file if present ; otherwise return None."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ninitializing s3 access keys from command line or config file.", "response": "def init_s3_keys(opt):\n '''Initialize s3 access keys from environment variable or s3cfg config file.'''\n S3Handler.S3_KEYS = S3Handler.s3_keys_from_cmdline(opt) or S3Handler.s3_keys_from_env() \\\n or S3Handler.s3_keys_from_s3cfg(opt)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef connect(self):\n '''Connect to S3 storage'''\n try:\n if S3Handler.S3_KEYS:\n self.s3 = BotoClient(self.opt, S3Handler.S3_KEYS[0], S3Handler.S3_KEYS[1])\n else:\n self.s3 = BotoClient(self.opt)\n except Exception as e:\n raise RetryFailure('Unable to connect to s3: %s' % e)", "response": "Connect to S3 storage"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef s3walk(self, basedir, show_dir=None):\n '''Walk through a S3 directory. This function initiate a walk with a basedir.\n It also supports multiple wildcards.\n '''\n # Provide the default value from command line if no override.\n if not show_dir:\n show_dir = self.opt.show_dir\n\n # trailing slash normalization, this is for the reason that we want\n # ls 's3://foo/bar/' has the same result as 's3://foo/bar'. Since we\n # call partial_match() to check wildcards, we need to ensure the number\n # of slashes stays the same when we do this.\n if basedir[-1] == PATH_SEP:\n basedir = basedir[0:-1]\n\n s3url = S3URL(basedir)\n result = []\n\n pool = ThreadPool(ThreadUtil, self.opt)\n pool.s3walk(s3url, s3url.get_fixed_path(), s3url.path, result)\n pool.join()\n\n # automatic directory detection\n if not show_dir and len(result) == 1 and result[0]['is_dir']:\n path = result[0]['name']\n s3url = S3URL(path)\n result = []\n pool = ThreadPool(ThreadUtil, self.opt)\n pool.s3walk(s3url, s3url.get_fixed_path(), s3url.path, result)\n pool.join()\n\n def compare(x, y):\n '''Comparator for ls output'''\n result = -cmp(x['is_dir'], y['is_dir'])\n if result != 0:\n return result\n return cmp(x['name'], y['name'])\n return sorted(result, key=cmp_to_key(compare))", "response": "This function initiate a walk with a basedir. It also supports multiple wildcards."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nwalks through local directories from root basedir and return a list of all local files.", "response": "def local_walk(self, basedir):\n '''Walk through local directories from root basedir'''\n result = []\n\n for root, dirs, files in os.walk(basedir):\n for f in files:\n result.append(os.path.join(root, f))\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef source_expand(self, source):\n '''Expand the wildcards for an S3 path. This emulates the shall expansion\n for wildcards if the input is local path.\n '''\n result = []\n\n if not isinstance(source, list):\n source = [source]\n\n for src in source:\n # XXX Hacky: We need to disable recursive when we expand the input\n # parameters, need to pass this as an override parameter if\n # provided.\n tmp = self.opt.recursive\n self.opt.recursive = False\n result += [f['name'] for f in self.s3walk(src, True)]\n self.opt.recursive = tmp\n\n if (len(result) == 0) and (not self.opt.ignore_empty_source):\n fail(\"[Runtime Failure] Source doesn't exist.\")\n\n return result", "response": "Expand the wildcards for an S3 path. This emulates the shall expansion\n for wildcards if the input is local path."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef put_single_file(self, pool, source, target):\n '''Upload a single file or a directory by adding a task into queue'''\n if os.path.isdir(source):\n if self.opt.recursive:\n for f in (f for f in self.local_walk(source) if not os.path.isdir(f)):\n target_url = S3URL(target)\n # deal with ./ or ../ here by normalizing the path.\n joined_path = os.path.normpath(os.path.join(target_url.path, os.path.relpath(f, source)))\n pool.upload(f, S3URL.combine('s3', target_url.bucket, joined_path))\n else:\n message('omitting directory \"%s\".' % source)\n else:\n pool.upload(source, target)", "response": "Upload a single file or a directory by adding a task into queue"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nuploading files to S3.", "response": "def put_files(self, source, target):\n '''Upload files to S3.\n This function can handle multiple file upload if source is a list.\n Also, it works for recursive mode which copy all files and keep the\n directory structure under the given source directory.\n '''\n pool = ThreadPool(ThreadUtil, self.opt)\n if not isinstance(source, list):\n source = [source]\n\n if target[-1] == PATH_SEP:\n for src in source:\n self.put_single_file(pool, src, os.path.join(target, self.get_basename(src)))\n else:\n if len(source) == 1:\n self.put_single_file(pool, source[0], target)\n else:\n raise Failure('Target \"%s\" is not a directory (with a trailing slash).' % target)\n\n pool.join()"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef create_bucket(self, source):\n '''Use the create_bucket API to create a new bucket'''\n s3url = S3URL(source)\n\n message('Creating %s', source)\n if not self.opt.dry_run:\n resp = self.s3.create_bucket(Bucket=s3url.bucket)\n if resp['ResponseMetadata'][\"HTTPStatusCode\"] == 200:\n message('Done.')\n else:\n raise Failure('Unable to create bucket %s' % source)", "response": "Use the create_bucket API to create a new bucket"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef update_privilege(self, obj, target):\n '''Get privileges from metadata of the source in s3, and apply them to target'''\n if 'privilege' in obj['Metadata']:\n os.chmod(target, int(obj['Metadata']['privilege'], 8))", "response": "Update the privileges of the source in s3 to target"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef print_files(self, source):\n '''Print out a series of files'''\n sources = self.source_expand(source)\n\n for source in sources:\n s3url = S3URL(source)\n response = self.s3.get_object(Bucket=s3url.bucket, Key=s3url.path)\n message('%s', response['Body'].read())", "response": "Print out a series of files"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ndownloading a single file or a directory by adding a task into queue", "response": "def get_single_file(self, pool, source, target):\n '''Download a single file or a directory by adding a task into queue'''\n if source[-1] == PATH_SEP:\n if self.opt.recursive:\n basepath = S3URL(source).path\n for f in (f for f in self.s3walk(source) if not f['is_dir']):\n pool.download(f['name'], os.path.join(target, os.path.relpath(S3URL(f['name']).path, basepath)))\n else:\n message('omitting directory \"%s\".' % source)\n else:\n pool.download(source, target)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ndownload files. This function can handle multiple files if source S3 URL has wildcard characters. It also handles recursive mode by download all files and keep the directory structure.", "response": "def get_files(self, source, target):\n '''Download files.\n This function can handle multiple files if source S3 URL has wildcard\n characters. It also handles recursive mode by download all files and\n keep the directory structure.\n '''\n pool = ThreadPool(ThreadUtil, self.opt)\n source = self.source_expand(source)\n\n if os.path.isdir(target):\n for src in source:\n self.get_single_file(pool, src, os.path.join(target, self.get_basename(S3URL(src).path)))\n else:\n if len(source) > 1:\n raise Failure('Target \"%s\" is not a directory.' % target)\n # Get file if it exists on s3 otherwise do nothing\n elif len(source) == 1:\n self.get_single_file(pool, source[0], target)\n else:\n #Source expand may return empty list only if ignore-empty-source is set to true\n pass\n\n pool.join()"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nremoving remote files that are not present in the local source.", "response": "def delete_removed_files(self, source, target):\n '''Remove remote files that are not present in the local source.\n (Obsolete) It is used for old sync command now.\n '''\n message(\"Deleting files found in %s and not in %s\", source, target)\n if os.path.isdir(source):\n unecessary = []\n basepath = S3URL(target).path\n for f in [f for f in self.s3walk(target) if not f['is_dir']]:\n local_name = os.path.join(source, os.path.relpath(S3URL(f['name']).path, basepath))\n if not os.path.isfile(local_name):\n message(\"%s not found locally, adding to delete queue\", local_name)\n unecessary.append(f['name'])\n if len(unecessary) > 0:\n pool = ThreadPool(ThreadUtil, self.opt)\n for del_file in unecessary:\n pool.delete(del_file)\n pool.join()\n else:\n raise Failure('Source \"%s\" is not a directory.' % target)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef cp_single_file(self, pool, source, target, delete_source):\n '''Copy a single file or a directory by adding a task into queue'''\n if source[-1] == PATH_SEP:\n if self.opt.recursive:\n basepath = S3URL(source).path\n for f in (f for f in self.s3walk(source) if not f['is_dir']):\n pool.copy(f['name'], os.path.join(target, os.path.relpath(S3URL(f['name']).path, basepath)), delete_source=delete_source)\n else:\n message('omitting directory \"%s\".' % source)\n else:\n pool.copy(source, target, delete_source=delete_source)", "response": "Copy a single file or a directory by adding a task into queue"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef cp_files(self, source, target, delete_source=False):\n '''Copy files\n This function can handle multiple files if source S3 URL has wildcard\n characters. It also handles recursive mode by copying all files and\n keep the directory structure.\n '''\n pool = ThreadPool(ThreadUtil, self.opt)\n source = self.source_expand(source)\n\n if target[-1] == PATH_SEP:\n for src in source:\n self.cp_single_file(pool, src, os.path.join(target, self.get_basename(S3URL(src).path)), delete_source)\n else:\n if len(source) > 1:\n raise Failure('Target \"%s\" is not a directory (with a trailing slash).' % target)\n # Copy file if it exists otherwise do nothing\n elif len(source) == 1:\n self.cp_single_file(pool, source[0], target, delete_source)\n else:\n # Source expand may return empty list only if ignore-empty-source is set to true\n pass\n\n pool.join()", "response": "Copy files from source to target."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef del_files(self, source):\n '''Delete files on S3'''\n src_files = []\n for obj in self.s3walk(source):\n if not obj['is_dir']: # ignore directories\n src_files.append(obj['name'])\n\n pool = ThreadPool(ThreadUtil, self.opt)\n pool.batch_delete(src_files)\n pool.join()", "response": "Delete files on S3"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the size component of the given s3url.", "response": "def size(self, source):\n '''Get the size component of the given s3url. If it is a\n directory, combine the sizes of all the files under\n that directory. Subdirectories will not be counted unless\n --recursive option is set.\n '''\n result = []\n for src in self.source_expand(source):\n size = 0\n for f in self.s3walk(src):\n size += f['size']\n result.append((src, size))\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncalculate MD5 hash code for a local file", "response": "def file_hash(self, filename, block_size=2**20):\n '''Calculate MD5 hash code for a local file'''\n m = hashlib.md5()\n with open(filename, 'rb') as f:\n while True:\n data = f.read(block_size)\n if not data:\n break\n m.update(data)\n return m.hexdigest()"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_md5(self):\n '''Get or calculate MD5 value of the local file.'''\n if self.md5 is None:\n self.md5 = self.file_hash(self.filename)\n return self.md5", "response": "Get or calculate MD5 value of the local file."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nensuring all directories are created for a given target file.", "response": "def mkdirs(self, target):\n '''Ensure all directories are created for a given target file.'''\n path = os.path.dirname(target)\n if path and path != PATH_SEP and not os.path.isdir(path):\n # Multi-threading means there will be intervleaved execution\n # between the check and creation of the directory.\n try:\n os.makedirs(path)\n except OSError as ose:\n if ose.errno != errno.EEXIST:\n raise Failure('Unable to create directory (%s)' % (path,))"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncheck MD5 for a local file and a remote file.", "response": "def sync_check(self, md5cache, remoteKey):\n '''Check MD5 for a local file and a remote file.\n Return True if they have the same md5 hash, otherwise False.\n '''\n if not remoteKey:\n return False\n if not os.path.exists(md5cache.filename):\n return False\n localmd5 = md5cache.get_md5()\n\n # check multiple md5 locations\n return ('ETag' in remoteKey and remoteKey['ETag'] == '\"%s\"' % localmd5) or \\\n ('md5' in remoteKey and remoteKey['md5'] == localmd5) or \\\n ('md5' in remoteKey['Metadata'] and remoteKey['Metadata']['md5'] == localmd5)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef s3walk(self, s3url, s3dir, filter_path, result):\n '''Thread worker for s3walk.\n Recursively walk into all subdirectories if they still match the filter\n path partially.\n '''\n\n paginator = self.s3.get_paginator('list_objects')\n filter_path_level = filter_path.count(PATH_SEP)\n\n for page in paginator.paginate(Bucket=s3url.bucket, Prefix=s3dir, Delimiter=PATH_SEP, PaginationConfig={'PageSize': 1000}):\n # Get subdirectories first.\n for obj in page.get('CommonPrefixes') or []:\n obj_name = obj['Prefix']\n\n if not self.partial_match(obj_name, filter_path):\n continue\n\n if self.opt.recursive or (obj_name.count(PATH_SEP) != filter_path_level + 1):\n self.pool.s3walk(s3url, obj_name, filter_path, result)\n else:\n self.conditional(result, {\n 'name': S3URL.combine(s3url.proto, s3url.bucket, obj_name),\n 'is_dir': True,\n 'size': 0,\n 'last_modified': None\n })\n\n # Then get all items in this folder.\n for obj in page.get('Contents') or []:\n obj_name = obj['Key']\n if not self.partial_match(obj_name, filter_path):\n continue\n\n if self.opt.recursive or obj_name.count(PATH_SEP) == filter_path_level:\n self.conditional(result, {\n 'name': S3URL.combine(s3url.proto, s3url.bucket, obj_name),\n 'is_dir': False,\n 'size': obj['Size'],\n 'last_modified': obj['LastModified']\n })", "response": "Thread worker for s3walk"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks all file item with given conditions.", "response": "def conditional(self, result, obj):\n '''Check all file item with given conditions.'''\n fileonly = (self.opt.last_modified_before is not None) or (self.opt.last_modified_after is not None)\n\n if obj['is_dir']:\n if not fileonly:\n result.append(obj)\n return\n\n if (self.opt.last_modified_before is not None) and obj['last_modified'] >= self.opt.last_modified_before:\n return\n\n if (self.opt.last_modified_after is not None) and obj['last_modified'] <= self.opt.last_modified_after:\n return\n\n result.append(obj)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_file_splits(self, id, source, target, fsize, splitsize):\n '''Get file splits for upload/download/copy operation.'''\n pos = 0\n part = 1 # S3 part id starts from 1\n mpi = ThreadUtil.MultipartItem(id)\n splits = []\n\n while pos < fsize:\n chunk = min(splitsize, fsize - pos)\n assert(chunk > 0)\n splits.append((source, target, mpi, pos, chunk, part))\n part += 1\n pos += chunk\n mpi.total = len(splits)\n\n return splits", "response": "Get file splits for upload or download operation."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting privileges of a local file", "response": "def get_file_privilege(self, source):\n '''Get privileges of a local file'''\n try:\n return str(oct(os.stat(source).st_mode)[-3:])\n except Exception as e:\n raise Failure('Could not get stat for %s, error_message = %s', source, e)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef lookup(self, s3url):\n '''Get the s3 object with the S3 URL. Return None if not exist.'''\n try:\n return self.s3.head_object(Bucket=s3url.bucket, Key=s3url.path)\n except BotoClient.ClientError as e:\n if e.response['ResponseMetadata']['HTTPStatusCode'] == 404:\n return None\n else:\n raise e", "response": "Get the s3 object with the S3 URL. Return None if not exist."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef read_file_chunk(self, source, pos, chunk):\n '''Read local file chunk'''\n if chunk==0:\n return StringIO()\n data = None\n with open(source, 'rb') as f:\n f.seek(pos)\n data = f.read(chunk)\n if not data:\n raise Failure('Unable to read data from source: %s' % source)\n return StringIO(data)", "response": "Read a chunk of data from a local file"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef upload(self, source, target, mpi=None, pos=0, chunk=0, part=0):\n '''Thread worker for upload operation.'''\n s3url = S3URL(target)\n obj = self.lookup(s3url)\n\n # Initialization: Set up multithreaded uploads.\n if not mpi:\n fsize = os.path.getsize(source)\n md5cache = LocalMD5Cache(source)\n\n # optional checks\n if self.opt.dry_run:\n message('%s => %s', source, target)\n return\n elif self.opt.sync_check and self.sync_check(md5cache, obj):\n message('%s => %s (synced)', source, target)\n return\n elif not self.opt.force and obj:\n raise Failure('File already exists: %s' % target)\n\n if fsize < self.opt.max_singlepart_upload_size:\n data = self.read_file_chunk(source, 0, fsize)\n self.s3.put_object(Bucket=s3url.bucket,\n Key=s3url.path,\n Body=data,\n Metadata={'md5': md5cache.get_md5(),\n 'privilege': self.get_file_privilege(source)})\n message('%s => %s', source, target)\n return\n\n # Here we need to have our own md5 value because multipart upload calculates\n # different md5 values.\n response = self.s3.create_multipart_upload(Bucket=s3url.bucket,\n Key=s3url.path,\n Metadata={'md5': md5cache.get_md5(),\n 'privilege': self.get_file_privilege(source)})\n upload_id = response['UploadId']\n\n for args in self.get_file_splits(upload_id, source, target, fsize, self.opt.multipart_split_size):\n self.pool.upload(*args)\n return\n\n data = self.read_file_chunk(source, pos, chunk)\n response = self.s3.upload_part(Bucket=s3url.bucket, Key=s3url.path, UploadId=mpi.id, Body=data, PartNumber=part)\n\n # Finalize\n if mpi.complete({'ETag': response['ETag'], 'PartNumber': part}):\n try:\n self.s3.complete_multipart_upload(Bucket=s3url.bucket, Key=s3url.path, UploadId=mpi.id, MultipartUpload={'Parts': mpi.sorted_parts()})\n message('%s => %s', source, target)\n except Exception as e:\n message('Unable to complete upload: %s', str(e))\n self.s3.abort_multipart_upload(Bucket=s3url.bucket, Key=s3url.path, UploadId=mpi.id)\n raise RetryFailure('Upload failed: Unable to complete upload %s.' % source)", "response": "Thread worker for uploading a file."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _verify_file_size(self, obj, downloaded_file):\n '''Verify the file size of the downloaded file.'''\n file_size = os.path.getsize(downloaded_file)\n if int(obj['ContentLength']) != file_size:\n raise RetryFailure('Downloaded file size inconsistent: %s' % (repr(obj)))", "response": "Verify the file size of the downloaded file."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nwriting a chunk of data to a local file.", "response": "def write_file_chunk(self, target, pos, chunk, body):\n '''Write local file chunk'''\n fd = os.open(target, os.O_CREAT | os.O_WRONLY)\n try:\n os.lseek(fd, pos, os.SEEK_SET)\n data = body.read(chunk)\n num_bytes_written = os.write(fd, data)\n if(num_bytes_written != len(data)):\n raise RetryFailure('Number of bytes written inconsistent: %s != %s' % (num_bytes_written, sys.getsizeof(data)))\n finally:\n os.close(fd)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef download(self, source, target, mpi=None, pos=0, chunk=0, part=0):\n '''Thread worker for download operation.'''\n s3url = S3URL(source)\n obj = self.lookup(s3url)\n if obj is None:\n raise Failure('The obj \"%s\" does not exists.' % (s3url.path,))\n\n # Initialization: Set up multithreaded downloads.\n if not mpi:\n # optional checks\n if self.opt.dry_run:\n message('%s => %s', source, target)\n return\n elif self.opt.sync_check and self.sync_check(LocalMD5Cache(target), obj):\n message('%s => %s (synced)', source, target)\n return\n elif not self.opt.force and os.path.exists(target):\n raise Failure('File already exists: %s' % target)\n\n fsize = int(obj['ContentLength'])\n\n # Small file optimization.\n if fsize < self.opt.max_singlepart_download_size:\n # Create a single part to chain back main download operation.\n mpi = ThreadUtil.MultipartItem(tempfile_get(target))\n mpi.total = 1\n pos = 0\n chunk = fsize\n # Continue as one part download.\n else:\n # Here we use temp filename as the id of mpi.\n for args in self.get_file_splits(tempfile_get(target), source, target, fsize, self.opt.multipart_split_size):\n self.pool.download(*args)\n return\n\n tempfile = mpi.id\n if self.opt.recursive:\n self.mkdirs(tempfile)\n\n # Download part of the file, range is inclusive.\n response = self.s3.get_object(Bucket=s3url.bucket, Key=s3url.path, Range='bytes=%d-%d' % (pos, pos + chunk - 1))\n self.write_file_chunk(tempfile, pos, chunk, response['Body'])\n\n # Finalize\n if mpi.complete({'PartNumber': part}):\n try:\n self.update_privilege(obj, tempfile)\n self._verify_file_size(obj, tempfile)\n tempfile_set(tempfile, target)\n message('%s => %s', source, target)\n except Exception as e:\n # Note that we don't retry in this case, because\n # We are going to remove the temp file, and if we\n # retry here with original parameters (wrapped in\n # the task item), it would fail anyway\n tempfile_set(tempfile, None)\n raise Failure('Download Failure: %s, Source: %s.' % (e.message, source))", "response": "Thread worker for download operation."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef copy(self, source, target, mpi=None, pos=0, chunk=0, part=0, delete_source=False):\n '''Copy a single file from source to target using boto S3 library.'''\n\n if self.opt.dry_run:\n message('%s => %s' % (source, target))\n return\n\n source_url = S3URL(source)\n target_url = S3URL(target)\n\n if not mpi:\n obj = self.lookup(source_url)\n fsize = int(obj['ContentLength'])\n\n if fsize < self.opt.max_singlepart_copy_size:\n self.s3.copy_object(Bucket=target_url.bucket, Key=target_url.path,\n CopySource={'Bucket': source_url.bucket, 'Key': source_url.path})\n\n message('%s => %s' % (source, target))\n if delete_source:\n self.delete(source)\n\n return\n\n response = self.s3.create_multipart_upload(Bucket=target_url.bucket,\n Key=target_url.path,\n Metadata=obj['Metadata'])\n upload_id = response['UploadId']\n\n for args in self.get_file_splits(upload_id, source, target, fsize, self.opt.multipart_split_size):\n self.pool.copy(*args, delete_source=delete_source)\n return\n\n response = self.s3.upload_part_copy(Bucket=target_url.bucket,\n Key=target_url.path,\n CopySource={'Bucket': source_url.bucket, 'Key': source_url.path},\n CopySourceRange='bytes=%d-%d' % (pos, pos + chunk - 1),\n UploadId=mpi.id,\n PartNumber=part)\n\n if mpi.complete({'ETag': response['CopyPartResult']['ETag'], 'PartNumber': part}):\n try:\n # Finalize copy operation.\n self.s3.complete_multipart_upload(Bucket=target_url.bucket, Key=target_url.path, UploadId=mpi.id, MultipartUpload={'Parts': mpi.sorted_parts()})\n\n if delete_source:\n self.delete(source)\n\n message('%s => %s' % (source, target))\n except Exception as e:\n message('Unable to complete upload: %s', str(e))\n self.s3.abort_multipart_upload(Bucket=source_url.bucket, Key=source_url.path, UploadId=mpi.id)\n raise RetryFailure('Copy failed: Unable to complete copy %s.' % source)", "response": "Copy a single file from source to target using boto S3 library."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nthread worker for download operation.", "response": "def delete(self, source):\n '''Thread worker for download operation.'''\n s3url = S3URL(source)\n\n message('Delete %s', source)\n if not self.opt.dry_run:\n self.s3.delete_object(Bucket=s3url.bucket, Key=s3url.path)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ndelete a list of files in batch of batch_delete_size ( default = 1000.", "response": "def batch_delete(self, sources):\n '''Delete a list of files in batch of batch_delete_size (default=1000).'''\n assert(type(sources) == list)\n\n if len(sources) == 0:\n return\n elif len(sources) == 1:\n self.delete(sources[0])\n elif len(sources) > self.opt.batch_delete_size:\n for i in range(0, len(sources), self.opt.batch_delete_size):\n self.pool.batch_delete(sources[i:i+self.opt.batch_delete_size])\n else:\n bucket = S3URL(sources[0]).bucket\n deletes = []\n for source in sources:\n s3url = S3URL(source)\n if s3url.bucket != bucket:\n raise Failure('Unable to delete keys in different bucket %s and %s.' % (s3url.bucket, bucket))\n deletes.append({'Key': s3url.path})\n\n response = self.s3.delete_objects(Bucket=bucket, Delete={'Objects': deletes})\n\n # Output result of deletion.\n for res in response.get('Deleted') or []:\n message('Delete %s', S3URL.combine('s3', bucket, res['Key']))\n\n for err in response.get('Errors') or []:\n message('Error deleting %s, code(%s) %s', S3URL.combine('s3', bucket, res['Key']), err['Code'], err['Message'])\n\n if response.get('Errors') is not None:\n raise RetryFailure('Unable to complete deleting %d files.' % len(response.get('Errors')))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef run(self, args):\n '''Main entry to handle commands. Dispatch to individual command handler.'''\n if len(args) == 0:\n raise InvalidArgument('No command provided')\n cmd = args[0]\n if cmd + '_handler' in CommandHandler.__dict__:\n CommandHandler.__dict__[cmd + '_handler'](self, args)\n else:\n raise InvalidArgument('Unknown command %s' % cmd)", "response": "Main entry to handle commands. Dispatch to individual command handler."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef validate(self, format, args):\n '''Validate input parameters with given format.\n This function also checks for wildcards for recursive mode.\n '''\n fmtMap = {\n 'cmd': 'Command',\n 's3': 's3 path',\n 'local': 'local path'\n }\n fmts = format.split('|')\n if len(fmts) != len(args):\n raise InvalidArgument('Invalid number of parameters')\n\n for i, fmt in enumerate(fmts):\n valid = False\n for f in fmt.split(','):\n if f == 'cmd' and args[i] + '_handler' in CommandHandler.__dict__:\n valid = True\n if f == 's3' and S3URL.is_valid(args[i]):\n valid = True\n if f == 'local' and not S3URL.is_valid(args[i]):\n valid = True\n if not valid:\n raise InvalidArgument('Invalid parameter: %s, %s expected' % (args[i], fmtMap[fmt.split(',')[0]]))", "response": "Validate input parameters with given format."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef pretty_print(self, objlist):\n '''Pretty print the result of s3walk. Here we calculate the maximum width\n of each column and align them.\n '''\n\n def normalize_time(timestamp):\n '''Normalize the timestamp format for pretty print.'''\n if timestamp is None:\n return ' ' * 16\n\n return TIMESTAMP_FORMAT % (timestamp.year, timestamp.month, timestamp.day, timestamp.hour, timestamp.minute)\n\n cwidth = [0, 0, 0]\n format = '%%%ds %%%ds %%-%ds'\n\n # Calculate maximum width for each column.\n result = []\n for obj in objlist:\n last_modified = normalize_time(obj['last_modified'])\n size = str(obj['size']) if not obj['is_dir'] else 'DIR'\n name = obj['name']\n item = (last_modified, size, name)\n for i, value in enumerate(item):\n if cwidth[i] < len(value):\n cwidth[i] = len(value)\n result.append(item)\n\n # Format output.\n for item in result:\n text = (format % tuple(cwidth)) % item\n message('%s', text.rstrip())", "response": "Pretty print the result of s3walk. Here we calculate the maximum width of each column and align them."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef mb_handler(self, args):\n '''Handler for mb command'''\n if len(args) == 1:\n raise InvalidArgument('No s3 bucketname provided')\n\n self.validate('cmd|s3', args)\n self.s3handler().create_bucket(args[1])", "response": "Handler for mb command"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef put_handler(self, args):\n '''Handler for put command'''\n\n # Special check for shell expansion\n if len(args) < 3:\n raise InvalidArgument('Invalid number of parameters')\n self.validate('|'.join(['cmd'] + ['local'] * (len(args) - 2) + ['s3']), args)\n\n source = args[1:-1] # shell expansion\n target = args[-1]\n\n self.s3handler().put_files(source, target)", "response": "Handler for put command."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef cat_handler(self, args):\n '''Handler for cat command'''\n\n self.validate('cmd|s3', args)\n source = args[1]\n\n self.s3handler().print_files(source)", "response": "Handler for cat command"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef dsync_handler(self, args):\n '''Handler for dsync command.'''\n self.opt.recursive = True\n self.opt.sync_check = True\n self.opt.force = True\n\n self.validate('cmd|s3,local|s3,local', args)\n source = args[1]\n target = args[2]\n\n self.s3handler().dsync_files(source, target)", "response": "Handler for dsync command."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef cp_handler(self, args):\n '''Handler for cp command'''\n\n self.validate('cmd|s3|s3', args)\n source = args[1]\n target = args[2]\n self.s3handler().cp_files(source, target)", "response": "Handler for cp command"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef mv_handler(self, args):\n '''Handler for mv command'''\n\n self.validate('cmd|s3|s3', args)\n source = args[1]\n target = args[2]\n self.s3handler().cp_files(source, target, delete_source=True)", "response": "Handler for mv command"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef del_handler(self, args):\n '''Handler for del command'''\n self.validate('cmd|s3', args)\n source = args[1]\n self.s3handler().del_files(source)", "response": "Handler for del command"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef du_handler(self, args):\n '''Handler for size command'''\n for src, size in self.s3handler().size(args[1:]):\n message('%s\\t%s' % (size, src))", "response": "Handler for size command"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _totalsize_handler(self, args):\n '''Handler of total_size command'''\n total_size = 0\n for src, size in self.s3handler().size(args[1:]):\n total_size += size\n message(str(total_size))", "response": "Handler of total_size command"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsearches for date information in the string", "response": "def match_date(self, value):\n '''Search for date information in the string'''\n m = self.REGEX_DATE.search(value)\n date = datetime.datetime.utcnow().date()\n if m:\n date = datetime.date(int(m.group(1)), int(m.group(2)), int(m.group(3)))\n value = self.REGEX_DATE.sub('', value)\n return (date, value)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsearches for time information in the string", "response": "def match_time(self, value):\n '''Search for time information in the string'''\n m = self.REGEX_TIME.search(value)\n time = datetime.datetime.utcnow().time()\n if m:\n time = datetime.time(int(m.group(1)), int(m.group(2)))\n value = self.REGEX_TIME.sub('', value)\n return (time, value)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef match_delta(self, value):\n '''Search for timedelta information in the string'''\n m = self.REGEX_DELTA.search(value)\n delta = datetime.timedelta(days=0)\n if m:\n d = int(m.group(1))\n if m.group(3) == 'ago' or m.group(3) == 'before':\n d = -d\n\n if m.group(2) == 'minute':\n delta = datetime.timedelta(minutes=d)\n elif m.group(2) == 'hour':\n delta = datetime.timedelta(hours=d)\n elif m.group(2) == 'day':\n delta = datetime.timedelta(days=d)\n elif m.group(2) == 'week':\n delta = datetime.timedelta(weeks=d)\n value = self.REGEX_DELTA.sub('', value)\n return (delta, value)", "response": "Search for timedelta information in the string"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ntake json as dictionary parameter", "response": "def check_dict(self, opt, value):\n '''Take json as dictionary parameter'''\n try:\n return json.loads(value)\n except:\n raise optparse.OptionValueError(\"Option %s: invalid dict value: %r\" % (opt, value))"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ndiscovering gateways using multicast.", "response": "def discover_gateways(self):\n \"\"\"Discover gateways using multicast\"\"\"\n\n _socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)\n _socket.settimeout(5.0)\n if self._interface != 'any':\n _socket.bind((self._interface, 0))\n\n for gateway in self._gateways_config:\n host = gateway.get('host')\n port = gateway.get('port')\n sid = gateway.get('sid')\n\n if not (host and port and sid):\n continue\n try:\n ip_address = socket.gethostbyname(host)\n if gateway.get('disable'):\n _LOGGER.info(\n 'Xiaomi Gateway %s is disabled by configuration', sid)\n self.disabled_gateways.append(ip_address)\n continue\n _LOGGER.info(\n 'Xiaomi Gateway %s configured at IP %s:%s',\n sid, ip_address, port)\n\n self.gateways[ip_address] = XiaomiGateway(\n ip_address, port, sid,\n gateway.get('key'), self._device_discovery_retries,\n self._interface, gateway.get('proto'))\n except OSError as error:\n _LOGGER.error(\n \"Could not resolve %s: %s\", host, error)\n\n try:\n _socket.sendto('{\"cmd\":\"whois\"}'.encode(),\n (self.MULTICAST_ADDRESS, self.GATEWAY_DISCOVERY_PORT))\n\n while True:\n data, (ip_add, _) = _socket.recvfrom(1024)\n if len(data) is None or ip_add in self.gateways:\n continue\n\n if ip_add in self.gateways.keys() or ip_add in self.disabled_gateways:\n continue\n\n resp = json.loads(data.decode())\n if resp[\"cmd\"] != 'iam':\n _LOGGER.error(\"Response does not match return cmd\")\n continue\n\n if resp[\"model\"] not in GATEWAY_MODELS:\n _LOGGER.error(\"Response must be gateway model\")\n continue\n\n disabled = False\n gateway_key = None\n for gateway in self._gateways_config:\n sid = gateway.get('sid')\n if sid is None or sid == resp[\"sid\"]:\n gateway_key = gateway.get('key')\n if sid and sid == resp['sid'] and gateway.get('disable'):\n disabled = True\n\n sid = resp[\"sid\"]\n if disabled:\n _LOGGER.info(\"Xiaomi Gateway %s is disabled by configuration\",\n sid)\n self.disabled_gateways.append(ip_add)\n else:\n _LOGGER.info('Xiaomi Gateway %s found at IP %s', sid, ip_add)\n self.gateways[ip_add] = XiaomiGateway(\n ip_add, resp[\"port\"], sid, gateway_key,\n self._device_discovery_retries, self._interface,\n resp[\"proto_version\"] if \"proto_version\" in resp else None)\n\n except socket.timeout:\n _LOGGER.info(\"Gateway discovery finished in 5 seconds\")\n _socket.close()"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef listen(self):\n\n _LOGGER.info('Creating Multicast Socket')\n self._mcastsocket = self._create_mcast_socket()\n self._listening = True\n thread = Thread(target=self._listen_to_msg, args=())\n self._threads.append(thread)\n thread.daemon = True\n thread.start()", "response": "Start listening to messages."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nsend data to the hub.", "response": "def write_to_hub(self, sid, **kwargs):\n \"\"\"Send data to gateway to turn on / off device\"\"\"\n if self.key is None:\n _LOGGER.error('Gateway Key is not provided. Can not send commands to the gateway.')\n return False\n data = {}\n for key in kwargs:\n data[key] = kwargs[key]\n if not self.token:\n _LOGGER.debug('Gateway Token was not obtained yet. Cannot send commands to the gateway.')\n return False\n\n cmd = dict()\n cmd['cmd'] = 'write'\n cmd['sid'] = sid\n if int(self.proto[0:1]) == 1:\n data['key'] = self._get_key()\n cmd['data'] = data\n else:\n cmd['key'] = self._get_key()\n cmd['params'] = [data]\n resp = self._send_cmd(json.dumps(cmd), \"write_ack\") if int(self.proto[0:1]) == 1 \\\n else self._send_cmd(json.dumps(cmd), \"write_rsp\")\n _LOGGER.debug(\"write_ack << %s\", resp)\n if _validate_data(resp):\n return True\n if not _validate_keyerror(resp):\n return False\n\n # If 'invalid key' message we ask for a new token\n resp = self._send_cmd('{\"cmd\" : \"get_id_list\"}', \"get_id_list_ack\") if int(self.proto[0:1]) == 1 \\\n else self._send_cmd('{\"cmd\" : \"discovery\"}', \"discovery_rsp\")\n _LOGGER.debug(\"get_id_list << %s\", resp)\n\n if resp is None or \"token\" not in resp:\n _LOGGER.error('No new token from gateway. Can not send commands to the gateway.')\n return False\n self.token = resp['token']\n if int(self.proto[0:1]) == 1:\n data['key'] = self._get_key()\n cmd['data'] = data\n else:\n cmd['key'] = self._get_key()\n cmd['params'] = [data]\n resp = self._send_cmd(json.dumps(cmd), \"write_ack\") if int(self.proto[0:1]) == 1 \\\n else self._send_cmd(json.dumps(cmd), \"write_rsp\")\n _LOGGER.debug(\"write_ack << %s\", resp)\n return _validate_data(resp)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting data from hub.", "response": "def get_from_hub(self, sid):\n \"\"\"Get data from gateway\"\"\"\n cmd = '{ \"cmd\":\"read\",\"sid\":\"' + sid + '\"}'\n resp = self._send_cmd(cmd, \"read_ack\") if int(self.proto[0:1]) == 1 else self._send_cmd(cmd, \"read_rsp\")\n _LOGGER.debug(\"read_ack << %s\", resp)\n return self.push_data(resp)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef push_data(self, data):\n if not _validate_data(data):\n return False\n jdata = json.loads(data['data']) if int(self.proto[0:1]) == 1 else _list2map(data['params'])\n if jdata is None:\n return False\n sid = data['sid']\n for func in self.callbacks[sid]:\n func(jdata, data)\n return True", "response": "Push data broadcasted from gateway to device"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _get_key(self):\n init_vector = bytes(bytearray.fromhex('17996d093d28ddb3ba695a2e6f58562e'))\n encryptor = Cipher(algorithms.AES(self.key.encode()), modes.CBC(init_vector),\n backend=default_backend()).encryptor()\n ciphertext = encryptor.update(self.token.encode()) + encryptor.finalize()\n if isinstance(ciphertext, str): # For Python 2 compatibility\n return ''.join('{:02x}'.format(ord(x)) for x in ciphertext)\n return ''.join('{:02x}'.format(x) for x in ciphertext)", "response": "Get key using token from gateway"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef train(hparams, *args):\n # Initialize experiments and track all the hyperparameters\n exp = Experiment(\n name=hparams.test_tube_exp_name,\n # Location to save the metrics.\n save_dir=hparams.log_path,\n autosave=False,\n )\n exp.argparse(hparams)\n\n # Pretend to train.\n x = torch.rand((1, hparams.x_val))\n for train_step in range(0, 100):\n y = torch.rand((hparams.x_val, 1))\n out = x.mm(y)\n exp.log({'fake_err': out.item()})\n\n # Save exp when .\n exp.save()", "response": "Train your awesome model with the given hyperparameters."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef train(hparams, *args):\n # Initialize experiments and track all the hyperparameters\n exp = Experiment(\n name=hparams.test_tube_exp_name,\n # Location to save the metrics.\n save_dir=hparams.log_path,\n # The experiment version is optional, but using the one \n # from SLURM means the exp will not collide with other\n # versions if SLURM runs multiple at once.\n version=hparams.hpc_exp_number,\n autosave=False,\n )\n exp.argparse(hparams)\n\n # Pretend to train.\n x = hparams.x_val\n for train_step in range(0, 100):\n y = hparams.y_val\n out = x * y\n exp.log({'fake_err': out.item()}) # Log metrics.\n\n # Save exp when done.\n exp.save()", "response": "Train your awesome model with hyperparameters."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncall by RQ when there is a failure in a worker process.", "response": "def exception_handler(job, *exc_info):\n \"\"\"\n Called by RQ when there is a failure in a worker.\n\n NOTE: Make sure that in your RQ worker process, rollbar.init() has been called with\n handler='blocking'. The default handler, 'thread', does not work from inside an RQ worker.\n \"\"\"\n # Report data about the job with the exception.\n job_info = job.to_dict()\n # job_info['data'] is the pickled representation of the job, and doesn't json-serialize well.\n # repr() works nicely.\n job_info['data'] = repr(job_info['data'])\n\n extra_data = {'job': job_info}\n payload_data = {'framework': 'rq'}\n \n rollbar.report_exc_info(exc_info, extra_data=extra_data, payload_data=payload_data)\n\n # continue to the next handler\n return True"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef patch_debugtoolbar(settings):\n try:\n from pyramid_debugtoolbar import tbtools\n except ImportError:\n return\n\n rollbar_web_base = settings.get('rollbar.web_base', DEFAULT_WEB_BASE)\n if rollbar_web_base.endswith('/'):\n rollbar_web_base = rollbar_web_base[:-1]\n\n def insert_rollbar_console(request, html):\n # insert after the closing \n item_uuid = request.environ.get('rollbar.uuid')\n if not item_uuid:\n return html\n\n url = '%s/item/uuid/?uuid=%s' % (rollbar_web_base, item_uuid)\n link = 'View in Rollbar' % url\n new_data = \"

Rollbar: %s

\" % link\n insertion_marker = \"\"\n replacement = insertion_marker + new_data\n return html.replace(insertion_marker, replacement, 1)\n\n # patch tbtools.Traceback.render_full\n old_render_full = tbtools.Traceback.render_full\n\n def new_render_full(self, request, *args, **kw):\n html = old_render_full(self, request, *args, **kw)\n return insert_rollbar_console(request, html)\n\n tbtools.Traceback.render_full = new_render_full", "response": "Patches pyramid_debugtoolbar to display a link to the related rollbar item."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nincludes pyramid entry point in rollbar.", "response": "def includeme(config):\n \"\"\"\n Pyramid entry point\n \"\"\"\n settings = config.registry.settings\n\n config.add_tween('rollbar.contrib.pyramid.rollbar_tween_factory', over=EXCVIEW)\n\n # run patch_debugtoolbar, unless they disabled it\n if asbool(settings.get('rollbar.patch_debugtoolbar', True)):\n patch_debugtoolbar(settings)\n\n def hook(request, data):\n data['framework'] = 'pyramid'\n\n if request:\n request.environ['rollbar.uuid'] = data['uuid']\n\n if request.matched_route:\n data['context'] = request.matched_route.name\n\n rollbar.BASE_DATA_HOOK = hook\n\n kw = parse_settings(settings)\n\n access_token = kw.pop('access_token')\n environment = kw.pop('environment', 'production')\n\n if kw.get('scrub_fields'):\n kw['scrub_fields'] = set([str.strip(x) for x in kw.get('scrub_fields').split('\\n') if x])\n\n if kw.get('exception_level_filters'):\n r = DottedNameResolver()\n exception_level_filters = []\n for line in kw.get('exception_level_filters').split('\\n'):\n if line:\n dotted_path, level = line.split()\n\n try:\n cls = r.resolve(dotted_path)\n exception_level_filters.append((cls, level))\n except ImportError:\n log.error('Could not import %r' % dotted_path)\n\n kw['exception_level_filters'] = exception_level_filters\n\n kw['enabled'] = asbool(kw.get('enabled', True))\n\n rollbar.init(access_token, environment, **kw)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _ensure_log_handler(self):\n if log.handlers:\n return\n handler = logging.StreamHandler()\n formatter = logging.Formatter(\n '%(asctime)s %(levelname)-5.5s [%(name)s][%(threadName)s] %(message)s')\n handler.setFormatter(formatter)\n log.addHandler(handler)", "response": "Ensure that the log handler is set up."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_request():\n\n # TODO(cory): add in a generic _get_locals_request() which\n # will iterate up through the call stack and look for a variable\n # that appears to be valid request object.\n for fn in (_get_bottle_request,\n _get_flask_request,\n _get_pyramid_request,\n _get_pylons_request):\n try:\n req = fn()\n if req is not None:\n return req\n except:\n pass\n\n return None", "response": "Get the current request object."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ninitialize the Rollbar module with the given access token and environment.", "response": "def init(access_token, environment='production', scrub_fields=None, url_fields=None, **kw):\n \"\"\"\n Saves configuration variables in this module's SETTINGS.\n\n access_token: project access token. Get this from the Rollbar UI:\n - click \"Settings\" in the top nav\n - click \"Projects\" in the left nav\n - copy-paste the appropriate token.\n environment: environment name. Can be any string; suggestions: 'production', 'development',\n 'staging', 'yourname'\n **kw: provided keyword arguments will override keys in SETTINGS.\n \"\"\"\n global SETTINGS, agent_log, _initialized, _transforms, _serialize_transform, _threads\n\n if scrub_fields is not None:\n SETTINGS['scrub_fields'] = list(scrub_fields)\n if url_fields is not None:\n SETTINGS['url_fields'] = list(url_fields)\n\n # Merge the extra config settings into SETTINGS\n SETTINGS = dict_merge(SETTINGS, kw)\n if _initialized:\n # NOTE: Temp solution to not being able to re-init.\n # New versions of pyrollbar will support re-initialization\n # via the (not-yet-implemented) configure() method.\n if not SETTINGS.get('suppress_reinit_warning'):\n log.warning('Rollbar already initialized. Ignoring re-init.')\n return\n\n SETTINGS['access_token'] = access_token\n SETTINGS['environment'] = environment\n\n if SETTINGS.get('allow_logging_basic_config'):\n logging.basicConfig()\n\n if SETTINGS.get('handler') == 'agent':\n agent_log = _create_agent_log()\n\n # We will perform these transforms in order:\n # 1. Serialize the payload to be all python built-in objects\n # 2. Scrub the payloads based on the key suffixes in SETTINGS['scrub_fields']\n # 3. Scrub URLs in the payload for keys that end with 'url'\n # 4. Optional - If local variable gathering is enabled, transform the\n # trace frame values using the ShortReprTransform.\n _serialize_transform = SerializableTransform(safe_repr=SETTINGS['locals']['safe_repr'],\n whitelist_types=SETTINGS['locals']['whitelisted_types'])\n _transforms = [\n ScrubRedactTransform(),\n _serialize_transform,\n ScrubTransform(suffixes=[(field,) for field in SETTINGS['scrub_fields']], redact_char='*'),\n ScrubUrlTransform(suffixes=[(field,) for field in SETTINGS['url_fields']], params_to_scrub=SETTINGS['scrub_fields'])\n ]\n\n # A list of key prefixes to apply our shortener transform to. The request\n # being included in the body key is old behavior and is being retained for\n # backwards compatibility.\n shortener_keys = [\n ('request', 'POST'),\n ('request', 'json'),\n ('body', 'request', 'POST'),\n ('body', 'request', 'json'),\n ]\n\n if SETTINGS['locals']['enabled']:\n shortener_keys.append(('body', 'trace', 'frames', '*', 'code'))\n shortener_keys.append(('body', 'trace', 'frames', '*', 'args', '*'))\n shortener_keys.append(('body', 'trace', 'frames', '*', 'kwargs', '*'))\n shortener_keys.append(('body', 'trace', 'frames', '*', 'locals', '*'))\n\n shortener_keys.extend(SETTINGS['shortener_keys'])\n\n shortener = ShortenerTransform(safe_repr=SETTINGS['locals']['safe_repr'],\n keys=shortener_keys,\n **SETTINGS['locals']['sizes'])\n _transforms.append(shortener)\n _threads = queue.Queue()\n events.reset()\n filters.add_builtin_filters(SETTINGS)\n\n _initialized = True"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreport an exception to Rollbar.", "response": "def report_exc_info(exc_info=None, request=None, extra_data=None, payload_data=None, level=None, **kw):\n \"\"\"\n Reports an exception to Rollbar, using exc_info (from calling sys.exc_info())\n\n exc_info: optional, should be the result of calling sys.exc_info(). If omitted, sys.exc_info() will be called here.\n request: optional, a WebOb, Werkzeug-based or Sanic request object.\n extra_data: optional, will be included in the 'custom' section of the payload\n payload_data: optional, dict that will override values in the final payload\n (e.g. 'level' or 'fingerprint')\n kw: provided for legacy purposes; unused.\n\n Example usage:\n\n rollbar.init(access_token='YOUR_PROJECT_ACCESS_TOKEN')\n try:\n do_something()\n except:\n rollbar.report_exc_info(sys.exc_info(), request, {'foo': 'bar'}, {'level': 'warning'})\n \"\"\"\n if exc_info is None:\n exc_info = sys.exc_info()\n\n try:\n return _report_exc_info(exc_info, request, extra_data, payload_data, level=level)\n except Exception as e:\n log.exception(\"Exception while reporting exc_info to Rollbar. %r\", e)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreport a message to Rollbar.", "response": "def report_message(message, level='error', request=None, extra_data=None, payload_data=None):\n \"\"\"\n Reports an arbitrary string message to Rollbar.\n\n message: the string body of the message\n level: level to report at. One of: 'critical', 'error', 'warning', 'info', 'debug'\n request: the request object for the context of the message\n extra_data: dictionary of params to include with the message. 'body' is reserved.\n payload_data: param names to pass in the 'data' level of the payload; overrides defaults.\n \"\"\"\n try:\n return _report_message(message, level, request, extra_data, payload_data)\n except Exception as e:\n log.exception(\"Exception while reporting message to Rollbar. %r\", e)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef send_payload(payload, access_token):\n payload = events.on_payload(payload)\n if payload is False:\n return\n\n payload_str = _serialize_payload(payload)\n\n handler = SETTINGS.get('handler')\n if handler == 'blocking':\n _send_payload(payload_str, access_token)\n elif handler == 'agent':\n agent_log.error(payload_str)\n elif handler == 'tornado':\n if TornadoAsyncHTTPClient is None:\n log.error('Unable to find tornado')\n return\n _send_payload_tornado(payload_str, access_token)\n elif handler == 'gae':\n if AppEngineFetch is None:\n log.error('Unable to find AppEngine URLFetch module')\n return\n _send_payload_appengine(payload_str, access_token)\n elif handler == 'twisted':\n if treq is None:\n log.error('Unable to find Treq')\n return\n _send_payload_twisted(payload_str, access_token)\n else:\n # default to 'thread'\n thread = threading.Thread(target=_send_payload, args=(payload_str, access_token))\n _threads.put(thread)\n thread.start()", "response": "Send a payload to the current thread."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsearching a project for items that match the input criteria.", "response": "def search_items(title, return_fields=None, access_token=None, endpoint=None, **search_fields):\n \"\"\"\n Searches a project for items that match the input criteria.\n\n title: all or part of the item's title to search for.\n return_fields: the fields that should be returned for each item.\n e.g. ['id', 'project_id', 'status'] will return a dict containing\n only those fields for each item.\n access_token: a project access token. If this is not provided,\n the one provided to init() will be used instead.\n search_fields: additional fields to include in the search.\n currently supported: status, level, environment\n \"\"\"\n if not title:\n return []\n\n if return_fields is not None:\n return_fields = ','.join(return_fields)\n\n return _get_api('search/',\n title=title,\n fields=return_fields,\n access_token=access_token,\n endpoint=endpoint,\n **search_fields)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncreating a new log file for use with rollbar - agent", "response": "def _create_agent_log():\n \"\"\"\n Creates .rollbar log file for use with rollbar-agent\n \"\"\"\n log_file = SETTINGS['agent.log_file']\n if not log_file.endswith('.rollbar'):\n log.error(\"Provided agent log file does not end with .rollbar, which it must. \"\n \"Using default instead.\")\n log_file = DEFAULTS['agent.log_file']\n\n retval = logging.getLogger('rollbar_agent')\n handler = logging.FileHandler(log_file, 'a', 'utf-8')\n formatter = logging.Formatter('%(message)s')\n handler.setFormatter(formatter)\n retval.addHandler(handler)\n retval.setLevel(logging.WARNING)\n return retval"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _report_exc_info(exc_info, request, extra_data, payload_data, level=None):\n\n if not _check_config():\n return\n\n filtered_level = _filtered_level(exc_info[1])\n if level is None:\n level = filtered_level\n\n filtered_exc_info = events.on_exception_info(exc_info,\n request=request,\n extra_data=extra_data,\n payload_data=payload_data,\n level=level)\n\n if filtered_exc_info is False:\n return\n\n cls, exc, trace = filtered_exc_info\n\n data = _build_base_data(request)\n if level is not None:\n data['level'] = level\n\n # walk the trace chain to collect cause and context exceptions\n trace_chain = _walk_trace_chain(cls, exc, trace)\n\n extra_trace_data = None\n if len(trace_chain) > 1:\n data['body'] = {\n 'trace_chain': trace_chain\n }\n if payload_data and ('body' in payload_data) and ('trace' in payload_data['body']):\n extra_trace_data = payload_data['body']['trace']\n del payload_data['body']['trace']\n else:\n data['body'] = {\n 'trace': trace_chain[0]\n }\n\n if extra_data:\n extra_data = extra_data\n if not isinstance(extra_data, dict):\n extra_data = {'value': extra_data}\n if extra_trace_data:\n extra_data = dict_merge(extra_data, extra_trace_data)\n data['custom'] = extra_data\n if extra_trace_data and not extra_data:\n data['custom'] = extra_trace_data\n\n request = _get_actual_request(request)\n _add_request_data(data, request)\n _add_person_data(data, request)\n _add_lambda_context_data(data)\n data['server'] = _build_server_data()\n\n if payload_data:\n data = dict_merge(data, payload_data)\n\n payload = _build_payload(data)\n send_payload(payload, payload.get('access_token'))\n\n return data['uuid']", "response": "Internal function to report exception info."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreports a message to the server.", "response": "def _report_message(message, level, request, extra_data, payload_data):\n \"\"\"\n Called by report_message() wrapper\n \"\"\"\n if not _check_config():\n return\n\n filtered_message = events.on_message(message,\n request=request,\n extra_data=extra_data,\n payload_data=payload_data,\n level=level)\n\n if filtered_message is False:\n return\n\n data = _build_base_data(request, level=level)\n\n # message\n data['body'] = {\n 'message': {\n 'body': filtered_message\n }\n }\n\n if extra_data:\n extra_data = extra_data\n data['body']['message'].update(extra_data)\n\n request = _get_actual_request(request)\n _add_request_data(data, request)\n _add_person_data(data, request)\n _add_lambda_context_data(data)\n data['server'] = _build_server_data()\n\n if payload_data:\n data = dict_merge(data, payload_data)\n\n payload = _build_payload(data)\n send_payload(payload, payload.get('access_token'))\n\n return data['uuid']"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nbuilds a dictionary describing the logged - in user using data from request. rollbar_person first then user then user_id then user", "response": "def _build_person_data(request):\n \"\"\"\n Returns a dictionary describing the logged-in user using data from `request.\n\n Try request.rollbar_person first, then 'user', then 'user_id'\n \"\"\"\n if hasattr(request, 'rollbar_person'):\n rollbar_person_prop = request.rollbar_person\n try:\n person = rollbar_person_prop()\n except TypeError:\n person = rollbar_person_prop\n\n if person and isinstance(person, dict):\n return person\n else:\n return None\n\n if hasattr(request, 'user'):\n user_prop = request.user\n try:\n user = user_prop()\n except TypeError:\n user = user_prop\n\n if not user:\n return None\n elif isinstance(user, dict):\n return user\n else:\n retval = {}\n if getattr(user, 'id', None):\n retval['id'] = text(user.id)\n elif getattr(user, 'user_id', None):\n retval['id'] = text(user.user_id)\n\n # id is required, so only include username/email if we have an id\n if retval.get('id'):\n username = getattr(user, 'username', None)\n email = getattr(user, 'email', None)\n retval.update({\n 'username': username,\n 'email': email\n })\n return retval\n\n if hasattr(request, 'user_id'):\n user_id_prop = request.user_id\n try:\n user_id = user_id_prop()\n except TypeError:\n user_id = user_id_prop\n\n if not user_id:\n return None\n return {'id': text(user_id)}"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadding lambda context data to the data dict", "response": "def _add_lambda_context_data(data):\n \"\"\"\n Attempts to add information from the lambda context if it exists\n \"\"\"\n global _CURRENT_LAMBDA_CONTEXT\n context = _CURRENT_LAMBDA_CONTEXT\n if context is None:\n return\n try:\n lambda_data = {\n 'lambda': {\n 'remaining_time_in_millis': context.get_remaining_time_in_millis(),\n 'function_name': context.function_name,\n 'function_version': context.function_version,\n 'arn': context.invoked_function_arn,\n 'request_id': context.aws_request_id,\n }\n }\n if 'custom' in data:\n data['custom'] = dict_merge(data['custom'], lambda_data)\n else:\n data['custom'] = lambda_data\n except Exception as e:\n log.exception(\"Exception while adding lambda context data: %r\", e)\n finally:\n _CURRENT_LAMBDA_CONTEXT = None"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nbuilding and adds the request data to the data dict.", "response": "def _add_request_data(data, request):\n \"\"\"\n Attempts to build request data; if successful, sets the 'request' key on `data`.\n \"\"\"\n try:\n request_data = _build_request_data(request)\n except Exception as e:\n log.exception(\"Exception while building request_data for Rollbar payload: %r\", e)\n else:\n if request_data:\n _filter_ip(request_data, SETTINGS['capture_ip'])\n data['request'] = request_data"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns True if we should add local variables for the given frame.", "response": "def _check_add_locals(frame, frame_num, total_frames):\n \"\"\"\n Returns True if we should record local variables for the given frame.\n \"\"\"\n # Include the last frames locals\n # Include any frame locals that came from a file in the project's root\n return any(((frame_num == total_frames - 1),\n ('root' in SETTINGS and (frame.get('filename') or '').lower().startswith((SETTINGS['root'] or '').lower()))))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _build_request_data(request):\n\n # webob (pyramid)\n if WebobBaseRequest and isinstance(request, WebobBaseRequest):\n return _build_webob_request_data(request)\n\n # django\n if DjangoHttpRequest and isinstance(request, DjangoHttpRequest):\n return _build_django_request_data(request)\n\n # django rest framework\n if RestFrameworkRequest and isinstance(request, RestFrameworkRequest):\n return _build_django_request_data(request)\n\n # werkzeug (flask)\n if WerkzeugRequest and isinstance(request, WerkzeugRequest):\n return _build_werkzeug_request_data(request)\n\n # tornado\n if TornadoRequest and isinstance(request, TornadoRequest):\n return _build_tornado_request_data(request)\n\n # bottle\n if BottleRequest and isinstance(request, BottleRequest):\n return _build_bottle_request_data(request)\n\n # Sanic\n if SanicRequest and isinstance(request, SanicRequest):\n return _build_sanic_request_data(request)\n\n # falcon\n if FalconRequest and isinstance(request, FalconRequest):\n return _build_falcon_request_data(request)\n\n # Plain wsgi (should be last)\n if isinstance(request, dict) and 'wsgi.version' in request:\n return _build_wsgi_request_data(request)\n\n return None", "response": "Builds the data dictionary from the request."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nbuilding server environment data dictionary.", "response": "def _build_server_data():\n \"\"\"\n Returns a dictionary containing information about the server environment.\n \"\"\"\n # server environment\n server_data = {\n 'host': socket.gethostname(),\n 'pid': os.getpid()\n }\n\n # argv does not always exist in embedded python environments\n argv = getattr(sys, 'argv', None)\n if argv:\n server_data['argv'] = argv\n\n for key in ['branch', 'root']:\n if SETTINGS.get(key):\n server_data[key] = SETTINGS[key]\n\n return server_data"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nbuild the full payload for the current user.", "response": "def _build_payload(data):\n \"\"\"\n Returns the full payload as a string.\n \"\"\"\n\n for k, v in iteritems(data):\n data[k] = _transform(v, key=(k,))\n\n payload = {\n 'access_token': SETTINGS['access_token'],\n 'data': data\n }\n\n return payload"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef compose(chosung, joongsung, jongsung=u''):\n\n if jongsung is None: jongsung = u''\n\n try:\n chosung_index = CHO.index(chosung)\n joongsung_index = JOONG.index(joongsung)\n jongsung_index = JONG.index(jongsung)\n except Exception:\n raise NotHangulException('No valid Hangul character index')\n\n return unichr(0xAC00 + chosung_index * NUM_JOONG * NUM_JONG + joongsung_index * NUM_JONG + jongsung_index)", "response": "This function returns a Hangul letter by composing the specified chosung and joongsung and jongsung."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef has_jongsung(letter):\n if len(letter) != 1:\n raise Exception('The target string must be one letter.')\n if not is_hangul(letter):\n raise NotHangulException('The target string must be Hangul')\n\n code = lt.hangul_index(letter)\n return code % NUM_JONG > 0", "response": "Check whether this letter contains Jongsung."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef attach(word, josa=EUN_NEUN):\n last_letter = word.strip()[-1]\n try:\n _, _, letter_jong = letter.decompose(last_letter)\n except NotHangulException:\n letter_jong = letter.get_substituent_of(last_letter)\n\n if letter_jong in ('', josa['except']):\n return word + josa['has']\n\n return word + josa['not']", "response": "add josa at the end of this word"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef is_inside_except(node):\n current = node\n while current and not isinstance(current.parent, astroid.ExceptHandler):\n current = current.parent\n\n return current and current is current.parent.name", "response": "Returns true if the given node is inside the name of an except handler."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn true if given node is inside lambda", "response": "def is_inside_lambda(node: astroid.node_classes.NodeNG) -> bool:\n \"\"\"Return true if given node is inside lambda\"\"\"\n parent = node.parent\n while parent is not None:\n if isinstance(parent, astroid.Lambda):\n return True\n parent = parent.parent\n return False"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck if an assignment node in an except handler clobbers an existing variable.", "response": "def clobber_in_except(\n node: astroid.node_classes.NodeNG\n) -> Tuple[bool, Tuple[str, str]]:\n \"\"\"Checks if an assignment node in an except handler clobbers an existing\n variable.\n\n Returns (True, args for W0623) if assignment clobbers an existing variable,\n (False, None) otherwise.\n \"\"\"\n if isinstance(node, astroid.AssignAttr):\n return True, (node.attrname, \"object %r\" % (node.expr.as_string(),))\n if isinstance(node, astroid.AssignName):\n name = node.name\n if is_builtin(name):\n return (True, (name, \"builtins\"))\n\n stmts = node.lookup(name)[1]\n if stmts and not isinstance(\n stmts[0].assign_type(),\n (astroid.Assign, astroid.AugAssign, astroid.ExceptHandler),\n ):\n return True, (name, \"outer scope (line %s)\" % stmts[0].fromlineno)\n return False, None"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef is_super(node: astroid.node_classes.NodeNG) -> bool:\n if getattr(node, \"name\", None) == \"super\" and node.root().name == BUILTINS_NAME:\n return True\n return False", "response": "Returns True if the node is referencing the super builtin function\n "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning true if the function does nothing but raising an exception", "response": "def is_error(node: astroid.node_classes.NodeNG) -> bool:\n \"\"\"return true if the function does nothing but raising an exception\"\"\"\n for child_node in node.get_children():\n if isinstance(child_node, astroid.Raise):\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning True if the given node is an object from the builtin module.", "response": "def is_builtin_object(node: astroid.node_classes.NodeNG) -> bool:\n \"\"\"Returns True if the given node is an object from the __builtin__ module.\"\"\"\n return node and node.root().name == BUILTINS_NAME"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef is_defined_before(var_node: astroid.node_classes.NodeNG) -> bool:\n varname = var_node.name\n _node = var_node.parent\n while _node:\n if is_defined_in_scope(var_node, varname, _node):\n return True\n _node = _node.parent\n # possibly multiple statements on the same line using semi colon separator\n stmt = var_node.statement()\n _node = stmt.previous_sibling()\n lineno = stmt.fromlineno\n while _node and _node.fromlineno == lineno:\n for assign_node in _node.nodes_of_class(astroid.AssignName):\n if assign_node.name == varname:\n return True\n for imp_node in _node.nodes_of_class((astroid.ImportFrom, astroid.Import)):\n if varname in [name[1] or name[0] for name in imp_node.names]:\n return True\n _node = _node.previous_sibling()\n return False", "response": "return True if the variable node is defined before the given node"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn true if the given Name node is used in function or lambda default argument s value", "response": "def is_default_argument(node: astroid.node_classes.NodeNG) -> bool:\n \"\"\"return true if the given Name node is used in function or lambda\n default argument's value\n \"\"\"\n parent = node.scope()\n if isinstance(parent, (astroid.FunctionDef, astroid.Lambda)):\n for default_node in parent.args.defaults:\n for default_name_node in default_node.nodes_of_class(astroid.Name):\n if default_name_node is node:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef is_func_decorator(node: astroid.node_classes.NodeNG) -> bool:\n parent = node.parent\n while parent is not None:\n if isinstance(parent, astroid.Decorators):\n return True\n if parent.is_statement or isinstance(\n parent,\n (astroid.Lambda, scoped_nodes.ComprehensionScope, scoped_nodes.ListComp),\n ):\n break\n parent = parent.parent\n return False", "response": "return true if the name is used in function decorator"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef is_ancestor_name(\n frame: astroid.node_classes.NodeNG, node: astroid.node_classes.NodeNG\n) -> bool:\n \"\"\"return True if `frame` is an astroid.Class node with `node` in the\n subtree of its bases attribute\n \"\"\"\n try:\n bases = frame.bases\n except AttributeError:\n return False\n for base in bases:\n if node in base.nodes_of_class(astroid.Name):\n return True\n return False", "response": "Returns True if node is an ancestor of the base class of the base class of the base class of the base class of the base class of the base class of the base class of the base class of the base class of the base class of the base class of the base class."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the higher parent which is not an AssignName Tuple or List node", "response": "def assign_parent(node: astroid.node_classes.NodeNG) -> astroid.node_classes.NodeNG:\n \"\"\"return the higher parent which is not an AssignName, Tuple or List node\n \"\"\"\n while node and isinstance(node, (astroid.AssignName, astroid.Tuple, astroid.List)):\n node = node.parent\n return node"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef overrides_a_method(class_node: astroid.node_classes.NodeNG, name: str) -> bool:\n for ancestor in class_node.ancestors():\n if name in ancestor and isinstance(ancestor[name], astroid.FunctionDef):\n return True\n return False", "response": "return True if is a method overridden from an ancestor"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef check_messages(*messages: str) -> Callable:\n\n def store_messages(func):\n func.checks_msgs = messages\n return func\n\n return store_messages", "response": "decorator to store messages that are handled by a checker method"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nparse a format string into a set of mapping keys and arguments.", "response": "def parse_format_string(\n format_string: str\n) -> Tuple[Set[str], int, Dict[str, str], List[str]]:\n \"\"\"Parses a format string, returning a tuple of (keys, num_args), where keys\n is the set of mapping keys in the format string, and num_args is the number\n of arguments required by the format string. Raises\n IncompleteFormatString or UnsupportedFormatCharacter if a\n parse error occurs.\"\"\"\n keys = set()\n key_types = dict()\n pos_types = []\n num_args = 0\n\n def next_char(i):\n i += 1\n if i == len(format_string):\n raise IncompleteFormatString\n return (i, format_string[i])\n\n i = 0\n while i < len(format_string):\n char = format_string[i]\n if char == \"%\":\n i, char = next_char(i)\n # Parse the mapping key (optional).\n key = None\n if char == \"(\":\n depth = 1\n i, char = next_char(i)\n key_start = i\n while depth != 0:\n if char == \"(\":\n depth += 1\n elif char == \")\":\n depth -= 1\n i, char = next_char(i)\n key_end = i - 1\n key = format_string[key_start:key_end]\n\n # Parse the conversion flags (optional).\n while char in \"#0- +\":\n i, char = next_char(i)\n # Parse the minimum field width (optional).\n if char == \"*\":\n num_args += 1\n i, char = next_char(i)\n else:\n while char in string.digits:\n i, char = next_char(i)\n # Parse the precision (optional).\n if char == \".\":\n i, char = next_char(i)\n if char == \"*\":\n num_args += 1\n i, char = next_char(i)\n else:\n while char in string.digits:\n i, char = next_char(i)\n # Parse the length modifier (optional).\n if char in \"hlL\":\n i, char = next_char(i)\n # Parse the conversion type (mandatory).\n if PY3K:\n flags = \"diouxXeEfFgGcrs%a\"\n else:\n flags = \"diouxXeEfFgGcrs%\"\n if char not in flags:\n raise UnsupportedFormatCharacter(i)\n if key:\n keys.add(key)\n key_types[key] = char\n elif char != \"%\":\n num_args += 1\n pos_types.append(char)\n i += 1\n return keys, num_args, key_types, pos_types"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef collect_string_fields(format_string) -> Iterable[Optional[str]]:\n formatter = string.Formatter()\n try:\n parseiterator = formatter.parse(format_string)\n for result in parseiterator:\n if all(item is None for item in result[1:]):\n # not a replacement format\n continue\n name = result[1]\n nested = result[2]\n yield name\n if nested:\n for field in collect_string_fields(nested):\n yield field\n except ValueError as exc:\n # Probably the format string is invalid.\n if exc.args[0].startswith(\"cannot switch from manual\"):\n # On Jython, parsing a string with both manual\n # and automatic positions will fail with a ValueError,\n # while on CPython it will simply return the fields,\n # the validation being done in the interpreter (?).\n # We're just returning two mixed fields in order\n # to trigger the format-combined-specification check.\n yield \"\"\n yield \"1\"\n return\n raise IncompleteFormatString(format_string)", "response": "Given a format string return an iterator of all the valid format fields."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef parse_format_method_string(\n format_string: str\n) -> Tuple[List[Tuple[str, List[Tuple[bool, str]]]], int, int]:\n \"\"\"\n Parses a PEP 3101 format string, returning a tuple of\n (keyword_arguments, implicit_pos_args_cnt, explicit_pos_args),\n where keyword_arguments is the set of mapping keys in the format string, implicit_pos_args_cnt\n is the number of arguments required by the format string and\n explicit_pos_args is the number of arguments passed with the position.\n \"\"\"\n keyword_arguments = []\n implicit_pos_args_cnt = 0\n explicit_pos_args = set()\n for name in collect_string_fields(format_string):\n if name and str(name).isdigit():\n explicit_pos_args.add(str(name))\n elif name:\n keyname, fielditerator = split_format_field_names(name)\n if isinstance(keyname, numbers.Number):\n # In Python 2 it will return long which will lead\n # to different output between 2 and 3\n explicit_pos_args.add(str(keyname))\n keyname = int(keyname)\n try:\n keyword_arguments.append((keyname, list(fielditerator)))\n except ValueError:\n raise IncompleteFormatString()\n else:\n implicit_pos_args_cnt += 1\n return keyword_arguments, implicit_pos_args_cnt, len(explicit_pos_args)", "response": "Parses a PEP 3101 format string into a tuple of keyword arguments implicit_pos_args_cnt explicit_pos_args."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef is_attr_protected(attrname: str) -> bool:\n return (\n attrname[0] == \"_\"\n and attrname != \"_\"\n and not (attrname.startswith(\"__\") and attrname.endswith(\"__\"))\n )", "response": "Returns True if the given attribute name is protected."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef node_frame_class(\n node: astroid.node_classes.NodeNG\n) -> Optional[astroid.node_classes.NodeNG]:\n \"\"\"return klass node for a method node (or a staticmethod or a\n classmethod), return null otherwise\n \"\"\"\n klass = node.frame()\n\n while klass is not None and not isinstance(klass, astroid.ClassDef):\n if klass.parent is None:\n klass = None\n else:\n klass = klass.parent.frame()\n\n return klass", "response": "return the class node that is a method or a staticmethod or a\n classmethod"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking that the given attribute name is private.", "response": "def is_attr_private(attrname: str) -> Optional[Match[str]]:\n \"\"\"Check that attribute name is private (at least two leading underscores,\n at most one trailing underscore)\n \"\"\"\n regex = re.compile(\"^_{2,}.*[^_]+_?$\")\n return regex.match(attrname)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_argument_from_call(\n call_node: astroid.Call, position: int = None, keyword: str = None\n) -> astroid.Name:\n \"\"\"Returns the specified argument from a function call.\n\n :param astroid.Call call_node: Node representing a function call to check.\n :param int position: position of the argument.\n :param str keyword: the keyword of the argument.\n\n :returns: The node representing the argument, None if the argument is not found.\n :rtype: astroid.Name\n :raises ValueError: if both position and keyword are None.\n :raises NoSuchArgumentError: if no argument at the provided position or with\n the provided keyword.\n \"\"\"\n if position is None and keyword is None:\n raise ValueError(\"Must specify at least one of: position or keyword.\")\n if position is not None:\n try:\n return call_node.args[position]\n except IndexError:\n pass\n if keyword and call_node.keywords:\n for arg in call_node.keywords:\n if arg.arg == keyword:\n return arg.value\n\n raise NoSuchArgumentError", "response": "Returns the argument from a function call."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef inherit_from_std_ex(node: astroid.node_classes.NodeNG) -> bool:\n ancestors = node.ancestors() if hasattr(node, \"ancestors\") else []\n for ancestor in itertools.chain([node], ancestors):\n if (\n ancestor.name in (\"Exception\", \"BaseException\")\n and ancestor.root().name == EXCEPTIONS_MODULE\n ):\n return True\n return False", "response": "Returns true if the given class node is subclass of exceptions. Exception."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if the given exception handler catches any of the given error_type.", "response": "def error_of_type(handler: astroid.ExceptHandler, error_type) -> bool:\n \"\"\"\n Check if the given exception handler catches\n the given error_type.\n\n The *handler* parameter is a node, representing an ExceptHandler node.\n The *error_type* can be an exception, such as AttributeError,\n the name of an exception, or it can be a tuple of errors.\n The function will return True if the handler catches any of the\n given errors.\n \"\"\"\n\n def stringify_error(error):\n if not isinstance(error, str):\n return error.__name__\n return error\n\n if not isinstance(error_type, tuple):\n error_type = (error_type,) # type: ignore\n expected_errors = {stringify_error(error) for error in error_type} # type: ignore\n if not handler.type:\n return True\n return handler.catch(expected_errors)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ndetect if the given function node is decorated with a property.", "response": "def decorated_with_property(node: astroid.FunctionDef) -> bool:\n \"\"\" Detect if the given function node is decorated with a property. \"\"\"\n if not node.decorators:\n return False\n for decorator in node.decorators.nodes:\n if not isinstance(decorator, astroid.Name):\n continue\n try:\n if _is_property_decorator(decorator):\n return True\n except astroid.InferenceError:\n pass\n return False"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndetermining if the function node has a decorator with the qualified name argName.", "response": "def decorated_with(func: astroid.FunctionDef, qnames: Iterable[str]) -> bool:\n \"\"\"Determine if the `func` node has a decorator with the qualified name `qname`.\"\"\"\n decorators = func.decorators.nodes if func.decorators else []\n for decorator_node in decorators:\n try:\n if any(\n i is not None and i.qname() in qnames for i in decorator_node.infer()\n ):\n return True\n except astroid.InferenceError:\n continue\n return False"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns a dictionary of unimplemented abstract methods for the given node.", "response": "def unimplemented_abstract_methods(\n node: astroid.node_classes.NodeNG, is_abstract_cb: astroid.FunctionDef = None\n) -> Dict[str, astroid.node_classes.NodeNG]:\n \"\"\"\n Get the unimplemented abstract methods for the given *node*.\n\n A method can be considered abstract if the callback *is_abstract_cb*\n returns a ``True`` value. The check defaults to verifying that\n a method is decorated with abstract methods.\n The function will work only for new-style classes. For old-style\n classes, it will simply return an empty dictionary.\n For the rest of them, it will return a dictionary of abstract method\n names and their inferred objects.\n \"\"\"\n if is_abstract_cb is None:\n is_abstract_cb = partial(decorated_with, qnames=ABC_METHODS)\n visited = {} # type: Dict[str, astroid.node_classes.NodeNG]\n try:\n mro = reversed(node.mro())\n except NotImplementedError:\n # Old style class, it will not have a mro.\n return {}\n except astroid.ResolveError:\n # Probably inconsistent hierarchy, don'try\n # to figure this out here.\n return {}\n for ancestor in mro:\n for obj in ancestor.values():\n infered = obj\n if isinstance(obj, astroid.AssignName):\n infered = safe_infer(obj)\n if not infered:\n # Might be an abstract function,\n # but since we don't have enough information\n # in order to take this decision, we're taking\n # the *safe* decision instead.\n if obj.name in visited:\n del visited[obj.name]\n continue\n if not isinstance(infered, astroid.FunctionDef):\n if obj.name in visited:\n del visited[obj.name]\n if isinstance(infered, astroid.FunctionDef):\n # It's critical to use the original name,\n # since after inferring, an object can be something\n # else than expected, as in the case of the\n # following assignment.\n #\n # class A:\n # def keys(self): pass\n # __iter__ = keys\n abstract = is_abstract_cb(infered)\n if abstract:\n visited[obj.name] = infered\n elif not abstract and obj.name in visited:\n del visited[obj.name]\n return visited"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef find_try_except_wrapper_node(\n node: astroid.node_classes.NodeNG\n) -> Union[astroid.ExceptHandler, astroid.TryExcept]:\n \"\"\"Return the ExceptHandler or the TryExcept node in which the node is.\"\"\"\n current = node\n ignores = (astroid.ExceptHandler, astroid.TryExcept)\n while current and not isinstance(current.parent, ignores):\n current = current.parent\n\n if current and isinstance(current.parent, ignores):\n return current.parent\n return None", "response": "Return the ExceptHandler or the TryExcept node in which the node is."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking if the given node is from a fallback import block.", "response": "def is_from_fallback_block(node: astroid.node_classes.NodeNG) -> bool:\n \"\"\"Check if the given node is from a fallback import block.\"\"\"\n context = find_try_except_wrapper_node(node)\n if not context:\n return False\n\n if isinstance(context, astroid.ExceptHandler):\n other_body = context.parent.body\n handlers = context.parent.handlers\n else:\n other_body = itertools.chain.from_iterable(\n handler.body for handler in context.handlers\n )\n handlers = context.handlers\n\n has_fallback_imports = any(\n isinstance(import_node, (astroid.ImportFrom, astroid.Import))\n for import_node in other_body\n )\n ignores_import_error = _except_handlers_ignores_exception(handlers, ImportError)\n return ignores_import_error or has_fallback_imports"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_exception_handlers(\n node: astroid.node_classes.NodeNG, exception=Exception\n) -> List[astroid.ExceptHandler]:\n \"\"\"Return the collections of handlers handling the exception in arguments.\n\n Args:\n node (astroid.NodeNG): A node that is potentially wrapped in a try except.\n exception (builtin.Exception or str): exception or name of the exception.\n\n Returns:\n list: the collection of handlers that are handling the exception or None.\n\n \"\"\"\n context = find_try_except_wrapper_node(node)\n if isinstance(context, astroid.TryExcept):\n return [\n handler for handler in context.handlers if error_of_type(handler, exception)\n ]\n return None", "response": "Returns the list of handlers handling the exception in arguments."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking if the node is directly under a Try or Except statement.", "response": "def is_node_inside_try_except(node: astroid.Raise) -> bool:\n \"\"\"Check if the node is directly under a Try/Except statement.\n (but not under an ExceptHandler!)\n\n Args:\n node (astroid.Raise): the node raising the exception.\n\n Returns:\n bool: True if the node is inside a try/except statement, False otherwise.\n \"\"\"\n context = find_try_except_wrapper_node(node)\n return isinstance(context, astroid.TryExcept)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck if the given node ignores the given exception.", "response": "def node_ignores_exception(\n node: astroid.node_classes.NodeNG, exception=Exception\n) -> bool:\n \"\"\"Check if the node is in a TryExcept which handles the given exception.\n\n If the exception is not given, the function is going to look for bare\n excepts.\n \"\"\"\n managing_handlers = get_exception_handlers(node, exception)\n if not managing_handlers:\n return False\n return any(managing_handlers)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns true if the given class node should be considered as an abstract", "response": "def class_is_abstract(node: astroid.ClassDef) -> bool:\n \"\"\"return true if the given class node should be considered as an abstract\n class\n \"\"\"\n for method in node.methods():\n if method.parent.frame() is node:\n if method.is_abstract(pass_is_abstract=False):\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the inferred value for the given node.", "response": "def safe_infer(\n node: astroid.node_classes.NodeNG, context=None\n) -> Optional[astroid.node_classes.NodeNG]:\n \"\"\"Return the inferred value for the given node.\n\n Return None if inference failed or if there is some ambiguity (more than\n one node has been inferred).\n \"\"\"\n try:\n inferit = node.infer(context=context)\n value = next(inferit)\n except astroid.InferenceError:\n return None\n try:\n next(inferit)\n return None # None if there is ambiguity on the inferred node\n except astroid.InferenceError:\n return None # there is some kind of ambiguity\n except StopIteration:\n return value"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef node_type(node: astroid.node_classes.NodeNG) -> Optional[type]:\n # check there is only one possible type for the assign node. Else we\n # don't handle it for now\n types = set()\n try:\n for var_type in node.infer():\n if var_type == astroid.Uninferable or is_none(var_type):\n continue\n types.add(var_type)\n if len(types) > 1:\n return None\n except astroid.InferenceError:\n return None\n return types.pop() if types else None", "response": "Return the inferred type for a node."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncheck if the given function node is a singledispatch function.", "response": "def is_registered_in_singledispatch_function(node: astroid.FunctionDef) -> bool:\n \"\"\"Check if the given function node is a singledispatch function.\"\"\"\n\n singledispatch_qnames = (\n \"functools.singledispatch\",\n \"singledispatch.singledispatch\",\n )\n\n if not isinstance(node, astroid.FunctionDef):\n return False\n\n decorators = node.decorators.nodes if node.decorators else []\n for decorator in decorators:\n # func.register are function calls\n if not isinstance(decorator, astroid.Call):\n continue\n\n func = decorator.func\n if not isinstance(func, astroid.Attribute) or func.attrname != \"register\":\n continue\n\n try:\n func_def = next(func.expr.infer())\n except astroid.InferenceError:\n continue\n\n if isinstance(func_def, astroid.FunctionDef):\n # pylint: disable=redundant-keyword-arg; some flow inference goes wrong here\n return decorated_with(func_def, singledispatch_qnames)\n\n return False"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_node_last_lineno(node: astroid.node_classes.NodeNG) -> int:\n # 'finalbody' is always the last clause in a try statement, if present\n if getattr(node, \"finalbody\", False):\n return get_node_last_lineno(node.finalbody[-1])\n # For if, while, and for statements 'orelse' is always the last clause.\n # For try statements 'orelse' is the last in the absence of a 'finalbody'\n if getattr(node, \"orelse\", False):\n return get_node_last_lineno(node.orelse[-1])\n # try statements have the 'handlers' last if there is no 'orelse' or 'finalbody'\n if getattr(node, \"handlers\", False):\n return get_node_last_lineno(node.handlers[-1])\n # All compound statements have a 'body'\n if getattr(node, \"body\", False):\n return get_node_last_lineno(node.body[-1])\n # Not a compound statement\n return node.lineno", "response": "Get the last lineno of the given node."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncheck if the postponed evaluation of annotations is enabled", "response": "def is_postponed_evaluation_enabled(node: astroid.node_classes.NodeNG) -> bool:\n \"\"\"Check if the postponed evaluation of annotations is enabled\"\"\"\n name = \"annotations\"\n module = node.root()\n stmt = module.locals.get(name)\n return (\n stmt\n and isinstance(stmt[0], astroid.ImportFrom)\n and stmt[0].modname == \"__future__\"\n )"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck if first node is a subclass of second node.", "response": "def is_subclass_of(child: astroid.ClassDef, parent: astroid.ClassDef) -> bool:\n \"\"\"\n Check if first node is a subclass of second node.\n :param child: Node to check for subclass.\n :param parent: Node to check for superclass.\n :returns: True if child is derived from parent. False otherwise.\n \"\"\"\n if not all(isinstance(node, astroid.ClassDef) for node in (child, parent)):\n return False\n\n for ancestor in child.ancestors():\n try:\n if astroid.helpers.is_subtype(ancestor, parent):\n return True\n except _NonDeducibleTypeHierarchy:\n continue\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the qualified names of the given module", "response": "def _qualified_names(modname):\n \"\"\"Split the names of the given module into subparts\n\n For example,\n _qualified_names('pylint.checkers.ImportsChecker')\n returns\n ['pylint', 'pylint.checkers', 'pylint.checkers.ImportsChecker']\n \"\"\"\n names = modname.split(\".\")\n return [\".\".join(names[0 : i + 1]) for i in range(len(names))]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting a prepared module name from the given import node.", "response": "def _get_import_name(importnode, modname):\n \"\"\"Get a prepared module name from the given import node\n\n In the case of relative imports, this will return the\n absolute qualified module name, which might be useful\n for debugging. Otherwise, the initial module name\n is returned unchanged.\n \"\"\"\n if isinstance(importnode, astroid.ImportFrom):\n if importnode.level:\n root = importnode.root()\n if isinstance(root, astroid.Module):\n modname = root.relative_to_absolute_name(\n modname, level=importnode.level\n )\n return modname"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the first node where base. name is imported or None if not found", "response": "def _get_first_import(node, context, name, base, level, alias):\n \"\"\"return the node where [base.] is imported or None if not found\n \"\"\"\n fullname = \"%s.%s\" % (base, name) if base else name\n\n first = None\n found = False\n for first in context.body:\n if first is node:\n continue\n if first.scope() is node.scope() and first.fromlineno > node.fromlineno:\n continue\n if isinstance(first, astroid.Import):\n if any(fullname == iname[0] for iname in first.names):\n found = True\n break\n elif isinstance(first, astroid.ImportFrom):\n if level == first.level:\n for imported_name, imported_alias in first.names:\n if fullname == \"%s.%s\" % (first.modname, imported_name):\n found = True\n break\n if (\n name != \"*\"\n and name == imported_name\n and not (alias or imported_alias)\n ):\n found = True\n break\n if found:\n break\n if found and not astroid.are_exclusive(first, node):\n return first\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nmakes a list of 2 - tuple module list_of_files_which_import_this_module", "response": "def _make_tree_defs(mod_files_list):\n \"\"\"get a list of 2-uple (module, list_of_files_which_import_this_module),\n it will return a dictionary to represent this as a tree\n \"\"\"\n tree_defs = {}\n for mod, files in mod_files_list:\n node = (tree_defs, ())\n for prefix in mod.split(\".\"):\n node = node[0].setdefault(prefix, [{}, []])\n node[1] += files\n return tree_defs"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a string which represents imports as a tree", "response": "def _repr_tree_defs(data, indent_str=None):\n \"\"\"return a string which represents imports as a tree\"\"\"\n lines = []\n nodes = data.items()\n for i, (mod, (sub, files)) in enumerate(sorted(nodes, key=lambda x: x[0])):\n if not files:\n files = \"\"\n else:\n files = \"(%s)\" % \",\".join(sorted(files))\n if indent_str is None:\n lines.append(\"%s %s\" % (mod, files))\n sub_indent_str = \" \"\n else:\n lines.append(r\"%s\\-%s %s\" % (indent_str, mod, files))\n if i == len(nodes) - 1:\n sub_indent_str = \"%s \" % indent_str\n else:\n sub_indent_str = \"%s| \" % indent_str\n if sub:\n lines.append(_repr_tree_defs(sub, sub_indent_str))\n return \"\\n\".join(lines)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nwrites dependencies as a graphviz file", "response": "def _dependencies_graph(filename, dep_info):\n \"\"\"write dependencies as a dot (graphviz) file\n \"\"\"\n done = {}\n printer = DotBackend(filename[:-4], rankdir=\"LR\")\n printer.emit('URL=\".\" node[shape=\"box\"]')\n for modname, dependencies in sorted(dep_info.items()):\n done[modname] = 1\n printer.emit_node(modname)\n for depmodname in dependencies:\n if depmodname not in done:\n done[depmodname] = 1\n printer.emit_node(depmodname)\n for depmodname, dependencies in sorted(dep_info.items()):\n for modname in dependencies:\n printer.emit_edge(modname, depmodname)\n printer.generate(filename)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngenerates a graph and add some information about it in the report s section", "response": "def _make_graph(filename, dep_info, sect, gtype):\n \"\"\"generate a dependencies graph and add some information about it in the\n report's section\n \"\"\"\n _dependencies_graph(filename, dep_info)\n sect.append(Paragraph(\"%simports graph has been written to %s\" % (gtype, filename)))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef open(self):\n self.linter.add_stats(dependencies={})\n self.linter.add_stats(cycles=[])\n self.stats = self.linter.stats\n self.import_graph = collections.defaultdict(set)\n self._module_pkg = {} # mapping of modules to the pkg they belong in\n self._excluded_edges = collections.defaultdict(set)\n self._ignored_modules = get_global_option(self, \"ignored-modules\", default=[])\n # Build a mapping {'module': 'preferred-module'}\n self.preferred_modules = dict(\n module.split(\":\")\n for module in self.config.preferred_modules\n if \":\" in module\n )", "response": "called before visiting project"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef close(self):\n if self.linter.is_message_enabled(\"cyclic-import\"):\n graph = self._import_graph_without_ignored_edges()\n vertices = list(graph)\n for cycle in get_cycles(graph, vertices=vertices):\n self.add_message(\"cyclic-import\", args=\" -> \".join(cycle))", "response": "called before visiting project"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ntriggering when an import statement is seen", "response": "def visit_import(self, node):\n \"\"\"triggered when an import statement is seen\"\"\"\n self._check_reimport(node)\n self._check_import_as_rename(node)\n\n modnode = node.root()\n names = [name for name, _ in node.names]\n if len(names) >= 2:\n self.add_message(\"multiple-imports\", args=\", \".join(names), node=node)\n\n for name in names:\n self._check_deprecated_module(node, name)\n self._check_preferred_module(node, name)\n imported_module = self._get_imported_module(node, name)\n if isinstance(node.parent, astroid.Module):\n # Allow imports nested\n self._check_position(node)\n if isinstance(node.scope(), astroid.Module):\n self._record_import(node, imported_module)\n\n if imported_module is None:\n continue\n\n self._check_relative_import(modnode, node, imported_module, name)\n self._add_imported_module(node, imported_module.name)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ntriggers when an import statement is seen", "response": "def visit_importfrom(self, node):\n \"\"\"triggered when a from statement is seen\"\"\"\n basename = node.modname\n imported_module = self._get_imported_module(node, basename)\n\n self._check_import_as_rename(node)\n self._check_misplaced_future(node)\n self._check_deprecated_module(node, basename)\n self._check_preferred_module(node, basename)\n self._check_wildcard_imports(node, imported_module)\n self._check_same_line_imports(node)\n self._check_reimport(node, basename=basename, level=node.level)\n\n if isinstance(node.parent, astroid.Module):\n # Allow imports nested\n self._check_position(node)\n if isinstance(node.scope(), astroid.Module):\n self._record_import(node, imported_module)\n if imported_module is None:\n return\n modnode = node.root()\n self._check_relative_import(modnode, node, imported_module, basename)\n\n for name, _ in node.names:\n if name != \"*\":\n self._add_imported_module(node, \"%s.%s\" % (imported_module.name, name))\n else:\n self._add_imported_module(node, imported_module.name)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck if the node import or importfrom node position is correct Send a message if it is not well placed", "response": "def _check_position(self, node):\n \"\"\"Check `node` import or importfrom node position is correct\n\n Send a message if `node` comes before another instruction\n \"\"\"\n # if a first non-import instruction has already been encountered,\n # it means the import comes after it and therefore is not well placed\n if self._first_non_import_node:\n self.add_message(\"wrong-import-position\", node=node, args=node.as_string())"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _record_import(self, node, importedmodnode):\n if isinstance(node, astroid.ImportFrom):\n importedname = node.modname\n else:\n importedname = importedmodnode.name if importedmodnode else None\n if not importedname:\n importedname = node.names[0][0].split(\".\")[0]\n\n if isinstance(node, astroid.ImportFrom) and (node.level or 0) >= 1:\n # We need the importedname with first point to detect local package\n # Example of node:\n # 'from .my_package1 import MyClass1'\n # the output should be '.my_package1' instead of 'my_package1'\n # Example of node:\n # 'from . import my_package2'\n # the output should be '.my_package2' instead of '{pyfile}'\n importedname = \".\" + importedname\n\n self._imports_stack.append((node, importedname))", "response": "Record the package node imports from"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _check_imports_order(self, _module_node):\n std_imports = []\n third_party_imports = []\n first_party_imports = []\n # need of a list that holds third or first party ordered import\n external_imports = []\n local_imports = []\n third_party_not_ignored = []\n first_party_not_ignored = []\n local_not_ignored = []\n isort_obj = isort.SortImports(\n file_contents=\"\",\n known_third_party=self.config.known_third_party,\n known_standard_library=self.config.known_standard_library,\n )\n for node, modname in self._imports_stack:\n if modname.startswith(\".\"):\n package = \".\" + modname.split(\".\")[1]\n else:\n package = modname.split(\".\")[0]\n nested = not isinstance(node.parent, astroid.Module)\n ignore_for_import_order = not self.linter.is_message_enabled(\n \"wrong-import-order\", node.fromlineno\n )\n import_category = isort_obj.place_module(package)\n node_and_package_import = (node, package)\n if import_category in (\"FUTURE\", \"STDLIB\"):\n std_imports.append(node_and_package_import)\n wrong_import = (\n third_party_not_ignored\n or first_party_not_ignored\n or local_not_ignored\n )\n if self._is_fallback_import(node, wrong_import):\n continue\n if wrong_import and not nested:\n self.add_message(\n \"wrong-import-order\",\n node=node,\n args=(\n 'standard import \"%s\"' % node.as_string(),\n '\"%s\"' % wrong_import[0][0].as_string(),\n ),\n )\n elif import_category == \"THIRDPARTY\":\n third_party_imports.append(node_and_package_import)\n external_imports.append(node_and_package_import)\n if not nested and not ignore_for_import_order:\n third_party_not_ignored.append(node_and_package_import)\n wrong_import = first_party_not_ignored or local_not_ignored\n if wrong_import and not nested:\n self.add_message(\n \"wrong-import-order\",\n node=node,\n args=(\n 'third party import \"%s\"' % node.as_string(),\n '\"%s\"' % wrong_import[0][0].as_string(),\n ),\n )\n elif import_category == \"FIRSTPARTY\":\n first_party_imports.append(node_and_package_import)\n external_imports.append(node_and_package_import)\n if not nested and not ignore_for_import_order:\n first_party_not_ignored.append(node_and_package_import)\n wrong_import = local_not_ignored\n if wrong_import and not nested:\n self.add_message(\n \"wrong-import-order\",\n node=node,\n args=(\n 'first party import \"%s\"' % node.as_string(),\n '\"%s\"' % wrong_import[0][0].as_string(),\n ),\n )\n elif import_category == \"LOCALFOLDER\":\n local_imports.append((node, package))\n if not nested and not ignore_for_import_order:\n local_not_ignored.append((node, package))\n return std_imports, external_imports, local_imports", "response": "Checks the imports of a module node are grouped by category."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nchecking relative import. node is either an Import or From node modname the imported module name. importedasname CTYPE is the name of the imported module.", "response": "def _check_relative_import(\n self, modnode, importnode, importedmodnode, importedasname\n ):\n \"\"\"check relative import. node is either an Import or From node, modname\n the imported module name.\n \"\"\"\n if not self.linter.is_message_enabled(\"relative-import\"):\n return None\n if importedmodnode.file is None:\n return False # built-in module\n if modnode is importedmodnode:\n return False # module importing itself\n if modnode.absolute_import_activated() or getattr(importnode, \"level\", None):\n return False\n if importedmodnode.name != importedasname:\n # this must be a relative import...\n self.add_message(\n \"relative-import\",\n args=(importedasname, importedmodnode.name),\n node=importnode,\n )\n return None\n return None"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nnotifying an imported module", "response": "def _add_imported_module(self, node, importedmodname):\n \"\"\"notify an imported module, used to analyze dependencies\"\"\"\n module_file = node.root().file\n context_name = node.root().name\n base = os.path.splitext(os.path.basename(module_file))[0]\n\n try:\n importedmodname = astroid.modutils.get_module_part(\n importedmodname, module_file\n )\n except ImportError:\n pass\n\n if context_name == importedmodname:\n self.add_message(\"import-self\", node=node)\n\n elif not astroid.modutils.is_standard_module(importedmodname):\n # if this is not a package __init__ module\n if base != \"__init__\" and context_name not in self._module_pkg:\n # record the module's parent, or the module itself if this is\n # a top level module, as the package it belongs to\n self._module_pkg[context_name] = context_name.rsplit(\".\", 1)[0]\n\n # handle dependencies\n importedmodnames = self.stats[\"dependencies\"].setdefault(\n importedmodname, set()\n )\n if context_name not in importedmodnames:\n importedmodnames.add(context_name)\n\n # update import graph\n self.import_graph[context_name].add(importedmodname)\n if not self.linter.is_message_enabled(\"cyclic-import\", line=node.lineno):\n self._excluded_edges[context_name].add(importedmodname)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks if the module is deprecated", "response": "def _check_deprecated_module(self, node, mod_path):\n \"\"\"check if the module is deprecated\"\"\"\n for mod_name in self.config.deprecated_modules:\n if mod_path == mod_name or mod_path.startswith(mod_name + \".\"):\n self.add_message(\"deprecated-module\", node=node, args=mod_path)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _check_preferred_module(self, node, mod_path):\n if mod_path in self.preferred_modules:\n self.add_message(\n \"preferred-module\",\n node=node,\n args=(self.preferred_modules[mod_path], mod_path),\n )", "response": "check if the module has a preferred replacement"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _check_reimport(self, node, basename=None, level=None):\n if not self.linter.is_message_enabled(\"reimported\"):\n return\n\n frame = node.frame()\n root = node.root()\n contexts = [(frame, level)]\n if root is not frame:\n contexts.append((root, None))\n\n for known_context, known_level in contexts:\n for name, alias in node.names:\n first = _get_first_import(\n node, known_context, name, basename, known_level, alias\n )\n if first is not None:\n self.add_message(\n \"reimported\", node=node, args=(name, first.fromlineno)\n )", "response": "check if the import is necessary"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn a verbatim layout for displaying dependencies", "response": "def _report_external_dependencies(self, sect, _, _dummy):\n \"\"\"return a verbatim layout for displaying dependencies\"\"\"\n dep_info = _make_tree_defs(self._external_dependencies_info().items())\n if not dep_info:\n raise EmptyReportError()\n tree_str = _repr_tree_defs(dep_info)\n sect.append(VerbatimText(tree_str))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _report_dependencies_graph(self, sect, _, _dummy):\n dep_info = self.stats[\"dependencies\"]\n if not dep_info or not (\n self.config.import_graph\n or self.config.ext_import_graph\n or self.config.int_import_graph\n ):\n raise EmptyReportError()\n filename = self.config.import_graph\n if filename:\n _make_graph(filename, dep_info, sect, \"\")\n filename = self.config.ext_import_graph\n if filename:\n _make_graph(filename, self._external_dependencies_info(), sect, \"external \")\n filename = self.config.int_import_graph\n if filename:\n _make_graph(filename, self._internal_dependencies_info(), sect, \"internal \")", "response": "write dependencies as a dot ( graphviz file"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _filter_dependencies_graph(self, internal):\n graph = collections.defaultdict(set)\n for importee, importers in self.stats[\"dependencies\"].items():\n for importer in importers:\n package = self._module_pkg.get(importer, importer)\n is_inside = importee.startswith(package)\n if is_inside and internal or not is_inside and not internal:\n graph[importee].add(importer)\n return graph", "response": "build the internal or external depedency graph"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreads config file and return list of options", "response": "def get_default_options():\n \"\"\"\n Read config file and return list of options\n \"\"\"\n options = []\n home = os.environ.get(\"HOME\", \"\")\n if home:\n rcfile = os.path.join(home, RCFILE)\n try:\n options = open(rcfile).read().split()\n except IOError:\n pass # ignore if no config file found\n return options"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ninserts default options to sys. argv", "response": "def insert_default_options():\n \"\"\"insert default options to sys.argv\n \"\"\"\n options = get_default_options()\n options.reverse()\n for arg in options:\n sys.argv.insert(1, arg)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_visibility(name):\n if SPECIAL.match(name):\n visibility = \"special\"\n elif PRIVATE.match(name):\n visibility = \"private\"\n elif PROTECTED.match(name):\n visibility = \"protected\"\n\n else:\n visibility = \"public\"\n return visibility", "response": "return the visibility from a name"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn true if the node should be treated", "response": "def show_attr(self, node):\n \"\"\"return true if the node should be treated\n \"\"\"\n visibility = get_visibility(getattr(node, \"name\", node))\n return not self.__mode & VIS_MOD[visibility]"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef walk(self, node, _done=None):\n if _done is None:\n _done = set()\n if node in _done:\n raise AssertionError((id(node), node, node.parent))\n _done.add(node)\n self.visit(node)\n for child_node in node.get_children():\n assert child_node is not node\n self.walk(child_node, _done)\n self.leave(node)\n assert node.parent is not node", "response": "walk on the tree from node getting callbacks from handler"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_callbacks(self, node):\n klass = node.__class__\n methods = self._cache.get(klass)\n if methods is None:\n handler = self.handler\n kid = klass.__name__.lower()\n e_method = getattr(\n handler, \"visit_%s\" % kid, getattr(handler, \"visit_default\", None)\n )\n l_method = getattr(\n handler, \"leave_%s\" % kid, getattr(handler, \"leave_default\", None)\n )\n self._cache[klass] = (e_method, l_method)\n else:\n e_method, l_method = methods\n return e_method, l_method", "response": "get callbacks from handler for the visited node"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nwalks on the tree from node getting callbacks from handler", "response": "def visit(self, node):\n \"\"\"walk on the tree from , getting callbacks from handler\"\"\"\n method = self.get_callbacks(node)[0]\n if method is not None:\n method(node)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nwalks on the tree from node getting callbacks from handler", "response": "def leave(self, node):\n \"\"\"walk on the tree from , getting callbacks from handler\"\"\"\n method = self.get_callbacks(node)[1]\n if method is not None:\n method(node)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef visit(self, node):\n if node in self._visited:\n return None\n self._visited[node] = 1 # FIXME: use set ?\n methods = self.get_callbacks(node)\n if methods[0] is not None:\n methods[0](node)\n if hasattr(node, \"locals\"): # skip Instance and other proxy\n for local_node in node.values():\n self.visit(local_node)\n if methods[1] is not None:\n return methods[1](node)\n return None", "response": "launch the visit starting from the given node"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking the consistency of the message ids.", "response": "def check_consistency(self) -> None:\n \"\"\"Check the consistency of msgid.\n\n msg ids for a checker should be a string of len 4, where the two first\n characters are the checker id and the two last the msg id in this\n checker.\n\n :raises InvalidMessageError: If the checker id in the messages are not\n always the same. \"\"\"\n checker_id = None\n existing_ids = []\n for message in self.messages:\n if checker_id is not None and checker_id != message.msgid[1:3]:\n error_msg = \"Inconsistent checker part in message id \"\n error_msg += \"'{}' (expected 'x{checker_id}xx' \".format(\n message.msgid, checker_id=checker_id\n )\n error_msg += \"because we already had {existing_ids}).\".format(\n existing_ids=existing_ids\n )\n raise InvalidMessageError(error_msg)\n checker_id = message.msgid[1:3]\n existing_ids.append(message.msgid)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef visit_call(self, node):\n try:\n for inferred in node.func.infer():\n if inferred is astroid.Uninferable:\n continue\n elif inferred.root().name == OPEN_MODULE:\n if getattr(node.func, \"name\", None) in OPEN_FILES:\n self._check_open_mode(node)\n elif inferred.root().name == UNITTEST_CASE:\n self._check_redundant_assert(node, inferred)\n elif isinstance(inferred, astroid.ClassDef):\n if inferred.qname() == THREADING_THREAD:\n self._check_bad_thread_instantiation(node)\n elif inferred.qname() == SUBPROCESS_POPEN:\n self._check_for_preexec_fn_in_popen(node)\n elif isinstance(inferred, astroid.FunctionDef):\n name = inferred.qname()\n if name == COPY_COPY:\n self._check_shallow_copy_environ(node)\n elif name in ENV_GETTERS:\n self._check_env_function(node, inferred)\n elif name == SUBPROCESS_RUN and PY35:\n self._check_for_check_kw_in_run(node)\n self._check_deprecated_method(node, inferred)\n except astroid.InferenceError:\n return", "response": "Visit a Call node."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking that a datetime was infered.", "response": "def _check_datetime(self, node):\n \"\"\" Check that a datetime was infered.\n If so, emit boolean-datetime warning.\n \"\"\"\n try:\n infered = next(node.infer())\n except astroid.InferenceError:\n return\n if isinstance(infered, Instance) and infered.qname() == \"datetime.time\":\n self.add_message(\"boolean-datetime\", node=node)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _check_open_mode(self, node):\n try:\n mode_arg = utils.get_argument_from_call(node, position=1, keyword=\"mode\")\n except utils.NoSuchArgumentError:\n return\n if mode_arg:\n mode_arg = utils.safe_infer(mode_arg)\n if isinstance(mode_arg, astroid.Const) and not _check_mode_str(\n mode_arg.value\n ):\n self.add_message(\"bad-open-mode\", node=node, args=mode_arg.value)", "response": "Check that the mode argument of an open or file call is valid."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nmanage message of different type and in the context of path.", "response": "def handle_message(self, msg):\n \"\"\"Manage message of different type and in the context of path.\"\"\"\n self.messages.append(\n {\n \"type\": msg.category,\n \"module\": msg.module,\n \"obj\": msg.obj,\n \"line\": msg.line,\n \"column\": msg.column,\n \"path\": msg.path,\n \"symbol\": msg.symbol,\n \"message\": html.escape(msg.msg or \"\", quote=False),\n \"message-id\": msg.msg_id,\n }\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets title for objects", "response": "def get_title(self, node):\n \"\"\"get title for objects\"\"\"\n title = node.name\n if self.module_names:\n title = \"%s.%s\" % (node.root().name, title)\n return title"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nset different default options with _default dictionary", "response": "def _set_default_options(self):\n \"\"\"set different default options with _default dictionary\"\"\"\n self.module_names = self._set_option(self.config.module_names)\n all_ancestors = self._set_option(self.config.all_ancestors)\n all_associated = self._set_option(self.config.all_associated)\n anc_level, association_level = (0, 0)\n if all_ancestors:\n anc_level = -1\n if all_associated:\n association_level = -1\n if self.config.show_ancestors is not None:\n anc_level = self.config.show_ancestors\n if self.config.show_associated is not None:\n association_level = self.config.show_associated\n self.anc_level, self.association_level = anc_level, association_level"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ntrues if the node is a builtin module.", "response": "def show_node(self, node):\n \"\"\"true if builtins and not show_builtins\"\"\"\n if self.config.show_builtin:\n return True\n return node.root().name != BUILTINS_NAME"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef add_class(self, node):\n self.linker.visit(node)\n self.classdiagram.add_object(self.get_title(node), node)", "response": "visit one class and add it to diagram"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget ancestors of a class node", "response": "def get_ancestors(self, node, level):\n \"\"\"return ancestor nodes of a class node\"\"\"\n if level == 0:\n return\n for ancestor in node.ancestors(recurs=False):\n if not self.show_node(ancestor):\n continue\n yield ancestor"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning associated nodes of a class node", "response": "def get_associated(self, klass_node, level):\n \"\"\"return associated nodes of a class node\"\"\"\n if level == 0:\n return\n for association_nodes in list(klass_node.instance_attrs_type.values()) + list(\n klass_node.locals_type.values()\n ):\n for node in association_nodes:\n if isinstance(node, astroid.Instance):\n node = node._proxied\n if not (isinstance(node, astroid.ClassDef) and self.show_node(node)):\n continue\n yield node"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef extract_classes(self, klass_node, anc_level, association_level):\n if self.classdiagram.has_node(klass_node) or not self.show_node(klass_node):\n return\n self.add_class(klass_node)\n\n for ancestor in self.get_ancestors(klass_node, anc_level):\n self.extract_classes(ancestor, anc_level - 1, association_level)\n\n for node in self.get_associated(klass_node, association_level):\n self.extract_classes(node, anc_level, association_level - 1)", "response": "extract recursively classes related to klass_node"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nvisits a pyreverse. utils. Project node create a diagram definition for classes", "response": "def visit_project(self, node):\n \"\"\"visit a pyreverse.utils.Project node\n\n create a diagram definition for packages\n \"\"\"\n mode = self.config.mode\n if len(node.modules) > 1:\n self.pkgdiagram = PackageDiagram(\"packages %s\" % node.name, mode)\n else:\n self.pkgdiagram = None\n self.classdiagram = ClassDiagram(\"classes %s\" % node.name, mode)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef leave_project(self, node): # pylint: disable=unused-argument\n if self.pkgdiagram:\n return self.pkgdiagram, self.classdiagram\n return (self.classdiagram,)", "response": "leave the pyreverse. utils. Project node\n\n return the generated diagram definition"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef visit_module(self, node):\n if self.pkgdiagram:\n self.linker.visit(node)\n self.pkgdiagram.add_object(node.name, node)", "response": "visit an astroid. Module node add this class to the package diagram definition\n "} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef visit_classdef(self, node):\n anc_level, association_level = self._get_levels()\n self.extract_classes(node, anc_level, association_level)", "response": "visit an astroid. Class node add this class to the class diagram definition\n "} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef visit_importfrom(self, node):\n if self.pkgdiagram:\n self.pkgdiagram.add_from_depend(node, node.modname)", "response": "visit astroid. ImportFrom and catch modules for package diagram\n "} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef class_diagram(self, project, klass):\n\n self.classdiagram = ClassDiagram(klass, self.config.mode)\n if len(project.modules) > 1:\n module, klass = klass.rsplit(\".\", 1)\n module = project.get_module(module)\n else:\n module = project.modules[0]\n klass = klass.split(\".\")[-1]\n klass = next(module.ilookup(klass))\n\n anc_level, association_level = self._get_levels()\n self.extract_classes(klass, anc_level, association_level)\n return self.classdiagram", "response": "return a class diagram definition for the given klass and its\n related klasses\n "} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_diadefs(self, project, linker):\n\n # read and interpret diagram definitions (Diadefs)\n diagrams = []\n generator = ClassDiadefGenerator(linker, self)\n for klass in self.config.classes:\n diagrams.append(generator.class_diagram(project, klass))\n if not diagrams:\n diagrams = DefaultDiadefGenerator(linker, self).visit(project)\n for diagram in diagrams:\n diagram.extract_relationships()\n return diagrams", "response": "Get the diagrams configuration data"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _is_owner_ignored(owner, name, ignored_classes, ignored_modules):\n ignored_modules = set(ignored_modules)\n module_name = owner.root().name\n module_qname = owner.root().qname()\n if any(\n module_name in ignored_modules\n or module_qname in ignored_modules\n or fnmatch.fnmatch(module_qname, ignore)\n for ignore in ignored_modules\n ):\n return True\n\n ignored_classes = set(ignored_classes)\n if hasattr(owner, \"qname\"):\n qname = owner.qname()\n else:\n qname = \"\"\n return any(ignore in (name, qname) for ignore in ignored_classes)", "response": "Check if the given owner should be ignored."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ngive an owner and a name try to find similar names.", "response": "def _similar_names(owner, attrname, distance_threshold, max_choices):\n \"\"\"Given an owner and a name, try to find similar names\n\n The similar names are searched given a distance metric and only\n a given number of choices will be returned.\n \"\"\"\n possible_names = []\n names = _node_names(owner)\n\n for name in names:\n if name == attrname:\n continue\n\n distance = _string_distance(attrname, name)\n if distance <= distance_threshold:\n possible_names.append((name, distance))\n\n # Now get back the values with a minimum, up to the given\n # limit or choices.\n picked = [\n name\n for (name, _) in heapq.nsmallest(\n max_choices, possible_names, key=operator.itemgetter(1)\n )\n ]\n return sorted(picked)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a list of nodes that should be emitted for the given owner.", "response": "def _emit_no_member(node, owner, owner_name, ignored_mixins=True, ignored_none=True):\n \"\"\"Try to see if no-member should be emitted for the given owner.\n\n The following cases are ignored:\n\n * the owner is a function and it has decorators.\n * the owner is an instance and it has __getattr__, __getattribute__ implemented\n * the module is explicitly ignored from no-member checks\n * the owner is a class and the name can be found in its metaclass.\n * The access node is protected by an except handler, which handles\n AttributeError, Exception or bare except.\n \"\"\"\n # pylint: disable=too-many-return-statements\n if node_ignores_exception(node, AttributeError):\n return False\n if ignored_none and isinstance(owner, astroid.Const) and owner.value is None:\n return False\n if is_super(owner) or getattr(owner, \"type\", None) == \"metaclass\":\n return False\n if ignored_mixins and owner_name[-5:].lower() == \"mixin\":\n return False\n if isinstance(owner, astroid.FunctionDef) and owner.decorators:\n return False\n if isinstance(owner, (astroid.Instance, astroid.ClassDef)):\n if owner.has_dynamic_getattr():\n # Issue #2565: Don't ignore enums, as they have a `__getattr__` but it's not\n # invoked at this point.\n try:\n metaclass = owner.metaclass()\n except exceptions.MroError:\n return False\n if metaclass:\n return metaclass.qname() == \"enum.EnumMeta\"\n return False\n if not has_known_bases(owner):\n return False\n if isinstance(owner, objects.Super):\n # Verify if we are dealing with an invalid Super object.\n # If it is invalid, then there's no point in checking that\n # it has the required attribute. Also, don't fail if the\n # MRO is invalid.\n try:\n owner.super_mro()\n except (exceptions.MroError, exceptions.SuperError):\n return False\n if not all(map(has_known_bases, owner.type.mro())):\n return False\n if isinstance(owner, astroid.Module):\n try:\n owner.getattr(\"__getattr__\")\n return False\n except astroid.NotFoundError:\n pass\n if node.attrname.startswith(\"_\" + owner_name):\n # Test if an attribute has been mangled ('private' attribute)\n unmangled_name = node.attrname.split(\"_\" + owner_name)[-1]\n try:\n if owner.getattr(unmangled_name, context=None) is not None:\n return False\n except astroid.NotFoundError:\n return True\n return True"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if the given node has a parent of the given type.", "response": "def _has_parent_of_type(node, node_type, statement):\n \"\"\"Check if the given node has a parent of the given type.\"\"\"\n parent = node.parent\n while not isinstance(parent, node_type) and statement.parent_of(parent):\n parent = parent.parent\n return isinstance(parent, node_type)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck if the given name is used as a variadic argument.", "response": "def _is_name_used_as_variadic(name, variadics):\n \"\"\"Check if the given name is used as a variadic argument.\"\"\"\n return any(\n variadic.value == name or variadic.value.parent_of(name)\n for variadic in variadics\n )"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nverifying if the given call node has a specific call context at hand.", "response": "def _no_context_variadic(node, variadic_name, variadic_type, variadics):\n \"\"\"Verify if the given call node has variadic nodes without context\n\n This is a workaround for handling cases of nested call functions\n which don't have the specific call context at hand.\n Variadic arguments (variable positional arguments and variable\n keyword arguments) are inferred, inherently wrong, by astroid\n as a Tuple, respectively a Dict with empty elements.\n This can lead pylint to believe that a function call receives\n too few arguments.\n \"\"\"\n statement = node.statement()\n for name in statement.nodes_of_class(astroid.Name):\n if name.name != variadic_name:\n continue\n\n inferred = safe_infer(name)\n if isinstance(inferred, (astroid.List, astroid.Tuple)):\n length = len(inferred.elts)\n elif isinstance(inferred, astroid.Dict):\n length = len(inferred.items)\n else:\n continue\n\n inferred_statement = inferred.statement()\n if not length and isinstance(inferred_statement, astroid.FunctionDef):\n is_in_starred_context = _has_parent_of_type(node, variadic_type, statement)\n used_as_starred_argument = _is_name_used_as_variadic(name, variadics)\n if is_in_starred_context or used_as_starred_argument:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _infer_from_metaclass_constructor(cls, func):\n context = astroid.context.InferenceContext()\n\n class_bases = astroid.List()\n class_bases.postinit(elts=cls.bases)\n\n attrs = astroid.Dict()\n local_names = [(name, values[-1]) for name, values in cls.locals.items()]\n attrs.postinit(local_names)\n\n builder_args = astroid.Tuple()\n builder_args.postinit([cls.name, class_bases, attrs])\n\n context.callcontext = astroid.context.CallContext(builder_args)\n try:\n inferred = next(func.infer_call_result(func, context), None)\n except astroid.InferenceError:\n return None\n return inferred or None", "response": "Try to infer the class from the given metaclass constructor."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef register(linter):\n linter.register_checker(TypeChecker(linter))\n linter.register_checker(IterableChecker(linter))", "response": "required method to auto register this checker"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef visit_attribute(self, node):\n for pattern in self.config.generated_members:\n # attribute is marked as generated, stop here\n if re.match(pattern, node.attrname):\n return\n if re.match(pattern, node.as_string()):\n return\n\n try:\n inferred = list(node.expr.infer())\n except exceptions.InferenceError:\n return\n\n # list of (node, nodename) which are missing the attribute\n missingattr = set()\n\n non_opaque_inference_results = [\n owner\n for owner in inferred\n if owner is not astroid.Uninferable\n and not isinstance(owner, astroid.nodes.Unknown)\n ]\n if (\n len(non_opaque_inference_results) != len(inferred)\n and self.config.ignore_on_opaque_inference\n ):\n # There is an ambiguity in the inference. Since we can't\n # make sure that we won't emit a false positive, we just stop\n # whenever the inference returns an opaque inference object.\n return\n\n for owner in non_opaque_inference_results:\n name = getattr(owner, \"name\", None)\n if _is_owner_ignored(\n owner, name, self.config.ignored_classes, self.config.ignored_modules\n ):\n continue\n\n try:\n if not [\n n\n for n in owner.getattr(node.attrname)\n if not isinstance(n.statement(), astroid.AugAssign)\n ]:\n missingattr.add((owner, name))\n continue\n except AttributeError:\n # XXX method / function\n continue\n except exceptions.NotFoundError:\n # This can't be moved before the actual .getattr call,\n # because there can be more values inferred and we are\n # stopping after the first one which has the attribute in question.\n # The problem is that if the first one has the attribute,\n # but we continue to the next values which doesn't have the\n # attribute, then we'll have a false positive.\n # So call this only after the call has been made.\n if not _emit_no_member(\n node,\n owner,\n name,\n ignored_mixins=self.config.ignore_mixin_members,\n ignored_none=self.config.ignore_none,\n ):\n continue\n missingattr.add((owner, name))\n continue\n # stop on the first found\n break\n else:\n # we have not found any node with the attributes, display the\n # message for infered nodes\n done = set()\n for owner, name in missingattr:\n if isinstance(owner, astroid.Instance):\n actual = owner._proxied\n else:\n actual = owner\n if actual in done:\n continue\n done.add(actual)\n\n msg, hint = self._get_nomember_msgid_hint(node, owner)\n self.add_message(\n msg,\n node=node,\n args=(owner.display_type(), name, node.attrname, hint),\n confidence=INFERENCE,\n )", "response": "check that the accessed attribute exists and if not we ll consider the code as correct if it doesn t."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking that if assigning to a function call the function is and if it is returning something valuable", "response": "def visit_assign(self, node):\n \"\"\"check that if assigning to a function call, the function is\n possibly returning something valuable\n \"\"\"\n if not isinstance(node.value, astroid.Call):\n return\n function_node = safe_infer(node.value.func)\n # skip class, generator and incomplete function definition\n funcs = (astroid.FunctionDef, astroid.UnboundMethod, astroid.BoundMethod)\n if not (\n isinstance(function_node, funcs)\n and function_node.root().fully_defined()\n and not function_node.decorators\n ):\n return\n if (\n function_node.is_generator()\n or function_node.is_abstract(pass_is_abstract=False)\n or isinstance(function_node, astroid.AsyncFunctionDef)\n ):\n return\n returns = list(\n function_node.nodes_of_class(astroid.Return, skip_klass=astroid.FunctionDef)\n )\n if not returns:\n self.add_message(\"assignment-from-no-return\", node=node)\n else:\n for rnode in returns:\n if not (\n isinstance(rnode.value, astroid.Const)\n and rnode.value.value is None\n or rnode.value is None\n ):\n break\n else:\n self.add_message(\"assignment-from-none\", node=node)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _check_uninferable_call(self, node):\n if not isinstance(node.func, astroid.Attribute):\n return\n\n # Look for properties. First, obtain\n # the lhs of the Attribute node and search the attribute\n # there. If that attribute is a property or a subclass of properties,\n # then most likely it's not callable.\n\n # TODO: since astroid doesn't understand descriptors very well\n # we will not handle them here, right now.\n\n expr = node.func.expr\n klass = safe_infer(expr)\n if (\n klass is None\n or klass is astroid.Uninferable\n or not isinstance(klass, astroid.Instance)\n ):\n return\n\n try:\n attrs = klass._proxied.getattr(node.func.attrname)\n except exceptions.NotFoundError:\n return\n\n for attr in attrs:\n if attr is astroid.Uninferable:\n continue\n if not isinstance(attr, astroid.FunctionDef):\n continue\n\n # Decorated, see if it is decorated with a property.\n # Also, check the returns and see if they are callable.\n if decorated_with_property(attr):\n\n try:\n all_returns_are_callable = all(\n return_node.callable() or return_node is astroid.Uninferable\n for return_node in attr.infer_call_result(node)\n )\n except astroid.InferenceError:\n continue\n\n if not all_returns_are_callable:\n self.add_message(\n \"not-callable\", node=node, args=node.func.as_string()\n )\n break", "response": "Check that the given uninferable Call node does not call an actual function."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef visit_call(self, node):\n called = safe_infer(node.func)\n # only function, generator and object defining __call__ are allowed\n # Ignore instances of descriptors since astroid cannot properly handle them\n # yet\n if called and not called.callable():\n if isinstance(called, astroid.Instance) and (\n not has_known_bases(called)\n or (\n isinstance(called.scope(), astroid.ClassDef)\n and \"__get__\" in called.locals\n )\n ):\n # Don't emit if we can't make sure this object is callable.\n pass\n else:\n self.add_message(\"not-callable\", node=node, args=node.func.as_string())\n\n self._check_uninferable_call(node)\n\n try:\n called, implicit_args, callable_name = _determine_callable(called)\n except ValueError:\n # Any error occurred during determining the function type, most of\n # those errors are handled by different warnings.\n return\n\n if called.args.args is None:\n # Built-in functions have no argument information.\n return\n\n if len(called.argnames()) != len(set(called.argnames())):\n # Duplicate parameter name (see duplicate-argument). We can't really\n # make sense of the function call in this case, so just return.\n return\n\n # Build the set of keyword arguments, checking for duplicate keywords,\n # and count the positional arguments.\n call_site = astroid.arguments.CallSite.from_call(node)\n\n # Warn about duplicated keyword arguments, such as `f=24, **{'f': 24}`\n for keyword in call_site.duplicated_keywords:\n self.add_message(\"repeated-keyword\", node=node, args=(keyword,))\n\n if call_site.has_invalid_arguments() or call_site.has_invalid_keywords():\n # Can't make sense of this.\n return\n\n num_positional_args = len(call_site.positional_arguments)\n keyword_args = list(call_site.keyword_arguments.keys())\n\n # Determine if we don't have a context for our call and we use variadics.\n if isinstance(node.scope(), astroid.FunctionDef):\n has_no_context_positional_variadic = _no_context_variadic_positional(node)\n has_no_context_keywords_variadic = _no_context_variadic_keywords(node)\n else:\n has_no_context_positional_variadic = (\n has_no_context_keywords_variadic\n ) = False\n\n # These are coming from the functools.partial implementation in astroid\n already_filled_positionals = getattr(called, \"filled_positionals\", 0)\n already_filled_keywords = getattr(called, \"filled_keywords\", {})\n\n keyword_args += list(already_filled_keywords)\n num_positional_args += implicit_args + already_filled_positionals\n\n # Analyze the list of formal parameters.\n\n num_mandatory_parameters = len(called.args.args) - len(called.args.defaults)\n parameters = []\n parameter_name_to_index = {}\n for i, arg in enumerate(called.args.args):\n if isinstance(arg, astroid.Tuple):\n name = None\n # Don't store any parameter names within the tuple, since those\n # are not assignable from keyword arguments.\n else:\n assert isinstance(arg, astroid.AssignName)\n # This occurs with:\n # def f( (a), (b) ): pass\n name = arg.name\n parameter_name_to_index[name] = i\n if i >= num_mandatory_parameters:\n defval = called.args.defaults[i - num_mandatory_parameters]\n else:\n defval = None\n parameters.append([(name, defval), False])\n\n kwparams = {}\n for i, arg in enumerate(called.args.kwonlyargs):\n if isinstance(arg, astroid.Keyword):\n name = arg.arg\n else:\n assert isinstance(arg, astroid.AssignName)\n name = arg.name\n kwparams[name] = [called.args.kw_defaults[i], False]\n\n # Match the supplied arguments against the function parameters.\n\n # 1. Match the positional arguments.\n for i in range(num_positional_args):\n if i < len(parameters):\n parameters[i][1] = True\n elif called.args.vararg is not None:\n # The remaining positional arguments get assigned to the *args\n # parameter.\n break\n else:\n # Too many positional arguments.\n self.add_message(\n \"too-many-function-args\", node=node, args=(callable_name,)\n )\n break\n\n # 2. Match the keyword arguments.\n for keyword in keyword_args:\n if keyword in parameter_name_to_index:\n i = parameter_name_to_index[keyword]\n if parameters[i][1]:\n # Duplicate definition of function parameter.\n\n # Might be too hardcoded, but this can actually\n # happen when using str.format and `self` is passed\n # by keyword argument, as in `.format(self=self)`.\n # It's perfectly valid to so, so we're just skipping\n # it if that's the case.\n if not (keyword == \"self\" and called.qname() in STR_FORMAT):\n self.add_message(\n \"redundant-keyword-arg\",\n node=node,\n args=(keyword, callable_name),\n )\n else:\n parameters[i][1] = True\n elif keyword in kwparams:\n if kwparams[keyword][1]: # XXX is that even possible?\n # Duplicate definition of function parameter.\n self.add_message(\n \"redundant-keyword-arg\",\n node=node,\n args=(keyword, callable_name),\n )\n else:\n kwparams[keyword][1] = True\n elif called.args.kwarg is not None:\n # The keyword argument gets assigned to the **kwargs parameter.\n pass\n else:\n # Unexpected keyword argument.\n self.add_message(\n \"unexpected-keyword-arg\", node=node, args=(keyword, callable_name)\n )\n\n # 3. Match the **kwargs, if any.\n if node.kwargs:\n for i, [(name, defval), assigned] in enumerate(parameters):\n # Assume that *kwargs provides values for all remaining\n # unassigned named parameters.\n if name is not None:\n parameters[i][1] = True\n else:\n # **kwargs can't assign to tuples.\n pass\n\n # Check that any parameters without a default have been assigned\n # values.\n for [(name, defval), assigned] in parameters:\n if (defval is None) and not assigned:\n if name is None:\n display_name = \"\"\n else:\n display_name = repr(name)\n # TODO(cpopa): this should be removed after PyCQA/astroid/issues/177\n if not has_no_context_positional_variadic:\n self.add_message(\n \"no-value-for-parameter\",\n node=node,\n args=(display_name, callable_name),\n )\n\n for name in kwparams:\n defval, assigned = kwparams[name]\n if defval is None and not assigned and not has_no_context_keywords_variadic:\n self.add_message(\"missing-kwoa\", node=node, args=(name, callable_name))", "response": "Check that the called functions and methods are inferred to callable objects and that the arguments passed to the function match the parameters in\n ."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef visit_unaryop(self, node):\n\n for error in node.type_errors():\n # Let the error customize its output.\n self.add_message(\"invalid-unary-operand-type\", args=str(error), node=node)", "response": "Detect TypeErrors for unary operands."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncalls when a node is visited.", "response": "def visit_call(self, node):\n \"\"\"Called when a :class:`.astroid.node_classes.Call` node is visited.\n\n See :mod:`astroid` for the description of available nodes.\n\n :param node: The node to check.\n :type node: astroid.node_classes.Call\n \"\"\"\n if not (\n isinstance(node.func, astroid.Attribute)\n and isinstance(node.func.expr, astroid.Name)\n and node.func.expr.name == self.config.store_locals_indicator\n and node.func.attrname == \"create\"\n ):\n return\n in_class = node.frame()\n for param in node.args:\n in_class.locals[param.name] = node"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning an iterator on interfaces implemented by the given class node.", "response": "def interfaces(node, herited=True, handler_func=_iface_hdlr):\n \"\"\"Return an iterator on interfaces implemented by the given class node.\"\"\"\n # FIXME: what if __implements__ = (MyIFace, MyParent.__implements__)...\n try:\n implements = bases.Instance(node).getattr(\"__implements__\")[0]\n except exceptions.NotFoundError:\n return\n if not herited and implements.frame() is not node:\n return\n found = set()\n missing = False\n for iface in node_classes.unpack_infer(implements):\n if iface is astroid.Uninferable:\n missing = True\n continue\n if iface not in found and handler_func(iface):\n found.add(iface)\n yield iface\n if missing:\n raise exceptions.InferenceError()"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns a Project from a list of files or modules", "response": "def project_from_files(\n files, func_wrapper=_astroid_wrapper, project_name=\"no name\", black_list=(\"CVS\",)\n):\n \"\"\"return a Project from a list of files or modules\"\"\"\n # build the project representation\n astroid_manager = manager.AstroidManager()\n project = Project(project_name)\n for something in files:\n if not os.path.exists(something):\n fpath = modutils.file_from_modpath(something.split(\".\"))\n elif os.path.isdir(something):\n fpath = os.path.join(something, \"__init__.py\")\n else:\n fpath = something\n ast = func_wrapper(astroid_manager.ast_from_file, fpath)\n if ast is None:\n continue\n # XXX why is first file defining the project.path ?\n project.path = project.path or ast.file\n project.add_module(ast)\n base_name = ast.name\n # recurse in package except if __init__ was explicitly given\n if ast.package and something.find(\"__init__\") == -1:\n # recurse on others packages / modules if this is a package\n for fpath in modutils.get_module_files(\n os.path.dirname(ast.file), black_list\n ):\n ast = func_wrapper(astroid_manager.ast_from_file, fpath)\n if ast is None or ast.name == base_name:\n continue\n project.add_module(ast)\n return project"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nvisit a pyreverse. utils. Project node optionally tag the node with a unique id", "response": "def visit_project(self, node):\n \"\"\"visit a pyreverse.utils.Project node\n\n * optionally tag the node with a unique id\n \"\"\"\n if self.tag:\n node.uid = self.generate_id()\n for module in node.modules:\n self.visit(module)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nvisit an astroid. Package node optionally tag the node with a unique id", "response": "def visit_package(self, node):\n \"\"\"visit an astroid.Package node\n\n * optionally tag the node with a unique id\n \"\"\"\n if self.tag:\n node.uid = self.generate_id()\n for subelmt in node.values():\n self.visit(subelmt)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nvisiting an astroid. Class node get the locals_type instance_attrs_type mappings set the uid attribute and build it", "response": "def visit_classdef(self, node):\n \"\"\"visit an astroid.Class node\n\n * set the locals_type and instance_attrs_type mappings\n * set the implements list and build it\n * optionally tag the node with a unique id\n \"\"\"\n if hasattr(node, \"locals_type\"):\n return\n node.locals_type = collections.defaultdict(list)\n if self.tag:\n node.uid = self.generate_id()\n # resolve ancestors\n for baseobj in node.ancestors(recurs=False):\n specializations = getattr(baseobj, \"specializations\", [])\n specializations.append(node)\n baseobj.specializations = specializations\n # resolve instance attributes\n node.instance_attrs_type = collections.defaultdict(list)\n for assignattrs in node.instance_attrs.values():\n for assignattr in assignattrs:\n self.handle_assignattr_type(assignattr, node)\n # resolve implemented interface\n try:\n node.implements = list(interfaces(node, self.inherited_interfaces))\n except astroid.InferenceError:\n node.implements = ()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nvisit an astroid. Function node", "response": "def visit_functiondef(self, node):\n \"\"\"visit an astroid.Function node\n\n * set the locals_type mapping\n * optionally tag the node with a unique id\n \"\"\"\n if hasattr(node, \"locals_type\"):\n return\n node.locals_type = collections.defaultdict(list)\n if self.tag:\n node.uid = self.generate_id()"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef visit_assignname(self, node):\n # avoid double parsing done by different Linkers.visit\n # running over the same project:\n if hasattr(node, \"_handled\"):\n return\n node._handled = True\n if node.name in node.frame():\n frame = node.frame()\n else:\n # the name has been defined as 'global' in the frame and belongs\n # there.\n frame = node.root()\n try:\n if not hasattr(frame, \"locals_type\"):\n # If the frame doesn't have a locals_type yet,\n # it means it wasn't yet visited. Visit it now\n # to add what's missing from it.\n if isinstance(frame, astroid.ClassDef):\n self.visit_classdef(frame)\n elif isinstance(frame, astroid.FunctionDef):\n self.visit_functiondef(frame)\n else:\n self.visit_module(frame)\n\n current = frame.locals_type[node.name]\n values = set(node.infer())\n frame.locals_type[node.name] = list(set(current) | values)\n except astroid.InferenceError:\n pass", "response": "handle an astroid. AssignName node\n handle locals_type\n "} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef handle_assignattr_type(node, parent):\n try:\n values = set(node.infer())\n current = set(parent.instance_attrs_type[node.attrname])\n parent.instance_attrs_type[node.attrname] = list(current | values)\n except astroid.InferenceError:\n pass", "response": "handle an astroid. assignattr node\n\n handle an astroid. assignattr node\n\n handle instance_attrs_type\n "} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nvisit an astroid. Import node resolve module dependencies", "response": "def visit_import(self, node):\n \"\"\"visit an astroid.Import node\n\n resolve module dependencies\n \"\"\"\n context_file = node.root().file\n for name in node.names:\n relative = modutils.is_relative(name[0], context_file)\n self._imported_module(node, name[0], relative)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nvisits an astroid. ImportFrom node resolve module dependencies", "response": "def visit_importfrom(self, node):\n \"\"\"visit an astroid.ImportFrom node\n\n resolve module dependencies\n \"\"\"\n basename = node.modname\n context_file = node.root().file\n if context_file is not None:\n relative = modutils.is_relative(basename, context_file)\n else:\n relative = False\n for name in node.names:\n if name[0] == \"*\":\n continue\n # analyze dependencies\n fullname = \"%s.%s\" % (basename, name[0])\n if fullname.find(\".\") > -1:\n try:\n # TODO: don't use get_module_part,\n # missing package precedence\n fullname = modutils.get_module_part(fullname, context_file)\n except ImportError:\n continue\n if fullname != basename:\n self._imported_module(node, fullname, relative)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef compute_module(self, context_name, mod_path):\n package_dir = os.path.dirname(self.project.path)\n if context_name == mod_path:\n return 0\n if modutils.is_standard_module(mod_path, (package_dir,)):\n return 1\n return 0", "response": "return true if the module should be added to dependencies"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _imported_module(self, node, mod_path, relative):\n module = node.root()\n context_name = module.name\n if relative:\n mod_path = \"%s.%s\" % (\".\".join(context_name.split(\".\")[:-1]), mod_path)\n if self.compute_module(context_name, mod_path):\n # handle dependencies\n if not hasattr(module, \"depends\"):\n module.depends = []\n mod_paths = module.depends\n if mod_path not in mod_paths:\n mod_paths.append(mod_path)", "response": "Notify an imported module used to analyze dependencies"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef implements(obj, interface):\n kimplements = getattr(obj, \"__implements__\", ())\n if not isinstance(kimplements, (list, tuple)):\n kimplements = (kimplements,)\n for implementedinterface in kimplements:\n if issubclass(implementedinterface, interface):\n return True\n return False", "response": "Return True if the give object implements the given interface."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns the ANSI escape code corresponding to color and style", "response": "def _get_ansi_code(color=None, style=None):\n \"\"\"return ansi escape code corresponding to color and style\n\n :type color: str or None\n :param color:\n the color name (see `ANSI_COLORS` for available values)\n or the color number when 256 colors are available\n\n :type style: str or None\n :param style:\n style string (see `ANSI_COLORS` for available values). To get\n several style effects at the same time, use a coma as separator.\n\n :raise KeyError: if an unexistent color or style identifier is given\n\n :rtype: str\n :return: the built escape code\n \"\"\"\n ansi_code = []\n if style:\n style_attrs = utils._splitstrip(style)\n for effect in style_attrs:\n ansi_code.append(ANSI_STYLES[effect])\n if color:\n if color.isdigit():\n ansi_code.extend([\"38\", \"5\"])\n ansi_code.append(color)\n else:\n ansi_code.append(ANSI_COLORS[color])\n if ansi_code:\n return ANSI_PREFIX + \";\".join(ansi_code) + ANSI_END\n return \"\""} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nregister the reporter classes with the linter.", "response": "def register(linter):\n \"\"\"Register the reporter classes with the linter.\"\"\"\n linter.register_reporter(TextReporter)\n linter.register_reporter(ParseableTextReporter)\n linter.register_reporter(VSTextReporter)\n linter.register_reporter(ColorizedTextReporter)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef handle_message(self, msg):\n if msg.module not in self._modules:\n if msg.module:\n self.writeln(\"************* Module %s\" % msg.module)\n self._modules.add(msg.module)\n else:\n self.writeln(\"************* \")\n self.write_message(msg)", "response": "manage message of different type and in the context of path"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef handle_message(self, msg):\n if msg.module not in self._modules:\n color, style = self._get_decoration(\"S\")\n if msg.module:\n modsep = colorize_ansi(\n \"************* Module %s\" % msg.module, color, style\n )\n else:\n modsep = colorize_ansi(\"************* %s\" % msg.module, color, style)\n self.writeln(modsep)\n self._modules.add(msg.module)\n color, style = self._get_decoration(msg.C)\n\n msg = msg._replace(\n **{\n attr: colorize_ansi(getattr(msg, attr), color, style)\n for attr in (\"msg\", \"symbol\", \"category\", \"C\")\n }\n )\n self.write_message(msg)", "response": "manage message of different types and colorize output\n using ansi escape codes"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nopening a vcg graph", "response": "def open_graph(self, **args):\n \"\"\"open a vcg graph\n \"\"\"\n self._stream.write(\"%sgraph:{\\n\" % self._indent)\n self._inc_indent()\n self._write_attributes(GRAPH_ATTRS, **args)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ndrawing an edge from a node to another.", "response": "def edge(self, from_node, to_node, edge_type=\"\", **args):\n \"\"\"draw an edge from a node to another.\n \"\"\"\n self._stream.write(\n '%s%sedge: {sourcename:\"%s\" targetname:\"%s\"'\n % (self._indent, edge_type, from_node, to_node)\n )\n self._write_attributes(EDGE_ATTRS, **args)\n self._stream.write(\"}\\n\")"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _write_attributes(self, attributes_dict, **args):\n for key, value in args.items():\n try:\n _type = attributes_dict[key]\n except KeyError:\n raise Exception(\n \"\"\"no such attribute %s\npossible attributes are %s\"\"\"\n % (key, attributes_dict.keys())\n )\n\n if not _type:\n self._stream.write('%s%s:\"%s\"\\n' % (self._indent, key, value))\n elif _type == 1:\n self._stream.write(\"%s%s:%s\\n\" % (self._indent, key, int(value)))\n elif value in _type:\n self._stream.write(\"%s%s:%s\\n\" % (self._indent, key, value))\n else:\n raise Exception(\n \"\"\"value %s isn\\'t correct for attribute %s\ncorrect values are %s\"\"\"\n % (value, key, _type)\n )", "response": "write graph node or edge attributes"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngiving a list of format specifiers returns a list of format specifiers the final path", "response": "def get_access_path(key, parts):\n \"\"\" Given a list of format specifiers, returns\n the final access path (e.g. a.b.c[0][1]).\n \"\"\"\n path = []\n for is_attribute, specifier in parts:\n if is_attribute:\n path.append(\".{}\".format(specifier))\n else:\n path.append(\"[{!r}]\".format(specifier))\n return str(key) + \"\".join(path)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef register(linter):\n linter.register_checker(StringFormatChecker(linter))\n linter.register_checker(StringConstantChecker(linter))", "response": "required method to auto register this checker"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef str_eval(token):\n if token[0:2].lower() in (\"fr\", \"rf\"):\n token = token[2:]\n elif token[0].lower() in (\"r\", \"u\", \"f\"):\n token = token[1:]\n if token[0:3] in ('\"\"\"', \"'''\"):\n return token[3:-3]\n return token[1:-1]", "response": "Return a string from a token."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _check_new_format(self, node, func):\n # TODO: skip (for now) format nodes which don't have\n # an explicit string on the left side of the format operation.\n # We do this because our inference engine can't properly handle\n # redefinitions of the original string.\n # For more details, see issue 287.\n #\n # Note that there may not be any left side at all, if the format method\n # has been assigned to another variable. See issue 351. For example:\n #\n # fmt = 'some string {}'.format\n # fmt('arg')\n if isinstance(node.func, astroid.Attribute) and not isinstance(\n node.func.expr, astroid.Const\n ):\n return\n if node.starargs or node.kwargs:\n return\n try:\n strnode = next(func.bound.infer())\n except astroid.InferenceError:\n return\n if not (isinstance(strnode, astroid.Const) and isinstance(strnode.value, str)):\n return\n try:\n call_site = CallSite.from_call(node)\n except astroid.InferenceError:\n return\n\n try:\n fields, num_args, manual_pos = utils.parse_format_method_string(\n strnode.value\n )\n except utils.IncompleteFormatString:\n self.add_message(\"bad-format-string\", node=node)\n return\n\n positional_arguments = call_site.positional_arguments\n named_arguments = call_site.keyword_arguments\n named_fields = {field[0] for field in fields if isinstance(field[0], str)}\n if num_args and manual_pos:\n self.add_message(\"format-combined-specification\", node=node)\n return\n\n check_args = False\n # Consider \"{[0]} {[1]}\" as num_args.\n num_args += sum(1 for field in named_fields if field == \"\")\n if named_fields:\n for field in named_fields:\n if field and field not in named_arguments:\n self.add_message(\n \"missing-format-argument-key\", node=node, args=(field,)\n )\n for field in named_arguments:\n if field not in named_fields:\n self.add_message(\n \"unused-format-string-argument\", node=node, args=(field,)\n )\n # num_args can be 0 if manual_pos is not.\n num_args = num_args or manual_pos\n if positional_arguments or num_args:\n empty = any(True for field in named_fields if field == \"\")\n if named_arguments or empty:\n # Verify the required number of positional arguments\n # only if the .format got at least one keyword argument.\n # This means that the format strings accepts both\n # positional and named fields and we should warn\n # when one of the them is missing or is extra.\n check_args = True\n else:\n check_args = True\n if check_args:\n # num_args can be 0 if manual_pos is not.\n num_args = num_args or manual_pos\n if len(positional_arguments) > num_args:\n self.add_message(\"too-many-format-args\", node=node)\n elif len(positional_arguments) < num_args:\n self.add_message(\"too-few-format-args\", node=node)\n\n self._detect_vacuous_formatting(node, positional_arguments)\n self._check_new_format_specifiers(node, fields, named_arguments)", "response": "Check the new string formatting."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck if the format of the object is correct.", "response": "def _check_new_format_specifiers(self, node, fields, named):\n \"\"\"\n Check attribute and index access in the format\n string (\"{0.a}\" and \"{0[a]}\").\n \"\"\"\n for key, specifiers in fields:\n # Obtain the argument. If it can't be obtained\n # or infered, skip this check.\n if key == \"\":\n # {[0]} will have an unnamed argument, defaulting\n # to 0. It will not be present in `named`, so use the value\n # 0 for it.\n key = 0\n if isinstance(key, numbers.Number):\n try:\n argname = utils.get_argument_from_call(node, key)\n except utils.NoSuchArgumentError:\n continue\n else:\n if key not in named:\n continue\n argname = named[key]\n if argname in (astroid.Uninferable, None):\n continue\n try:\n argument = next(argname.infer())\n except astroid.InferenceError:\n continue\n if not specifiers or argument is astroid.Uninferable:\n # No need to check this key if it doesn't\n # use attribute / item access\n continue\n if argument.parent and isinstance(argument.parent, astroid.Arguments):\n # Ignore any object coming from an argument,\n # because we can't infer its value properly.\n continue\n previous = argument\n parsed = []\n for is_attribute, specifier in specifiers:\n if previous is astroid.Uninferable:\n break\n parsed.append((is_attribute, specifier))\n if is_attribute:\n try:\n previous = previous.getattr(specifier)[0]\n except astroid.NotFoundError:\n if (\n hasattr(previous, \"has_dynamic_getattr\")\n and previous.has_dynamic_getattr()\n ):\n # Don't warn if the object has a custom __getattr__\n break\n path = get_access_path(key, parsed)\n self.add_message(\n \"missing-format-attribute\",\n args=(specifier, path),\n node=node,\n )\n break\n else:\n warn_error = False\n if hasattr(previous, \"getitem\"):\n try:\n previous = previous.getitem(astroid.Const(specifier))\n except (\n astroid.AstroidIndexError,\n astroid.AstroidTypeError,\n astroid.AttributeInferenceError,\n ):\n warn_error = True\n except astroid.InferenceError:\n break\n if previous is astroid.Uninferable:\n break\n else:\n try:\n # Lookup __getitem__ in the current node,\n # but skip further checks, because we can't\n # retrieve the looked object\n previous.getattr(\"__getitem__\")\n break\n except astroid.NotFoundError:\n warn_error = True\n if warn_error:\n path = get_access_path(key, parsed)\n self.add_message(\n \"invalid-format-index\", args=(specifier, path), node=node\n )\n break\n\n try:\n previous = next(previous.infer())\n except astroid.InferenceError:\n # can't check further if we can't infer it\n break"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef process_non_raw_string_token(self, prefix, string_body, start_row):\n # Walk through the string; if we see a backslash then escape the next\n # character, and skip over it. If we see a non-escaped character,\n # alert, and continue.\n #\n # Accept a backslash when it escapes a backslash, or a quote, or\n # end-of-line, or one of the letters that introduce a special escape\n # sequence \n #\n # TODO(mbp): Maybe give a separate warning about the rarely-used\n # \\a \\b \\v \\f?\n #\n # TODO(mbp): We could give the column of the problem character, but\n # add_message doesn't seem to have a way to pass it through at present.\n i = 0\n while True:\n i = string_body.find(\"\\\\\", i)\n if i == -1:\n break\n # There must be a next character; having a backslash at the end\n # of the string would be a SyntaxError.\n next_char = string_body[i + 1]\n match = string_body[i : i + 2]\n if next_char in self.UNICODE_ESCAPE_CHARACTERS:\n if \"u\" in prefix:\n pass\n elif (_PY3K or self._unicode_literals) and \"b\" not in prefix:\n pass # unicode by default\n else:\n self.add_message(\n \"anomalous-unicode-escape-in-string\",\n line=start_row,\n args=(match,),\n )\n elif next_char not in self.ESCAPE_CHARACTERS:\n self.add_message(\n \"anomalous-backslash-in-string\", line=start_row, args=(match,)\n )\n # Whether it was a valid escape or not, backslash followed by\n # another character can always be consumed whole: the second\n # character can never be the start of a new backslash escape.\n i += 2", "response": "Process a non - raw string token."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef visit_section(self, layout):\n self.section += 1\n self.writeln()\n self.format_children(layout)\n self.section -= 1\n self.writeln()", "response": "display a section as text\n "} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef visit_evaluationsection(self, layout):\n self.section += 1\n self.format_children(layout)\n self.section -= 1\n self.writeln()", "response": "Display an evaluation section as a text."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndisplaying a table as text", "response": "def visit_table(self, layout):\n \"\"\"display a table as text\"\"\"\n table_content = self.get_table_content(layout)\n # get columns width\n cols_width = [0] * len(table_content[0])\n for row in table_content:\n for index, col in enumerate(row):\n cols_width[index] = max(cols_width[index], len(col))\n self.default_table(layout, table_content, cols_width)\n self.writeln()"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef visit_verbatimtext(self, layout):\n self.writeln(\"::\\n\")\n for line in layout.data.splitlines():\n self.writeln(\" \" + line)\n self.writeln()", "response": "display a verbatim text layout as text"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef add_renamed_message(self, old_id, old_symbol, new_symbol):\n message_definition = self.get_message_definitions(new_symbol)[0]\n message_definition.old_names.append((old_id, old_symbol))\n self._register_alternative_name(message_definition, old_id, old_symbol)", "response": "Register the old ID and symbol for a warning that was renamed."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef register_messages_from_checker(self, checker):\n checker.check_consistency()\n for message in checker.messages:\n self.register_message(message)", "response": "Register all messages from a checker."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nregisters a message with consistency in mind.", "response": "def register_message(self, message):\n \"\"\"Register a MessageDefinition with consistency in mind.\n\n :param MessageDefinition message: The message definition being added.\n \"\"\"\n self._check_id_and_symbol_consistency(message.msgid, message.symbol)\n self._check_symbol(message.msgid, message.symbol)\n self._check_msgid(message.msgid, message.symbol)\n for old_name in message.old_names:\n self._check_symbol(message.msgid, old_name[1])\n self._messages_definitions[message.symbol] = message\n self._register_alternative_name(message, message.msgid, message.symbol)\n for old_id, old_symbol in message.old_names:\n self._register_alternative_name(message, old_id, old_symbol)\n self._msgs_by_category[message.msgid[0]].append(message.msgid)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck that a symbol is not already used.", "response": "def _check_symbol(self, msgid, symbol):\n \"\"\"Check that a symbol is not already used. \"\"\"\n other_message = self._messages_definitions.get(symbol)\n if other_message:\n self._raise_duplicate_msg_id(symbol, msgid, other_message.msgid)\n else:\n alternative_msgid = None\n alternative_message = self._alternative_names.get(symbol)\n if alternative_message:\n if alternative_message.symbol == symbol:\n alternative_msgid = alternative_message.msgid\n else:\n for old_msgid, old_symbol in alternative_message.old_names:\n if old_symbol == symbol:\n alternative_msgid = old_msgid\n break\n if msgid != alternative_msgid:\n self._raise_duplicate_msg_id(symbol, msgid, alternative_msgid)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _raise_duplicate_symbol(msgid, symbol, other_symbol):\n symbols = [symbol, other_symbol]\n symbols.sort()\n error_message = \"Message id '{msgid}' cannot have both \".format(msgid=msgid)\n error_message += \"'{other_symbol}' and '{symbol}' as symbolic name.\".format(\n other_symbol=symbols[0], symbol=symbols[1]\n )\n raise InvalidMessageError(error_message)", "response": "Raise an error when a symbol is duplicated."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _raise_duplicate_msg_id(symbol, msgid, other_msgid):\n msgids = [msgid, other_msgid]\n msgids.sort()\n error_message = \"Message symbol '{symbol}' cannot be used for \".format(\n symbol=symbol\n )\n error_message += \"'{other_msgid}' and '{msgid}' at the same time.\".format(\n other_msgid=msgids[0], msgid=msgids[1]\n )\n raise InvalidMessageError(error_message)", "response": "Raise an error when a msgid is duplicated."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning the list of Message Definitions corresponding to the given message id or symbol.", "response": "def get_message_definitions(self, msgid_or_symbol: str) -> list:\n \"\"\"Returns the Message object for this message.\n\n :param str msgid_or_symbol: msgid_or_symbol may be either a numeric or symbolic id.\n :raises UnknownMessageError: if the message id is not defined.\n :rtype: List of MessageDefinition\n :return: A message definition corresponding to msgid_or_symbol\n \"\"\"\n if msgid_or_symbol[1:].isdigit():\n msgid_or_symbol = msgid_or_symbol.upper()\n for source in (self._alternative_names, self._messages_definitions):\n try:\n return [source[msgid_or_symbol]]\n except KeyError:\n pass\n error_msg = \"No such message id or symbol '{msgid_or_symbol}'.\".format(\n msgid_or_symbol=msgid_or_symbol\n )\n raise UnknownMessageError(error_msg)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngenerating a user - consumptionable representation of a message.", "response": "def get_msg_display_string(self, msgid):\n \"\"\"Generates a user-consumable representation of a message.\n\n Can be just the message ID or the ID and the symbol.\n \"\"\"\n message_definitions = self.get_message_definitions(msgid)\n if len(message_definitions) == 1:\n return repr(message_definitions[0].symbol)\n return repr([md.symbol for md in message_definitions])"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndisplays the help messages for the given message identifiers", "response": "def help_message(self, msgids):\n \"\"\"Display help messages for the given message identifiers\"\"\"\n for msgid in msgids:\n try:\n for message_definition in self.get_message_definitions(msgid):\n print(message_definition.format_help(checkerref=True))\n print(\"\")\n except UnknownMessageError as ex:\n print(ex)\n print(\"\")\n continue"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\noutputting full messages list documentation in ReST format.", "response": "def list_messages(self):\n \"\"\"Output full messages list documentation in ReST format. \"\"\"\n messages = sorted(self._messages_definitions.values(), key=lambda m: m.msgid)\n for message in messages:\n if not message.may_be_emitted():\n continue\n print(message.format_help(checkerref=False))\n print(\"\")"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nrequiring method to auto register this checker.", "response": "def register(linter):\n \"\"\"Required method to auto register this checker.\n\n :param linter: Main interface object for Pylint plugins\n :type linter: Pylint object\n \"\"\"\n warnings.warn(\n \"This plugin is deprecated, use pylint.extensions.docparams instead.\",\n DeprecationWarning,\n )\n linter.register_checker(docparams.DocstringParameterChecker(linter))"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\noutput full documentation in ReST format for all extension modules", "response": "def builder_inited(app):\n \"\"\"Output full documentation in ReST format for all extension modules\"\"\"\n # PACKAGE/docs/exts/pylint_extensions.py --> PACKAGE/\n base_path = os.path.dirname(\n os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\n )\n # PACKAGE/ --> PACKAGE/pylint/extensions\n ext_path = os.path.join(base_path, \"pylint\", \"extensions\")\n modules = []\n doc_files = {}\n for filename in os.listdir(ext_path):\n name, ext = os.path.splitext(filename)\n if name[0] == \"_\" or name in DEPRECATED_MODULES:\n continue\n if ext == \".py\":\n modules.append(\"pylint.extensions.%s\" % name)\n elif ext == \".rst\":\n doc_files[\"pylint.extensions.\" + name] = os.path.join(ext_path, filename)\n modules.sort()\n if not modules:\n sys.exit(\"No Pylint extensions found?\")\n\n linter = PyLinter()\n linter.load_plugin_modules(modules)\n\n extensions_doc = os.path.join(\n base_path, \"doc\", \"technical_reference\", \"extensions.rst\"\n )\n with open(extensions_doc, \"w\") as stream:\n stream.write(\"Optional Pylint checkers in the extensions module\\n\")\n stream.write(\"=================================================\\n\\n\")\n stream.write(\"Pylint provides the following optional plugins:\\n\\n\")\n for module in modules:\n stream.write(\"- :ref:`{}`\\n\".format(module))\n stream.write(\"\\n\")\n stream.write(\n \"You can activate any or all of these extensions \"\n \"by adding a ``load-plugins`` line to the ``MASTER`` \"\n \"section of your ``.pylintrc``, for example::\\n\"\n )\n stream.write(\n \"\\n load-plugins=pylint.extensions.docparams,\"\n \"pylint.extensions.docstyle\\n\\n\"\n )\n by_module = get_plugins_info(linter, doc_files)\n for module, info in sorted(by_module.items()):\n linter._print_checker_doc(info[\"name\"], info, stream=stream)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the number of CPU cores available for virtualized or containerized environments.", "response": "def _cpu_count() -> int:\n \"\"\"Use sched_affinity if available for virtualized or containerized environments.\"\"\"\n sched_getaffinity = getattr(os, \"sched_getaffinity\", None)\n # pylint: disable=not-callable,using-constant-test\n if sched_getaffinity:\n return len(sched_getaffinity(0))\n if multiprocessing:\n return multiprocessing.cpu_count()\n return 1"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nmakes total errors / warnings report", "response": "def report_total_messages_stats(sect, stats, previous_stats):\n \"\"\"make total errors / warnings report\"\"\"\n lines = [\"type\", \"number\", \"previous\", \"difference\"]\n lines += checkers.table_lines_from_stats(\n stats, previous_stats, (\"convention\", \"refactor\", \"warning\", \"error\")\n )\n sect.append(report_nodes.Table(children=lines, cols=4, rheaders=1))"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef report_messages_stats(sect, stats, _):\n if not stats[\"by_msg\"]:\n # don't print this report when we didn't detected any errors\n raise exceptions.EmptyReportError()\n in_order = sorted(\n [\n (value, msg_id)\n for msg_id, value in stats[\"by_msg\"].items()\n if not msg_id.startswith(\"I\")\n ]\n )\n in_order.reverse()\n lines = (\"message id\", \"occurrences\")\n for value, msg_id in in_order:\n lines += (msg_id, str(value))\n sect.append(report_nodes.Table(children=lines, cols=2, rheaders=1))", "response": "make messages type report"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef report_messages_by_module_stats(sect, stats, _):\n if len(stats[\"by_module\"]) == 1:\n # don't print this report when we are analysing a single module\n raise exceptions.EmptyReportError()\n by_mod = collections.defaultdict(dict)\n for m_type in (\"fatal\", \"error\", \"warning\", \"refactor\", \"convention\"):\n total = stats[m_type]\n for module in stats[\"by_module\"].keys():\n mod_total = stats[\"by_module\"][module][m_type]\n if total == 0:\n percent = 0\n else:\n percent = float((mod_total) * 100) / total\n by_mod[module][m_type] = percent\n sorted_result = []\n for module, mod_info in by_mod.items():\n sorted_result.append(\n (\n mod_info[\"error\"],\n mod_info[\"warning\"],\n mod_info[\"refactor\"],\n mod_info[\"convention\"],\n module,\n )\n )\n sorted_result.sort()\n sorted_result.reverse()\n lines = [\"module\", \"error\", \"warning\", \"refactor\", \"convention\"]\n for line in sorted_result:\n # Don't report clean modules.\n if all(entry == 0 for entry in line[:-1]):\n continue\n lines.append(line[-1])\n for val in line[:-1]:\n lines.append(\"%.2f\" % val)\n if len(lines) == 5:\n raise exceptions.EmptyReportError()\n sect.append(report_nodes.Table(children=lines, cols=5, rheaders=1))", "response": "make errors / warnings by modules report"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nprocessing some options in the sequence of key = value pairs", "response": "def preprocess_options(args, search_for):\n \"\"\"look for some options (keys of ) which have to be processed\n before others\n\n values of are callback functions to call when the option is\n found\n \"\"\"\n i = 0\n while i < len(args):\n arg = args[i]\n if arg.startswith(\"--\"):\n try:\n option, val = arg[2:].split(\"=\", 1)\n except ValueError:\n option, val = arg[2:], None\n try:\n cb, takearg = search_for[option]\n except KeyError:\n i += 1\n else:\n del args[i]\n if takearg and val is None:\n if i >= len(args) or args[i].startswith(\"-\"):\n msg = \"Option %s expects a value\" % option\n raise ArgumentPreprocessingError(msg)\n val = args[i]\n del args[i]\n elif not takearg and val is not None:\n msg = \"Option %s doesn't expects a value\" % option\n raise ArgumentPreprocessingError(msg)\n cb(option, val)\n else:\n i += 1"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nfixing sys. path for running the linter checks.", "response": "def fix_import_path(args):\n \"\"\"Prepare sys.path for running the linter checks.\n\n Within this context, each of the given arguments is importable.\n Paths are added to sys.path in corresponding order to the arguments.\n We avoid adding duplicate directories to sys.path.\n `sys.path` is reset to its original value upon exiting this context.\n \"\"\"\n orig = list(sys.path)\n changes = []\n for arg in args:\n path = _get_python_path(arg)\n if path in changes:\n continue\n else:\n changes.append(path)\n sys.path[:] = changes + [\".\"] + sys.path\n try:\n yield\n finally:\n sys.path[:] = orig"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef load_plugin_modules(self, modnames):\n for modname in modnames:\n if modname in self._dynamic_plugins:\n continue\n self._dynamic_plugins.add(modname)\n module = modutils.load_module_from_name(modname)\n module.register(self)", "response": "load a list of module names which are pylint plugins and load\n and register them"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef load_plugin_configuration(self):\n for modname in self._dynamic_plugins:\n module = modutils.load_module_from_name(modname)\n if hasattr(module, \"load_configuration\"):\n module.load_configuration(self)", "response": "Call the load_configuration hook for plugins\n\n "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\noverriding from config.OptionsProviderMixin to handle some special options", "response": "def set_option(self, optname, value, action=None, optdict=None):\n \"\"\"overridden from config.OptionsProviderMixin to handle some\n special options\n \"\"\"\n if optname in self._options_methods or optname in self._bw_options_methods:\n if value:\n try:\n meth = self._options_methods[optname]\n except KeyError:\n meth = self._bw_options_methods[optname]\n warnings.warn(\n \"%s is deprecated, replace it by %s\"\n % (optname, optname.split(\"-\")[0]),\n DeprecationWarning,\n )\n value = utils._check_csv(value)\n if isinstance(value, (list, tuple)):\n for _id in value:\n meth(_id, ignore_unknown=True)\n else:\n meth(value)\n return # no need to call set_option, disable/enable methods do it\n elif optname == \"output-format\":\n self._reporter_name = value\n # If the reporters are already available, load\n # the reporter class.\n if self._reporters:\n self._load_reporter()\n\n try:\n checkers.BaseTokenChecker.set_option(self, optname, value, action, optdict)\n except config.UnsupportedAction:\n print(\"option %s can't be read from config file\" % optname, file=sys.stderr)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef register_checker(self, checker):\n assert checker.priority <= 0, \"checker priority can't be >= 0\"\n self._checkers[checker.name].append(checker)\n for r_id, r_title, r_cb in checker.reports:\n self.register_report(r_id, r_title, r_cb, checker)\n self.register_options_provider(checker)\n if hasattr(checker, \"msgs\"):\n self.msgs_store.register_messages_from_checker(checker)\n checker.load_defaults()\n\n # Register the checker, but disable all of its messages.\n # TODO(cpopa): we should have a better API for this.\n if not getattr(checker, \"enabled\", True):\n self.disable(checker.name)", "response": "register a new checker"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef error_mode(self):\n self._error_mode = True\n self.disable_noerror_messages()\n self.disable(\"miscellaneous\")\n if self._python3_porting_mode:\n self.disable(\"all\")\n for msg_id in self._checker_messages(\"python3\"):\n if msg_id.startswith(\"E\"):\n self.enable(msg_id)\n config_parser = self.cfgfile_parser\n if config_parser.has_option(\"MESSAGES CONTROL\", \"disable\"):\n value = config_parser.get(\"MESSAGES CONTROL\", \"disable\")\n self.global_set_option(\"disable\", value)\n else:\n self.disable(\"python3\")\n self.set_option(\"reports\", False)\n self.set_option(\"persistent\", False)\n self.set_option(\"score\", False)", "response": "error mode enable only errors ; no reports no persistent"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef python3_porting_mode(self):\n self.disable(\"all\")\n self.enable(\"python3\")\n if self._error_mode:\n # The error mode was activated, using the -E flag.\n # So we'll need to enable only the errors from the\n # Python 3 porting checker.\n for msg_id in self._checker_messages(\"python3\"):\n if msg_id.startswith(\"E\"):\n self.enable(msg_id)\n else:\n self.disable(msg_id)\n config_parser = self.cfgfile_parser\n if config_parser.has_option(\"MESSAGES CONTROL\", \"disable\"):\n value = config_parser.get(\"MESSAGES CONTROL\", \"disable\")\n self.global_set_option(\"disable\", value)\n self._python3_porting_mode = True", "response": "Disable all other checkers and enable Python 3 warnings."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nprocess tokens from the current module to search for module level options and block level options", "response": "def process_tokens(self, tokens):\n \"\"\"process tokens from the current module to search for module/block\n level options\n \"\"\"\n control_pragmas = {\"disable\", \"enable\"}\n for (tok_type, content, start, _, _) in tokens:\n if tok_type != tokenize.COMMENT:\n continue\n match = OPTION_RGX.search(content)\n if match is None:\n continue\n\n first_group = match.group(1)\n if (\n first_group.strip() == \"disable-all\"\n or first_group.strip() == \"skip-file\"\n ):\n if first_group.strip() == \"disable-all\":\n self.add_message(\n \"deprecated-pragma\",\n line=start[0],\n args=(\"disable-all\", \"skip-file\"),\n )\n self.add_message(\"file-ignored\", line=start[0])\n self._ignore_file = True\n return\n try:\n opt, value = first_group.split(\"=\", 1)\n except ValueError:\n self.add_message(\n \"bad-inline-option\", args=first_group.strip(), line=start[0]\n )\n continue\n opt = opt.strip()\n if opt in self._options_methods or opt in self._bw_options_methods:\n try:\n meth = self._options_methods[opt]\n except KeyError:\n meth = self._bw_options_methods[opt]\n # found a \"(dis|en)able-msg\" pragma deprecated suppression\n self.add_message(\n \"deprecated-pragma\",\n line=start[0],\n args=(opt, opt.replace(\"-msg\", \"\")),\n )\n for msgid in utils._splitstrip(value):\n # Add the line where a control pragma was encountered.\n if opt in control_pragmas:\n self._pragma_lineno[msgid] = start[0]\n\n try:\n if (opt, msgid) == (\"disable\", \"all\"):\n self.add_message(\n \"deprecated-pragma\",\n line=start[0],\n args=(\"disable=all\", \"skip-file\"),\n )\n self.add_message(\"file-ignored\", line=start[0])\n self._ignore_file = True\n return\n meth(msgid, \"module\", start[0])\n except exceptions.UnknownMessageError:\n self.add_message(\"bad-option-value\", args=msgid, line=start[0])\n else:\n self.add_message(\"unrecognized-inline-option\", args=opt, line=start[0])"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_checkers(self):\n return [self] + [\n c\n for _checkers in self._checkers.values()\n for c in _checkers\n if c is not self\n ]", "response": "return all available checkers as a list"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets all the checker names that this linter knows about.", "response": "def get_checker_names(self):\n \"\"\"Get all the checker names that this linter knows about.\"\"\"\n current_checkers = self.get_checkers()\n return sorted(\n {check.name for check in current_checkers if check.name != \"master\"}\n )"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef prepare_checkers(self):\n if not self.config.reports:\n self.disable_reporters()\n # get needed checkers\n neededcheckers = [self]\n for checker in self.get_checkers()[1:]:\n messages = {msg for msg in checker.msgs if self.is_message_enabled(msg)}\n if messages or any(self.report_is_enabled(r[0]) for r in checker.reports):\n neededcheckers.append(checker)\n # Sort checkers by priority\n neededcheckers = sorted(\n neededcheckers, key=operator.attrgetter(\"priority\"), reverse=True\n )\n return neededcheckers", "response": "return checkers needed for activated messages and reports"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef check(self, files_or_modules):\n # initialize msgs_state now that all messages have been registered into\n # the store\n for msg in self.msgs_store.messages:\n if not msg.may_be_emitted():\n self._msgs_state[msg.msgid] = False\n\n if not isinstance(files_or_modules, (list, tuple)):\n files_or_modules = (files_or_modules,)\n\n if self.config.jobs == 1:\n self._do_check(files_or_modules)\n else:\n self._parallel_check(files_or_modules)", "response": "main checking entry: check a list of files or modules from their\n name."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef expand_files(self, modules):\n result, errors = utils.expand_modules(\n modules, self.config.black_list, self.config.black_list_re\n )\n for error in errors:\n message = modname = error[\"mod\"]\n key = error[\"key\"]\n self.set_current_module(modname)\n if key == \"fatal\":\n message = str(error[\"ex\"]).replace(os.getcwd() + os.sep, \"\")\n self.add_message(key, args=message)\n return result", "response": "get modules and errors from a list of modules and handle errors\n "} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef set_current_module(self, modname, filepath=None):\n if not modname and filepath is None:\n return\n self.reporter.on_set_current_module(modname, filepath)\n self.current_name = modname\n self.current_file = filepath or modname\n self.stats[\"by_module\"][modname] = {}\n self.stats[\"by_module\"][modname][\"statement\"] = 0\n for msg_cat in MSG_TYPES.values():\n self.stats[\"by_module\"][modname][msg_cat] = 0", "response": "set the name of the currently analyzed module and init statistics for it\n "} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_ast(self, filepath, modname):\n try:\n return MANAGER.ast_from_file(filepath, modname, source=True)\n except astroid.AstroidSyntaxError as ex:\n # pylint: disable=no-member\n self.add_message(\n \"syntax-error\", line=getattr(ex.error, \"lineno\", 0), args=str(ex.error)\n )\n except astroid.AstroidBuildingException as ex:\n self.add_message(\"parse-error\", args=ex)\n except Exception as ex:\n import traceback\n\n traceback.print_exc()\n self.add_message(\"astroid-error\", args=(ex.__class__, ex))", "response": "return an ast representation for a module"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef check_astroid_module(self, ast_node, walker, rawcheckers, tokencheckers):\n try:\n tokens = utils.tokenize_module(ast_node)\n except tokenize.TokenError as ex:\n self.add_message(\"syntax-error\", line=ex.args[1][0], args=ex.args[0])\n return None\n\n if not ast_node.pure_python:\n self.add_message(\"raw-checker-failed\", args=ast_node.name)\n else:\n # assert astroid.file.endswith('.py')\n # invoke ITokenChecker interface on self to fetch module/block\n # level options\n self.process_tokens(tokens)\n if self._ignore_file:\n return False\n # walk ast to collect line numbers\n self.file_state.collect_block_lines(self.msgs_store, ast_node)\n # run raw and tokens checkers\n for checker in rawcheckers:\n checker.process_module(ast_node)\n for checker in tokencheckers:\n checker.process_tokens(tokens)\n # generate events to astroid checkers\n walker.walk(ast_node)\n return True", "response": "Check a module from its astroid representation."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef open(self):\n self.stats = {\"by_module\": {}, \"by_msg\": {}}\n MANAGER.always_load_extensions = self.config.unsafe_load_any_extension\n MANAGER.max_inferable_values = self.config.limit_inference_results\n MANAGER.extension_package_whitelist.update(self.config.extension_pkg_whitelist)\n for msg_cat in MSG_TYPES.values():\n self.stats[msg_cat] = 0", "response": "initialize counters for the current locale"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef generate_reports(self):\n # Display whatever messages are left on the reporter.\n self.reporter.display_messages(report_nodes.Section())\n\n if self.file_state.base_name is not None:\n # load previous results if any\n previous_stats = config.load_results(self.file_state.base_name)\n # XXX code below needs refactoring to be more reporter agnostic\n self.reporter.on_close(self.stats, previous_stats)\n if self.config.reports:\n sect = self.make_reports(self.stats, previous_stats)\n else:\n sect = report_nodes.Section()\n\n if self.config.reports:\n self.reporter.display_reports(sect)\n self._report_evaluation()\n # save results if persistent run\n if self.config.persistent:\n config.save_results(self.stats, self.file_state.base_name)\n else:\n self.reporter.on_close(self.stats, {})", "response": "Generate reports for the current package."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nmake the global evaluation report", "response": "def _report_evaluation(self):\n \"\"\"make the global evaluation report\"\"\"\n # check with at least check 1 statements (usually 0 when there is a\n # syntax error preventing pylint from further processing)\n previous_stats = config.load_results(self.file_state.base_name)\n if self.stats[\"statement\"] == 0:\n return\n\n # get a global note for the code\n evaluation = self.config.evaluation\n try:\n note = eval(evaluation, {}, self.stats) # pylint: disable=eval-used\n except Exception as ex:\n msg = \"An exception occurred while rating: %s\" % ex\n else:\n self.stats[\"global_note\"] = note\n msg = \"Your code has been rated at %.2f/10\" % note\n pnote = previous_stats.get(\"global_note\")\n if pnote is not None:\n msg += \" (previous run: %.2f/10, %+.2f)\" % (pnote, note - pnote)\n\n if self.config.score:\n sect = report_nodes.EvaluationSection(msg)\n self.reporter.display_reports(sect)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef cb_add_plugins(self, name, value):\n self._plugins.extend(utils._splitstrip(value))", "response": "callback for adding plugins to the log"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef cb_generate_config(self, *args, **kwargs):\n self.linter.generate_config(skipsections=(\"COMMANDS\",))\n sys.exit(0)", "response": "optik callback for sample config file generation"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef cb_list_groups(self, *args, **kwargs):\n for check in self.linter.get_checker_names():\n print(check)\n sys.exit(0)", "response": "List all the check groups that pylint knows about."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nwrap the text on the given line length.", "response": "def normalize_text(text, line_len=80, indent=\"\"):\n \"\"\"Wrap the text on the given line length.\"\"\"\n return \"\\n\".join(\n textwrap.wrap(\n text, width=line_len, initial_indent=indent, subsequent_indent=indent\n )\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_module_and_frameid(node):\n frame = node.frame()\n module, obj = \"\", []\n while frame:\n if isinstance(frame, Module):\n module = frame.name\n else:\n obj.append(getattr(frame, \"name\", \"\"))\n try:\n frame = frame.parent.frame()\n except AttributeError:\n frame = None\n obj.reverse()\n return module, \".\".join(obj)", "response": "return the module name and the frame id in the module"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ndecoding line from encoding or decode with default encoding", "response": "def safe_decode(line, encoding, *args, **kwargs):\n \"\"\"return decoded line from encoding or decode with default encoding\"\"\"\n try:\n return line.decode(encoding or sys.getdefaultencoding(), *args, **kwargs)\n except LookupError:\n return line.decode(sys.getdefaultencoding(), *args, **kwargs)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndetermining if the basename is matched in a regex.", "response": "def _basename_in_blacklist_re(base_name, black_list_re):\n \"\"\"Determines if the basename is matched in a regex blacklist\n\n :param str base_name: The basename of the file\n :param list black_list_re: A collection of regex patterns to match against.\n Successful matches are blacklisted.\n\n :returns: `True` if the basename is blacklisted, `False` otherwise.\n :rtype: bool\n \"\"\"\n for file_pattern in black_list_re:\n if file_pattern.match(base_name):\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ntaking a list of files or modules and return a list of tuple containing the file and module name which have to be actually checked", "response": "def expand_modules(files_or_modules, black_list, black_list_re):\n \"\"\"take a list of files/modules/packages and return the list of tuple\n (file, module name) which have to be actually checked\n \"\"\"\n result = []\n errors = []\n for something in files_or_modules:\n if basename(something) in black_list:\n continue\n if _basename_in_blacklist_re(basename(something), black_list_re):\n continue\n if exists(something):\n # this is a file or a directory\n try:\n modname = \".\".join(modutils.modpath_from_file(something))\n except ImportError:\n modname = splitext(basename(something))[0]\n if isdir(something):\n filepath = join(something, \"__init__.py\")\n else:\n filepath = something\n else:\n # suppose it's a module or package\n modname = something\n try:\n filepath = modutils.file_from_modpath(modname.split(\".\"))\n if filepath is None:\n continue\n except (ImportError, SyntaxError) as ex:\n # FIXME p3k : the SyntaxError is a Python bug and should be\n # removed as soon as possible http://bugs.python.org/issue10588\n errors.append({\"key\": \"fatal\", \"mod\": modname, \"ex\": ex})\n continue\n\n filepath = normpath(filepath)\n modparts = (modname or something).split(\".\")\n\n try:\n spec = modutils.file_info_from_modpath(modparts, path=sys.path)\n except ImportError:\n # Might not be acceptable, don't crash.\n is_namespace = False\n is_directory = isdir(something)\n else:\n is_namespace = modutils.is_namespace(spec)\n is_directory = modutils.is_directory(spec)\n\n if not is_namespace:\n result.append(\n {\n \"path\": filepath,\n \"name\": modname,\n \"isarg\": True,\n \"basepath\": filepath,\n \"basename\": modname,\n }\n )\n\n has_init = (\n not (modname.endswith(\".__init__\") or modname == \"__init__\")\n and basename(filepath) == \"__init__.py\"\n )\n\n if has_init or is_namespace or is_directory:\n for subfilepath in modutils.get_module_files(\n dirname(filepath), black_list, list_all=is_namespace\n ):\n if filepath == subfilepath:\n continue\n if _basename_in_blacklist_re(basename(subfilepath), black_list_re):\n continue\n\n modpath = _modpath_from_file(subfilepath, is_namespace)\n submodname = \".\".join(modpath)\n result.append(\n {\n \"path\": subfilepath,\n \"name\": submodname,\n \"isarg\": False,\n \"basepath\": filepath,\n \"basename\": modname,\n }\n )\n return result, errors"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef register_plugins(linter, directory):\n imported = {}\n for filename in listdir(directory):\n base, extension = splitext(filename)\n if base in imported or base == \"__pycache__\":\n continue\n if (\n extension in PY_EXTS\n and base != \"__init__\"\n or (not extension and isdir(join(directory, base)))\n ):\n try:\n module = modutils.load_module_from_file(join(directory, filename))\n except ValueError:\n # empty module name (usually emacs auto-save files)\n continue\n except ImportError as exc:\n print(\n \"Problem importing module %s: %s\" % (filename, exc), file=sys.stderr\n )\n else:\n if hasattr(module, \"register\"):\n module.register(linter)\n imported[base] = 1", "response": "load all modules and packages in the given directory and register plugins"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nretrieving an option defined by a given checker.", "response": "def get_global_option(checker, option, default=None):\n \"\"\" Retrieve an option defined by the given *checker* or\n by all known option providers.\n\n It will look in the list of all options providers\n until the given *option* will be found.\n If the option wasn't found, the *default* value will be returned.\n \"\"\"\n # First, try in the given checker's config.\n # After that, look in the options providers.\n\n try:\n return getattr(checker.config, option.replace(\"-\", \"_\"))\n except AttributeError:\n pass\n for provider in checker.linter.options_providers:\n for options in provider.options:\n if options[0] == option:\n return getattr(provider.config, option.replace(\"-\", \"_\"))\n return default"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsplitting the string given as argument on sep by default. Empty string are discarded.", "response": "def _splitstrip(string, sep=\",\"):\n \"\"\"return a list of stripped string by splitting the string given as\n argument on `sep` (',' by default). Empty string are discarded.\n\n >>> _splitstrip('a, b, c , 4,,')\n ['a', 'b', 'c', '4']\n >>> _splitstrip('a')\n ['a']\n >>> _splitstrip('a,\\nb,\\nc,')\n ['a', 'b', 'c']\n\n :type string: str or unicode\n :param string: a csv line\n\n :type sep: str or unicode\n :param sep: field separator, default to the comma (',')\n\n :rtype: str or unicode\n :return: the unquoted string (or the input string if it wasn't quoted)\n \"\"\"\n return [word.strip() for word in string.split(sep) if word.strip()]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nremove optional quotes from the string", "response": "def _unquote(string):\n \"\"\"remove optional quotes (simple or double) from the string\n\n :type string: str or unicode\n :param string: an optionally quoted string\n\n :rtype: str or unicode\n :return: the unquoted string (or the input string if it wasn't quoted)\n \"\"\"\n if not string:\n return string\n if string[0] in \"\\\"'\":\n string = string[1:]\n if string[-1] in \"\\\"'\":\n string = string[:-1]\n return string"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _comment(string):\n lines = [line.strip() for line in string.splitlines()]\n return \"# \" + (\"%s# \" % linesep).join(lines)", "response": "return string as a comment"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning the user input s value from a compiled value", "response": "def _format_option_value(optdict, value):\n \"\"\"return the user input's value from a 'compiled' value\"\"\"\n if isinstance(value, (list, tuple)):\n value = \",\".join(_format_option_value(optdict, item) for item in value)\n elif isinstance(value, dict):\n value = \",\".join(\"%s:%s\" % (k, v) for k, v in value.items())\n elif hasattr(value, \"match\"): # optdict.get('type') == 'regexp'\n # compiled regexp\n value = value.pattern\n elif optdict.get(\"type\") == \"yn\":\n value = \"yes\" if value else \"no\"\n elif isinstance(value, str) and value.isspace():\n value = \"'%s'\" % value\n return value"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef format_section(stream, section, options, doc=None):\n if doc:\n print(_comment(doc), file=stream)\n print(\"[%s]\" % section, file=stream)\n _ini_format(stream, options)", "response": "format an options section using the INI format"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nformatting options using the INI format", "response": "def _ini_format(stream, options):\n \"\"\"format options using the INI format\"\"\"\n for optname, optdict, value in options:\n value = _format_option_value(optdict, value)\n help_opt = optdict.get(\"help\")\n if help_opt:\n help_opt = normalize_text(help_opt, line_len=79, indent=\"# \")\n print(file=stream)\n print(help_opt, file=stream)\n else:\n print(file=stream)\n if value is None:\n print(\"#%s=\" % optname, file=stream)\n else:\n value = str(value).strip()\n if re.match(r\"^([\\w-]+,)+[\\w-]+$\", str(value)):\n separator = \"\\n \" + \" \" * len(optname)\n value = separator.join(x + \",\" for x in str(value).split(\",\"))\n # remove trailing ',' from last element of the list\n value = value[:-1]\n print(\"%s=%s\" % (optname, value), file=stream)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ninserts a child node at the given index", "response": "def insert(self, index, child):\n \"\"\"insert a child node\"\"\"\n self.children.insert(index, child)\n child.parent = self"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _get_visit_name(self):\n try:\n # pylint: disable=no-member\n return self.TYPE.replace(\"-\", \"_\")\n # pylint: disable=broad-except\n except Exception:\n return self.__class__.__name__.lower()", "response": "Returns the visit name for the mixed class."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef append(self, child):\n assert child not in self.parents()\n VNode.append(self, child)", "response": "appends a child node to the end of the tree"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the ancestor nodes", "response": "def parents(self):\n \"\"\"return the ancestor nodes\"\"\"\n assert self.parent is not self\n if self.parent is None:\n return []\n return [self.parent] + self.parent.parents()"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nformat and write the given layout into the given stream", "response": "def format(self, layout, stream=None, encoding=None):\n \"\"\"format and write the given layout into the stream object\n\n unicode policy: unicode strings may be found in the layout;\n try to call stream.write with it, but give it back encoded using\n the given encoding if it fails\n \"\"\"\n if stream is None:\n stream = sys.stdout\n if not encoding:\n encoding = getattr(stream, \"encoding\", \"UTF-8\")\n self.encoding = encoding or \"UTF-8\"\n self.out = stream\n self.begin_format()\n layout.accept(self)\n self.end_format()"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_table_content(self, table):\n result = [[]]\n cols = table.cols\n for cell in self.compute_content(table):\n if cols == 0:\n result.append([])\n cols = table.cols\n cols -= 1\n result[-1].append(cell)\n # fill missing cells\n while len(result[-1]) < cols:\n result[-1].append(\"\")\n return result", "response": "trick to get table content without actually writing it\n return an aligned list of lists containing table cells values as string\n "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ntricking to compute the formatting of children layout before actually writing it return an iterator on strings", "response": "def compute_content(self, layout):\n \"\"\"trick to compute the formatting of children layout before actually\n writing it\n\n return an iterator on strings (one for each child element)\n \"\"\"\n # Patch the underlying output stream with a fresh-generated stream,\n # which is used to store a temporary representation of a child\n # node.\n out = self.out\n try:\n for child in layout.children:\n stream = StringIO()\n self.out = stream\n child.accept(self)\n yield stream.getvalue()\n finally:\n self.out = out"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nwalks the AST to collect block level options line numbers.", "response": "def collect_block_lines(self, msgs_store, module_node):\n \"\"\"Walk the AST to collect block level options line numbers.\"\"\"\n for msg, lines in self._module_msgs_state.items():\n self._raw_module_msgs_state[msg] = lines.copy()\n orig_state = self._module_msgs_state.copy()\n self._module_msgs_state = {}\n self._suppression_mapping = {}\n self._effective_max_line_number = module_node.tolineno\n self._collect_block_lines(msgs_store, module_node, orig_state)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsets status for a given message at a given line", "response": "def set_msg_status(self, msg, line, status):\n \"\"\"Set status (enabled/disable) for a given message at a given line\"\"\"\n assert line > 0\n try:\n self._module_msgs_state[msg.msgid][line] = status\n except KeyError:\n self._module_msgs_state[msg.msgid] = {line: status}"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef handle_ignored_message(\n self, state_scope, msgid, line, node, args, confidence\n ): # pylint: disable=unused-argument\n \"\"\"Report an ignored message.\n\n state_scope is either MSG_STATE_SCOPE_MODULE or MSG_STATE_SCOPE_CONFIG,\n depending on whether the message was disabled locally in the module,\n or globally. The other arguments are the same as for add_message.\n \"\"\"\n if state_scope == MSG_STATE_SCOPE_MODULE:\n try:\n orig_line = self._suppression_mapping[(msgid, line)]\n self._ignored_msgs[(msgid, orig_line)].add(line)\n except KeyError:\n pass", "response": "Report an ignored message."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nregistering a report with the checker", "response": "def register_report(self, reportid, r_title, r_cb, checker):\n \"\"\"register a report\n\n reportid is the unique identifier for the report\n r_title the report's title\n r_cb the method to call to make the report\n checker is the checker defining the report\n \"\"\"\n reportid = reportid.upper()\n self._reports[checker].append((reportid, r_title, r_cb))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef enable_report(self, reportid):\n reportid = reportid.upper()\n self._reports_state[reportid] = True", "response": "enable the report with the given id"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef disable_report(self, reportid):\n reportid = reportid.upper()\n self._reports_state[reportid] = False", "response": "disable the report with the given id"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nadding some stats entries to the statistic dictionary", "response": "def add_stats(self, **kwargs):\n \"\"\"add some stats entries to the statistic dictionary\n raise an AssertionError if there is a key conflict\n \"\"\"\n for key, value in kwargs.items():\n if key[-1] == \"_\":\n key = key[:-1]\n assert key not in self.stats\n self.stats[key] = value\n return self.stats"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the name of the property that the given node is a setter for.", "response": "def get_setters_property_name(node):\n \"\"\"Get the name of the property that the given node is a setter for.\n\n :param node: The node to get the property name for.\n :type node: str\n\n :rtype: str or None\n :returns: The name of the property that the node is a setter for,\n or None if one could not be found.\n \"\"\"\n decorators = node.decorators.nodes if node.decorators else []\n for decorator in decorators:\n if (\n isinstance(decorator, astroid.Attribute)\n and decorator.attrname == \"setter\"\n and isinstance(decorator.expr, astroid.Name)\n ):\n return decorator.expr.name\n return None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget the property node for the given node.", "response": "def get_setters_property(node):\n \"\"\"Get the property node for the given setter node.\n\n :param node: The node to get the property for.\n :type node: astroid.FunctionDef\n\n :rtype: astroid.FunctionDef or None\n :returns: The node relating to the property of the given setter node,\n or None if one could not be found.\n \"\"\"\n property_ = None\n\n property_name = get_setters_property_name(node)\n class_node = utils.node_frame_class(node)\n if property_name and class_node:\n class_attrs = class_node.getattr(node.name)\n for attr in class_attrs:\n if utils.decorated_with_property(attr):\n property_ = attr\n break\n\n return property_"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef returns_something(return_node):\n returns = return_node.value\n\n if returns is None:\n return False\n\n return not (isinstance(returns, astroid.Const) and returns.value is None)", "response": "Check if a return node returns something other than None."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef possible_exc_types(node):\n excs = []\n if isinstance(node.exc, astroid.Name):\n inferred = utils.safe_infer(node.exc)\n if inferred:\n excs = [inferred.name]\n elif node.exc is None:\n handler = node.parent\n while handler and not isinstance(handler, astroid.ExceptHandler):\n handler = handler.parent\n\n if handler and handler.type:\n inferred_excs = astroid.unpack_infer(handler.type)\n excs = (exc.name for exc in inferred_excs if exc is not astroid.Uninferable)\n else:\n target = _get_raise_target(node)\n if isinstance(target, astroid.ClassDef):\n excs = [target.name]\n elif isinstance(target, astroid.FunctionDef):\n for ret in target.nodes_of_class(astroid.Return):\n if ret.frame() != target:\n # return from inner function - ignore it\n continue\n\n val = utils.safe_infer(ret.value)\n if (\n val\n and isinstance(val, (astroid.Instance, astroid.ClassDef))\n and utils.inherit_from_std_ex(val)\n ):\n excs.append(val.name)\n\n try:\n return {exc for exc in excs if not utils.node_ignores_exception(node, exc)}\n except astroid.InferenceError:\n return set()", "response": "Returns a list of possible exception types for the given raise node."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef register(linter):\n linter.register_checker(EncodingChecker(linter))\n linter.register_checker(ByIdManagedMessagesChecker(linter))", "response": "required method to auto register this checker"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ninspects the source file to find messages activated or deactivated by id.", "response": "def process_module(self, module):\n \"\"\"inspect the source file to find messages activated or deactivated by id.\"\"\"\n managed_msgs = MessagesHandlerMixIn.get_by_id_managed_msgs()\n for (mod_name, msg_id, msg_symbol, lineno, is_disabled) in managed_msgs:\n if mod_name == module.name:\n if is_disabled:\n txt = \"Id '{ident}' is used to disable '{symbol}' message emission\".format(\n ident=msg_id, symbol=msg_symbol\n )\n else:\n txt = \"Id '{ident}' is used to enable '{symbol}' message emission\".format(\n ident=msg_id, symbol=msg_symbol\n )\n self.add_message(\"use-symbolic-message-instead\", line=lineno, args=txt)\n MessagesHandlerMixIn.clear_by_id_managed_msgs()"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ninspect the source file to find encoding problem", "response": "def process_module(self, module):\n \"\"\"inspect the source file to find encoding problem\"\"\"\n if module.file_encoding:\n encoding = module.file_encoding\n else:\n encoding = \"ascii\"\n\n with module.stream() as stream:\n for lineno, line in enumerate(stream):\n self._check_encoding(lineno + 1, line, encoding)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ninspects the source to find fixme problems", "response": "def process_tokens(self, tokens):\n \"\"\"inspect the source to find fixme problems\"\"\"\n if not self.config.notes:\n return\n comments = (\n token_info for token_info in tokens if token_info.type == tokenize.COMMENT\n )\n for comment in comments:\n comment_text = comment.string[1:].lstrip() # trim '#' and whitespaces\n\n # handle pylint disable clauses\n disable_option_match = OPTION_RGX.search(comment_text)\n if disable_option_match:\n try:\n _, value = disable_option_match.group(1).split(\"=\", 1)\n values = [_val.strip().upper() for _val in value.split(\",\")]\n if set(values) & set(self.config.notes):\n continue\n except ValueError:\n self.add_message(\n \"bad-inline-option\",\n args=disable_option_match.group(1).strip(),\n line=comment.string,\n )\n continue\n\n # emit warnings if necessary\n match = self._fixme_pattern.search(\"#\" + comment_text.lower())\n if match:\n note = match.group(1)\n self.add_message(\n \"fixme\",\n col_offset=comment.string.lower().index(note.lower()),\n args=comment_text,\n line=comment.start[0],\n )"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _is_from_future_import(stmt, name):\n try:\n module = stmt.do_import_module(stmt.modname)\n except astroid.AstroidBuildingException:\n return None\n\n for local_node in module.locals.get(name, []):\n if isinstance(local_node, astroid.ImportFrom) and local_node.modname == FUTURE:\n return True\n return None", "response": "Check if the name is a future import from another module."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef in_for_else_branch(parent, stmt):\n return isinstance(parent, astroid.For) and any(\n else_stmt.parent_of(stmt) or else_stmt == stmt for else_stmt in parent.orelse\n )", "response": "Returns True if stmt in inside the else branch for a parent For stmt."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef overridden_method(klass, name):\n try:\n parent = next(klass.local_attr_ancestors(name))\n except (StopIteration, KeyError):\n return None\n try:\n meth_node = parent[name]\n except KeyError:\n # We have found an ancestor defining but it's not in the local\n # dictionary. This may happen with astroid built from living objects.\n return None\n if isinstance(meth_node, astroid.FunctionDef):\n return meth_node\n return None", "response": "get overridden method if any"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns extra information to add to the message for unpacking - non - sequence and unbalanced - tuple - unpacking errors", "response": "def _get_unpacking_extra_info(node, infered):\n \"\"\"return extra information to add to the message for unpacking-non-sequence\n and unbalanced-tuple-unpacking errors\n \"\"\"\n more = \"\"\n infered_module = infered.root().name\n if node.root().name == infered_module:\n if node.lineno == infered.lineno:\n more = \" %s\" % infered.as_string()\n elif infered.lineno:\n more = \" defined at line %s\" % infered.lineno\n elif infered.lineno:\n more = \" defined at line %s of %s\" % (infered.lineno, infered_module)\n return more"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _detect_global_scope(node, frame, defframe):\n def_scope = scope = None\n if frame and frame.parent:\n scope = frame.parent.scope()\n if defframe and defframe.parent:\n def_scope = defframe.parent.scope()\n if isinstance(frame, astroid.FunctionDef):\n # If the parent of the current node is a\n # function, then it can be under its scope\n # (defined in, which doesn't concern us) or\n # the `->` part of annotations. The same goes\n # for annotations of function arguments, they'll have\n # their parent the Arguments node.\n if not isinstance(node.parent, (astroid.FunctionDef, astroid.Arguments)):\n return False\n elif any(\n not isinstance(f, (astroid.ClassDef, astroid.Module)) for f in (frame, defframe)\n ):\n # Not interested in other frames, since they are already\n # not in a global scope.\n return False\n\n break_scopes = []\n for s in (scope, def_scope):\n # Look for parent scopes. If there is anything different\n # than a module or a class scope, then they frames don't\n # share a global scope.\n parent_scope = s\n while parent_scope:\n if not isinstance(parent_scope, (astroid.ClassDef, astroid.Module)):\n break_scopes.append(parent_scope)\n break\n if parent_scope.parent:\n parent_scope = parent_scope.parent.scope()\n else:\n break\n if break_scopes and len(set(break_scopes)) != 1:\n # Store different scopes than expected.\n # If the stored scopes are, in fact, the very same, then it means\n # that the two frames (frame and defframe) shares the same scope,\n # and we could apply our lineno analysis over them.\n # For instance, this works when they are inside a function, the node\n # that uses a definition and the definition itself.\n return False\n # At this point, we are certain that frame and defframe shares a scope\n # and the definition of the first depends on the second.\n return frame.lineno < defframe.lineno", "response": "Detects if the given frames share a global scope."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ntrying to fix import with multiple dots.", "response": "def _fix_dot_imports(not_consumed):\n \"\"\" Try to fix imports with multiple dots, by returning a dictionary\n with the import names expanded. The function unflattens root imports,\n like 'xml' (when we have both 'xml.etree' and 'xml.sax'), to 'xml.etree'\n and 'xml.sax' respectively.\n \"\"\"\n # TODO: this should be improved in issue astroid #46\n names = {}\n for name, stmts in not_consumed.items():\n if any(\n isinstance(stmt, astroid.AssignName)\n and isinstance(stmt.assign_type(), astroid.AugAssign)\n for stmt in stmts\n ):\n continue\n for stmt in stmts:\n if not isinstance(stmt, (astroid.ImportFrom, astroid.Import)):\n continue\n for imports in stmt.names:\n second_name = None\n import_module_name = imports[0]\n if import_module_name == \"*\":\n # In case of wildcard imports,\n # pick the name from inside the imported module.\n second_name = name\n else:\n name_matches_dotted_import = False\n if (\n import_module_name.startswith(name)\n and import_module_name.find(\".\") > -1\n ):\n name_matches_dotted_import = True\n\n if name_matches_dotted_import or name in imports:\n # Most likely something like 'xml.etree',\n # which will appear in the .locals as 'xml'.\n # Only pick the name if it wasn't consumed.\n second_name = import_module_name\n if second_name and second_name not in names:\n names[second_name] = stmt\n return sorted(names.items(), key=lambda a: a[1].fromlineno)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _find_frame_imports(name, frame):\n imports = frame.nodes_of_class((astroid.Import, astroid.ImportFrom))\n for import_node in imports:\n for import_name, import_alias in import_node.names:\n # If the import uses an alias, check only that.\n # Otherwise, check only the import name.\n if import_alias:\n if import_alias == name:\n return True\n elif import_name and import_name == name:\n return True\n return None", "response": "Find the frame with the required\n name."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking if name_node has corresponding assign statement in same scope as name_node.", "response": "def _assigned_locally(name_node):\n \"\"\"\n Checks if name_node has corresponding assign statement in same scope\n \"\"\"\n assign_stmts = name_node.scope().nodes_of_class(astroid.AssignName)\n return any(a.name == name_node.name for a in assign_stmts)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef mark_as_consumed(self, name, new_node):\n self.consumed[name] = new_node\n del self.to_consume[name]", "response": "Mark the name as consumed and delete it from the to_consume dictionary"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef visit_module(self, node):\n self._to_consume = [NamesConsumer(node, \"module\")]\n self._postponed_evaluation_enabled = is_postponed_evaluation_enabled(node)\n\n for name, stmts in node.locals.items():\n if utils.is_builtin(name) and not utils.is_inside_except(stmts[0]):\n if self._should_ignore_redefined_builtin(stmts[0]) or name == \"__doc__\":\n continue\n self.add_message(\"redefined-builtin\", args=name, node=stmts[0])", "response": "update consumption analysis variable\n checks globals doesn t overrides builtins\n "} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef leave_module(self, node):\n assert len(self._to_consume) == 1\n not_consumed = self._to_consume.pop().to_consume\n # attempt to check for __all__ if defined\n if \"__all__\" in node.locals:\n self._check_all(node, not_consumed)\n\n # check for unused globals\n self._check_globals(not_consumed)\n\n # don't check unused imports in __init__ files\n if not self.config.init_import and node.package:\n return\n\n self._check_imports(not_consumed)", "response": "leave module check globals\n check imports\n "} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef visit_functiondef(self, node):\n self._to_consume.append(NamesConsumer(node, \"function\"))\n if not (\n self.linter.is_message_enabled(\"redefined-outer-name\")\n or self.linter.is_message_enabled(\"redefined-builtin\")\n ):\n return\n globs = node.root().globals\n for name, stmt in node.items():\n if utils.is_inside_except(stmt):\n continue\n if name in globs and not isinstance(stmt, astroid.Global):\n definition = globs[name][0]\n if (\n isinstance(definition, astroid.ImportFrom)\n and definition.modname == FUTURE\n ):\n # It is a __future__ directive, not a symbol.\n continue\n\n line = definition.fromlineno\n if not self._is_name_ignored(stmt, name):\n self.add_message(\n \"redefined-outer-name\", args=(name, line), node=stmt\n )\n\n elif utils.is_builtin(name) and not self._should_ignore_redefined_builtin(\n stmt\n ):\n # do not print Redefining builtin for additional builtins\n self.add_message(\"redefined-builtin\", args=name, node=stmt)", "response": "update consumption analysis variable and check locals\n "} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks if function is not used.", "response": "def leave_functiondef(self, node):\n \"\"\"leave function: check function's locals are consumed\"\"\"\n if node.type_comment_returns:\n self._store_type_annotation_node(node.type_comment_returns)\n if node.type_comment_args:\n for argument_annotation in node.type_comment_args:\n self._store_type_annotation_node(argument_annotation)\n\n not_consumed = self._to_consume.pop().to_consume\n if not (\n self.linter.is_message_enabled(\"unused-variable\")\n or self.linter.is_message_enabled(\"possibly-unused-variable\")\n or self.linter.is_message_enabled(\"unused-argument\")\n ):\n return\n\n # Don't check arguments of function which are only raising an exception.\n if utils.is_error(node):\n return\n\n # Don't check arguments of abstract methods or within an interface.\n is_method = node.is_method()\n if is_method and node.is_abstract():\n return\n\n global_names = _flattened_scope_names(node.nodes_of_class(astroid.Global))\n nonlocal_names = _flattened_scope_names(node.nodes_of_class(astroid.Nonlocal))\n for name, stmts in not_consumed.items():\n self._check_is_unused(name, node, stmts[0], global_names, nonlocal_names)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef visit_global(self, node):\n frame = node.frame()\n if isinstance(frame, astroid.Module):\n self.add_message(\"global-at-module-level\", node=node)\n return\n\n module = frame.root()\n default_message = True\n locals_ = node.scope().locals\n for name in node.names:\n try:\n assign_nodes = module.getattr(name)\n except astroid.NotFoundError:\n # unassigned global, skip\n assign_nodes = []\n\n not_defined_locally_by_import = not any(\n isinstance(local, astroid.node_classes.Import)\n for local in locals_.get(name, ())\n )\n if not assign_nodes and not_defined_locally_by_import:\n self.add_message(\"global-variable-not-assigned\", args=name, node=node)\n default_message = False\n continue\n\n for anode in assign_nodes:\n if (\n isinstance(anode, astroid.AssignName)\n and anode.name in module.special_attributes\n ):\n self.add_message(\"redefined-builtin\", args=name, node=node)\n break\n if anode.frame() is module:\n # module level assignment\n break\n else:\n if not_defined_locally_by_import:\n # global undefined at the module scope\n self.add_message(\"global-variable-undefined\", args=name, node=node)\n default_message = False\n\n if default_message:\n self.add_message(\"global-statement\", node=node)", "response": "check names imported exists in the global scope"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _ignore_class_scope(self, node):\n # Detect if we are in a local class scope, as an assignment.\n # For example, the following is fair game.\n #\n # class A:\n # b = 1\n # c = lambda b=b: b * b\n #\n # class B:\n # tp = 1\n # def func(self, arg: tp):\n # ...\n # class C:\n # tp = 2\n # def func(self, arg=tp):\n # ...\n\n name = node.name\n frame = node.statement().scope()\n in_annotation_or_default = self._defined_in_function_definition(node, frame)\n if in_annotation_or_default:\n frame_locals = frame.parent.scope().locals\n else:\n frame_locals = frame.locals\n return not (\n (isinstance(frame, astroid.ClassDef) or in_annotation_or_default)\n and name in frame_locals\n )", "response": "Return True if the node is in local class scope False otherwise."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking modules attribute accesses", "response": "def visit_import(self, node):\n \"\"\"check modules attribute accesses\"\"\"\n if not self._analyse_fallback_blocks and utils.is_from_fallback_block(node):\n # No need to verify this, since ImportError is already\n # handled by the client code.\n return\n\n for name, _ in node.names:\n parts = name.split(\".\")\n try:\n module = next(_infer_name_module(node, parts[0]))\n except astroid.ResolveError:\n continue\n self._check_module_attrs(node, module, parts[1:])"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks modules attribute accesses", "response": "def visit_importfrom(self, node):\n \"\"\"check modules attribute accesses\"\"\"\n if not self._analyse_fallback_blocks and utils.is_from_fallback_block(node):\n # No need to verify this, since ImportError is already\n # handled by the client code.\n return\n\n name_parts = node.modname.split(\".\")\n try:\n module = node.do_import_module(name_parts[0])\n except astroid.AstroidBuildingException:\n return\n module = self._check_module_attrs(node, module, name_parts[1:])\n if not module:\n return\n for name, _ in node.names:\n if name == \"*\":\n continue\n self._check_module_attrs(node, module, name.split(\".\"))"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks unbalanced tuple unpacking for assignments and unpacking non - sequences as well as in case self / cls get assigned.", "response": "def visit_assign(self, node):\n \"\"\"Check unbalanced tuple unpacking for assignments\n and unpacking non-sequences as well as in case self/cls\n get assigned.\n \"\"\"\n self._check_self_cls_assign(node)\n if not isinstance(node.targets[0], (astroid.Tuple, astroid.List)):\n return\n\n targets = node.targets[0].itered()\n try:\n infered = utils.safe_infer(node.value)\n if infered is not None:\n self._check_unpacking(infered, node, targets)\n except astroid.InferenceError:\n return"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking that self and cls don t get assigned", "response": "def _check_self_cls_assign(self, node):\n \"\"\"Check that self/cls don't get assigned\"\"\"\n assign_names = {\n target.name\n for target in node.targets\n if isinstance(target, astroid.AssignName)\n }\n scope = node.scope()\n nonlocals_with_same_name = any(\n child\n for child in scope.body\n if isinstance(child, astroid.Nonlocal) and assign_names & set(child.names)\n )\n if nonlocals_with_same_name:\n scope = node.scope().parent.scope()\n\n if not (\n isinstance(scope, astroid.scoped_nodes.FunctionDef)\n and scope.is_method()\n and \"builtins.staticmethod\" not in scope.decoratornames()\n ):\n return\n argument_names = scope.argnames()\n if not argument_names:\n return\n self_cls_name = argument_names[0]\n target_assign_names = (\n target.name\n for target in node.targets\n if isinstance(target, astroid.node_classes.AssignName)\n )\n if self_cls_name in target_assign_names:\n self.add_message(\"self-cls-assignment\", node=node, args=(self_cls_name))"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _check_unpacking(self, infered, node, targets):\n if utils.is_inside_abstract_class(node):\n return\n if utils.is_comprehension(node):\n return\n if infered is astroid.Uninferable:\n return\n if (\n isinstance(infered.parent, astroid.Arguments)\n and isinstance(node.value, astroid.Name)\n and node.value.name == infered.parent.vararg\n ):\n # Variable-length argument, we can't determine the length.\n return\n if isinstance(infered, (astroid.Tuple, astroid.List)):\n # attempt to check unpacking is properly balanced\n values = infered.itered()\n if len(targets) != len(values):\n # Check if we have starred nodes.\n if any(isinstance(target, astroid.Starred) for target in targets):\n return\n self.add_message(\n \"unbalanced-tuple-unpacking\",\n node=node,\n args=(\n _get_unpacking_extra_info(node, infered),\n len(targets),\n len(values),\n ),\n )\n # attempt to check unpacking may be possible (ie RHS is iterable)\n else:\n if not utils.is_iterable(infered):\n self.add_message(\n \"unpacking-non-sequence\",\n node=node,\n args=(_get_unpacking_extra_info(node, infered),),\n )", "response": "Checks if the node is unbalanced and unpacking non sequences."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _check_module_attrs(self, node, module, module_names):\n assert isinstance(module, astroid.Module), module\n while module_names:\n name = module_names.pop(0)\n if name == \"__dict__\":\n module = None\n break\n try:\n module = next(module.getattr(name)[0].infer())\n if module is astroid.Uninferable:\n return None\n except astroid.NotFoundError:\n if module.name in self._ignored_modules:\n return None\n self.add_message(\n \"no-name-in-module\", args=(name, module.name), node=node\n )\n return None\n except astroid.InferenceError:\n return None\n if module_names:\n # FIXME: other message if name is not the latest part of\n # module_names ?\n modname = module.name if module else \"__dict__\"\n self.add_message(\n \"no-name-in-module\", node=node, args=(\".\".join(module_names), modname)\n )\n return None\n if isinstance(module, astroid.Module):\n return module\n return None", "response": "check that the module_names are accessible through the module"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _check_metaclasses(self, node):\n consumed = [] # [(scope_locals, consumed_key)]\n\n for child_node in node.get_children():\n if isinstance(child_node, astroid.ClassDef):\n consumed.extend(self._check_classdef_metaclasses(child_node, node))\n\n # Pop the consumed items, in order to avoid having\n # unused-import and unused-variable false positives\n for scope_locals, name in consumed:\n scope_locals.pop(name, None)", "response": "Update consumption analysis for metaclasses."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef table_lines_from_stats(stats, _, columns):\n lines = []\n for m_type in columns:\n new = stats[m_type]\n new = \"%.3f\" % new if isinstance(new, float) else str(new)\n lines += (m_type.replace(\"_\", \" \"), new, \"NC\", \"NC\")\n return lines", "response": "get values listed in columns from stats and old_stats and a\n and return a formated list of values"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef ensure_scripts(linux_scripts):\n from distutils import util\n\n if util.get_platform()[:3] == \"win\":\n return linux_scripts + [script + \".bat\" for script in linux_scripts]\n return linux_scripts", "response": "Creates the proper script names required for each platform\n "} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn a list of subpackages for the given directory", "response": "def get_packages(directory, prefix):\n \"\"\"return a list of subpackages for the given directory\"\"\"\n result = []\n for package in os.listdir(directory):\n absfile = join(directory, package)\n if isdir(absfile):\n if exists(join(absfile, \"__init__.py\")):\n if prefix:\n result.append(\"%s.%s\" % (prefix, package))\n else:\n result.append(package)\n result += get_packages(absfile, result[-1])\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef run(self):\n install_lib.install_lib.run(self)\n # manually install included directories if any\n if include_dirs:\n for directory in include_dirs:\n dest = join(self.install_dir, directory)\n if sys.version_info >= (3, 0):\n exclude = {\"invalid_encoded_data*\", \"unknown_encoding*\"}\n else:\n exclude = set()\n shutil.rmtree(dest, ignore_errors=True)\n shutil.copytree(\n directory, dest, ignore=shutil.ignore_patterns(*exclude)\n )", "response": "run method that installs any of the include directories"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef stripped_lines(lines, ignore_comments, ignore_docstrings, ignore_imports):\n if ignore_imports:\n tree = astroid.parse(\"\".join(lines))\n node_is_import_by_lineno = (\n (node.lineno, isinstance(node, (astroid.Import, astroid.ImportFrom)))\n for node in tree.body\n )\n line_begins_import = {\n lineno: all(is_import for _, is_import in node_is_import_group)\n for lineno, node_is_import_group in groupby(\n node_is_import_by_lineno, key=lambda x: x[0]\n )\n }\n current_line_is_import = False\n\n strippedlines = []\n docstring = None\n for lineno, line in enumerate(lines, start=1):\n line = line.strip()\n if ignore_docstrings:\n if not docstring and any(\n line.startswith(i) for i in ['\"\"\"', \"'''\", 'r\"\"\"', \"r'''\"]\n ):\n docstring = line[:3]\n line = line[3:]\n if docstring:\n if line.endswith(docstring):\n docstring = None\n line = \"\"\n if ignore_imports:\n current_line_is_import = line_begins_import.get(\n lineno, current_line_is_import\n )\n if current_line_is_import:\n line = \"\"\n if ignore_comments:\n # XXX should use regex in checkers/format to avoid cutting\n # at a \"#\" in a string\n line = line.split(\"#\", 1)[0].strip()\n strippedlines.append(line)\n return strippedlines", "response": "return lines with leading and trailing whitespace and any ignored code\n features removed\n "} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nmake a layout with some stats about duplication", "response": "def report_similarities(sect, stats, old_stats):\n \"\"\"make a layout with some stats about duplication\"\"\"\n lines = [\"\", \"now\", \"previous\", \"difference\"]\n lines += table_lines_from_stats(\n stats, old_stats, (\"nb_duplicated_lines\", \"percent_duplicated_lines\")\n )\n sect.append(Table(children=lines, cols=4, rheaders=1, cheaders=1))"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nruns the standalone command line access point.", "response": "def Run(argv=None):\n \"\"\"standalone command line access point\"\"\"\n if argv is None:\n argv = sys.argv[1:]\n from getopt import getopt\n\n s_opts = \"hdi\"\n l_opts = (\n \"help\",\n \"duplicates=\",\n \"ignore-comments\",\n \"ignore-imports\",\n \"ignore-docstrings\",\n )\n min_lines = 4\n ignore_comments = False\n ignore_docstrings = False\n ignore_imports = False\n opts, args = getopt(argv, s_opts, l_opts)\n for opt, val in opts:\n if opt in (\"-d\", \"--duplicates\"):\n min_lines = int(val)\n elif opt in (\"-h\", \"--help\"):\n usage()\n elif opt in (\"-i\", \"--ignore-comments\"):\n ignore_comments = True\n elif opt in (\"--ignore-docstrings\",):\n ignore_docstrings = True\n elif opt in (\"--ignore-imports\",):\n ignore_imports = True\n if not args:\n usage(1)\n sim = Similar(min_lines, ignore_comments, ignore_docstrings, ignore_imports)\n for filename in args:\n with open(filename) as stream:\n sim.append_stream(filename, stream)\n sim.run()\n sys.exit(0)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nappends a file to search for similarities", "response": "def append_stream(self, streamid, stream, encoding=None):\n \"\"\"append a file to search for similarities\"\"\"\n if encoding is None:\n readlines = stream.readlines\n else:\n readlines = decoding_stream(stream, encoding).readlines\n try:\n self.linesets.append(\n LineSet(\n streamid,\n readlines(),\n self.ignore_comments,\n self.ignore_docstrings,\n self.ignore_imports,\n )\n )\n except UnicodeDecodeError:\n pass"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncompute similarities in appended files", "response": "def _compute_sims(self):\n \"\"\"compute similarities in appended files\"\"\"\n no_duplicates = defaultdict(list)\n for num, lineset1, idx1, lineset2, idx2 in self._iter_sims():\n duplicate = no_duplicates[num]\n for couples in duplicate:\n if (lineset1, idx1) in couples or (lineset2, idx2) in couples:\n couples.add((lineset1, idx1))\n couples.add((lineset2, idx2))\n break\n else:\n duplicate.append({(lineset1, idx1), (lineset2, idx2)})\n sims = []\n for num, ensembles in no_duplicates.items():\n for couples in ensembles:\n sims.append((num, couples))\n sims.sort()\n sims.reverse()\n return sims"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _display_sims(self, sims):\n nb_lignes_dupliquees = 0\n for num, couples in sims:\n print()\n print(num, \"similar lines in\", len(couples), \"files\")\n couples = sorted(couples)\n for lineset, idx in couples:\n print(\"==%s:%s\" % (lineset.name, idx))\n # pylint: disable=W0631\n for line in lineset._real_lines[idx : idx + num]:\n print(\" \", line.rstrip())\n nb_lignes_dupliquees += num * (len(couples) - 1)\n nb_total_lignes = sum([len(lineset) for lineset in self.linesets])\n print(\n \"TOTAL lines=%s duplicates=%s percent=%.2f\"\n % (\n nb_total_lignes,\n nb_lignes_dupliquees,\n nb_lignes_dupliquees * 100.0 / nb_total_lignes,\n )\n )", "response": "display computed similarities on stdout"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _find_common(self, lineset1, lineset2):\n lines1 = lineset1.enumerate_stripped\n lines2 = lineset2.enumerate_stripped\n find = lineset2.find\n index1 = 0\n min_lines = self.min_lines\n while index1 < len(lineset1):\n skip = 1\n num = 0\n for index2 in find(lineset1[index1]):\n non_blank = 0\n for num, ((_, line1), (_, line2)) in enumerate(\n zip(lines1(index1), lines2(index2))\n ):\n if line1 != line2:\n if non_blank > min_lines:\n yield num, lineset1, index1, lineset2, index2\n skip = max(skip, num)\n break\n if line1:\n non_blank += 1\n else:\n # we may have reach the end\n num += 1\n if non_blank > min_lines:\n yield num, lineset1, index1, lineset2, index2\n skip = max(skip, num)\n index1 += skip", "response": "find similarities in the two given linesets"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\niterate on similarities among all files", "response": "def _iter_sims(self):\n \"\"\"iterate on similarities among all files, by making a cartesian\n product\n \"\"\"\n for idx, lineset in enumerate(self.linesets[:-1]):\n for lineset2 in self.linesets[idx + 1 :]:\n for sim in self._find_common(lineset, lineset2):\n yield sim"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns an iterator on stripped lines starting from a given index. start_at is the index of the first line that starts from 0.", "response": "def enumerate_stripped(self, start_at=0):\n \"\"\"return an iterator on stripped lines, starting from a given index\n if specified, else 0\n \"\"\"\n idx = start_at\n if start_at:\n lines = self._stripped_lines[start_at:]\n else:\n lines = self._stripped_lines\n for line in lines:\n # if line:\n yield idx, line\n idx += 1"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates the index for this set", "response": "def _mk_index(self):\n \"\"\"create the index for this set\"\"\"\n index = defaultdict(list)\n for line_no, line in enumerate(self._stripped_lines):\n if line:\n index[line].append(line_no)\n return index"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef set_option(self, optname, value, action=None, optdict=None):\n BaseChecker.set_option(self, optname, value, action, optdict)\n if optname == \"min-similarity-lines\":\n self.min_lines = self.config.min_similarity_lines\n elif optname == \"ignore-comments\":\n self.ignore_comments = self.config.ignore_comments\n elif optname == \"ignore-docstrings\":\n self.ignore_docstrings = self.config.ignore_docstrings\n elif optname == \"ignore-imports\":\n self.ignore_imports = self.config.ignore_imports", "response": "set an option in the configuration"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef process_module(self, node):\n with node.stream() as stream:\n self.append_stream(self.linter.current_name, stream, node.file_encoding)", "response": "process a module by appending the stream to the current name"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncomputes and display similarities on closing ( i. e. end of parsing", "response": "def close(self):\n \"\"\"compute and display similarities on closing (i.e. end of parsing)\"\"\"\n total = sum(len(lineset) for lineset in self.linesets)\n duplicated = 0\n stats = self.stats\n for num, couples in self._compute_sims():\n msg = []\n for lineset, idx in couples:\n msg.append(\"==%s:%s\" % (lineset.name, idx))\n msg.sort()\n # pylint: disable=W0631\n for line in lineset._real_lines[idx : idx + num]:\n msg.append(line.rstrip())\n self.add_message(\"R0801\", args=(len(couples), \"\\n\".join(msg)))\n duplicated += num * (len(couples) - 1)\n stats[\"nb_duplicated_lines\"] = duplicated\n stats[\"percent_duplicated_lines\"] = total and duplicated * 100.0 / total"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if a definition signature is equivalent to a call.", "response": "def _definition_equivalent_to_call(definition, call):\n \"\"\"Check if a definition signature is equivalent to a call.\"\"\"\n if definition.kwargs:\n same_kw_variadics = definition.kwargs in call.starred_kws\n else:\n same_kw_variadics = not call.starred_kws\n if definition.varargs:\n same_args_variadics = definition.varargs in call.starred_args\n else:\n same_args_variadics = not call.starred_args\n same_kwonlyargs = all(kw in call.kws for kw in definition.kwonlyargs)\n same_args = definition.args == call.args\n\n no_additional_kwarg_arguments = True\n if call.kws:\n for keyword in call.kws:\n is_arg = keyword in call.args\n is_kwonly = keyword in definition.kwonlyargs\n if not is_arg and not is_kwonly:\n # Maybe this argument goes into **kwargs,\n # or it is an extraneous argument.\n # In any case, the signature is different than\n # the call site, which stops our search.\n no_additional_kwarg_arguments = False\n break\n\n return all(\n (\n same_args,\n same_kwonlyargs,\n same_args_variadics,\n same_kw_variadics,\n no_additional_kwarg_arguments,\n )\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks if the attributes of node_a and node_b are equal.", "response": "def _check_arg_equality(node_a, node_b, attr_name):\n \"\"\"\n Check equality of nodes based on the comparison of their attributes named attr_name.\n\n Args:\n node_a (astroid.node): first node to compare.\n node_b (astroid.node): second node to compare.\n attr_name (str): name of the nodes attribute to use for comparison.\n\n Returns:\n bool: True if node_a.attr_name == node_b.attr_name, False otherwise.\n \"\"\"\n return getattr(node_a, attr_name) == getattr(node_b, attr_name)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _has_different_parameters_default_value(original, overridden):\n if original.args is None or overridden.args is None:\n return False\n\n all_args = chain(original.args, original.kwonlyargs)\n original_param_names = [param.name for param in all_args]\n default_missing = object()\n for param_name in original_param_names:\n try:\n original_default = original.default_value(param_name)\n except astroid.exceptions.NoDefault:\n original_default = default_missing\n try:\n overridden_default = overridden.default_value(param_name)\n except astroid.exceptions.NoDefault:\n overridden_default = default_missing\n\n default_list = [\n arg == default_missing for arg in (original_default, overridden_default)\n ]\n if any(default_list) and not all(default_list):\n # Only one arg has no default value\n return True\n\n astroid_type_compared_attr = {\n astroid.Const: \"value\",\n astroid.ClassDef: \"name\",\n astroid.Tuple: \"elts\",\n astroid.List: \"elts\",\n }\n handled_types = tuple(\n astroid_type for astroid_type in astroid_type_compared_attr\n )\n original_type = _get_node_type(original_default, handled_types)\n if original_type:\n # \u00a0We handle only astroid types that are inside the dict astroid_type_compared_attr\n if not isinstance(overridden_default, original_type):\n # \u00a0Two args with same name but different types\n return True\n if not _check_arg_equality(\n original_default,\n overridden_default,\n astroid_type_compared_attr[original_type],\n ):\n # Two args with same type but different values\n return True\n return False", "response": "Check if one of the original and overridden methods arguments have different default values"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ndetermining if two methods have different parameters.", "response": "def _different_parameters(original, overridden, dummy_parameter_regex):\n \"\"\"Determine if the two methods have different parameters\n\n They are considered to have different parameters if:\n\n * they have different positional parameters, including different names\n\n * one of the methods is having variadics, while the other is not\n\n * they have different keyword only parameters.\n\n \"\"\"\n original_parameters = _positional_parameters(original)\n overridden_parameters = _positional_parameters(overridden)\n\n different_positional = _has_different_parameters(\n original_parameters, overridden_parameters, dummy_parameter_regex\n )\n different_kwonly = _has_different_parameters(\n original.args.kwonlyargs, overridden.args.kwonlyargs, dummy_parameter_regex\n )\n if original.name in PYMETHODS:\n # Ignore the difference for special methods. If the parameter\n # numbers are different, then that is going to be caught by\n # unexpected-special-method-signature.\n # If the names are different, it doesn't matter, since they can't\n # be used as keyword arguments anyway.\n different_positional = different_kwonly = False\n\n # Both or none should have extra variadics, otherwise the method\n # loses or gains capabilities that are not reflected into the parent method,\n # leading to potential inconsistencies in the code.\n different_kwarg = (\n sum(1 for param in (original.args.kwarg, overridden.args.kwarg) if not param)\n == 1\n )\n different_vararg = (\n sum(1 for param in (original.args.vararg, overridden.args.vararg) if not param)\n == 1\n )\n\n return any(\n (different_positional, different_kwarg, different_vararg, different_kwonly)\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _called_in_methods(func, klass, methods):\n if not isinstance(func, astroid.FunctionDef):\n return False\n for method in methods:\n try:\n infered = klass.getattr(method)\n except astroid.NotFoundError:\n continue\n for infer_method in infered:\n for call in infer_method.nodes_of_class(astroid.Call):\n try:\n bound = next(call.func.infer())\n except (astroid.InferenceError, StopIteration):\n continue\n if not isinstance(bound, astroid.BoundMethod):\n continue\n func_obj = bound._proxied\n if isinstance(func_obj, astroid.UnboundMethod):\n func_obj = func_obj._proxied\n if func_obj.name == func.name:\n return True\n return False", "response": "Check if the given function was called in any of the given methods. Returns True if so False otherwise."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking if the given attribute name is a property in the given klass.", "response": "def _is_attribute_property(name, klass):\n \"\"\" Check if the given attribute *name* is a property\n in the given *klass*.\n\n It will look for `property` calls or for functions\n with the given name, decorated by `property` or `property`\n subclasses.\n Returns ``True`` if the name is a property in the given klass,\n ``False`` otherwise.\n \"\"\"\n\n try:\n attributes = klass.getattr(name)\n except astroid.NotFoundError:\n return False\n property_name = \"{}.property\".format(BUILTINS)\n for attr in attributes:\n if attr is astroid.Uninferable:\n continue\n try:\n infered = next(attr.infer())\n except astroid.InferenceError:\n continue\n if isinstance(infered, astroid.FunctionDef) and decorated_with_property(\n infered\n ):\n return True\n if infered.pytype() == property_name:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _safe_infer_call_result(node, caller, context=None):\n try:\n inferit = node.infer_call_result(caller, context=context)\n value = next(inferit)\n except astroid.InferenceError:\n return None # inference failed\n except StopIteration:\n return None # no values infered\n try:\n next(inferit)\n return None # there is ambiguity on the inferred node\n except astroid.InferenceError:\n return None # there is some kind of ambiguity\n except StopIteration:\n return value", "response": "Safely infer the return value of a function call."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _ancestors_to_call(klass_node, method=\"__init__\"):\n to_call = {}\n for base_node in klass_node.ancestors(recurs=False):\n try:\n to_call[base_node] = next(base_node.igetattr(method))\n except astroid.InferenceError:\n continue\n return to_call", "response": "return a dictionary where keys are the list of base classes providing\n the queried method and so that should be called from the method node node\n "} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nrequire method to auto register this checker", "response": "def register(linter):\n \"\"\"required method to auto register this checker \"\"\"\n linter.register_checker(ClassChecker(linter))\n linter.register_checker(SpecialMethodsChecker(linter))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef set_accessed(self, node):\n\n frame = node_frame_class(node)\n if frame is None:\n # The node does not live in a class.\n return\n self._scopes[frame][node.attrname].append(node)", "response": "Set the given node as accessed."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef visit_classdef(self, node):\n self._check_bases_classes(node)\n # if not an exception or a metaclass\n if node.type == \"class\" and has_known_bases(node):\n try:\n node.local_attr(\"__init__\")\n except astroid.NotFoundError:\n self.add_message(\"no-init\", args=node, node=node)\n self._check_slots(node)\n self._check_proper_bases(node)\n self._check_consistent_mro(node)", "response": "check if class is not an exception or a metaclass"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _check_consistent_mro(self, node):\n try:\n node.mro()\n except InconsistentMroError:\n self.add_message(\"inconsistent-mro\", args=node.name, node=node)\n except DuplicateBasesError:\n self.add_message(\"duplicate-bases\", args=node.name, node=node)\n except NotImplementedError:\n # Old style class, there's no mro so don't do anything.\n pass", "response": "Detect that a class has a consistent mro or duplicate bases."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _check_proper_bases(self, node):\n for base in node.bases:\n ancestor = safe_infer(base)\n if ancestor in (astroid.Uninferable, None):\n continue\n if isinstance(ancestor, astroid.Instance) and ancestor.is_subtype_of(\n \"%s.type\" % (BUILTINS,)\n ):\n continue\n\n if not isinstance(ancestor, astroid.ClassDef) or _is_invalid_base_class(\n ancestor\n ):\n self.add_message(\"inherit-non-class\", args=base.as_string(), node=node)\n\n if ancestor.name == object.__name__:\n self.add_message(\n \"useless-object-inheritance\", args=node.name, node=node\n )", "response": "Detects that a class inherits something which is not an instance or a type."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef leave_classdef(self, cnode):\n # check access to existent members on non metaclass classes\n if self._ignore_mixin and cnode.name[-5:].lower() == \"mixin\":\n # We are in a mixin class. No need to try to figure out if\n # something is missing, since it is most likely that it will\n # miss.\n return\n\n accessed = self._accessed.accessed(cnode)\n if cnode.type != \"metaclass\":\n self._check_accessed_members(cnode, accessed)\n # checks attributes are defined in an allowed method such as __init__\n if not self.linter.is_message_enabled(\"attribute-defined-outside-init\"):\n return\n defining_methods = self.config.defining_attr_methods\n current_module = cnode.root()\n for attr, nodes in cnode.instance_attrs.items():\n # skip nodes which are not in the current module and it may screw up\n # the output, while it's not worth it\n nodes = [\n n\n for n in nodes\n if not isinstance(n.statement(), (astroid.Delete, astroid.AugAssign))\n and n.root() is current_module\n ]\n if not nodes:\n continue # error detected by typechecking\n # check if any method attr is defined in is a defining method\n if any(node.frame().name in defining_methods for node in nodes):\n continue\n\n # check attribute is defined in a parent's __init__\n for parent in cnode.instance_attr_ancestors(attr):\n attr_defined = False\n # check if any parent method attr is defined in is a defining method\n for node in parent.instance_attrs[attr]:\n if node.frame().name in defining_methods:\n attr_defined = True\n if attr_defined:\n # we're done :)\n break\n else:\n # check attribute is defined as a class attribute\n try:\n cnode.local_attr(attr)\n except astroid.NotFoundError:\n for node in nodes:\n if node.frame().name not in defining_methods:\n # If the attribute was set by a call in any\n # of the defining methods, then don't emit\n # the warning.\n if _called_in_methods(\n node.frame(), cnode, defining_methods\n ):\n continue\n self.add_message(\n \"attribute-defined-outside-init\", args=attr, node=node\n )", "response": "close a class node check that instance attributes are defined in __init__ and check that all members of the class are in the same module as the class."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef visit_functiondef(self, node):\n # ignore actual functions\n if not node.is_method():\n return\n\n self._check_useless_super_delegation(node)\n\n klass = node.parent.frame()\n self._meth_could_be_func = True\n # check first argument is self if this is actually a method\n self._check_first_arg_for_type(node, klass.type == \"metaclass\")\n if node.name == \"__init__\":\n self._check_init(node)\n return\n # check signature if the method overloads inherited method\n for overridden in klass.local_attr_ancestors(node.name):\n # get astroid for the searched method\n try:\n meth_node = overridden[node.name]\n except KeyError:\n # we have found the method but it's not in the local\n # dictionary.\n # This may happen with astroid build from living objects\n continue\n if not isinstance(meth_node, astroid.FunctionDef):\n continue\n self._check_signature(node, meth_node, \"overridden\", klass)\n break\n if node.decorators:\n for decorator in node.decorators.nodes:\n if isinstance(decorator, astroid.Attribute) and decorator.attrname in (\n \"getter\",\n \"setter\",\n \"deleter\",\n ):\n # attribute affectation will call this method, not hiding it\n return\n if isinstance(decorator, astroid.Name):\n if decorator.name == \"property\":\n # attribute affectation will either call a setter or raise\n # an attribute error, anyway not hiding the function\n return\n\n # Infer the decorator and see if it returns something useful\n inferred = safe_infer(decorator)\n if not inferred:\n return\n if isinstance(inferred, astroid.FunctionDef):\n # Okay, it's a decorator, let's see what it can infer.\n try:\n inferred = next(inferred.infer_call_result(inferred))\n except astroid.InferenceError:\n return\n try:\n if (\n isinstance(inferred, (astroid.Instance, astroid.ClassDef))\n and inferred.getattr(\"__get__\")\n and inferred.getattr(\"__set__\")\n ):\n return\n except astroid.AttributeInferenceError:\n pass\n\n # check if the method is hidden by an attribute\n try:\n overridden = klass.instance_attr(node.name)[0] # XXX\n overridden_frame = overridden.frame()\n if (\n isinstance(overridden_frame, astroid.FunctionDef)\n and overridden_frame.type == \"method\"\n ):\n overridden_frame = overridden_frame.parent.frame()\n if isinstance(overridden_frame, astroid.ClassDef) and klass.is_subtype_of(\n overridden_frame.qname()\n ):\n args = (overridden.root().name, overridden.fromlineno)\n self.add_message(\"method-hidden\", args=args, node=node)\n except astroid.NotFoundError:\n pass", "response": "Check if the node is a method or method."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if the given function node is an useless method override", "response": "def _check_useless_super_delegation(self, function):\n \"\"\"Check if the given function node is an useless method override\n\n We consider it *useless* if it uses the super() builtin, but having\n nothing additional whatsoever than not implementing the method at all.\n If the method uses super() to delegate an operation to the rest of the MRO,\n and if the method called is the same as the current one, the arguments\n passed to super() are the same as the parameters that were passed to\n this method, then the method could be removed altogether, by letting\n other implementation to take precedence.\n \"\"\"\n\n if (\n not function.is_method()\n # With decorators is a change of use\n or function.decorators\n ):\n return\n\n body = function.body\n if len(body) != 1:\n # Multiple statements, which means this overridden method\n # could do multiple things we are not aware of.\n return\n\n statement = body[0]\n if not isinstance(statement, (astroid.Expr, astroid.Return)):\n # Doing something else than what we are interested into.\n return\n\n call = statement.value\n if (\n not isinstance(call, astroid.Call)\n # Not a super() attribute access.\n or not isinstance(call.func, astroid.Attribute)\n ):\n return\n\n # Should be a super call.\n try:\n super_call = next(call.func.expr.infer())\n except astroid.InferenceError:\n return\n else:\n if not isinstance(super_call, objects.Super):\n return\n\n # The name should be the same.\n if call.func.attrname != function.name:\n return\n\n # Should be a super call with the MRO pointer being the\n # current class and the type being the current instance.\n current_scope = function.parent.scope()\n if (\n super_call.mro_pointer != current_scope\n or not isinstance(super_call.type, astroid.Instance)\n or super_call.type.name != current_scope.name\n ):\n return\n\n # \u00a0Check values of default args\n klass = function.parent.frame()\n meth_node = None\n for overridden in klass.local_attr_ancestors(function.name):\n # get astroid for the searched method\n try:\n meth_node = overridden[function.name]\n except KeyError:\n # we have found the method but it's not in the local\n # dictionary.\n # This may happen with astroid build from living objects\n continue\n if (\n not isinstance(meth_node, astroid.FunctionDef)\n # If the method have an ancestor which is not a\n # function then it is legitimate to redefine it\n or _has_different_parameters_default_value(\n meth_node.args, function.args\n )\n ):\n return\n break\n\n # Detect if the parameters are the same as the call's arguments.\n params = _signature_from_arguments(function.args)\n args = _signature_from_call(call)\n\n if meth_node is not None:\n\n def form_annotations(annotations):\n return [\n annotation.as_string() for annotation in filter(None, annotations)\n ]\n\n called_annotations = form_annotations(function.args.annotations)\n overridden_annotations = form_annotations(meth_node.args.annotations)\n if called_annotations and overridden_annotations:\n if called_annotations != overridden_annotations:\n return\n\n if _definition_equivalent_to_call(params, args):\n self.add_message(\n \"useless-super-delegation\", node=function, args=(function.name,)\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef leave_functiondef(self, node):\n if node.is_method():\n if node.args.args is not None:\n self._first_attrs.pop()\n if not self.linter.is_message_enabled(\"no-self-use\"):\n return\n class_node = node.parent.frame()\n if (\n self._meth_could_be_func\n and node.type == \"method\"\n and node.name not in PYMETHODS\n and not (\n node.is_abstract()\n or overrides_a_method(class_node, node.name)\n or decorated_with_property(node)\n or _has_bare_super_call(node)\n )\n ):\n self.add_message(\"no-self-use\", node=node)", "response": "check if this function couldn t be a function"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck if the getattr is an access to a class member and register it if so", "response": "def visit_attribute(self, node):\n \"\"\"check if the getattr is an access to a class member\n if so, register it. Also check for access to protected\n class member from outside its class (but ignore __special__\n methods)\n \"\"\"\n # Check self\n if self._uses_mandatory_method_param(node):\n self._accessed.set_accessed(node)\n return\n if not self.linter.is_message_enabled(\"protected-access\"):\n return\n\n self._check_protected_attribute_access(node)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nchecks that the given AssignAttr node is defined in the class slots.", "response": "def _check_in_slots(self, node):\n \"\"\" Check that the given AssignAttr node\n is defined in the class slots.\n \"\"\"\n inferred = safe_infer(node.expr)\n if not isinstance(inferred, astroid.Instance):\n return\n\n klass = inferred._proxied\n if not has_known_bases(klass):\n return\n if \"__slots__\" not in klass.locals or not klass.newstyle:\n return\n\n slots = klass.slots()\n if slots is None:\n return\n # If any ancestor doesn't use slots, the slots\n # defined for this class are superfluous.\n if any(\n \"__slots__\" not in ancestor.locals and ancestor.name != \"object\"\n for ancestor in klass.ancestors()\n ):\n return\n\n if not any(slot.value == node.attrname for slot in slots):\n # If we have a '__dict__' in slots, then\n # assigning any name is valid.\n if not any(slot.value == \"__dict__\" for slot in slots):\n if _is_attribute_property(node.attrname, klass):\n # Properties circumvent the slots mechanism,\n # so we should not emit a warning for them.\n return\n if node.attrname in klass.locals and _has_data_descriptor(\n klass, node.attrname\n ):\n # Descriptors circumvent the slots mechanism as well.\n return\n if node.attrname == \"__class__\" and _has_same_layout_slots(\n slots, node.parent.value\n ):\n return\n self.add_message(\"assigning-non-slot\", args=(node.attrname,), node=node)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _check_classmethod_declaration(self, node):\n if not isinstance(node.value, astroid.Call):\n return\n\n # check the function called is \"classmethod\" or \"staticmethod\"\n func = node.value.func\n if not isinstance(func, astroid.Name) or func.name not in (\n \"classmethod\",\n \"staticmethod\",\n ):\n return\n\n msg = (\n \"no-classmethod-decorator\"\n if func.name == \"classmethod\"\n else \"no-staticmethod-decorator\"\n )\n # assignment must be at a class scope\n parent_class = node.scope()\n if not isinstance(parent_class, astroid.ClassDef):\n return\n\n # Check if the arg passed to classmethod is a class member\n classmeth_arg = node.value.args[0]\n if not isinstance(classmeth_arg, astroid.Name):\n return\n\n method_name = classmeth_arg.name\n if any(method_name == member.name for member in parent_class.mymethods()):\n self.add_message(msg, node=node.targets[0])", "response": "Checks for uses of classmethod or staticmethod."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _check_protected_attribute_access(self, node):\n attrname = node.attrname\n\n if (\n is_attr_protected(attrname)\n and attrname not in self.config.exclude_protected\n ):\n\n klass = node_frame_class(node)\n\n # XXX infer to be more safe and less dirty ??\n # in classes, check we are not getting a parent method\n # through the class object or through super\n callee = node.expr.as_string()\n\n # We are not in a class, no remaining valid case\n if klass is None:\n self.add_message(\"protected-access\", node=node, args=attrname)\n return\n\n # If the expression begins with a call to super, that's ok.\n if (\n isinstance(node.expr, astroid.Call)\n and isinstance(node.expr.func, astroid.Name)\n and node.expr.func.name == \"super\"\n ):\n return\n\n # If the expression begins with a call to type(self), that's ok.\n if self._is_type_self_call(node.expr):\n return\n\n # We are in a class, one remaining valid cases, Klass._attr inside\n # Klass\n if not (callee == klass.name or callee in klass.basenames):\n # Detect property assignments in the body of the class.\n # This is acceptable:\n #\n # class A:\n # b = property(lambda: self._b)\n\n stmt = node.parent.statement()\n if (\n isinstance(stmt, astroid.Assign)\n and len(stmt.targets) == 1\n and isinstance(stmt.targets[0], astroid.AssignName)\n ):\n name = stmt.targets[0].name\n if _is_attribute_property(name, klass):\n return\n\n self.add_message(\"protected-access\", node=node, args=attrname)", "response": "Checks if the node is a protected attribute access."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef visit_name(self, node):\n if self._first_attrs and (\n node.name == self._first_attrs[-1] or not self._first_attrs[-1]\n ):\n self._meth_could_be_func = False", "response": "check if the name handle an access to a class member\n "} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncheck that all members of a node are defined", "response": "def _check_accessed_members(self, node, accessed):\n \"\"\"check that accessed members are defined\"\"\"\n # XXX refactor, probably much simpler now that E0201 is in type checker\n excs = (\"AttributeError\", \"Exception\", \"BaseException\")\n for attr, nodes in accessed.items():\n try:\n # is it a class attribute ?\n node.local_attr(attr)\n # yes, stop here\n continue\n except astroid.NotFoundError:\n pass\n # is it an instance attribute of a parent class ?\n try:\n next(node.instance_attr_ancestors(attr))\n # yes, stop here\n continue\n except StopIteration:\n pass\n # is it an instance attribute ?\n try:\n defstmts = node.instance_attr(attr)\n except astroid.NotFoundError:\n pass\n else:\n # filter out augment assignment nodes\n defstmts = [stmt for stmt in defstmts if stmt not in nodes]\n if not defstmts:\n # only augment assignment for this node, no-member should be\n # triggered by the typecheck checker\n continue\n # filter defstmts to only pick the first one when there are\n # several assignments in the same scope\n scope = defstmts[0].scope()\n defstmts = [\n stmt\n for i, stmt in enumerate(defstmts)\n if i == 0 or stmt.scope() is not scope\n ]\n # if there are still more than one, don't attempt to be smarter\n # than we can be\n if len(defstmts) == 1:\n defstmt = defstmts[0]\n # check that if the node is accessed in the same method as\n # it's defined, it's accessed after the initial assignment\n frame = defstmt.frame()\n lno = defstmt.fromlineno\n for _node in nodes:\n if (\n _node.frame() is frame\n and _node.fromlineno < lno\n and not astroid.are_exclusive(\n _node.statement(), defstmt, excs\n )\n ):\n self.add_message(\n \"access-member-before-definition\",\n node=_node,\n args=(attr, lno),\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _check_first_arg_for_type(self, node, metaclass=0):\n # don't care about functions with unknown argument (builtins)\n if node.args.args is None:\n return\n first_arg = node.args.args and node.argnames()[0]\n self._first_attrs.append(first_arg)\n first = self._first_attrs[-1]\n # static method\n if node.type == \"staticmethod\":\n if (\n first_arg == \"self\"\n or first_arg in self.config.valid_classmethod_first_arg\n or first_arg in self.config.valid_metaclass_classmethod_first_arg\n ):\n self.add_message(\"bad-staticmethod-argument\", args=first, node=node)\n return\n self._first_attrs[-1] = None\n # class / regular method with no args\n elif not node.args.args:\n self.add_message(\"no-method-argument\", node=node)\n # metaclass\n elif metaclass:\n # metaclass __new__ or classmethod\n if node.type == \"classmethod\":\n self._check_first_arg_config(\n first,\n self.config.valid_metaclass_classmethod_first_arg,\n node,\n \"bad-mcs-classmethod-argument\",\n node.name,\n )\n # metaclass regular method\n else:\n self._check_first_arg_config(\n first,\n self.config.valid_classmethod_first_arg,\n node,\n \"bad-mcs-method-argument\",\n node.name,\n )\n # regular class\n else:\n # class method\n if node.type == \"classmethod\" or node.name == \"__class_getitem__\":\n self._check_first_arg_config(\n first,\n self.config.valid_classmethod_first_arg,\n node,\n \"bad-classmethod-argument\",\n node.name,\n )\n # regular method without self as argument\n elif first != \"self\":\n self.add_message(\"no-self-argument\", node=node)", "response": "check the first argument for the type of class or regular method"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _check_bases_classes(self, node):\n\n def is_abstract(method):\n return method.is_abstract(pass_is_abstract=False)\n\n # check if this class abstract\n if class_is_abstract(node):\n return\n\n methods = sorted(\n unimplemented_abstract_methods(node, is_abstract).items(),\n key=lambda item: item[0],\n )\n for name, method in methods:\n owner = method.parent.frame()\n if owner is node:\n continue\n # owner is not this class, it must be a parent class\n # check that the ancestor's method is not abstract\n if name in node.locals:\n # it is redefined as an attribute or with a descriptor\n continue\n self.add_message(\"abstract-method\", node=node, args=(name, owner.name))", "response": "check that the given class node implements abstract methods from\n base classes\n "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking that the __init__ method call super or ancestors - super - init - not - called", "response": "def _check_init(self, node):\n \"\"\"check that the __init__ method call super or ancestors'__init__\n method\n \"\"\"\n if not self.linter.is_message_enabled(\n \"super-init-not-called\"\n ) and not self.linter.is_message_enabled(\"non-parent-init-called\"):\n return\n klass_node = node.parent.frame()\n to_call = _ancestors_to_call(klass_node)\n not_called_yet = dict(to_call)\n for stmt in node.nodes_of_class(astroid.Call):\n expr = stmt.func\n if not isinstance(expr, astroid.Attribute) or expr.attrname != \"__init__\":\n continue\n # skip the test if using super\n if (\n isinstance(expr.expr, astroid.Call)\n and isinstance(expr.expr.func, astroid.Name)\n and expr.expr.func.name == \"super\"\n ):\n return\n try:\n for klass in expr.expr.infer():\n if klass is astroid.Uninferable:\n continue\n # The infered klass can be super(), which was\n # assigned to a variable and the `__init__`\n # was called later.\n #\n # base = super()\n # base.__init__(...)\n\n if (\n isinstance(klass, astroid.Instance)\n and isinstance(klass._proxied, astroid.ClassDef)\n and is_builtin_object(klass._proxied)\n and klass._proxied.name == \"super\"\n ):\n return\n if isinstance(klass, objects.Super):\n return\n try:\n del not_called_yet[klass]\n except KeyError:\n if klass not in to_call:\n self.add_message(\n \"non-parent-init-called\", node=expr, args=klass.name\n )\n except astroid.InferenceError:\n continue\n for klass, method in not_called_yet.items():\n cls = node_frame_class(method)\n if klass.name == \"object\" or (cls and cls.name == \"object\"):\n continue\n self.add_message(\"super-init-not-called\", args=klass.name, node=node)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck that the signature of the two given methods match the signature of the class", "response": "def _check_signature(self, method1, refmethod, class_type, cls):\n \"\"\"check that the signature of the two given methods match\n \"\"\"\n if not (\n isinstance(method1, astroid.FunctionDef)\n and isinstance(refmethod, astroid.FunctionDef)\n ):\n self.add_message(\n \"method-check-failed\", args=(method1, refmethod), node=method1\n )\n return\n\n instance = cls.instantiate_class()\n method1 = function_to_method(method1, instance)\n refmethod = function_to_method(refmethod, instance)\n\n # Don't care about functions with unknown argument (builtins).\n if method1.args.args is None or refmethod.args.args is None:\n return\n\n # Ignore private to class methods.\n if is_attr_private(method1.name):\n return\n # Ignore setters, they have an implicit extra argument,\n # which shouldn't be taken in consideration.\n if method1.decorators:\n for decorator in method1.decorators.nodes:\n if (\n isinstance(decorator, astroid.Attribute)\n and decorator.attrname == \"setter\"\n ):\n return\n\n if _different_parameters(\n refmethod, method1, dummy_parameter_regex=self._dummy_rgx\n ):\n self.add_message(\n \"arguments-differ\", args=(class_type, method1.name), node=method1\n )\n elif len(method1.args.defaults) < len(refmethod.args.defaults):\n self.add_message(\n \"signature-differs\", args=(class_type, method1.name), node=method1\n )"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _is_mandatory_method_param(self, node):\n return (\n self._first_attrs\n and isinstance(node, astroid.Name)\n and node.name == self._first_attrs[-1]\n )", "response": "Check if node is a mandatory method parameter."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning an iterator which yields tuples of nodes inferred by the given statement.", "response": "def _annotated_unpack_infer(stmt, context=None):\n \"\"\"\n Recursively generate nodes inferred by the given statement.\n If the inferred value is a list or a tuple, recurse on the elements.\n Returns an iterator which yields tuples in the format\n ('original node', 'infered node').\n \"\"\"\n if isinstance(stmt, (astroid.List, astroid.Tuple)):\n for elt in stmt.elts:\n inferred = utils.safe_infer(elt)\n if inferred and inferred is not astroid.Uninferable:\n yield elt, inferred\n return\n for infered in stmt.infer(context):\n if infered is astroid.Uninferable:\n continue\n yield stmt, infered"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _is_raising(body: typing.List) -> bool:\n for node in body:\n if isinstance(node, astroid.Raise):\n return True\n return False", "response": "Return true if the given statement node raise an exception"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nverifies that the exception context is properly set.", "response": "def _check_bad_exception_context(self, node):\n \"\"\"Verify that the exception context is properly set.\n\n An exception context can be only `None` or an exception.\n \"\"\"\n cause = utils.safe_infer(node.cause)\n if cause in (astroid.Uninferable, None):\n return\n\n if isinstance(cause, astroid.Const):\n if cause.value is not None:\n self.add_message(\"bad-exception-context\", node=node)\n elif not isinstance(cause, astroid.ClassDef) and not utils.inherit_from_std_ex(\n cause\n ):\n self.add_message(\"bad-exception-context\", node=node)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks for empty except clause and raise exceptions.", "response": "def visit_tryexcept(self, node):\n \"\"\"check for empty except\"\"\"\n self._check_try_except_raise(node)\n exceptions_classes = []\n nb_handlers = len(node.handlers)\n for index, handler in enumerate(node.handlers):\n if handler.type is None:\n if not _is_raising(handler.body):\n self.add_message(\"bare-except\", node=handler)\n\n # check if an \"except:\" is followed by some other\n # except\n if index < (nb_handlers - 1):\n msg = \"empty except clause should always appear last\"\n self.add_message(\"bad-except-order\", node=node, args=msg)\n\n elif isinstance(handler.type, astroid.BoolOp):\n self.add_message(\n \"binary-op-exception\", node=handler, args=handler.type.op\n )\n else:\n try:\n excs = list(_annotated_unpack_infer(handler.type))\n except astroid.InferenceError:\n continue\n\n for part, exc in excs:\n if exc is astroid.Uninferable:\n continue\n if isinstance(exc, astroid.Instance) and utils.inherit_from_std_ex(\n exc\n ):\n # pylint: disable=protected-access\n exc = exc._proxied\n\n self._check_catching_non_exception(handler, exc, part)\n\n if not isinstance(exc, astroid.ClassDef):\n continue\n\n exc_ancestors = [\n anc\n for anc in exc.ancestors()\n if isinstance(anc, astroid.ClassDef)\n ]\n\n for previous_exc in exceptions_classes:\n if previous_exc in exc_ancestors:\n msg = \"%s is an ancestor class of %s\" % (\n previous_exc.name,\n exc.name,\n )\n self.add_message(\n \"bad-except-order\", node=handler.type, args=msg\n )\n if (\n exc.name in self.config.overgeneral_exceptions\n and exc.root().name == utils.EXCEPTIONS_MODULE\n and not _is_raising(handler.body)\n ):\n self.add_message(\n \"broad-except\", args=exc.name, node=handler.type\n )\n\n if exc in exceptions_classes:\n self.add_message(\n \"duplicate-except\", args=exc.name, node=handler.type\n )\n\n exceptions_classes += [exc for _, exc in excs]"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef visit_functiondef(self, node):\n # ignore actual functions or method within a new style class\n if not node.is_method():\n return\n klass = node.parent.frame()\n for stmt in node.nodes_of_class(astroid.Call):\n if node_frame_class(stmt) != node_frame_class(node):\n # Don't look down in other scopes.\n continue\n\n expr = stmt.func\n if not isinstance(expr, astroid.Attribute):\n continue\n\n call = expr.expr\n # skip the test if using super\n if not (\n isinstance(call, astroid.Call)\n and isinstance(call.func, astroid.Name)\n and call.func.name == \"super\"\n ):\n continue\n\n if not klass.newstyle and has_known_bases(klass):\n # super should not be used on an old style class\n continue\n else:\n # super first arg should be the class\n if not call.args:\n if sys.version_info[0] == 3:\n # unless Python 3\n continue\n else:\n self.add_message(\"missing-super-argument\", node=call)\n continue\n\n # calling super(type(self), self) can lead to recursion loop\n # in derived classes\n arg0 = call.args[0]\n if (\n isinstance(arg0, astroid.Call)\n and isinstance(arg0.func, astroid.Name)\n and arg0.func.name == \"type\"\n ):\n self.add_message(\"bad-super-call\", node=call, args=(\"type\",))\n continue\n\n # calling super(self.__class__, self) can lead to recursion loop\n # in derived classes\n if (\n len(call.args) >= 2\n and isinstance(call.args[1], astroid.Name)\n and call.args[1].name == \"self\"\n and isinstance(arg0, astroid.Attribute)\n and arg0.attrname == \"__class__\"\n ):\n self.add_message(\n \"bad-super-call\", node=call, args=(\"self.__class__\",)\n )\n continue\n\n try:\n supcls = call.args and next(call.args[0].infer(), None)\n except astroid.InferenceError:\n continue\n\n if klass is not supcls:\n name = None\n # if supcls is not Uninferable, then supcls was infered\n # and use its name. Otherwise, try to look\n # for call.args[0].name\n if supcls:\n name = supcls.name\n elif call.args and hasattr(call.args[0], \"name\"):\n name = call.args[0].name\n if name:\n self.add_message(\"bad-super-call\", node=call, args=(name,))", "response": "check use of super"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef display_reports(self, layout):\n self.section = 0\n if hasattr(layout, \"report_id\"):\n layout.children[0].children[0].data += \" (%s)\" % layout.report_id\n self._display(layout)", "response": "display results encapsulated in the layout tree"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks if a class node is a typing. NamedTuple class", "response": "def _is_typing_namedtuple(node: astroid.ClassDef) -> bool:\n \"\"\"Check if a class node is a typing.NamedTuple class\"\"\"\n for base in node.ancestors():\n if base.qname() == TYPING_NAMEDTUPLE:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _is_enum_class(node: astroid.ClassDef) -> bool:\n for base in node.bases:\n try:\n inferred_bases = base.inferred()\n except astroid.InferenceError:\n continue\n\n for ancestor in inferred_bases:\n if not isinstance(ancestor, astroid.ClassDef):\n continue\n\n if ancestor.name == \"Enum\" and ancestor.root().name == \"enum\":\n return True\n\n return False", "response": "Checks if a class definition defines an Enum class."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _is_dataclass(node: astroid.ClassDef) -> bool:\n if not node.decorators:\n return False\n\n root_locals = node.root().locals\n for decorator in node.decorators.nodes:\n if isinstance(decorator, astroid.Call):\n decorator = decorator.func\n if not isinstance(decorator, (astroid.Name, astroid.Attribute)):\n continue\n if isinstance(decorator, astroid.Name):\n name = decorator.name\n else:\n name = decorator.attrname\n if name == DATACLASS_DECORATOR and DATACLASS_DECORATOR in root_locals:\n return True\n return False", "response": "Check if a given class node is a dataclass class."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncount the number of boolean expressions in BoolOp bool_op.", "response": "def _count_boolean_expressions(bool_op):\n \"\"\"Counts the number of boolean expressions in BoolOp `bool_op` (recursive)\n\n example: a and (b or c or (d and e)) ==> 5 boolean expressions\n \"\"\"\n nb_bool_expr = 0\n for bool_expr in bool_op.get_children():\n if isinstance(bool_expr, BoolOp):\n nb_bool_expr += _count_boolean_expressions(bool_expr)\n else:\n nb_bool_expr += 1\n return nb_bool_expr"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef visit_classdef(self, node):\n nb_parents = len(list(node.ancestors()))\n if nb_parents > self.config.max_parents:\n self.add_message(\n \"too-many-ancestors\",\n node=node,\n args=(nb_parents, self.config.max_parents),\n )\n\n if len(node.instance_attrs) > self.config.max_attributes:\n self.add_message(\n \"too-many-instance-attributes\",\n node=node,\n args=(len(node.instance_attrs), self.config.max_attributes),\n )", "response": "check size of inheritance hierarchy and number of instance attributes"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking if the current class has at least one public method.", "response": "def leave_classdef(self, node):\n \"\"\"check number of public methods\"\"\"\n my_methods = sum(\n 1 for method in node.mymethods() if not method.name.startswith(\"_\")\n )\n\n # Does the class contain less than n public methods ?\n # This checks only the methods defined in the current class,\n # since the user might not have control over the classes\n # from the ancestors. It avoids some false positives\n # for classes such as unittest.TestCase, which provides\n # a lot of assert methods. It doesn't make sense to warn\n # when the user subclasses TestCase to add his own tests.\n if my_methods > self.config.max_public_methods:\n self.add_message(\n \"too-many-public-methods\",\n node=node,\n args=(my_methods, self.config.max_public_methods),\n )\n\n # Stop here for exception, metaclass, interface classes and other\n # classes for which we don't need to count the methods.\n if (\n node.type != \"class\"\n or _is_enum_class(node)\n or _is_dataclass(node)\n or _is_typing_namedtuple(node)\n ):\n return\n\n # Does the class contain more than n public methods ?\n # This checks all the methods defined by ancestors and\n # by the current class.\n all_methods = _count_methods_in_class(node)\n if all_methods < self.config.min_public_methods:\n self.add_message(\n \"too-few-public-methods\",\n node=node,\n args=(all_methods, self.config.min_public_methods),\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nchecks function name docstring arguments redefinition variable names max locals", "response": "def visit_functiondef(self, node):\n \"\"\"check function name, docstring, arguments, redefinition,\n variable names, max locals\n \"\"\"\n # init branch and returns counters\n self._returns.append(0)\n # check number of arguments\n args = node.args.args\n ignored_argument_names = self._ignored_argument_names\n if args is not None:\n ignored_args_num = 0\n if ignored_argument_names:\n ignored_args_num = sum(\n 1 for arg in args if ignored_argument_names.match(arg.name)\n )\n\n argnum = len(args) - ignored_args_num\n if argnum > self.config.max_args:\n self.add_message(\n \"too-many-arguments\",\n node=node,\n args=(len(args), self.config.max_args),\n )\n else:\n ignored_args_num = 0\n # check number of local variables\n locnum = len(node.locals) - ignored_args_num\n if locnum > self.config.max_locals:\n self.add_message(\n \"too-many-locals\", node=node, args=(locnum, self.config.max_locals)\n )\n # init new statements counter\n self._stmts.append(1)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef leave_functiondef(self, node):\n returns = self._returns.pop()\n if returns > self.config.max_returns:\n self.add_message(\n \"too-many-return-statements\",\n node=node,\n args=(returns, self.config.max_returns),\n )\n branches = self._branches[node]\n if branches > self.config.max_branches:\n self.add_message(\n \"too-many-branches\",\n node=node,\n args=(branches, self.config.max_branches),\n )\n # check number of statements\n stmts = self._stmts.pop()\n if stmts > self.config.max_statements:\n self.add_message(\n \"too-many-statements\",\n node=node,\n args=(stmts, self.config.max_statements),\n )", "response": "leaves the functiondef node"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nincrement the branches counter node.orelse is the exception node.", "response": "def visit_tryexcept(self, node):\n \"\"\"increments the branches counter\"\"\"\n branches = len(node.handlers)\n if node.orelse:\n branches += 1\n self._inc_branch(node, branches)\n self._inc_all_stmts(branches)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef visit_if(self, node):\n self._check_boolean_expressions(node)\n branches = 1\n # don't double count If nodes coming from some 'elif'\n if node.orelse and (len(node.orelse) > 1 or not isinstance(node.orelse[0], If)):\n branches += 1\n self._inc_branch(node, branches)\n self._inc_all_stmts(branches)", "response": "visit an If node and update the branches counter and checks boolean expressions and the all statements that are in the branch."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _check_boolean_expressions(self, node):\n condition = node.test\n if not isinstance(condition, BoolOp):\n return\n nb_bool_expr = _count_boolean_expressions(condition)\n if nb_bool_expr > self.config.max_bool_expr:\n self.add_message(\n \"too-many-boolean-expressions\",\n node=condition,\n args=(nb_bool_expr, self.config.max_bool_expr),\n )", "response": "Go through the if node and counts its boolean expressions\n if the condition is a BoolOp node\n "} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef visit_while(self, node):\n branches = 1\n if node.orelse:\n branches += 1\n self._inc_branch(node, branches)", "response": "increments the branches counter"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _check_docstring(self, node):\n docstring = node.doc\n if not docstring:\n return\n\n start_line = node.lineno + 1\n\n # Go through lines of docstring\n for idx, line in enumerate(docstring.splitlines()):\n self._check_spelling(\"wrong-spelling-in-docstring\", line, start_line + idx)", "response": "check the node has any spelling errors"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef format(self, template):\n # For some reason, _asdict on derived namedtuples does not work with\n # Python 3.4. Needs some investigation.\n return template.format(**dict(zip(self._fields, self)))", "response": "Format the message according to the given template."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _is_len_call(node):\n return (\n isinstance(node, astroid.Call)\n and isinstance(node.func, astroid.Name)\n and node.func.name == \"len\"\n )", "response": "Checks if node is len ( SOFTING"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking if the given token is a trailing comma which trails an expression", "response": "def _is_trailing_comma(tokens, index):\n \"\"\"Check if the given token is a trailing comma\n\n :param tokens: Sequence of modules tokens\n :type tokens: list[tokenize.TokenInfo]\n :param int index: Index of token under check in tokens\n :returns: True if the token is a comma which trails an expression\n :rtype: bool\n \"\"\"\n token = tokens[index]\n if token.exact_type != tokenize.COMMA:\n return False\n # Must have remaining tokens on the same line such as NEWLINE\n left_tokens = itertools.islice(tokens, index + 1, None)\n same_line_remaining_tokens = list(\n itertools.takewhile(\n lambda other_token, _token=token: other_token.start[0] == _token.start[0],\n left_tokens,\n )\n )\n # Note: If the newline is tokenize.NEWLINE and not tokenize.NL\n # then the newline denotes the end of expression\n is_last_element = all(\n other_token.type in (tokenize.NEWLINE, tokenize.COMMENT)\n for other_token in same_line_remaining_tokens\n )\n if not same_line_remaining_tokens or not is_last_element:\n return False\n\n def get_curline_index_start():\n \"\"\"Get the index denoting the start of the current line\"\"\"\n for subindex, token in enumerate(reversed(tokens[:index])):\n # See Lib/tokenize.py and Lib/token.py in cpython for more info\n if token.type in (tokenize.NEWLINE, tokenize.NL):\n return index - subindex\n return 0\n\n curline_start = get_curline_index_start()\n expected_tokens = {\"return\", \"yield\"}\n for prevtoken in tokens[curline_start:index]:\n if \"=\" in prevtoken.string or prevtoken.string in expected_tokens:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef register(linter):\n linter.register_checker(RefactoringChecker(linter))\n linter.register_checker(NotChecker(linter))\n linter.register_checker(RecommandationChecker(linter))\n linter.register_checker(LenChecker(linter))", "response": "Required method to auto register this checker."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck if the given node is an actual elif node.", "response": "def _is_actual_elif(self, node):\n \"\"\"Check if the given node is an actual elif\n\n This is a problem we're having with the builtin ast module,\n which splits `elif` branches into a separate if statement.\n Unfortunately we need to know the exact type in certain\n cases.\n \"\"\"\n if isinstance(node.parent, astroid.If):\n orelse = node.parent.orelse\n # current if node must directly follow an \"else\"\n if orelse and orelse == [node]:\n if (node.lineno, node.col_offset) in self._elifs:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _check_simplifiable_if(self, node):\n\n if self._is_actual_elif(node):\n # Not interested in if statements with multiple branches.\n return\n if len(node.orelse) != 1 or len(node.body) != 1:\n return\n\n # Check if both branches can be reduced.\n first_branch = node.body[0]\n else_branch = node.orelse[0]\n if isinstance(first_branch, astroid.Return):\n if not isinstance(else_branch, astroid.Return):\n return\n first_branch_is_bool = self._is_bool_const(first_branch)\n else_branch_is_bool = self._is_bool_const(else_branch)\n reduced_to = \"'return bool(test)'\"\n elif isinstance(first_branch, astroid.Assign):\n if not isinstance(else_branch, astroid.Assign):\n return\n\n # Check if we assign to the same value\n first_branch_targets = [\n target.name\n for target in first_branch.targets\n if isinstance(target, astroid.AssignName)\n ]\n else_branch_targets = [\n target.name\n for target in else_branch.targets\n if isinstance(target, astroid.AssignName)\n ]\n if not first_branch_targets or not else_branch_targets:\n return\n if sorted(first_branch_targets) != sorted(else_branch_targets):\n return\n\n first_branch_is_bool = self._is_bool_const(first_branch)\n else_branch_is_bool = self._is_bool_const(else_branch)\n reduced_to = \"'var = bool(test)'\"\n else:\n return\n\n if not first_branch_is_bool or not else_branch_is_bool:\n return\n if not first_branch.value.value:\n # This is a case that can't be easily simplified and\n # if it can be simplified, it will usually result in a\n # code that's harder to understand and comprehend.\n # Let's take for instance `arg and arg <= 3`. This could theoretically be\n # reduced to `not arg or arg > 3`, but the net result is that now the\n # condition is harder to understand, because it requires understanding of\n # an extra clause:\n # * first, there is the negation of truthness with `not arg`\n # * the second clause is `arg > 3`, which occurs when arg has a\n # a truth value, but it implies that `arg > 3` is equivalent\n # with `arg and arg > 3`, which means that the user must\n # think about this assumption when evaluating `arg > 3`.\n # The original form is easier to grasp.\n return\n\n self.add_message(\"simplifiable-if-statement\", node=node, args=(reduced_to,))", "response": "Check if the given if node can be simplified."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _check_stop_iteration_inside_generator(self, node):\n frame = node.frame()\n if not isinstance(frame, astroid.FunctionDef) or not frame.is_generator():\n return\n if utils.node_ignores_exception(node, StopIteration):\n return\n if not node.exc:\n return\n exc = utils.safe_infer(node.exc)\n if exc is None or exc is astroid.Uninferable:\n return\n if self._check_exception_inherit_from_stopiteration(exc):\n self.add_message(\"stop-iteration-return\", node=node)", "response": "Check if an exception of type StopIteration is raised inside a generator"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _check_exception_inherit_from_stopiteration(exc):\n stopiteration_qname = \"{}.StopIteration\".format(utils.EXCEPTIONS_MODULE)\n return any(_class.qname() == stopiteration_qname for _class in exc.mro())", "response": "Return True if the exception node in argument inherit from StopIteration"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _check_raising_stopiteration_in_generator_next_call(self, node):\n\n def _looks_like_infinite_iterator(param):\n inferred = utils.safe_infer(param)\n if inferred:\n return inferred.qname() in KNOWN_INFINITE_ITERATORS\n return False\n\n if isinstance(node.func, astroid.Attribute):\n # A next() method, which is now what we want.\n return\n\n inferred = utils.safe_infer(node.func)\n if getattr(inferred, \"name\", \"\") == \"next\":\n frame = node.frame()\n # The next builtin can only have up to two\n # positional arguments and no keyword arguments\n has_sentinel_value = len(node.args) > 1\n if (\n isinstance(frame, astroid.FunctionDef)\n and frame.is_generator()\n and not has_sentinel_value\n and not utils.node_ignores_exception(node, StopIteration)\n and not _looks_like_infinite_iterator(node.args[0])\n ):\n self.add_message(\"stop-iteration-return\", node=node)", "response": "Check if a StopIteration exception is raised by the next call to next function\n ."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _check_nested_blocks(self, node):\n # only check block levels inside functions or methods\n if not isinstance(node.scope(), astroid.FunctionDef):\n return\n # messages are triggered on leaving the nested block. Here we save the\n # stack in case the current node isn't nested in the previous one\n nested_blocks = self._nested_blocks[:]\n if node.parent == node.scope():\n self._nested_blocks = [node]\n else:\n # go through ancestors from the most nested to the less\n for ancestor_node in reversed(self._nested_blocks):\n if ancestor_node == node.parent:\n break\n self._nested_blocks.pop()\n # if the node is an elif, this should not be another nesting level\n if isinstance(node, astroid.If) and self._is_actual_elif(node):\n if self._nested_blocks:\n self._nested_blocks.pop()\n self._nested_blocks.append(node)\n\n # send message only once per group of nested blocks\n if len(nested_blocks) > len(self._nested_blocks):\n self._emit_nested_blocks_message_if_needed(nested_blocks)", "response": "Update and check the number of nested blocks in the current node"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _duplicated_isinstance_types(node):\n duplicated_objects = set()\n all_types = collections.defaultdict(set)\n\n for call in node.values:\n if not isinstance(call, astroid.Call) or len(call.args) != 2:\n continue\n\n inferred = utils.safe_infer(call.func)\n if not inferred or not utils.is_builtin_object(inferred):\n continue\n\n if inferred.name != \"isinstance\":\n continue\n\n isinstance_object = call.args[0].as_string()\n isinstance_types = call.args[1]\n\n if isinstance_object in all_types:\n duplicated_objects.add(isinstance_object)\n\n if isinstance(isinstance_types, astroid.Tuple):\n elems = [\n class_type.as_string() for class_type in isinstance_types.itered()\n ]\n else:\n elems = [isinstance_types.as_string()]\n all_types[isinstance_object].update(elems)\n\n # Remove all keys which not duplicated\n return {\n key: value for key, value in all_types.items() if key in duplicated_objects\n }", "response": "Get the duplicated types from the underlying isinstance calls."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking if isinstance calls which can be merged together.", "response": "def _check_consider_merging_isinstance(self, node):\n \"\"\"Check isinstance calls which can be merged together.\"\"\"\n if node.op != \"or\":\n return\n\n first_args = self._duplicated_isinstance_types(node)\n for duplicated_name, class_names in first_args.items():\n names = sorted(name for name in class_names)\n self.add_message(\n \"consider-merging-isinstance\",\n node=node,\n args=(duplicated_name, \", \".join(names)),\n )"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nchecking if there is any chained comparison in the expression.", "response": "def _check_chained_comparison(self, node):\n \"\"\"Check if there is any chained comparison in the expression.\n\n Add a refactoring message if a boolOp contains comparison like a < b and b < c,\n which can be chained as a < b < c.\n\n Care is taken to avoid simplifying a < b < c and b < d.\n \"\"\"\n if node.op != \"and\" or len(node.values) < 2:\n return\n\n def _find_lower_upper_bounds(comparison_node, uses):\n left_operand = comparison_node.left\n for operator, right_operand in comparison_node.ops:\n for operand in (left_operand, right_operand):\n value = None\n if isinstance(operand, astroid.Name):\n value = operand.name\n elif isinstance(operand, astroid.Const):\n value = operand.value\n\n if value is None:\n continue\n\n if operator in (\"<\", \"<=\"):\n if operand is left_operand:\n uses[value][\"lower_bound\"].add(comparison_node)\n elif operand is right_operand:\n uses[value][\"upper_bound\"].add(comparison_node)\n elif operator in (\">\", \">=\"):\n if operand is left_operand:\n uses[value][\"upper_bound\"].add(comparison_node)\n elif operand is right_operand:\n uses[value][\"lower_bound\"].add(comparison_node)\n left_operand = right_operand\n\n uses = collections.defaultdict(\n lambda: {\"lower_bound\": set(), \"upper_bound\": set()}\n )\n for comparison_node in node.values:\n if isinstance(comparison_node, astroid.Compare):\n _find_lower_upper_bounds(comparison_node, uses)\n\n for _, bounds in uses.items():\n num_shared = len(bounds[\"lower_bound\"].intersection(bounds[\"upper_bound\"]))\n num_lower_bounds = len(bounds[\"lower_bound\"])\n num_upper_bounds = len(bounds[\"upper_bound\"])\n if num_shared < num_lower_bounds and num_shared < num_upper_bounds:\n self.add_message(\"chained-comparison\", node=node)\n break"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking if the augmented assignment is using join.", "response": "def _check_consider_using_join(self, aug_assign):\n \"\"\"\n We start with the augmented assignment and work our way upwards.\n Names of variables for nodes if match successful:\n result = '' # assign\n for number in ['1', '2', '3'] # for_loop\n result += number # aug_assign\n \"\"\"\n for_loop = aug_assign.parent\n if not isinstance(for_loop, astroid.For) or len(for_loop.body) > 1:\n return\n assign = for_loop.previous_sibling()\n if not isinstance(assign, astroid.Assign):\n return\n result_assign_names = {\n target.name\n for target in assign.targets\n if isinstance(target, astroid.AssignName)\n }\n\n is_concat_loop = (\n aug_assign.op == \"+=\"\n and isinstance(aug_assign.target, astroid.AssignName)\n and len(for_loop.body) == 1\n and aug_assign.target.name in result_assign_names\n and isinstance(assign.value, astroid.Const)\n and isinstance(assign.value.value, str)\n and isinstance(aug_assign.value, astroid.Name)\n and aug_assign.value.name == for_loop.target.name\n )\n if is_concat_loop:\n self.add_message(\"consider-using-join\", node=aug_assign)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns true if node is condition and true_value or false_value form.", "response": "def _is_and_or_ternary(node):\n \"\"\"\n Returns true if node is 'condition and true_value or false_value' form.\n\n All of: condition, true_value and false_value should not be a complex boolean expression\n \"\"\"\n return (\n isinstance(node, astroid.BoolOp)\n and node.op == \"or\"\n and len(node.values) == 2\n and isinstance(node.values[0], astroid.BoolOp)\n and not isinstance(node.values[1], astroid.BoolOp)\n and node.values[0].op == \"and\"\n and not isinstance(node.values[0].values[1], astroid.BoolOp)\n and len(node.values[0].values) == 2\n )"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking that all return statements inside a function are consistent.", "response": "def _check_consistent_returns(self, node):\n \"\"\"Check that all return statements inside a function are consistent.\n\n Return statements are consistent if:\n - all returns are explicit and if there is no implicit return;\n - all returns are empty and if there is, possibly, an implicit return.\n\n Args:\n node (astroid.FunctionDef): the function holding the return statements.\n\n \"\"\"\n # explicit return statements are those with a not None value\n explicit_returns = [\n _node for _node in self._return_nodes[node.name] if _node.value is not None\n ]\n if not explicit_returns:\n return\n if len(explicit_returns) == len(\n self._return_nodes[node.name]\n ) and self._is_node_return_ended(node):\n return\n self.add_message(\"inconsistent-return-statements\", node=node)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _is_node_return_ended(self, node):\n # \u00a0Recursion base case\n if isinstance(node, astroid.Return):\n return True\n if isinstance(node, astroid.Call):\n try:\n funcdef_node = node.func.inferred()[0]\n if self._is_function_def_never_returning(funcdef_node):\n return True\n except astroid.InferenceError:\n pass\n # Avoid the check inside while loop as we don't know\n # \u00a0if they will be completed\n if isinstance(node, astroid.While):\n return True\n if isinstance(node, astroid.Raise):\n # a Raise statement doesn't need to end with a return statement\n # but if the exception raised is handled, then the handler has to\n # ends with a return statement\n if not node.exc:\n # Ignore bare raises\n return True\n if not utils.is_node_inside_try_except(node):\n # If the raise statement is not inside a try/except statement\n # \u00a0then the exception is raised and cannot be caught. No need\n # \u00a0to infer it.\n return True\n exc = utils.safe_infer(node.exc)\n if exc is None or exc is astroid.Uninferable:\n return False\n exc_name = exc.pytype().split(\".\")[-1]\n handlers = utils.get_exception_handlers(node, exc_name)\n handlers = list(handlers) if handlers is not None else []\n if handlers:\n # among all the handlers handling the exception at least one\n # must end with a return statement\n return any(\n self._is_node_return_ended(_handler) for _handler in handlers\n )\n # if no handlers handle the exception then it's ok\n return True\n if isinstance(node, astroid.If):\n # if statement is returning if there are exactly two return statements in its\n # \u00a0children : one for the body part, the other for the orelse part\n # Do not check if inner function definition are return ended.\n is_orelse_returning = any(\n self._is_node_return_ended(_ore)\n for _ore in node.orelse\n if not isinstance(_ore, astroid.FunctionDef)\n )\n is_if_returning = any(\n self._is_node_return_ended(_ifn)\n for _ifn in node.body\n if not isinstance(_ifn, astroid.FunctionDef)\n )\n return is_if_returning and is_orelse_returning\n # \u00a0recurses on the children of the node except for those which are except handler\n # because one cannot be sure that the handler will really be used\n return any(\n self._is_node_return_ended(_child)\n for _child in node.get_children()\n if not isinstance(_child, astroid.ExceptHandler)\n )", "response": "Checks if the node ends with an explicit return statement."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _check_return_at_the_end(self, node):\n if len(self._return_nodes[node.name]) > 1:\n return\n if len(node.body) <= 1:\n return\n\n last = node.body[-1]\n if isinstance(last, astroid.Return):\n # e.g. \"return\"\n if last.value is None:\n self.add_message(\"useless-return\", node=node)\n # return None\"\n elif isinstance(last.value, astroid.Const) and (last.value.value is None):\n self.add_message(\"useless-return\", node=node)", "response": "Check for presence of a single return statement at the end of a function."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef visit_for(self, node):\n # Verify that we have a `range([start], len(...), [stop])` call and\n # that the object which is iterated is used as a subscript in the\n # body of the for.\n\n # Is it a proper range call?\n if not isinstance(node.iter, astroid.Call):\n return\n if not self._is_builtin(node.iter.func, \"range\"):\n return\n if len(node.iter.args) == 2 and not _is_constant_zero(node.iter.args[0]):\n return\n if len(node.iter.args) > 2:\n return\n\n # Is it a proper len call?\n if not isinstance(node.iter.args[-1], astroid.Call):\n return\n second_func = node.iter.args[-1].func\n if not self._is_builtin(second_func, \"len\"):\n return\n len_args = node.iter.args[-1].args\n if not len_args or len(len_args) != 1:\n return\n iterating_object = len_args[0]\n if not isinstance(iterating_object, astroid.Name):\n return\n # If we're defining __iter__ on self, enumerate won't work\n scope = node.scope()\n if iterating_object.name == \"self\" and scope.name == \"__iter__\":\n return\n\n # Verify that the body of the for loop uses a subscript\n # with the object that was iterated. This uses some heuristics\n # in order to make sure that the same object is used in the\n # for body.\n for child in node.body:\n for subscript in child.nodes_of_class(astroid.Subscript):\n if not isinstance(subscript.value, astroid.Name):\n continue\n if not isinstance(subscript.slice, astroid.Index):\n continue\n if not isinstance(subscript.slice.value, astroid.Name):\n continue\n if subscript.slice.value.name != node.target.name:\n continue\n if iterating_object.name != subscript.value.name:\n continue\n if subscript.value.scope() != node.scope():\n # Ignore this subscript if it's not in the same\n # scope. This means that in the body of the for\n # loop, another scope was created, where the same\n # name for the iterating object was used.\n continue\n self.add_message(\"consider-using-enumerate\", node=node)\n return", "response": "Emit a convention whenever range and len are used for indexing."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef visit_unaryop(self, node):\n if (\n isinstance(node, astroid.UnaryOp)\n and node.op == \"not\"\n and _is_len_call(node.operand)\n ):\n self.add_message(\"len-as-condition\", node=node)", "response": "Check if the node is a not len call."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck if graphviz is available", "response": "def _check_graphviz_available(output_format):\n \"\"\"check if we need graphviz for different output format\"\"\"\n try:\n subprocess.call([\"dot\", \"-V\"], stdout=subprocess.PIPE, stderr=subprocess.PIPE)\n except OSError:\n print(\n \"The output format '%s' is currently not available.\\n\"\n \"Please install 'Graphviz' to have other output formats \"\n \"than 'dot' or 'vcg'.\" % output_format\n )\n sys.exit(32)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking arguments and run project", "response": "def run(self, args):\n \"\"\"checking arguments and run project\"\"\"\n if not args:\n print(self.help())\n return 1\n # insert current working directory to the python path to recognize\n # dependencies to local modules even if cwd is not in the PYTHONPATH\n sys.path.insert(0, os.getcwd())\n try:\n project = project_from_files(\n args,\n project_name=self.config.project,\n black_list=self.config.black_list,\n )\n linker = Linker(project, tag=True)\n handler = DiadefsHandler(self.config)\n diadefs = handler.get_diadefs(project, linker)\n finally:\n sys.path.pop(0)\n\n if self.config.output_format == \"vcg\":\n writer.VCGWriter(self.config).write(diadefs)\n else:\n writer.DotWriter(self.config).write(diadefs)\n return 0"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef visit_tryexcept(self, node):\n for handler in node.handlers:\n if handler.type is None:\n continue\n if isinstance(handler.type, astroid.BoolOp):\n continue\n try:\n excs = list(_annotated_unpack_infer(handler.type))\n except astroid.InferenceError:\n continue\n\n handled_in_clause = []\n for part, exc in excs:\n if exc is astroid.Uninferable:\n continue\n if isinstance(exc, astroid.Instance) and utils.inherit_from_std_ex(exc):\n # pylint: disable=protected-access\n exc = exc._proxied\n\n if not isinstance(exc, astroid.ClassDef):\n continue\n\n exc_ancestors = [\n anc for anc in exc.ancestors() if isinstance(anc, astroid.ClassDef)\n ]\n\n for prev_part, prev_exc in handled_in_clause:\n prev_exc_ancestors = [\n anc\n for anc in prev_exc.ancestors()\n if isinstance(anc, astroid.ClassDef)\n ]\n if exc == prev_exc:\n self.add_message(\n \"overlapping-except\",\n node=handler.type,\n args=\"%s and %s are the same\"\n % (prev_part.as_string(), part.as_string()),\n )\n elif prev_exc in exc_ancestors or exc in prev_exc_ancestors:\n ancestor = part if exc in prev_exc_ancestors else prev_part\n descendant = part if prev_exc in exc_ancestors else prev_part\n self.add_message(\n \"overlapping-except\",\n node=handler.type,\n args=\"%s is an ancestor class of %s\"\n % (ancestor.as_string(), descendant.as_string()),\n )\n handled_in_clause += [(part, exc)]", "response": "check for empty except"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef write(self, diadefs):\n for diagram in diadefs:\n basename = diagram.title.strip().replace(\" \", \"_\")\n file_name = \"%s.%s\" % (basename, self.config.output_format)\n self.set_printer(file_name, basename)\n if diagram.TYPE == \"class\":\n self.write_classes(diagram)\n else:\n self.write_packages(diagram)\n self.close_graph()", "response": "write files for the current project according to diadefs"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef write_packages(self, diagram):\n # sorted to get predictable (hence testable) results\n for i, obj in enumerate(sorted(diagram.modules(), key=lambda x: x.title)):\n self.printer.emit_node(i, label=self.get_title(obj), shape=\"box\")\n obj.fig_id = i\n # package dependencies\n for rel in diagram.get_relationships(\"depends\"):\n self.printer.emit_edge(\n rel.from_object.fig_id, rel.to_object.fig_id, **self.pkg_edges\n )", "response": "write a package diagram"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef write_classes(self, diagram):\n # sorted to get predictable (hence testable) results\n for i, obj in enumerate(sorted(diagram.objects, key=lambda x: x.title)):\n self.printer.emit_node(i, **self.get_values(obj))\n obj.fig_id = i\n # inheritance links\n for rel in diagram.get_relationships(\"specialization\"):\n self.printer.emit_edge(\n rel.from_object.fig_id, rel.to_object.fig_id, **self.inh_edges\n )\n # implementation links\n for rel in diagram.get_relationships(\"implements\"):\n self.printer.emit_edge(\n rel.from_object.fig_id, rel.to_object.fig_id, **self.imp_edges\n )\n # generate associations\n for rel in diagram.get_relationships(\"association\"):\n self.printer.emit_edge(\n rel.from_object.fig_id,\n rel.to_object.fig_id,\n label=rel.name,\n **self.association_edges\n )", "response": "write a class diagram"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef set_printer(self, file_name, basename):\n layout = dict(rankdir=\"BT\")\n self.printer = DotBackend(basename, additional_param=layout)\n self.file_name = file_name", "response": "initialize DotWriter and add options for layout.\n "} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget label and shape for classes. is the name of the class it contains all attributes and methods.", "response": "def get_values(self, obj):\n \"\"\"get label and shape for classes.\n\n The label contains all attributes and methods\n \"\"\"\n label = obj.title\n if obj.shape == \"interface\":\n label = \"\u00abinterface\u00bb\\\\n%s\" % label\n if not self.config.only_classnames:\n label = r\"%s|%s\\l|\" % (label, r\"\\l\".join(obj.attrs))\n for func in obj.methods:\n args = [arg.name for arg in func.args.args if arg.name != \"self\"]\n label = r\"%s%s(%s)\\l\" % (label, func.name, \", \".join(args))\n label = \"{%s}\" % label\n if is_exception(obj.node):\n return dict(fontcolor=\"red\", label=label, shape=\"record\")\n return dict(label=label, shape=\"record\")"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef set_printer(self, file_name, basename):\n self.graph_file = open(file_name, \"w+\")\n self.printer = VCGPrinter(self.graph_file)\n self.printer.open_graph(\n title=basename,\n layoutalgorithm=\"dfs\",\n late_edge_labels=\"yes\",\n port_sharing=\"no\",\n manhattan_edges=\"yes\",\n )\n self.printer.emit_node = self.printer.node\n self.printer.emit_edge = self.printer.edge", "response": "initialize VCGWriter for a UML graph"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_values(self, obj):\n if is_exception(obj.node):\n label = r\"\\fb\\f09%s\\fn\" % obj.title\n else:\n label = r\"\\fb%s\\fn\" % obj.title\n if obj.shape == \"interface\":\n shape = \"ellipse\"\n else:\n shape = \"box\"\n if not self.config.only_classnames:\n attrs = obj.attrs\n methods = [func.name for func in obj.methods]\n # box width for UML like diagram\n maxlen = max(len(name) for name in [obj.title] + methods + attrs)\n line = \"_\" * (maxlen + 2)\n label = r\"%s\\n\\f%s\" % (label, line)\n for attr in attrs:\n label = r\"%s\\n\\f08%s\" % (label, attr)\n if attrs:\n label = r\"%s\\n\\f%s\" % (label, line)\n for func in methods:\n label = r\"%s\\n\\f10%s()\" % (label, func)\n return dict(label=label, shape=shape)", "response": "get label and shape for classes.\n "} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn True if the message may be emitted using the current interpreter", "response": "def may_be_emitted(self):\n \"\"\"return True if message may be emitted using the current interpreter\"\"\"\n if self.minversion is not None and self.minversion > sys.version_info:\n return False\n if self.maxversion is not None and self.maxversion <= sys.version_info:\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef format_help(self, checkerref=False):\n desc = self.descr\n if checkerref:\n desc += \" This message belongs to the %s checker.\" % self.checker.name\n title = self.msg\n if self.symbol:\n msgid = \"%s (%s)\" % (self.symbol, self.msgid)\n else:\n msgid = self.msgid\n if self.minversion or self.maxversion:\n restr = []\n if self.minversion:\n restr.append(\"< %s\" % \".\".join([str(n) for n in self.minversion]))\n if self.maxversion:\n restr.append(\">= %s\" % \".\".join([str(n) for n in self.maxversion]))\n restr = \" or \".join(restr)\n if checkerref:\n desc += \" It can't be emitted when using Python %s.\" % restr\n else:\n desc += \" This message can't be emitted when using Python %s.\" % restr\n desc = normalize_text(\" \".join(desc.split()), indent=\" \")\n if title != \"%s\":\n title = title.splitlines()[0]\n\n return \":%s: *%s*\\n%s\" % (msgid, title.rstrip(\" \"), desc)\n return \":%s:\\n%s\" % (msgid, desc)", "response": "return the help string for the given message id"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nprocess a module s content is accessible via node. stream", "response": "def process_module(self, node):\n \"\"\"process a module\n\n the module's content is accessible via node.stream() function\n \"\"\"\n with node.stream() as stream:\n for (lineno, line) in enumerate(stream):\n if line.rstrip().endswith(\"\\\\\"):\n self.add_message(\"backslash-line-continuation\", line=lineno)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nextracting the environment PYTHONPATH and appends the current sys. path to those.", "response": "def _get_env():\n \"\"\"Extracts the environment PYTHONPATH and appends the current sys.path to\n those.\"\"\"\n env = dict(os.environ)\n env[\"PYTHONPATH\"] = os.pathsep.join(sys.path)\n return env"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef lint(filename, options=()):\n # traverse downwards until we are out of a python package\n full_path = osp.abspath(filename)\n parent_path = osp.dirname(full_path)\n child_path = osp.basename(full_path)\n\n while parent_path != \"/\" and osp.exists(osp.join(parent_path, \"__init__.py\")):\n child_path = osp.join(osp.basename(parent_path), child_path)\n parent_path = osp.dirname(parent_path)\n\n # Start pylint\n # Ensure we use the python and pylint associated with the running epylint\n run_cmd = \"import sys; from pylint.lint import Run; Run(sys.argv[1:])\"\n cmd = (\n [sys.executable, \"-c\", run_cmd]\n + [\n \"--msg-template\",\n \"{path}:{line}: {category} ({msg_id}, {symbol}, {obj}) {msg}\",\n \"-r\",\n \"n\",\n child_path,\n ]\n + list(options)\n )\n process = Popen(\n cmd, stdout=PIPE, cwd=parent_path, env=_get_env(), universal_newlines=True\n )\n\n for line in process.stdout:\n # remove pylintrc warning\n if line.startswith(\"No config file found\"):\n continue\n\n # modify the file name thats output to reverse the path traversal we made\n parts = line.split(\":\")\n if parts and parts[0] == child_path:\n line = \":\".join([filename] + parts[1:])\n print(line, end=\" \")\n\n process.wait()\n return process.returncode", "response": "Pylint the given file and return a new Emacs object."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef py_run(command_options=\"\", return_std=False, stdout=None, stderr=None):\n # Detect if we use Python as executable or not, else default to `python`\n executable = sys.executable if \"python\" in sys.executable else \"python\"\n\n # Create command line to call pylint\n epylint_part = [executable, \"-c\", \"from pylint import epylint;epylint.Run()\"]\n options = shlex.split(command_options, posix=not sys.platform.startswith(\"win\"))\n cli = epylint_part + options\n\n # Providing standard output and/or error if not set\n if stdout is None:\n if return_std:\n stdout = PIPE\n else:\n stdout = sys.stdout\n if stderr is None:\n if return_std:\n stderr = PIPE\n else:\n stderr = sys.stderr\n # Call pylint in a subprocess\n process = Popen(\n cli,\n shell=False,\n stdout=stdout,\n stderr=stderr,\n env=_get_env(),\n universal_newlines=True,\n )\n proc_stdout, proc_stderr = process.communicate()\n # Return standard output and error\n if return_std:\n return StringIO(proc_stdout), StringIO(proc_stderr)\n return None", "response": "Run pylint from python\n "} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ntransforms filename into storedir basename and target.", "response": "def target_info_from_filename(filename):\n \"\"\"Transforms /some/path/foo.png into ('/some/path', 'foo.png', 'png').\"\"\"\n basename = osp.basename(filename)\n storedir = osp.dirname(osp.abspath(filename))\n target = filename.split(\".\")[-1]\n return storedir, basename, target"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngives a dictionary representing an ordered graph return a list of detected cycles", "response": "def get_cycles(graph_dict, vertices=None):\n \"\"\"given a dictionary representing an ordered graph (i.e. key are vertices\n and values is a list of destination vertices representing edges), return a\n list of detected cycles\n \"\"\"\n if not graph_dict:\n return ()\n result = []\n if vertices is None:\n vertices = graph_dict.keys()\n for vertice in vertices:\n _get_cycles(graph_dict, [], set(), result, vertice)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngenerating a graph file.", "response": "def generate(self, outputfile=None, dotfile=None, mapfile=None):\n \"\"\"Generates a graph file.\n\n :param str outputfile: filename and path [defaults to graphname.png]\n :param str dotfile: filename and path [defaults to graphname.dot]\n :param str mapfile: filename and path\n\n :rtype: str\n :return: a path to the generated file\n \"\"\"\n import subprocess # introduced in py 2.4\n\n name = self.graphname\n if not dotfile:\n # if 'outputfile' is a dot file use it as 'dotfile'\n if outputfile and outputfile.endswith(\".dot\"):\n dotfile = outputfile\n else:\n dotfile = \"%s.dot\" % name\n if outputfile is not None:\n storedir, _, target = target_info_from_filename(outputfile)\n if target != \"dot\":\n pdot, dot_sourcepath = tempfile.mkstemp(\".dot\", name)\n os.close(pdot)\n else:\n dot_sourcepath = osp.join(storedir, dotfile)\n else:\n target = \"png\"\n pdot, dot_sourcepath = tempfile.mkstemp(\".dot\", name)\n ppng, outputfile = tempfile.mkstemp(\".png\", name)\n os.close(pdot)\n os.close(ppng)\n pdot = codecs.open(dot_sourcepath, \"w\", encoding=\"utf8\")\n pdot.write(self.source)\n pdot.close()\n if target != \"dot\":\n use_shell = sys.platform == \"win32\"\n if mapfile:\n subprocess.call(\n [\n self.renderer,\n \"-Tcmapx\",\n \"-o\",\n mapfile,\n \"-T\",\n target,\n dot_sourcepath,\n \"-o\",\n outputfile,\n ],\n shell=use_shell,\n )\n else:\n subprocess.call(\n [self.renderer, \"-T\", target, dot_sourcepath, \"-o\", outputfile],\n shell=use_shell,\n )\n os.unlink(dot_sourcepath)\n return outputfile"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nemit an edge from name1 to name2.", "response": "def emit_edge(self, name1, name2, **props):\n \"\"\"emit an edge from to .\n edge properties: see http://www.graphviz.org/doc/info/attrs.html\n \"\"\"\n attrs = ['%s=\"%s\"' % (prop, value) for prop, value in props.items()]\n n_from, n_to = normalize_node_id(name1), normalize_node_id(name2)\n self.emit(\"%s -> %s [%s];\" % (n_from, n_to, \", \".join(sorted(attrs))))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef emit_node(self, name, **props):\n attrs = ['%s=\"%s\"' % (prop, value) for prop, value in props.items()]\n self.emit(\"%s [%s];\" % (normalize_node_id(name), \", \".join(sorted(attrs))))", "response": "emit a node with given properties."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreport raw stats to the table", "response": "def report_raw_stats(sect, stats, _):\n \"\"\"calculate percentage of code / doc / comment / empty\n \"\"\"\n total_lines = stats[\"total_lines\"]\n if not total_lines:\n raise EmptyReportError()\n sect.description = \"%s lines have been analyzed\" % total_lines\n lines = (\"type\", \"number\", \"%\", \"previous\", \"difference\")\n for node_type in (\"code\", \"docstring\", \"comment\", \"empty\"):\n key = node_type + \"_lines\"\n total = stats[key]\n percent = float(total * 100) / total_lines\n lines += (node_type, str(total), \"%.2f\" % percent, \"NC\", \"NC\")\n sect.append(Table(children=lines, cols=5, rheaders=1))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn the line type", "response": "def get_type(tokens, start_index):\n \"\"\"return the line type : docstring, comment, code, empty\"\"\"\n i = start_index\n tok_type = tokens[i][0]\n start = tokens[i][2]\n pos = start\n line_type = None\n while i < len(tokens) and tokens[i][2][0] == start[0]:\n tok_type = tokens[i][0]\n pos = tokens[i][3]\n if line_type is None:\n if tok_type == tokenize.STRING:\n line_type = \"docstring_lines\"\n elif tok_type == tokenize.COMMENT:\n line_type = \"comment_lines\"\n elif tok_type in JUNK:\n pass\n else:\n line_type = \"code_lines\"\n i += 1\n if line_type is None:\n line_type = \"empty_lines\"\n elif i < len(tokens) and tokens[i][0] == tokenize.NEWLINE:\n i += 1\n return i, pos[0] - start[0] + 1, line_type"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nformatting an options section using as ReST formatted output", "response": "def _rest_format_section(stream, section, options, doc=None):\n \"\"\"format an options section using as ReST formatted output\"\"\"\n if section:\n print(\"%s\\n%s\" % (section, \"'\" * len(section)), file=stream)\n if doc:\n print(normalize_text(doc, line_len=79, indent=\"\"), file=stream)\n print(file=stream)\n for optname, optdict, value in options:\n help_opt = optdict.get(\"help\")\n print(\":%s:\" % optname, file=stream)\n if help_opt:\n help_opt = normalize_text(help_opt, line_len=79, indent=\" \")\n print(help_opt, file=stream)\n if value:\n value = str(_format_option_value(optdict, value))\n print(file=stream)\n print(\" Default: ``%s``\" % value.replace(\"`` \", \"```` ``\"), file=stream)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _register_by_id_managed_msg(self, msgid, line, is_disabled=True):\n try:\n message_definitions = self.msgs_store.get_message_definitions(msgid)\n for message_definition in message_definitions:\n if msgid == message_definition.msgid:\n MessagesHandlerMixIn.__by_id_managed_msgs.append(\n (\n self.current_name,\n message_definition.msgid,\n message_definition.symbol,\n line,\n is_disabled,\n )\n )\n except UnknownMessageError:\n pass", "response": "Register the message with the user if the msgid is a numeric one then inform the user that it could furnish instead a symbolic msgid."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef disable(self, msgid, scope=\"package\", line=None, ignore_unknown=False):\n self._set_msg_status(\n msgid, enable=False, scope=scope, line=line, ignore_unknown=ignore_unknown\n )\n self._register_by_id_managed_msg(msgid, line)", "response": "disable a managed message"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget the message symbol of the given message id", "response": "def _message_symbol(self, msgid):\n \"\"\"Get the message symbol of the given message id\n\n Return the original message id if the message does not\n exist.\n \"\"\"\n try:\n return [md.symbol for md in self.msgs_store.get_message_definitions(msgid)]\n except UnknownMessageError:\n return msgid"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_message_state_scope(self, msgid, line=None, confidence=UNDEFINED):\n if self.config.confidence and confidence.name not in self.config.confidence:\n return MSG_STATE_CONFIDENCE\n try:\n if line in self.file_state._module_msgs_state[msgid]:\n return MSG_STATE_SCOPE_MODULE\n except (KeyError, TypeError):\n return MSG_STATE_SCOPE_CONFIG\n return None", "response": "Returns the scope at which a message was enabled or disabled."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning true if the message is enabled for the given message id", "response": "def is_message_enabled(self, msg_descr, line=None, confidence=None):\n \"\"\"return true if the message associated to the given message id is\n enabled\n\n msgid may be either a numeric or symbolic message id.\n \"\"\"\n if self.config.confidence and confidence:\n if confidence.name not in self.config.confidence:\n return False\n try:\n message_definitions = self.msgs_store.get_message_definitions(msg_descr)\n msgids = [md.msgid for md in message_definitions]\n except UnknownMessageError:\n # The linter checks for messages that are not registered\n # due to version mismatch, just treat them as message IDs\n # for now.\n msgids = [msg_descr]\n for msgid in msgids:\n if self.is_one_message_enabled(msgid, line):\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadding a message given by ID or name.", "response": "def add_message(\n self,\n msg_descr,\n line=None,\n node=None,\n args=None,\n confidence=UNDEFINED,\n col_offset=None,\n ):\n \"\"\"Adds a message given by ID or name.\n\n If provided, the message string is expanded using args.\n\n AST checkers must provide the node argument (but may optionally\n provide line if the line number is different), raw and token checkers\n must provide the line argument.\n \"\"\"\n message_definitions = self.msgs_store.get_message_definitions(msg_descr)\n for message_definition in message_definitions:\n self.add_one_message(\n message_definition, line, node, args, confidence, col_offset\n )"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef print_full_documentation(self, stream=None):\n if not stream:\n stream = sys.stdout\n\n print(\"Pylint global options and switches\", file=stream)\n print(\"----------------------------------\", file=stream)\n print(\"\", file=stream)\n print(\"Pylint provides global options and switches.\", file=stream)\n print(\"\", file=stream)\n\n by_checker = {}\n for checker in self.get_checkers():\n if checker.name == \"master\":\n if checker.options:\n for section, options in checker.options_by_section():\n if section is None:\n title = \"General options\"\n else:\n title = \"%s options\" % section.capitalize()\n print(title, file=stream)\n print(\"~\" * len(title), file=stream)\n _rest_format_section(stream, None, options)\n print(\"\", file=stream)\n else:\n name = checker.name\n try:\n by_checker[name][\"options\"] += checker.options_and_values()\n by_checker[name][\"msgs\"].update(checker.msgs)\n by_checker[name][\"reports\"] += checker.reports\n except KeyError:\n by_checker[name] = {\n \"options\": list(checker.options_and_values()),\n \"msgs\": dict(checker.msgs),\n \"reports\": list(checker.reports),\n }\n\n print(\"Pylint checkers' options and switches\", file=stream)\n print(\"-------------------------------------\", file=stream)\n print(\"\", file=stream)\n print(\"Pylint checkers can provide three set of features:\", file=stream)\n print(\"\", file=stream)\n print(\"* options that control their execution,\", file=stream)\n print(\"* messages that they can raise,\", file=stream)\n print(\"* reports that they can generate.\", file=stream)\n print(\"\", file=stream)\n print(\"Below is a list of all checkers and their features.\", file=stream)\n print(\"\", file=stream)\n\n for checker, info in sorted(by_checker.items()):\n self._print_checker_doc(checker, info, stream=stream)", "response": "output a full documentation in ReST format"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning the length of the indentation on the given token s line.", "response": "def _get_indent_length(line):\n \"\"\"Return the length of the indentation on the given token's line.\"\"\"\n result = 0\n for char in line:\n if char == \" \":\n result += 1\n elif char == \"\\t\":\n result += _TAB_LENGTH\n else:\n break\n return result"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning a line with |s for each of the positions in the given lists.", "response": "def _get_indent_hint_line(bar_positions, bad_position):\n \"\"\"Return a line with |s for each of the positions in the given lists.\"\"\"\n if not bar_positions:\n return (\"\", \"\")\n # TODO tabs should not be replaced by some random (8) number of spaces\n bar_positions = [_get_indent_length(indent) for indent in bar_positions]\n bad_position = _get_indent_length(bad_position)\n delta_message = \"\"\n markers = [(pos, \"|\") for pos in bar_positions]\n if len(markers) == 1:\n # if we have only one marker we'll provide an extra hint on how to fix\n expected_position = markers[0][0]\n delta = abs(expected_position - bad_position)\n direction = \"add\" if expected_position > bad_position else \"remove\"\n delta_message = _CONTINUATION_HINT_MESSAGE % (\n direction,\n delta,\n \"s\" if delta > 1 else \"\",\n )\n markers.append((bad_position, \"^\"))\n markers.sort()\n line = [\" \"] * (markers[-1][0] + 1)\n for position, marker in markers:\n line[position] = marker\n return (\"\".join(line), delta_message)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns an indentation string for hanging indentation for the given token.", "response": "def token_indent(self, idx):\n \"\"\"Get an indentation string for hanging indentation, consisting of the line-indent plus\n a number of spaces to fill up to the column of this token.\n\n e.g. the token indent for foo\n in \"print(foo)\"\n is \" \"\n \"\"\"\n line_indent = self.line_indent(idx)\n return line_indent + \" \" * (self.start_col(idx) - len(line_indent))"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef handle_line_start(self, pos):\n if self._line_start > -1:\n return\n\n check_token_position = pos\n if self._tokens.token(pos) == _ASYNC_TOKEN:\n check_token_position += 1\n self._is_block_opener = (\n self._tokens.token(check_token_position) in _CONTINUATION_BLOCK_OPENERS\n )\n self._line_start = pos", "response": "Record the first non - junk token at the start of a line."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning the valid offsets for the token at the given position.", "response": "def get_valid_indentations(self, idx):\n \"\"\"Returns the valid offsets for the token at the given position.\"\"\"\n # The closing brace on a dict or the 'for' in a dict comprehension may\n # reset two indent levels because the dict value is ended implicitly\n stack_top = -1\n if (\n self._tokens.token(idx) in (\"}\", \"for\")\n and self._cont_stack[-1].token == \":\"\n ):\n stack_top = -2\n indent = self._cont_stack[stack_top]\n if self._tokens.token(idx) in _CLOSING_BRACKETS:\n valid_indentations = indent.valid_outdent_strings\n else:\n valid_indentations = indent.valid_continuation_strings\n return indent, valid_indentations.copy()"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _hanging_indent_after_bracket(self, bracket, position):\n indentation = self._tokens.line_indent(position)\n if (\n self._is_block_opener\n and self._continuation_string == self._block_indent_string\n ):\n return _ContinuedIndent(\n HANGING_BLOCK,\n bracket,\n position,\n _Indentations(indentation + self._continuation_string, indentation),\n _BeforeBlockIndentations(\n indentation + self._continuation_string,\n indentation + self._continuation_string * 2,\n ),\n )\n if bracket == \":\":\n # If the dict key was on the same line as the open brace, the new\n # correct indent should be relative to the key instead of the\n # current indent level\n paren_align = self._cont_stack[-1].valid_outdent_strings\n next_align = self._cont_stack[-1].valid_continuation_strings.copy()\n next_align_keys = list(next_align.keys())\n next_align[next_align_keys[0] + self._continuation_string] = True\n # Note that the continuation of\n # d = {\n # 'a': 'b'\n # 'c'\n # }\n # is handled by the special-casing for hanging continued string indents.\n return _ContinuedIndent(\n HANGING_DICT_VALUE, bracket, position, paren_align, next_align\n )\n return _ContinuedIndent(\n HANGING,\n bracket,\n position,\n _Indentations(indentation, indentation + self._continuation_string),\n _Indentations(indentation + self._continuation_string),\n )", "response": "Extracts indentation information for a hanging indent after a bracket."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _continuation_inside_bracket(self, bracket, position):\n indentation = self._tokens.line_indent(position)\n token_indent = self._tokens.token_indent(position)\n next_token_indent = self._tokens.token_indent(position + 1)\n if (\n self._is_block_opener\n and next_token_indent == indentation + self._block_indent_string\n ):\n return _ContinuedIndent(\n CONTINUED_BLOCK,\n bracket,\n position,\n _Indentations(token_indent),\n _BeforeBlockIndentations(\n next_token_indent, next_token_indent + self._continuation_string\n ),\n )\n return _ContinuedIndent(\n CONTINUED,\n bracket,\n position,\n _Indentations(token_indent, next_token_indent),\n _Indentations(next_token_indent),\n )", "response": "Extracts indentation information for a continued indent."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\npush a new token onto the stack.", "response": "def push_token(self, token, position):\n \"\"\"Pushes a new token for continued indentation on the stack.\n\n Tokens that can modify continued indentation offsets are:\n * opening brackets\n * 'lambda'\n * : inside dictionaries\n\n push_token relies on the caller to filter out those\n interesting tokens.\n\n :param int token: The concrete token\n :param int position: The position of the token in the stream.\n \"\"\"\n if _token_followed_by_eol(self._tokens, position):\n self._cont_stack.append(self._hanging_indent_after_bracket(token, position))\n else:\n self._cont_stack.append(self._continuation_inside_bracket(token, position))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nprocess a new line if necessary", "response": "def new_line(self, tokens, line_end, line_start):\n \"\"\"a new line has been encountered, process it if necessary\"\"\"\n if _last_token_on_line_is(tokens, line_end, \";\"):\n self.add_message(\"unnecessary-semicolon\", line=tokens.start_line(line_end))\n\n line_num = tokens.start_line(line_start)\n line = tokens.line(line_start)\n if tokens.type(line_start) not in _JUNK_TOKENS:\n self._lines[line_num] = line.split(\"\\n\")[0]\n self.check_lines(line, line_num)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks that there are not unnecessary parens after a keyword.", "response": "def _check_keyword_parentheses(self, tokens, start):\n \"\"\"Check that there are not unnecessary parens after a keyword.\n\n Parens are unnecessary if there is exactly one balanced outer pair on a\n line, and it is followed by a colon, and contains no commas (i.e. is not a\n tuple).\n\n Args:\n tokens: list of Tokens; the entire list of Tokens.\n start: int; the position of the keyword in the token list.\n \"\"\"\n # If the next token is not a paren, we're fine.\n if self._inside_brackets(\":\") and tokens[start][1] == \"for\":\n self._pop_token()\n if tokens[start + 1][1] != \"(\":\n return\n\n found_and_or = False\n depth = 0\n keyword_token = str(tokens[start][1])\n line_num = tokens[start][2][0]\n\n for i in range(start, len(tokens) - 1):\n token = tokens[i]\n\n # If we hit a newline, then assume any parens were for continuation.\n if token[0] == tokenize.NL:\n return\n\n if token[1] == \"(\":\n depth += 1\n elif token[1] == \")\":\n depth -= 1\n if depth:\n continue\n # ')' can't happen after if (foo), since it would be a syntax error.\n if tokens[i + 1][1] in (\":\", \")\", \"]\", \"}\", \"in\") or tokens[i + 1][\n 0\n ] in (tokenize.NEWLINE, tokenize.ENDMARKER, tokenize.COMMENT):\n # The empty tuple () is always accepted.\n if i == start + 2:\n return\n if keyword_token == \"not\":\n if not found_and_or:\n self.add_message(\n \"superfluous-parens\", line=line_num, args=keyword_token\n )\n elif keyword_token in (\"return\", \"yield\"):\n self.add_message(\n \"superfluous-parens\", line=line_num, args=keyword_token\n )\n elif keyword_token not in self._keywords_with_parens:\n if not found_and_or:\n self.add_message(\n \"superfluous-parens\", line=line_num, args=keyword_token\n )\n return\n elif depth == 1:\n # This is a tuple, which is always acceptable.\n if token[1] == \",\":\n return\n # 'and' and 'or' are the only boolean operators with lower precedence\n # than 'not', so parens are only required when they are found.\n if token[1] in (\"and\", \"or\"):\n found_and_or = True\n # A yield inside an expression must always be in parentheses,\n # quit early without error.\n elif token[1] == \"yield\":\n return\n # A generator expression always has a 'for' token in it, and\n # the 'for' token is only legal inside parens when it is in a\n # generator expression. The parens are necessary here, so bail\n # without an error.\n elif token[1] == \"for\":\n return"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nextending check of PEP - 484 type hint presence", "response": "def _has_valid_type_annotation(self, tokens, i):\n \"\"\"Extended check of PEP-484 type hint presence\"\"\"\n if not self._inside_brackets(\"(\"):\n return False\n # token_info\n # type string start end line\n # 0 1 2 3 4\n bracket_level = 0\n for token in tokens[i - 1 :: -1]:\n if token[1] == \":\":\n return True\n if token[1] == \"(\":\n return False\n if token[1] == \"]\":\n bracket_level += 1\n elif token[1] == \"[\":\n bracket_level -= 1\n elif token[1] == \",\":\n if not bracket_level:\n return False\n elif token[1] in (\".\", \"...\"):\n continue\n elif token[0] not in (tokenize.NAME, tokenize.STRING, tokenize.NL):\n return False\n return False"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck the spacing of a single equals sign.", "response": "def _check_equals_spacing(self, tokens, i):\n \"\"\"Check the spacing of a single equals sign.\"\"\"\n if self._has_valid_type_annotation(tokens, i):\n self._check_space(tokens, i, (_MUST, _MUST))\n elif self._inside_brackets(\"(\") or self._inside_brackets(\"lambda\"):\n self._check_space(tokens, i, (_MUST_NOT, _MUST_NOT))\n else:\n self._check_space(tokens, i, (_MUST, _MUST))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck that a binary operator is surrounded by exactly one space.", "response": "def _check_surrounded_by_space(self, tokens, i):\n \"\"\"Check that a binary operator is surrounded by exactly one space.\"\"\"\n self._check_space(tokens, i, (_MUST, _MUST))"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nprocesses tokens and search for the entry in the cache.", "response": "def process_tokens(self, tokens):\n \"\"\"process tokens and search for :\n\n _ non strict indentation (i.e. not always using the parameter as\n indent unit)\n _ too long lines (i.e. longer than )\n _ optionally bad construct (if given, bad_construct must be a compiled\n regular expression).\n \"\"\"\n self._bracket_stack = [None]\n indents = [0]\n check_equal = False\n line_num = 0\n self._lines = {}\n self._visited_lines = {}\n token_handlers = self._prepare_token_dispatcher()\n self._last_line_ending = None\n last_blank_line_num = 0\n\n self._current_line = ContinuedLineState(tokens, self.config)\n for idx, (tok_type, token, start, _, line) in enumerate(tokens):\n if start[0] != line_num:\n line_num = start[0]\n # A tokenizer oddity: if an indented line contains a multi-line\n # docstring, the line member of the INDENT token does not contain\n # the full line; therefore we check the next token on the line.\n if tok_type == tokenize.INDENT:\n self.new_line(TokenWrapper(tokens), idx - 1, idx + 1)\n else:\n self.new_line(TokenWrapper(tokens), idx - 1, idx)\n\n if tok_type == tokenize.NEWLINE:\n # a program statement, or ENDMARKER, will eventually follow,\n # after some (possibly empty) run of tokens of the form\n # (NL | COMMENT)* (INDENT | DEDENT+)?\n # If an INDENT appears, setting check_equal is wrong, and will\n # be undone when we see the INDENT.\n check_equal = True\n self._process_retained_warnings(TokenWrapper(tokens), idx)\n self._current_line.next_logical_line()\n self._check_line_ending(token, line_num)\n elif tok_type == tokenize.INDENT:\n check_equal = False\n self.check_indent_level(token, indents[-1] + 1, line_num)\n indents.append(indents[-1] + 1)\n elif tok_type == tokenize.DEDENT:\n # there's nothing we need to check here! what's important is\n # that when the run of DEDENTs ends, the indentation of the\n # program statement (or ENDMARKER) that triggered the run is\n # equal to what's left at the top of the indents stack\n check_equal = True\n if len(indents) > 1:\n del indents[-1]\n elif tok_type == tokenize.NL:\n if not line.strip(\"\\r\\n\"):\n last_blank_line_num = line_num\n self._check_continued_indentation(TokenWrapper(tokens), idx + 1)\n self._current_line.next_physical_line()\n elif tok_type not in (tokenize.COMMENT, tokenize.ENCODING):\n self._current_line.handle_line_start(idx)\n # This is the first concrete token following a NEWLINE, so it\n # must be the first token of the next program statement, or an\n # ENDMARKER; the \"line\" argument exposes the leading whitespace\n # for this statement; in the case of ENDMARKER, line is an empty\n # string, so will properly match the empty string with which the\n # \"indents\" stack was seeded\n if check_equal:\n check_equal = False\n self.check_indent_level(line, indents[-1], line_num)\n\n if tok_type == tokenize.NUMBER and token.endswith(\"l\"):\n self.add_message(\"lowercase-l-suffix\", line=line_num)\n\n try:\n handler = token_handlers[token]\n except KeyError:\n pass\n else:\n handler(tokens, idx)\n\n line_num -= 1 # to be ok with \"wc -l\"\n if line_num > self.config.max_module_lines:\n # Get the line where the too-many-lines (or its message id)\n # was disabled or default to 1.\n message_definition = self.linter.msgs_store.get_message_definitions(\n \"too-many-lines\"\n )[0]\n names = (message_definition.msgid, \"too-many-lines\")\n line = next(filter(None, map(self.linter._pragma_lineno.get, names)), 1)\n self.add_message(\n \"too-many-lines\",\n args=(line_num, self.config.max_module_lines),\n line=line,\n )\n\n # See if there are any trailing lines. Do not complain about empty\n # files like __init__.py markers.\n if line_num == last_blank_line_num and line_num > 0:\n self.add_message(\"trailing-newlines\", line=line_num)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef visit_default(self, node):\n if not node.is_statement:\n return\n if not node.root().pure_python:\n return # XXX block visit of child nodes\n prev_sibl = node.previous_sibling()\n if prev_sibl is not None:\n prev_line = prev_sibl.fromlineno\n else:\n # The line on which a finally: occurs in a try/finally\n # is not directly represented in the AST. We infer it\n # by taking the last line of the body and adding 1, which\n # should be the line of finally:\n if (\n isinstance(node.parent, nodes.TryFinally)\n and node in node.parent.finalbody\n ):\n prev_line = node.parent.body[0].tolineno + 1\n else:\n prev_line = node.parent.statement().fromlineno\n line = node.fromlineno\n assert line, node\n if prev_line == line and self._visited_lines.get(line) != 2:\n self._check_multi_statement_line(node, line)\n return\n if line in self._visited_lines:\n return\n try:\n tolineno = node.blockstart_tolineno\n except AttributeError:\n tolineno = node.tolineno\n assert tolineno, node\n lines = []\n for line in range(line, tolineno + 1):\n self._visited_lines[line] = 1\n try:\n lines.append(self._lines[line].rstrip())\n except KeyError:\n lines.append(\"\")", "response": "check the line number and check it if not yet done"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck for lines containing multiple statements.", "response": "def _check_multi_statement_line(self, node, line):\n \"\"\"Check for lines containing multiple statements.\"\"\"\n # Do not warn about multiple nested context managers\n # in with statements.\n if isinstance(node, nodes.With):\n return\n # For try... except... finally..., the two nodes\n # appear to be on the same line due to how the AST is built.\n if isinstance(node, nodes.TryExcept) and isinstance(\n node.parent, nodes.TryFinally\n ):\n return\n if (\n isinstance(node.parent, nodes.If)\n and not node.parent.orelse\n and self.config.single_line_if_stmt\n ):\n return\n if (\n isinstance(node.parent, nodes.ClassDef)\n and len(node.parent.body) == 1\n and self.config.single_line_class_stmt\n ):\n return\n self.add_message(\"multiple-statements\", node=node)\n self._visited_lines[line] = 2"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks lines have less than a maximum number of characters", "response": "def check_lines(self, lines, i):\n \"\"\"check lines have less than a maximum number of characters\n \"\"\"\n max_chars = self.config.max_line_length\n ignore_long_line = self.config.ignore_long_lines\n\n def check_line(line, i):\n if not line.endswith(\"\\n\"):\n self.add_message(\"missing-final-newline\", line=i)\n else:\n # exclude \\f (formfeed) from the rstrip\n stripped_line = line.rstrip(\"\\t\\n\\r\\v \")\n if not stripped_line and _EMPTY_LINE in self.config.no_space_check:\n # allow empty lines\n pass\n elif line[len(stripped_line) :] not in (\"\\n\", \"\\r\\n\"):\n self.add_message(\n \"trailing-whitespace\", line=i, col_offset=len(stripped_line)\n )\n # Don't count excess whitespace in the line length.\n line = stripped_line\n mobj = OPTION_RGX.search(line)\n if mobj and \"=\" in line:\n front_of_equal, _, back_of_equal = mobj.group(1).partition(\"=\")\n if front_of_equal.strip() == \"disable\":\n if \"line-too-long\" in {\n _msg_id.strip() for _msg_id in back_of_equal.split(\",\")\n }:\n return None\n line = line.rsplit(\"#\", 1)[0].rstrip()\n\n if len(line) > max_chars and not ignore_long_line.search(line):\n self.add_message(\"line-too-long\", line=i, args=(len(line), max_chars))\n return i + 1\n\n unsplit_ends = {\n \"\\v\",\n \"\\x0b\",\n \"\\f\",\n \"\\x0c\",\n \"\\x1c\",\n \"\\x1d\",\n \"\\x1e\",\n \"\\x85\",\n \"\\u2028\",\n \"\\u2029\",\n }\n unsplit = []\n for line in lines.splitlines(True):\n if line[-1] in unsplit_ends:\n unsplit.append(line)\n continue\n\n if unsplit:\n unsplit.append(line)\n line = \"\".join(unsplit)\n unsplit = []\n\n i = check_line(line, i)\n if i is None:\n break\n\n if unsplit:\n check_line(\"\".join(unsplit), i)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck the indent level of the string", "response": "def check_indent_level(self, string, expected, line_num):\n \"\"\"return the indent level of the string\n \"\"\"\n indent = self.config.indent_string\n if indent == \"\\\\t\": # \\t is not interpreted in the configuration file\n indent = \"\\t\"\n level = 0\n unit_size = len(indent)\n while string[:unit_size] == indent:\n string = string[unit_size:]\n level += 1\n suppl = \"\"\n while string and string[0] in \" \\t\":\n if string[0] != indent[0]:\n if string[0] == \"\\t\":\n args = (\"tab\", \"space\")\n else:\n args = (\"space\", \"tab\")\n self.add_message(\"mixed-indentation\", args=args, line=line_num)\n return level\n suppl += string[0]\n string = string[1:]\n if level != expected or suppl:\n i_type = \"spaces\"\n if indent[0] == \"\\t\":\n i_type = \"tabs\"\n self.add_message(\n \"bad-indentation\",\n line=line_num,\n args=(level * unit_size + len(suppl), i_type, expected * unit_size),\n )\n return None"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _in_iterating_context(node):\n parent = node.parent\n # Since a call can't be the loop variant we only need to know if the node's\n # parent is a 'for' loop to know it's being used as the iterator for the\n # loop.\n if isinstance(parent, astroid.For):\n return True\n # Need to make sure the use of the node is in the iterator part of the\n # comprehension.\n if isinstance(parent, astroid.Comprehension):\n if parent.iter == node:\n return True\n # Various built-ins can take in an iterable or list and lead to the same\n # value.\n elif isinstance(parent, astroid.Call):\n if isinstance(parent.func, astroid.Name):\n parent_scope = parent.func.lookup(parent.func.name)[0]\n if _is_builtin(parent_scope) and parent.func.name in _ACCEPTS_ITERATOR:\n return True\n elif isinstance(parent.func, astroid.Attribute):\n if parent.func.attrname in ATTRIBUTES_ACCEPTS_ITERATOR:\n return True\n\n inferred = utils.safe_infer(parent.func)\n if inferred:\n if inferred.qname() in _BUILTIN_METHOD_ACCEPTS_ITERATOR:\n return True\n root = inferred.root()\n if root and root.name == \"itertools\":\n return True\n # If the call is in an unpacking, there's no need to warn,\n # since it can be considered iterating.\n elif isinstance(parent, astroid.Assign) and isinstance(\n parent.targets[0], (astroid.List, astroid.Tuple)\n ):\n if len(parent.targets[0].elts) > 1:\n return True\n # If the call is in a containment check, we consider that to\n # be an iterating context\n elif (\n isinstance(parent, astroid.Compare)\n and len(parent.ops) == 1\n and parent.ops[0][0] == \"in\"\n ):\n return True\n # Also if it's an `yield from`, that's fair\n elif isinstance(parent, astroid.YieldFrom):\n return True\n if isinstance(parent, astroid.Starred):\n return True\n return False", "response": "Check if the node is being used as an iterator."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _is_conditional_import(node):\n parent = node.parent\n return isinstance(\n parent, (astroid.TryExcept, astroid.ExceptHandler, astroid.If, astroid.IfExp)\n )", "response": "Checks if an import node is in the context of a conditional."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef visit_name(self, node):\n found_node, _ = node.lookup(node.name)\n if not _is_builtin(found_node):\n return\n if node.name not in self._bad_builtins:\n return\n if node_ignores_exception(node) or isinstance(\n find_try_except_wrapper_node(node), astroid.ExceptHandler\n ):\n return\n\n message = node.name.lower() + \"-builtin\"\n self.add_message(message, node=node)", "response": "Detect when a bad built - in is referenced."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef visit_subscript(self, node):\n try:\n for inferred in node.value.infer():\n if not isinstance(inferred, astroid.Instance):\n continue\n if utils.inherit_from_std_ex(inferred):\n self.add_message(\"indexing-exception\", node=node)\n except astroid.InferenceError:\n return", "response": "Look for indexing exceptions."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nlooking for removed attributes", "response": "def visit_attribute(self, node):\n \"\"\"Look for removed attributes\"\"\"\n if node.attrname == \"xreadlines\":\n self.add_message(\"xreadlines-attribute\", node=node)\n return\n\n exception_message = \"message\"\n try:\n for inferred in node.expr.infer():\n if isinstance(inferred, astroid.Instance) and utils.inherit_from_std_ex(\n inferred\n ):\n if node.attrname == exception_message:\n\n # Exceptions with .message clearly defined are an exception\n if exception_message in inferred.instance_attrs:\n continue\n self.add_message(\"exception-message-attribute\", node=node)\n if isinstance(inferred, astroid.Module):\n self._warn_if_deprecated(\n node, inferred.name, {node.attrname}, report_on_modules=False\n )\n except astroid.InferenceError:\n return"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef visit_excepthandler(self, node):\n\n def _is_used_in_except_block(node):\n scope = node.scope()\n current = node\n while (\n current\n and current != scope\n and not isinstance(current, astroid.ExceptHandler)\n ):\n current = current.parent\n return isinstance(current, astroid.ExceptHandler) and current.type != node\n\n if isinstance(node.name, (astroid.Tuple, astroid.List)):\n self.add_message(\"unpacking-in-except\", node=node)\n return\n\n if not node.name:\n return\n\n # Find any names\n scope = node.parent.scope()\n scope_names = scope.nodes_of_class(astroid.Name, skip_klass=astroid.FunctionDef)\n scope_names = list(scope_names)\n potential_leaked_names = [\n scope_name\n for scope_name in scope_names\n if scope_name.name == node.name.name\n and scope_name.lineno > node.lineno\n and not _is_used_in_except_block(scope_name)\n ]\n reassignments_for_same_name = {\n assign_name.lineno\n for assign_name in scope.nodes_of_class(\n astroid.AssignName, skip_klass=astroid.FunctionDef\n )\n if assign_name.name == node.name.name\n }\n for leaked_name in potential_leaked_names:\n if any(\n node.lineno < elem < leaked_name.lineno\n for elem in reassignments_for_same_name\n ):\n continue\n self.add_message(\"exception-escape\", node=leaked_name)", "response": "Visit an except handler block and check for exception unpacking."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nvisiting a raise statement and check for raising strings or old - raise - syntax.", "response": "def visit_raise(self, node):\n \"\"\"Visit a raise statement and check for raising\n strings or old-raise-syntax.\n \"\"\"\n\n # Ignore empty raise.\n if node.exc is None:\n return\n expr = node.exc\n if self._check_raise_value(node, expr):\n return\n try:\n value = next(astroid.unpack_infer(expr))\n except astroid.InferenceError:\n return\n self._check_raise_value(node, value)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef find_pylintrc():\n # is there a pylint rc file in the current directory ?\n if os.path.exists(\"pylintrc\"):\n return os.path.abspath(\"pylintrc\")\n if os.path.exists(\".pylintrc\"):\n return os.path.abspath(\".pylintrc\")\n if os.path.isfile(\"__init__.py\"):\n curdir = os.path.abspath(os.getcwd())\n while os.path.isfile(os.path.join(curdir, \"__init__.py\")):\n curdir = os.path.abspath(os.path.join(curdir, \"..\"))\n if os.path.isfile(os.path.join(curdir, \"pylintrc\")):\n return os.path.join(curdir, \"pylintrc\")\n if os.path.isfile(os.path.join(curdir, \".pylintrc\")):\n return os.path.join(curdir, \".pylintrc\")\n if \"PYLINTRC\" in os.environ and os.path.exists(os.environ[\"PYLINTRC\"]):\n pylintrc = os.environ[\"PYLINTRC\"]\n else:\n user_home = os.path.expanduser(\"~\")\n if user_home in (\"~\", \"/root\"):\n pylintrc = \".pylintrc\"\n else:\n pylintrc = os.path.join(user_home, \".pylintrc\")\n if not os.path.isfile(pylintrc):\n pylintrc = os.path.join(user_home, \".config\", \"pylintrc\")\n if not os.path.isfile(pylintrc):\n if os.path.isfile(\"/etc/pylintrc\"):\n pylintrc = \"/etc/pylintrc\"\n else:\n pylintrc = None\n return pylintrc", "response": "search the pylint rc file and return its path if it find it else None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _validate(value, optdict, name=\"\"):\n try:\n _type = optdict[\"type\"]\n except KeyError:\n # FIXME\n return value\n return _call_validator(_type, optdict, name, value)", "response": "validate a value for an option according to its type"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\npatch OptionParser. expand_default with custom behaviour", "response": "def _expand_default(self, option):\n \"\"\"Patch OptionParser.expand_default with custom behaviour\n\n This will handle defaults to avoid overriding values in the\n configuration file.\n \"\"\"\n if self.parser is None or not self.default_tag:\n return option.help\n optname = option._long_opts[0][2:]\n try:\n provider = self.parser.options_manager._all_options[optname]\n except KeyError:\n value = None\n else:\n optdict = provider.get_option_def(optname)\n optname = provider.option_attrname(optname, optdict)\n value = getattr(provider.config, optname, optdict)\n value = utils._format_option_value(optdict, value)\n if value is optparse.NO_DEFAULT or not value:\n value = self.NO_DEFAULT_VALUE\n return option.help.replace(self.default_tag, str(value))"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nregister an options provider", "response": "def register_options_provider(self, provider, own_group=True):\n \"\"\"register an options provider\"\"\"\n assert provider.priority <= 0, \"provider's priority can't be >= 0\"\n for i in range(len(self.options_providers)):\n if provider.priority > self.options_providers[i].priority:\n self.options_providers.insert(i, provider)\n break\n else:\n self.options_providers.append(provider)\n non_group_spec_options = [\n option for option in provider.options if \"group\" not in option[1]\n ]\n groups = getattr(provider, \"option_groups\", ())\n if own_group and non_group_spec_options:\n self.add_option_group(\n provider.name.upper(),\n provider.__doc__,\n non_group_spec_options,\n provider,\n )\n else:\n for opt, optdict in non_group_spec_options:\n self.add_optik_option(provider, self.cmdline_parser, opt, optdict)\n for gname, gdoc in groups:\n gname = gname.upper()\n goptions = [\n option\n for option in provider.options\n if option[1].get(\"group\", \"\").upper() == gname\n ]\n self.add_option_group(gname, gdoc, goptions, provider)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting our personal option definition and return a suitable form for optik", "response": "def optik_option(self, provider, opt, optdict):\n \"\"\"get our personal option definition and return a suitable form for\n use with optik/optparse\n \"\"\"\n optdict = copy.copy(optdict)\n if \"action\" in optdict:\n self._nocallback_options[provider] = opt\n else:\n optdict[\"action\"] = \"callback\"\n optdict[\"callback\"] = self.cb_set_provider_option\n # default is handled here and *must not* be given to optik if you\n # want the whole machinery to work\n if \"default\" in optdict:\n if (\n \"help\" in optdict\n and optdict.get(\"default\") is not None\n and optdict[\"action\"] not in (\"store_true\", \"store_false\")\n ):\n optdict[\"help\"] += \" [current: %default]\"\n del optdict[\"default\"]\n args = [\"--\" + str(opt)]\n if \"short\" in optdict:\n self._short_options[optdict[\"short\"]] = opt\n args.append(\"-\" + optdict[\"short\"])\n del optdict[\"short\"]\n # cleanup option definition dict before giving it to optik\n for key in list(optdict.keys()):\n if key not in self._optik_option_attrs:\n optdict.pop(key)\n return args, optdict"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nset option on the correct option provider", "response": "def global_set_option(self, opt, value):\n \"\"\"set option on the correct option provider\"\"\"\n self._all_options[opt].set_option(opt, value)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef generate_config(self, stream=None, skipsections=(), encoding=None):\n options_by_section = {}\n sections = []\n for provider in self.options_providers:\n for section, options in provider.options_by_section():\n if section is None:\n section = provider.name\n if section in skipsections:\n continue\n options = [\n (n, d, v)\n for (n, d, v) in options\n if d.get(\"type\") is not None and not d.get(\"deprecated\")\n ]\n if not options:\n continue\n if section not in sections:\n sections.append(section)\n alloptions = options_by_section.setdefault(section, [])\n alloptions += options\n stream = stream or sys.stdout\n printed = False\n for section in sections:\n if printed:\n print(\"\\n\", file=stream)\n utils.format_section(\n stream, section.upper(), sorted(options_by_section[section])\n )\n printed = True", "response": "write a configuration file according to the current configuration"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef read_config_file(self, config_file=None, verbose=None):\n helplevel = 1\n while helplevel <= self._maxlevel:\n opt = \"-\".join([\"long\"] * helplevel) + \"-help\"\n if opt in self._all_options:\n break # already processed\n # pylint: disable=unused-argument\n def helpfunc(option, opt, val, p, level=helplevel):\n print(self.help(level))\n sys.exit(0)\n\n helpmsg = \"%s verbose help.\" % \" \".join([\"more\"] * helplevel)\n optdict = {\"action\": \"callback\", \"callback\": helpfunc, \"help\": helpmsg}\n provider = self.options_providers[0]\n self.add_optik_option(provider, self.cmdline_parser, opt, optdict)\n provider.options += ((opt, optdict),)\n helplevel += 1\n if config_file is None:\n config_file = self.config_file\n if config_file is not None:\n config_file = os.path.expanduser(config_file)\n if not os.path.exists(config_file):\n raise IOError(\"The config file {:s} doesn't exist!\".format(config_file))\n\n use_config_file = config_file and os.path.exists(config_file)\n if use_config_file:\n parser = self.cfgfile_parser\n\n # Use this encoding in order to strip the BOM marker, if any.\n with io.open(config_file, \"r\", encoding=\"utf_8_sig\") as fp:\n parser.read_file(fp)\n\n # normalize sections'title\n for sect, values in list(parser._sections.items()):\n if not sect.isupper() and values:\n parser._sections[sect.upper()] = values\n\n if not verbose:\n return\n\n if use_config_file:\n msg = \"Using config file {}\".format(os.path.abspath(config_file))\n else:\n msg = \"No config file found, using default configuration\"\n print(msg, file=sys.stderr)", "response": "read the configuration file and dispatching the values to each options provider."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef load_config_file(self):\n parser = self.cfgfile_parser\n for section in parser.sections():\n for option, value in parser.items(section):\n try:\n self.global_set_option(option, value)\n except (KeyError, optparse.OptionError):\n # TODO handle here undeclared options appearing in the config file\n continue", "response": "dispatch values previously read from a configuration file to each\n options provider"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\noverride configuration according to command line parameters xid return additional arguments xid", "response": "def load_command_line_configuration(self, args=None):\n \"\"\"Override configuration according to command line parameters\n\n return additional arguments\n \"\"\"\n with _patch_optparse():\n if args is None:\n args = sys.argv[1:]\n else:\n args = list(args)\n (options, args) = self.cmdline_parser.parse_args(args=args)\n for provider in self._nocallback_options:\n config = provider.config\n for attr in config.__dict__.keys():\n value = getattr(options, attr, None)\n if value is None:\n continue\n setattr(config, attr, value)\n return args"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef add_help_section(self, title, description, level=0):\n group = optparse.OptionGroup(\n self.cmdline_parser, title=title.capitalize(), description=description\n )\n group.level = level\n self._maxlevel = max(self._maxlevel, level)\n self.cmdline_parser.add_option_group(group)", "response": "add a dummy option section for help purpose"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the usage string for available options", "response": "def help(self, level=0):\n \"\"\"return the usage string for available options \"\"\"\n self.cmdline_parser.formatter.output_level = level\n with _patch_optparse():\n return self.cmdline_parser.format_help()"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ninitializes the provider using default values", "response": "def load_defaults(self):\n \"\"\"initialize the provider using default values\"\"\"\n for opt, optdict in self.options:\n action = optdict.get(\"action\")\n if action != \"callback\":\n # callback action have no default\n if optdict is None:\n optdict = self.get_option_def(opt)\n default = optdict.get(\"default\")\n self.set_option(opt, default, action, optdict)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the config attribute corresponding to opt", "response": "def option_attrname(self, opt, optdict=None):\n \"\"\"get the config attribute corresponding to opt\"\"\"\n if optdict is None:\n optdict = self.get_option_def(opt)\n return optdict.get(\"dest\", opt.replace(\"-\", \"_\"))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning the dictionary defining an option given its name", "response": "def get_option_def(self, opt):\n \"\"\"return the dictionary defining an option given its name\"\"\"\n assert self.options\n for option in self.options:\n if option[0] == opt:\n return option[1]\n raise optparse.OptionError(\n \"no such option %s in section %r\" % (opt, self.name), opt\n )"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef options_by_section(self):\n sections = {}\n for optname, optdict in self.options:\n sections.setdefault(optdict.get(\"group\"), []).append(\n (optname, optdict, self.option_value(optname))\n )\n if None in sections:\n yield None, sections.pop(None)\n for section, options in sorted(sections.items()):\n yield section.upper(), options", "response": "return an iterator on options grouped by section"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndetermines if a BoundMethod AST node represents a method call.", "response": "def is_method_call(func, types=(), methods=()):\n \"\"\"Determines if a BoundMethod node represents a method call.\n\n Args:\n func (astroid.BoundMethod): The BoundMethod AST node to check.\n types (Optional[String]): Optional sequence of caller type names to restrict check.\n methods (Optional[String]): Optional sequence of method names to restrict check.\n\n Returns:\n bool: true if the node represents a method call for the given type and\n method names, False otherwise.\n \"\"\"\n return (\n isinstance(func, astroid.BoundMethod)\n and isinstance(func.bound, astroid.Instance)\n and (func.bound.name in types if types else True)\n and (func.name in methods if methods else True)\n )"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking if a node represents a string with complex formatting specs.", "response": "def is_complex_format_str(node):\n \"\"\"Checks if node represents a string with complex formatting specs.\n\n Args:\n node (astroid.node_classes.NodeNG): AST node to check\n Returns:\n bool: True if inferred string uses complex formatting, False otherwise\n \"\"\"\n inferred = utils.safe_infer(node)\n if inferred is None or not isinstance(inferred.value, str):\n return True\n try:\n parsed = list(string.Formatter().parse(inferred.value))\n except ValueError:\n # This format string is invalid\n return False\n for _, _, format_spec, _ in parsed:\n if format_spec:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef visit_module(self, node): # pylint: disable=unused-argument\n # The code being checked can just as easily \"import logging as foo\",\n # so it is necessary to process the imports and store in this field\n # what name the logging module is actually given.\n self._logging_names = set()\n logging_mods = self.config.logging_modules\n\n self._format_style = self.config.logging_format_style\n self._logging_modules = set(logging_mods)\n self._from_imports = {}\n for logging_mod in logging_mods:\n parts = logging_mod.rsplit(\".\", 1)\n if len(parts) > 1:\n self._from_imports[parts[0]] = parts[1]", "response": "Clears any state left in this checker from last module checked."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nchecks to see if a module uses a non - Python logging module.", "response": "def visit_importfrom(self, node):\n \"\"\"Checks to see if a module uses a non-Python logging module.\"\"\"\n try:\n logging_name = self._from_imports[node.modname]\n for module, as_name in node.names:\n if module == logging_name:\n self._logging_names.add(as_name or module)\n except KeyError:\n pass"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck to see if this module uses Python s built - in logging.", "response": "def visit_import(self, node):\n \"\"\"Checks to see if this module uses Python's built-in logging.\"\"\"\n for module, as_name in node.names:\n if module in self._logging_modules:\n self._logging_names.add(as_name or module)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef visit_call(self, node):\n\n def is_logging_name():\n return (\n isinstance(node.func, astroid.Attribute)\n and isinstance(node.func.expr, astroid.Name)\n and node.func.expr.name in self._logging_names\n )\n\n def is_logger_class():\n try:\n for inferred in node.func.infer():\n if isinstance(inferred, astroid.BoundMethod):\n parent = inferred._proxied.parent\n if isinstance(parent, astroid.ClassDef) and (\n parent.qname() == \"logging.Logger\"\n or any(\n ancestor.qname() == \"logging.Logger\"\n for ancestor in parent.ancestors()\n )\n ):\n return True, inferred._proxied.name\n except astroid.exceptions.InferenceError:\n pass\n return False, None\n\n if is_logging_name():\n name = node.func.attrname\n else:\n result, name = is_logger_class()\n if not result:\n return\n self._check_log_method(node, name)", "response": "Checks if a call is to logging methods."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _check_log_method(self, node, name):\n if name == \"log\":\n if node.starargs or node.kwargs or len(node.args) < 2:\n # Either a malformed call, star args, or double-star args. Beyond\n # the scope of this checker.\n return\n format_pos = 1\n elif name in CHECKED_CONVENIENCE_FUNCTIONS:\n if node.starargs or node.kwargs or not node.args:\n # Either no args, star args, or double-star args. Beyond the\n # scope of this checker.\n return\n format_pos = 0\n else:\n return\n\n if isinstance(node.args[format_pos], astroid.BinOp):\n binop = node.args[format_pos]\n emit = binop.op == \"%\"\n if binop.op == \"+\":\n total_number_of_strings = sum(\n 1\n for operand in (binop.left, binop.right)\n if self._is_operand_literal_str(utils.safe_infer(operand))\n )\n emit = total_number_of_strings > 0\n if emit:\n self.add_message(\"logging-not-lazy\", node=node)\n elif isinstance(node.args[format_pos], astroid.Call):\n self._check_call_func(node.args[format_pos])\n elif isinstance(node.args[format_pos], astroid.Const):\n self._check_format_string(node, format_pos)\n elif isinstance(\n node.args[format_pos], (astroid.FormattedValue, astroid.JoinedStr)\n ):\n self.add_message(\"logging-fstring-interpolation\", node=node)", "response": "Checks calls to logging. log."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck that function call is not format_string. format.", "response": "def _check_call_func(self, node):\n \"\"\"Checks that function call is not format_string.format().\n\n Args:\n node (astroid.node_classes.Call):\n Call AST node to be checked.\n \"\"\"\n func = utils.safe_infer(node.func)\n types = (\"str\", \"unicode\")\n methods = (\"format\",)\n if is_method_call(func, types, methods) and not is_complex_format_str(\n func.bound\n ):\n self.add_message(\"logging-format-interpolation\", node=node)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _check_format_string(self, node, format_arg):\n num_args = _count_supplied_tokens(node.args[format_arg + 1 :])\n if not num_args:\n # If no args were supplied the string is not interpolated and can contain\n # formatting characters - it's used verbatim. Don't check any further.\n return\n format_string = node.args[format_arg].value\n if not isinstance(format_string, str):\n # If the log format is constant non-string (e.g. logging.debug(5)),\n # ensure there are no arguments.\n required_num_args = 0\n else:\n try:\n if self._format_style == \"old\":\n keyword_args, required_num_args, _, _ = utils.parse_format_string(\n format_string\n )\n if keyword_args:\n # Keyword checking on logging strings is complicated by\n # special keywords - out of scope.\n return\n elif self._format_style == \"new\":\n keyword_arguments, implicit_pos_args, explicit_pos_args = utils.parse_format_method_string(\n format_string\n )\n\n keyword_args_cnt = len(\n set(k for k, l in keyword_arguments if not isinstance(k, int))\n )\n required_num_args = (\n keyword_args_cnt + implicit_pos_args + explicit_pos_args\n )\n except utils.UnsupportedFormatCharacter as ex:\n char = format_string[ex.index]\n self.add_message(\n \"logging-unsupported-format\",\n node=node,\n args=(char, ord(char), ex.index),\n )\n return\n except utils.IncompleteFormatString:\n self.add_message(\"logging-format-truncated\", node=node)\n return\n if num_args > required_num_args:\n self.add_message(\"logging-too-many-args\", node=node)\n elif num_args < required_num_args:\n self.add_message(\"logging-too-few-args\", node=node)", "response": "Checks that the format string tokens match the supplied arguments."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _redefines_import(node):\n current = node\n while current and not isinstance(current.parent, astroid.ExceptHandler):\n current = current.parent\n if not current or not utils.error_of_type(current.parent, ImportError):\n return False\n try_block = current.parent.parent\n for import_node in try_block.nodes_of_class((astroid.ImportFrom, astroid.Import)):\n for name, alias in import_node.names:\n if alias:\n if alias == node.name:\n return True\n elif name == node.name:\n return True\n return False", "response": "Detects that the given node is inside an anonymized import from the tryexcept body and redefines an import from the tryexcept body. Returns True if the node redefines an import False otherwise."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef in_loop(node):\n parent = node.parent\n while parent is not None:\n if isinstance(\n parent,\n (\n astroid.For,\n astroid.ListComp,\n astroid.SetComp,\n astroid.DictComp,\n astroid.GeneratorExp,\n ),\n ):\n return True\n parent = parent.parent\n return False", "response": "return True if the node is inside a kind of for loop"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns true if the object is an element of a nested list or of a nested list of items", "response": "def in_nested_list(nested_list, obj):\n \"\"\"return true if the object is an element of or of a nested\n list\n \"\"\"\n for elmt in nested_list:\n if isinstance(elmt, (list, tuple)):\n if in_nested_list(elmt, obj):\n return True\n elif elmt == obj:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the loop node that holds the break node in arguments.", "response": "def _get_break_loop_node(break_node):\n \"\"\"\n Returns the loop node that holds the break node in arguments.\n\n Args:\n break_node (astroid.Break): the break node of interest.\n\n Returns:\n astroid.For or astroid.While: the loop node holding the break node.\n \"\"\"\n loop_nodes = (astroid.For, astroid.While)\n parent = break_node.parent\n while not isinstance(parent, loop_nodes) or break_node in getattr(\n parent, \"orelse\", []\n ):\n break_node = parent\n parent = parent.parent\n if parent is None:\n break\n return parent"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn True if a loop node inspected is early or not.", "response": "def _loop_exits_early(loop):\n \"\"\"\n Returns true if a loop may ends up in a break statement.\n\n Args:\n loop (astroid.For, astroid.While): the loop node inspected.\n\n Returns:\n bool: True if the loop may ends up in a break statement, False otherwise.\n \"\"\"\n loop_nodes = (astroid.For, astroid.While)\n definition_nodes = (astroid.FunctionDef, astroid.ClassDef)\n inner_loop_nodes = [\n _node\n for _node in loop.nodes_of_class(loop_nodes, skip_klass=definition_nodes)\n if _node != loop\n ]\n return any(\n _node\n for _node in loop.nodes_of_class(astroid.Break, skip_klass=definition_nodes)\n if _get_break_loop_node(_node) not in inner_loop_nodes\n )"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn a tuple of property classes and names.", "response": "def _get_properties(config):\n \"\"\"Returns a tuple of property classes and names.\n\n Property classes are fully qualified, such as 'abc.abstractproperty' and\n property names are the actual names, such as 'abstract_property'.\n \"\"\"\n property_classes = {BUILTIN_PROPERTY}\n property_names = set() # Not returning 'property', it has its own check.\n if config is not None:\n property_classes.update(config.property_classes)\n property_names.update(\n (prop.rsplit(\".\", 1)[-1] for prop in config.property_classes)\n )\n return property_classes, property_names"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _determine_function_name_type(node, config=None):\n property_classes, property_names = _get_properties(config)\n if not node.is_method():\n return \"function\"\n if node.decorators:\n decorators = node.decorators.nodes\n else:\n decorators = []\n for decorator in decorators:\n # If the function is a property (decorated with @property\n # or @abc.abstractproperty), the name type is 'attr'.\n if isinstance(decorator, astroid.Name) or (\n isinstance(decorator, astroid.Attribute)\n and decorator.attrname in property_names\n ):\n infered = utils.safe_infer(decorator)\n if infered and infered.qname() in property_classes:\n return \"attr\"\n # If the function is decorated using the prop_method.{setter,getter}\n # form, treat it like an attribute as well.\n elif isinstance(decorator, astroid.Attribute) and decorator.attrname in (\n \"setter\",\n \"deleter\",\n ):\n return \"attr\"\n return \"method\"", "response": "Determine the name type of a function."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nmakes a report of the types of the node in the order they appear.", "response": "def report_by_type_stats(sect, stats, _):\n \"\"\"make a report of\n\n * percentage of different types documented\n * percentage of different types with a bad name\n \"\"\"\n # percentage of different types documented and/or with a bad name\n nice_stats = {}\n for node_type in (\"module\", \"class\", \"method\", \"function\"):\n try:\n total = stats[node_type]\n except KeyError:\n raise exceptions.EmptyReportError()\n nice_stats[node_type] = {}\n if total != 0:\n try:\n documented = total - stats[\"undocumented_\" + node_type]\n percent = (documented * 100.0) / total\n nice_stats[node_type][\"percent_documented\"] = \"%.2f\" % percent\n except KeyError:\n nice_stats[node_type][\"percent_documented\"] = \"NC\"\n try:\n percent = (stats[\"badname_\" + node_type] * 100.0) / total\n nice_stats[node_type][\"percent_badname\"] = \"%.2f\" % percent\n except KeyError:\n nice_stats[node_type][\"percent_badname\"] = \"NC\"\n lines = (\"type\", \"number\", \"old number\", \"difference\", \"%documented\", \"%badname\")\n for node_type in (\"module\", \"class\", \"method\", \"function\"):\n new = stats[node_type]\n lines += (\n node_type,\n str(new),\n \"NC\",\n \"NC\",\n nice_stats[node_type].get(\"percent_documented\", \"0\"),\n nice_stats[node_type].get(\"percent_badname\", \"0\"),\n )\n sect.append(reporter_nodes.Table(children=lines, cols=6, rheaders=1))"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef redefined_by_decorator(node):\n if node.decorators:\n for decorator in node.decorators.nodes:\n if (\n isinstance(decorator, astroid.Attribute)\n and getattr(decorator.expr, \"name\", None) == node.name\n ):\n return True\n return False", "response": "Return True if the object is a method redefined via decorator."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nam this a call with exactly 1 argument?", "response": "def _is_one_arg_pos_call(call):\n \"\"\"Is this a call with exactly 1 argument,\n where that argument is positional?\n \"\"\"\n return isinstance(call, astroid.Call) and len(call.args) == 1 and not call.keywords"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nrequires method to auto register this checker", "response": "def register(linter):\n \"\"\"required method to auto register this checker\"\"\"\n linter.register_checker(BasicErrorChecker(linter))\n linter.register_checker(BasicChecker(linter))\n linter.register_checker(NameChecker(linter))\n linter.register_checker(DocStringChecker(linter))\n linter.register_checker(PassChecker(linter))\n linter.register_checker(ComparisonChecker(linter))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef visit_starred(self, node):\n if isinstance(node.parent, astroid.Call):\n # f(*args) is converted to Call(args=[Starred]), so ignore\n # them for this check.\n return\n if PY35 and isinstance(\n node.parent, (astroid.List, astroid.Tuple, astroid.Set, astroid.Dict)\n ):\n # PEP 448 unpacking.\n return\n\n stmt = node.statement()\n if not isinstance(stmt, astroid.Assign):\n return\n\n if stmt.value is node or stmt.value.parent_of(node):\n self.add_message(\"star-needs-assignment-target\", node=node)", "response": "Check that a Starred expression is used in an assignment target."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _check_nonlocal_and_global(self, node):\n\n def same_scope(current):\n return current.scope() is node\n\n from_iter = itertools.chain.from_iterable\n nonlocals = set(\n from_iter(\n child.names\n for child in node.nodes_of_class(astroid.Nonlocal)\n if same_scope(child)\n )\n )\n\n if not nonlocals:\n return\n\n global_vars = set(\n from_iter(\n child.names\n for child in node.nodes_of_class(astroid.Global)\n if same_scope(child)\n )\n )\n for name in nonlocals.intersection(global_vars):\n self.add_message(\"nonlocal-and-global\", args=(name,), node=node)", "response": "Check that a name is both nonlocal and global."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks use of the non - existent ++ and -- operator operator", "response": "def visit_unaryop(self, node):\n \"\"\"check use of the non-existent ++ and -- operator operator\"\"\"\n if (\n (node.op in \"+-\")\n and isinstance(node.operand, astroid.UnaryOp)\n and (node.operand.op == node.op)\n ):\n self.add_message(\"nonexistent-operator\", node=node, args=node.op * 2)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef visit_call(self, node):\n try:\n for inferred in node.func.infer():\n self._check_inferred_class_is_abstract(inferred, node)\n except astroid.InferenceError:\n return", "response": "Check instantiating abstract class with\n as metaclass."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks that any loop with an else clause has a break statement.", "response": "def _check_else_on_loop(self, node):\n \"\"\"Check that any loop with an else clause has a break statement.\"\"\"\n if node.orelse and not _loop_exits_early(node):\n self.add_message(\n \"useless-else-on-loop\",\n node=node,\n # This is not optimal, but the line previous\n # to the first statement in the else clause\n # will usually be the one that contains the else:.\n line=node.orelse[0].lineno - 1,\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _check_in_loop(self, node, node_name):\n _node = node.parent\n while _node:\n if isinstance(_node, (astroid.For, astroid.While)):\n if node not in _node.orelse:\n return\n\n if isinstance(_node, (astroid.ClassDef, astroid.FunctionDef)):\n break\n if (\n isinstance(_node, astroid.TryFinally)\n and node in _node.finalbody\n and isinstance(node, astroid.Continue)\n ):\n self.add_message(\"continue-in-finally\", node=node)\n\n _node = _node.parent\n\n self.add_message(\"not-in-loop\", node=node, args=node_name)", "response": "check that a node is inside a for or while loop"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking if a function or method or class name is redefined.", "response": "def _check_redefinition(self, redeftype, node):\n \"\"\"check for redefinition of a function / method / class name\"\"\"\n parent_frame = node.parent.frame()\n defined_self = parent_frame[node.name]\n if defined_self is not node and not astroid.are_exclusive(node, defined_self):\n\n # Additional checks for methods which are not considered\n # redefined, since they are already part of the base API.\n if (\n isinstance(parent_frame, astroid.ClassDef)\n and node.name in REDEFINABLE_METHODS\n ):\n return\n\n dummy_variables_rgx = lint_utils.get_global_option(\n self, \"dummy-variables-rgx\", default=None\n )\n if dummy_variables_rgx and dummy_variables_rgx.match(node.name):\n return\n self.add_message(\n \"function-redefined\",\n node=node,\n args=(redeftype, defined_self.fromlineno),\n )"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef open(self):\n self._tryfinallys = []\n self.stats = self.linter.add_stats(module=0, function=0, method=0, class_=0)", "response": "initialize visit variables and statistics"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef visit_expr(self, node):\n expr = node.value\n if isinstance(expr, astroid.Const) and isinstance(expr.value, str):\n # treat string statement in a separated message\n # Handle PEP-257 attribute docstrings.\n # An attribute docstring is defined as being a string right after\n # an assignment at the module level, class level or __init__ level.\n scope = expr.scope()\n if isinstance(\n scope, (astroid.ClassDef, astroid.Module, astroid.FunctionDef)\n ):\n if isinstance(scope, astroid.FunctionDef) and scope.name != \"__init__\":\n pass\n else:\n sibling = expr.previous_sibling()\n if (\n sibling is not None\n and sibling.scope() is scope\n and isinstance(sibling, (astroid.Assign, astroid.AnnAssign))\n ):\n return\n self.add_message(\"pointless-string-statement\", node=node)\n return\n\n # Ignore if this is :\n # * a direct function call\n # * the unique child of a try/except body\n # * a yieldd statement\n # * an ellipsis (which can be used on Python 3 instead of pass)\n # warn W0106 if we have any underlying function call (we can't predict\n # side effects), else pointless-statement\n if isinstance(\n expr, (astroid.Yield, astroid.Await, astroid.Ellipsis, astroid.Call)\n ) or (\n isinstance(node.parent, astroid.TryExcept) and node.parent.body == [node]\n ):\n return\n if any(expr.nodes_of_class(astroid.Call)):\n self.add_message(\n \"expression-not-assigned\", node=node, args=expr.as_string()\n )\n else:\n self.add_message(\"pointless-statement\", node=node)", "response": "check for various kind of statements without effect"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef visit_lambda(self, node):\n # if the body of the lambda is a call expression with the same\n # argument list as the lambda itself, then the lambda is\n # possibly unnecessary and at least suspicious.\n if node.args.defaults:\n # If the arguments of the lambda include defaults, then a\n # judgment cannot be made because there is no way to check\n # that the defaults defined by the lambda are the same as\n # the defaults defined by the function called in the body\n # of the lambda.\n return\n call = node.body\n if not isinstance(call, astroid.Call):\n # The body of the lambda must be a function call expression\n # for the lambda to be unnecessary.\n return\n if isinstance(node.body.func, astroid.Attribute) and isinstance(\n node.body.func.expr, astroid.Call\n ):\n # Chained call, the intermediate call might\n # return something else (but we don't check that, yet).\n return\n\n call_site = CallSite.from_call(call)\n ordinary_args = list(node.args.args)\n new_call_args = list(self._filter_vararg(node, call.args))\n if node.args.kwarg:\n if self._has_variadic_argument(call.kwargs, node.args.kwarg):\n return\n\n if node.args.vararg:\n if self._has_variadic_argument(call.starargs, node.args.vararg):\n return\n elif call.starargs:\n return\n\n if call.keywords:\n # Look for additional keyword arguments that are not part\n # of the lambda's signature\n lambda_kwargs = {keyword.name for keyword in node.args.defaults}\n if len(lambda_kwargs) != len(call_site.keyword_arguments):\n # Different lengths, so probably not identical\n return\n if set(call_site.keyword_arguments).difference(lambda_kwargs):\n return\n\n # The \"ordinary\" arguments must be in a correspondence such that:\n # ordinary_args[i].name == call.args[i].name.\n if len(ordinary_args) != len(new_call_args):\n return\n for arg, passed_arg in zip(ordinary_args, new_call_args):\n if not isinstance(passed_arg, astroid.Name):\n return\n if arg.name != passed_arg.name:\n return\n\n self.add_message(\"unnecessary-lambda\", line=node.fromlineno, node=node)", "response": "Check whether or not the lambda is suspicious and at least suspicious."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef visit_functiondef(self, node):\n self.stats[node.is_method() and \"method\" or \"function\"] += 1\n self._check_dangerous_default(node)", "response": "check function name docstring arguments redefinition variable names max locals\n "} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef visit_return(self, node):\n self._check_unreachable(node)\n # Is it inside final body of a try...finally bloc ?\n self._check_not_in_finally(node, \"return\", (astroid.FunctionDef,))", "response": "Check if the node is a return statement"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking if the node is a break statement.", "response": "def visit_break(self, node):\n \"\"\"1 - check is the node has a right sibling (if so, that's some\n unreachable code)\n 2 - check is the node is inside the finally clause of a try...finally\n block\n \"\"\"\n # 1 - Is it right sibling ?\n self._check_unreachable(node)\n # 2 - Is it inside final body of a try...finally bloc ?\n self._check_not_in_finally(node, \"break\", (astroid.For, astroid.While))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef visit_assert(self, node):\n if (\n node.fail is None\n and isinstance(node.test, astroid.Tuple)\n and len(node.test.elts) == 2\n ):\n self.add_message(\"assert-on-tuple\", node=node)", "response": "check the use of an assert statement on a tuple"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking duplicate key in dictionary", "response": "def visit_dict(self, node):\n \"\"\"check duplicate key in dictionary\"\"\"\n keys = set()\n for k, _ in node.items:\n if isinstance(k, astroid.Const):\n key = k.value\n if key in keys:\n self.add_message(\"duplicate-key\", node=node, args=key)\n keys.add(key)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _check_not_in_finally(self, node, node_name, breaker_classes=()):\n # if self._tryfinallys is empty, we're not an in try...finally block\n if not self._tryfinallys:\n return\n # the node could be a grand-grand...-children of the try...finally\n _parent = node.parent\n _node = node\n while _parent and not isinstance(_parent, breaker_classes):\n if hasattr(_parent, \"finalbody\") and _node in _parent.finalbody:\n self.add_message(\"lost-exception\", node=node, args=node_name)\n return\n _node = _parent\n _parent = _node.parent", "response": "check that a node is not inside a finally clause."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncheck that the argument to reversed is a sequence", "response": "def _check_reversed(self, node):\n \"\"\" check that the argument to `reversed` is a sequence \"\"\"\n try:\n argument = utils.safe_infer(utils.get_argument_from_call(node, position=0))\n except utils.NoSuchArgumentError:\n pass\n else:\n if argument is astroid.Uninferable:\n return\n if argument is None:\n # Nothing was infered.\n # Try to see if we have iter().\n if isinstance(node.args[0], astroid.Call):\n try:\n func = next(node.args[0].func.infer())\n except astroid.InferenceError:\n return\n if getattr(\n func, \"name\", None\n ) == \"iter\" and utils.is_builtin_object(func):\n self.add_message(\"bad-reversed-sequence\", node=node)\n return\n\n if isinstance(argument, (astroid.List, astroid.Tuple)):\n return\n\n if isinstance(argument, astroid.Instance):\n if argument._proxied.name == \"dict\" and utils.is_builtin_object(\n argument._proxied\n ):\n self.add_message(\"bad-reversed-sequence\", node=node)\n return\n if any(\n ancestor.name == \"dict\" and utils.is_builtin_object(ancestor)\n for ancestor in argument._proxied.ancestors()\n ):\n # Mappings aren't accepted by reversed(), unless\n # they provide explicitly a __reversed__ method.\n try:\n argument.locals[REVERSED_PROTOCOL_METHOD]\n except KeyError:\n self.add_message(\"bad-reversed-sequence\", node=node)\n return\n\n if hasattr(argument, \"getattr\"):\n # everything else is not a proper sequence for reversed()\n for methods in REVERSED_METHODS:\n for meth in methods:\n try:\n argument.getattr(meth)\n except astroid.NotFoundError:\n break\n else:\n break\n else:\n self.add_message(\"bad-reversed-sequence\", node=node)\n else:\n self.add_message(\"bad-reversed-sequence\", node=node)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nchecking if the name is valid for assignment", "response": "def visit_assignname(self, node):\n \"\"\"check module level assigned names\"\"\"\n self._check_assign_to_new_keyword_violation(node.name, node)\n frame = node.frame()\n assign_type = node.assign_type()\n if isinstance(assign_type, astroid.Comprehension):\n self._check_name(\"inlinevar\", node.name, node)\n elif isinstance(frame, astroid.Module):\n if isinstance(assign_type, astroid.Assign) and not in_loop(assign_type):\n if isinstance(utils.safe_infer(assign_type.value), astroid.ClassDef):\n self._check_name(\"class\", node.name, node)\n else:\n if not _redefines_import(node):\n # Don't emit if the name redefines an import\n # in an ImportError except handler.\n self._check_name(\"const\", node.name, node)\n elif isinstance(assign_type, astroid.ExceptHandler):\n self._check_name(\"variable\", node.name, node)\n elif isinstance(frame, astroid.FunctionDef):\n # global introduced variable aren't in the function locals\n if node.name in frame and node.name not in frame.argnames():\n if not _redefines_import(node):\n self._check_name(\"variable\", node.name, node)\n elif isinstance(frame, astroid.ClassDef):\n if not list(frame.local_attr_ancestors(node.name)):\n self._check_name(\"class_attribute\", node.name, node)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _recursive_check_names(self, args, node):\n for arg in args:\n if isinstance(arg, astroid.AssignName):\n self._check_name(\"argument\", arg.name, node)\n else:\n self._recursive_check_names(arg.elts, node)", "response": "check names in a possibly recursive list "} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _check_name(self, node_type, name, node, confidence=interfaces.HIGH):\n\n def _should_exempt_from_invalid_name(node):\n if node_type == \"variable\":\n inferred = utils.safe_infer(node)\n if isinstance(inferred, astroid.ClassDef):\n return True\n return False\n\n if utils.is_inside_except(node):\n clobbering, _ = utils.clobber_in_except(node)\n if clobbering:\n return\n if name in self.config.good_names:\n return\n if name in self.config.bad_names:\n self.stats[\"badname_\" + node_type] += 1\n self.add_message(\"blacklisted-name\", node=node, args=name)\n return\n regexp = self._name_regexps[node_type]\n match = regexp.match(name)\n\n if _is_multi_naming_match(match, node_type, confidence):\n name_group = self._find_name_group(node_type)\n bad_name_group = self._bad_names.setdefault(name_group, {})\n warnings = bad_name_group.setdefault(match.lastgroup, [])\n warnings.append((node, node_type, name, confidence))\n\n if match is None and not _should_exempt_from_invalid_name(node):\n self._raise_name_warning(node, node_type, name, confidence)", "response": "check for a name using the type s regexp"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _check_docstring(\n self, node_type, node, report_missing=True, confidence=interfaces.HIGH\n ):\n \"\"\"check the node has a non empty docstring\"\"\"\n docstring = node.doc\n if docstring is None:\n if not report_missing:\n return\n lines = utils.get_node_last_lineno(node) - node.lineno\n\n if node_type == \"module\" and not lines:\n # If the module has no body, there's no reason\n # to require a docstring.\n return\n max_lines = self.config.docstring_min_length\n\n if node_type != \"module\" and max_lines > -1 and lines < max_lines:\n return\n self.stats[\"undocumented_\" + node_type] += 1\n if (\n node.body\n and isinstance(node.body[0], astroid.Expr)\n and isinstance(node.body[0].value, astroid.Call)\n ):\n # Most likely a string with a format call. Let's see.\n func = utils.safe_infer(node.body[0].value.func)\n if isinstance(func, astroid.BoundMethod) and isinstance(\n func.bound, astroid.Instance\n ):\n # Strings in Python 3, others in Python 2.\n if PY3K and func.bound.name == \"str\":\n return\n if func.bound.name in (\"str\", \"unicode\", \"bytes\"):\n return\n self.add_message(\n \"missing-docstring\", node=node, args=(node_type,), confidence=confidence\n )\n elif not docstring.strip():\n self.stats[\"undocumented_\" + node_type] += 1\n self.add_message(\n \"empty-docstring\", node=node, args=(node_type,), confidence=confidence\n )", "response": "check the node has a non empty docstring"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncheck if we compare to a literal.", "response": "def _check_literal_comparison(self, literal, node):\n \"\"\"Check if we compare to a literal, which is usually what we do not want to do.\"\"\"\n nodes = (astroid.List, astroid.Tuple, astroid.Dict, astroid.Set)\n is_other_literal = isinstance(literal, nodes)\n is_const = False\n if isinstance(literal, astroid.Const):\n if isinstance(literal.value, bool) or literal.value is None:\n # Not interested in this values.\n return\n is_const = isinstance(literal.value, (bytes, str, int, float))\n\n if is_const or is_other_literal:\n self.add_message(\"literal-comparison\", node=node)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck if identifier is compared against itself.", "response": "def _check_logical_tautology(self, node):\n \"\"\"Check if identifier is compared against itself.\n :param node: Compare node\n :type node: astroid.node_classes.Compare\n :Example:\n val = 786\n if val == val: # [comparison-with-itself]\n pass\n \"\"\"\n left_operand = node.left\n right_operand = node.ops[0][1]\n operator = node.ops[0][0]\n if isinstance(left_operand, astroid.Const) and isinstance(\n right_operand, astroid.Const\n ):\n left_operand = left_operand.value\n right_operand = right_operand.value\n elif isinstance(left_operand, astroid.Name) and isinstance(\n right_operand, astroid.Name\n ):\n left_operand = left_operand.name\n right_operand = right_operand.name\n\n if left_operand == right_operand:\n suggestion = \"%s %s %s\" % (left_operand, operator, right_operand)\n self.add_message(\"comparison-with-itself\", node=node, args=(suggestion,))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck for expressions like type x == Y.", "response": "def _check_type_x_is_y(self, node, left, operator, right):\n \"\"\"Check for expressions like type(x) == Y.\"\"\"\n left_func = utils.safe_infer(left.func)\n if not (\n isinstance(left_func, astroid.ClassDef) and left_func.qname() == TYPE_QNAME\n ):\n return\n\n if operator in (\"is\", \"is not\") and _is_one_arg_pos_call(right):\n right_func = utils.safe_infer(right.func)\n if (\n isinstance(right_func, astroid.ClassDef)\n and right_func.qname() == TYPE_QNAME\n ):\n # type(x) == type(a)\n right_arg = utils.safe_infer(right.args[0])\n if not isinstance(right_arg, LITERAL_NODE_TYPES):\n # not e.g. type(x) == type([])\n return\n self.add_message(\"unidiomatic-typecheck\", node=node)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncreate the subgraphs representing any if and for statements", "response": "def _subgraph(self, node, name, extra_blocks=()):\n \"\"\"create the subgraphs representing any `if` and `for` statements\"\"\"\n if self.graph is None:\n # global loop\n self.graph = PathGraph(node)\n self._subgraph_parse(node, node, extra_blocks)\n self.graphs[\"%s%s\" % (self.classname, name)] = self.graph\n self.reset()\n else:\n self._append_node(node)\n self._subgraph_parse(node, node, extra_blocks)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _subgraph_parse(\n self, node, pathnode, extra_blocks\n ): # pylint: disable=unused-argument\n \"\"\"parse the body and any `else` block of `if` and `for` statements\"\"\"\n loose_ends = []\n self.tail = node\n self.dispatch_list(node.body)\n loose_ends.append(self.tail)\n for extra in extra_blocks:\n self.tail = node\n self.dispatch_list(extra.body)\n loose_ends.append(self.tail)\n if node.orelse:\n self.tail = node\n self.dispatch_list(node.orelse)\n loose_ends.append(self.tail)\n else:\n loose_ends.append(node)\n if node:\n bottom = \"%s\" % self._bottom_counter\n self._bottom_counter += 1\n for le in loose_ends:\n self.graph.connect(le, bottom)\n self.tail = bottom", "response": "parse the body and any else block of if and for statements"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nvisit an astroid. Module node to check too complex rating and add message if is greather than max_complexity stored from options", "response": "def visit_module(self, node):\n \"\"\"visit an astroid.Module node to check too complex rating and\n add message if is greather than max_complexity stored from options\"\"\"\n visitor = PathGraphingAstVisitor()\n for child in node.body:\n visitor.preorder(child, visitor)\n for graph in visitor.graphs.values():\n complexity = graph.complexity()\n node = graph.root\n if hasattr(node, \"name\"):\n node_name = \"'%s'\" % node.name\n else:\n node_name = \"This '%s'\" % node.__class__.__name__.lower()\n if complexity <= self.config.max_complexity:\n continue\n self.add_message(\n \"too-complex\", node=node, confidence=HIGH, args=(node_name, complexity)\n )"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_checker(self, checker):\n # XXX : should be possible to merge needed_checkers and add_checker\n vcids = set()\n lcids = set()\n visits = self.visit_events\n leaves = self.leave_events\n for member in dir(checker):\n cid = member[6:]\n if cid == \"default\":\n continue\n if member.startswith(\"visit_\"):\n v_meth = getattr(checker, member)\n # don't use visit_methods with no activated message:\n if self._is_method_enabled(v_meth):\n visits[cid].append(v_meth)\n vcids.add(cid)\n elif member.startswith(\"leave_\"):\n l_meth = getattr(checker, member)\n # don't use leave_methods with no activated message:\n if self._is_method_enabled(l_meth):\n leaves[cid].append(l_meth)\n lcids.add(cid)\n visit_default = getattr(checker, \"visit_default\", None)\n if visit_default:\n for cls in nodes.ALL_NODE_CLASSES:\n cid = cls.__name__.lower()\n if cid not in vcids:\n visits[cid].append(visit_default)", "response": "walk to the checker s dir and collect visit and leave methods"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncalling visit events of astroid checkers for the given node then leave events of astroid checkers for the given node", "response": "def walk(self, astroid):\n \"\"\"call visit events of astroid checkers for the given node, recurse on\n its children, then leave events.\n \"\"\"\n cid = astroid.__class__.__name__.lower()\n\n # Detect if the node is a new name for a deprecated alias.\n # In this case, favour the methods for the deprecated\n # alias if any, in order to maintain backwards\n # compatibility.\n visit_events = self.visit_events.get(cid, ())\n leave_events = self.leave_events.get(cid, ())\n\n if astroid.is_statement:\n self.nbstatements += 1\n # generate events for this node on each checker\n for cb in visit_events or ():\n cb(astroid)\n # recurse on children\n for child in astroid.get_children():\n self.walk(child)\n for cb in leave_events or ():\n cb(astroid)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_relationship(self, from_object, to_object, relation_type, name=None):\n rel = Relationship(from_object, to_object, relation_type, name)\n self.relationships.setdefault(relation_type, []).append(rel)", "response": "add a relationship to the cache"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_relationship(self, from_object, relation_type):\n for rel in self.relationships.get(relation_type, ()):\n if rel.from_object is from_object:\n return rel\n raise KeyError(relation_type)", "response": "return a relation ship or None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn visible attributes possibly with class name", "response": "def get_attrs(self, node):\n \"\"\"return visible attributes, possibly with class name\"\"\"\n attrs = []\n properties = [\n (n, m)\n for n, m in node.items()\n if isinstance(m, astroid.FunctionDef) and decorated_with_property(m)\n ]\n for node_name, associated_nodes in (\n list(node.instance_attrs_type.items())\n + list(node.locals_type.items())\n + properties\n ):\n if not self.show_attr(node_name):\n continue\n names = self.class_names(associated_nodes)\n if names:\n node_name = \"%s : %s\" % (node_name, \", \".join(names))\n attrs.append(node_name)\n return sorted(attrs)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a diagram object", "response": "def add_object(self, title, node):\n \"\"\"create a diagram object\n \"\"\"\n assert node not in self._nodes\n ent = DiagramEntity(title, node)\n self._nodes[node] = ent\n self.objects.append(ent)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef class_names(self, nodes):\n names = []\n for node in nodes:\n if isinstance(node, astroid.Instance):\n node = node._proxied\n if (\n isinstance(node, astroid.ClassDef)\n and hasattr(node, \"name\")\n and not self.has_node(node)\n ):\n if node.name not in names:\n node_name = node.name\n names.append(node_name)\n return names", "response": "return class names if needed in diagram"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef classes(self):\n return [o for o in self.objects if isinstance(o.node, astroid.ClassDef)]", "response": "return all class nodes in the diagram"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning a class by its name raise KeyError if not found", "response": "def classe(self, name):\n \"\"\"return a class by its name, raise KeyError if not found\n \"\"\"\n for klass in self.classes():\n if klass.node.name == name:\n return klass\n raise KeyError(name)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nextracting the relationship ships between nodes in the diagram", "response": "def extract_relationships(self):\n \"\"\"extract relation ships between nodes in the diagram\n \"\"\"\n for obj in self.classes():\n node = obj.node\n obj.attrs = self.get_attrs(node)\n obj.methods = self.get_methods(node)\n # shape\n if is_interface(node):\n obj.shape = \"interface\"\n else:\n obj.shape = \"class\"\n # inheritance link\n for par_node in node.ancestors(recurs=False):\n try:\n par_obj = self.object_from_node(par_node)\n self.add_relationship(obj, par_obj, \"specialization\")\n except KeyError:\n continue\n # implements link\n for impl_node in node.implements:\n try:\n impl_obj = self.object_from_node(impl_node)\n self.add_relationship(obj, impl_obj, \"implements\")\n except KeyError:\n continue\n # associations link\n for name, values in list(node.instance_attrs_type.items()) + list(\n node.locals_type.items()\n ):\n for value in values:\n if value is astroid.Uninferable:\n continue\n if isinstance(value, astroid.Instance):\n value = value._proxied\n try:\n associated_obj = self.object_from_node(value)\n self.add_relationship(associated_obj, obj, \"association\", name)\n except KeyError:\n continue"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef modules(self):\n return [o for o in self.objects if isinstance(o.node, astroid.Module)]", "response": "return all module nodes in the diagram"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning a module by its name raise KeyError if not found", "response": "def module(self, name):\n \"\"\"return a module by its name, raise KeyError if not found\n \"\"\"\n for mod in self.modules():\n if mod.node.name == name:\n return mod\n raise KeyError(name)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_module(self, name, node):\n for mod in self.modules():\n mod_name = mod.node.name\n if mod_name == name:\n return mod\n # search for fullname of relative import modules\n package = node.root().name\n if mod_name == \"%s.%s\" % (package, name):\n return mod\n if mod_name == \"%s.%s\" % (package.rsplit(\".\", 1)[0], name):\n return mod\n raise KeyError(name)", "response": "return a module by its name looking also for relative imports ; raise KeyError if not found"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadding dependencies created by from - imports", "response": "def add_from_depend(self, node, from_module):\n \"\"\"add dependencies created by from-imports\n \"\"\"\n mod_name = node.root().name\n obj = self.module(mod_name)\n if from_module not in obj.node.depends:\n obj.node.depends.append(from_module)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef extract_relationships(self):\n ClassDiagram.extract_relationships(self)\n for obj in self.classes():\n # ownership\n try:\n mod = self.object_from_node(obj.node.root())\n self.add_relationship(obj, mod, \"ownership\")\n except KeyError:\n continue\n for obj in self.modules():\n obj.shape = \"package\"\n # dependencies\n for dep_name in obj.node.depends:\n try:\n dep = self.get_module(dep_name, obj.node)\n except KeyError:\n continue\n self.add_relationship(obj, dep, \"depends\")", "response": "extract relation ships between classes and modules in the diagram\n "} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncall for function and method definitions ( def.", "response": "def visit_functiondef(self, node):\n \"\"\"Called for function and method definitions (def).\n\n :param node: Node for a function or method definition in the AST\n :type node: :class:`astroid.scoped_nodes.Function`\n \"\"\"\n node_doc = utils.docstringify(node.doc, self.config.default_docstring_type)\n self.check_functiondef_params(node, node_doc)\n self.check_functiondef_returns(node, node_doc)\n self.check_functiondef_yields(node, node_doc)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking that all parameters in a function method or class or class are documented in the docstring.", "response": "def check_arguments_in_docstring(\n self, doc, arguments_node, warning_node, accept_no_param_doc=None\n ):\n \"\"\"Check that all parameters in a function, method or class constructor\n on the one hand and the parameters mentioned in the parameter\n documentation (e.g. the Sphinx tags 'param' and 'type') on the other\n hand are consistent with each other.\n\n * Undocumented parameters except 'self' are noticed.\n * Undocumented parameter types except for 'self' and the ``*``\n and ``**`` parameters are noticed.\n * Parameters mentioned in the parameter documentation that don't or no\n longer exist in the function parameter list are noticed.\n * If the text \"For the parameters, see\" or \"For the other parameters,\n see\" (ignoring additional whitespace) is mentioned in the docstring,\n missing parameter documentation is tolerated.\n * If there's no Sphinx style, Google style or NumPy style parameter\n documentation at all, i.e. ``:param`` is never mentioned etc., the\n checker assumes that the parameters are documented in another format\n and the absence is tolerated.\n\n :param doc: Docstring for the function, method or class.\n :type doc: str\n\n :param arguments_node: Arguments node for the function, method or\n class constructor.\n :type arguments_node: :class:`astroid.scoped_nodes.Arguments`\n\n :param warning_node: The node to assign the warnings to\n :type warning_node: :class:`astroid.scoped_nodes.Node`\n\n :param accept_no_param_doc: Whether or not to allow no parameters\n to be documented.\n If None then this value is read from the configuration.\n :type accept_no_param_doc: bool or None\n \"\"\"\n # Tolerate missing param or type declarations if there is a link to\n # another method carrying the same name.\n if not doc.doc:\n return\n\n if accept_no_param_doc is None:\n accept_no_param_doc = self.config.accept_no_param_doc\n tolerate_missing_params = doc.params_documented_elsewhere()\n\n # Collect the function arguments.\n expected_argument_names = {arg.name for arg in arguments_node.args}\n expected_argument_names.update(arg.name for arg in arguments_node.kwonlyargs)\n not_needed_type_in_docstring = self.not_needed_param_in_docstring.copy()\n\n if arguments_node.vararg is not None:\n expected_argument_names.add(arguments_node.vararg)\n not_needed_type_in_docstring.add(arguments_node.vararg)\n if arguments_node.kwarg is not None:\n expected_argument_names.add(arguments_node.kwarg)\n not_needed_type_in_docstring.add(arguments_node.kwarg)\n params_with_doc, params_with_type = doc.match_param_docs()\n\n # Tolerate no parameter documentation at all.\n if not params_with_doc and not params_with_type and accept_no_param_doc:\n tolerate_missing_params = True\n\n def _compare_missing_args(found_argument_names, message_id, not_needed_names):\n \"\"\"Compare the found argument names with the expected ones and\n generate a message if there are arguments missing.\n\n :param set found_argument_names: argument names found in the\n docstring\n\n :param str message_id: pylint message id\n\n :param not_needed_names: names that may be omitted\n :type not_needed_names: set of str\n \"\"\"\n if not tolerate_missing_params:\n missing_argument_names = (\n expected_argument_names - found_argument_names\n ) - not_needed_names\n if missing_argument_names:\n self.add_message(\n message_id,\n args=(\", \".join(sorted(missing_argument_names)),),\n node=warning_node,\n )\n\n def _compare_different_args(found_argument_names, message_id, not_needed_names):\n \"\"\"Compare the found argument names with the expected ones and\n generate a message if there are extra arguments found.\n\n :param set found_argument_names: argument names found in the\n docstring\n\n :param str message_id: pylint message id\n\n :param not_needed_names: names that may be omitted\n :type not_needed_names: set of str\n \"\"\"\n differing_argument_names = (\n (expected_argument_names ^ found_argument_names)\n - not_needed_names\n - expected_argument_names\n )\n\n if differing_argument_names:\n self.add_message(\n message_id,\n args=(\", \".join(sorted(differing_argument_names)),),\n node=warning_node,\n )\n\n _compare_missing_args(\n params_with_doc, \"missing-param-doc\", self.not_needed_param_in_docstring\n )\n\n for index, arg_name in enumerate(arguments_node.args):\n if arguments_node.annotations[index]:\n params_with_type.add(arg_name.name)\n\n _compare_missing_args(\n params_with_type, \"missing-type-doc\", not_needed_type_in_docstring\n )\n\n _compare_different_args(\n params_with_doc, \"differing-param-doc\", self.not_needed_param_in_docstring\n )\n _compare_different_args(\n params_with_type, \"differing-type-doc\", not_needed_type_in_docstring\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _add_raise_message(self, missing_excs, node):\n if node.is_abstract():\n try:\n missing_excs.remove(\"NotImplementedError\")\n except KeyError:\n pass\n\n if not missing_excs:\n return\n\n self.add_message(\n \"missing-raises-doc\", args=(\", \".join(sorted(missing_excs)),), node=node\n )", "response": "Adds a message to the log for missing exceptions."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nbinding a provider to a Flask application instance that can use the grant token for the current user.", "response": "def bind_cache_grant(app, provider, current_user, config_prefix='OAUTH2'):\n \"\"\"Configures an :class:`OAuth2Provider` instance to use various caching\n systems to get and set the grant token. This removes the need to\n register :func:`grantgetter` and :func:`grantsetter` yourself.\n\n :param app: Flask application instance\n :param provider: :class:`OAuth2Provider` instance\n :param current_user: function that returns an :class:`User` object\n :param config_prefix: prefix for config\n\n A usage example::\n\n oauth = OAuth2Provider(app)\n app.config.update({'OAUTH2_CACHE_TYPE': 'redis'})\n\n bind_cache_grant(app, oauth, current_user)\n\n You can define which cache system you would like to use by setting the\n following configuration option::\n\n OAUTH2_CACHE_TYPE = 'null' // memcache, simple, redis, filesystem\n\n For more information on the supported cache systems please visit:\n `Cache `_\n \"\"\"\n cache = Cache(app, config_prefix)\n\n @provider.grantsetter\n def create_grant(client_id, code, request, *args, **kwargs):\n \"\"\"Sets the grant token with the configured cache system\"\"\"\n grant = Grant(\n cache,\n client_id=client_id,\n code=code['code'],\n redirect_uri=request.redirect_uri,\n scopes=request.scopes,\n user=current_user(),\n )\n log.debug(\"Set Grant Token with key %s\" % grant.key)\n cache.set(grant.key, dict(grant))\n\n @provider.grantgetter\n def get(client_id, code):\n \"\"\"Gets the grant token with the configured cache system\"\"\"\n grant = Grant(cache, client_id=client_id, code=code)\n ret = cache.get(grant.key)\n if not ret:\n log.debug(\"Grant Token not found with key %s\" % grant.key)\n return None\n log.debug(\"Grant Token found with key %s\" % grant.key)\n for k, v in ret.items():\n setattr(grant, k, v)\n return grant"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef bind_sqlalchemy(provider, session, user=None, client=None,\n token=None, grant=None, current_user=None):\n \"\"\"Configures the given :class:`OAuth2Provider` instance with the\n required getters and setters for persistence with SQLAlchemy.\n\n An example of using all models::\n\n oauth = OAuth2Provider(app)\n\n bind_sqlalchemy(oauth, session, user=User, client=Client,\n token=Token, grant=Grant, current_user=current_user)\n\n You can omit any model if you wish to register the functions yourself.\n It is also possible to override the functions by registering them\n afterwards::\n\n oauth = OAuth2Provider(app)\n\n bind_sqlalchemy(oauth, session, user=User, client=Client, token=Token)\n\n @oauth.grantgetter\n def get_grant(client_id, code):\n pass\n\n @oauth.grantsetter\n def set_grant(client_id, code, request, *args, **kwargs):\n pass\n\n # register tokensetter with oauth but keeping the tokengetter\n # registered by `SQLAlchemyBinding`\n # You would only do this for the token and grant since user and client\n # only have getters\n @oauth.tokensetter\n def set_token(token, request, *args, **kwargs):\n pass\n\n Note that current_user is only required if you're using SQLAlchemy\n for grant caching. If you're using another caching system with\n GrantCacheBinding instead, omit current_user.\n\n :param provider: :class:`OAuth2Provider` instance\n :param session: A :class:`Session` object\n :param user: :class:`User` model\n :param client: :class:`Client` model\n :param token: :class:`Token` model\n :param grant: :class:`Grant` model\n :param current_user: function that returns a :class:`User` object\n \"\"\"\n if user:\n user_binding = UserBinding(user, session)\n provider.usergetter(user_binding.get)\n\n if client:\n client_binding = ClientBinding(client, session)\n provider.clientgetter(client_binding.get)\n\n if token:\n token_binding = TokenBinding(token, session, current_user)\n provider.tokengetter(token_binding.get)\n provider.tokensetter(token_binding.set)\n\n if grant:\n if not current_user:\n raise ValueError(('`current_user` is required'\n 'for Grant Binding'))\n grant_binding = GrantBinding(grant, session, current_user)\n provider.grantgetter(grant_binding.get)\n provider.grantsetter(grant_binding.set)", "response": "Configures the given SQLAlchemy provider instance with the required getters and setters for persistence with SQLAlchemy."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef delete(self):\n log.debug(\n \"Deleting grant %s for client %s\" % (self.code, self.client_id)\n )\n self._cache.delete(self.key)\n return None", "response": "Removes itself from the cache"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndetermines which method of getting the query object for use", "response": "def query(self):\n \"\"\"Determines which method of getting the query object for use\"\"\"\n if hasattr(self.model, 'query'):\n return self.model.query\n else:\n return self.session.query(self.model)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning the User object if the user is found and the password matches the given username and password Returns None if the user isn t found", "response": "def get(self, username, password, *args, **kwargs):\n \"\"\"Returns the User object\n\n Returns None if the user isn't found or the passwords don't match\n\n :param username: username of the user\n :param password: password of the user\n \"\"\"\n user = self.query.filter_by(username=username).first()\n if user and user.check_password(password):\n return user\n return None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn a Token object with the given access token or refresh token", "response": "def get(self, access_token=None, refresh_token=None):\n \"\"\"returns a Token object with the given access token or refresh token\n\n :param access_token: User's access token\n :param refresh_token: User's refresh token\n \"\"\"\n if access_token:\n return self.query.filter_by(access_token=access_token).first()\n elif refresh_token:\n return self.query.filter_by(refresh_token=refresh_token).first()\n return None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate a Token object and removes all expired tokens that belong to the user.", "response": "def set(self, token, request, *args, **kwargs):\n \"\"\"Creates a Token object and removes all expired tokens that belong\n to the user\n\n :param token: token object\n :param request: OAuthlib request object\n \"\"\"\n if hasattr(request, 'user') and request.user:\n user = request.user\n elif self.current_user:\n # for implicit token\n user = self.current_user()\n\n client = request.client\n\n tokens = self.query.filter_by(\n client_id=client.client_id,\n user_id=user.id).all()\n if tokens:\n for tk in tokens:\n self.session.delete(tk)\n self.session.commit()\n\n expires_in = token.get('expires_in')\n expires = datetime.utcnow() + timedelta(seconds=expires_in)\n\n tok = self.model(**token)\n tok.expires = expires\n tok.client_id = client.client_id\n tok.user_id = user.id\n\n self.session.add(tok)\n self.session.commit()\n return tok"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef set(self, client_id, code, request, *args, **kwargs):\n expires = datetime.utcnow() + timedelta(seconds=100)\n grant = self.model(\n client_id=request.client.client_id,\n code=code['code'],\n redirect_uri=request.redirect_uri,\n scope=' '.join(request.scopes),\n user=self.current_user(),\n expires=expires\n )\n self.session.add(grant)\n\n self.session.commit()", "response": "Creates a Grant object with the given params and saves it in the session"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get(self, client_id, code):\n return self.query.filter_by(client_id=client_id, code=code).first()", "response": "Get the Grant object with the given client ID and code"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nparse the response returned by the OAuthRemoteApp. http_request method.", "response": "def parse_response(resp, content, strict=False, content_type=None):\n \"\"\"Parse the response returned by :meth:`OAuthRemoteApp.http_request`.\n\n :param resp: response of http_request\n :param content: content of the response\n :param strict: strict mode for form urlencoded content\n :param content_type: assign a content type manually\n \"\"\"\n if not content_type:\n content_type = resp.headers.get('content-type', 'application/json')\n ct, options = parse_options_header(content_type)\n\n if ct in ('application/json', 'text/javascript'):\n if not content:\n return {}\n return json.loads(content)\n\n if ct in ('application/xml', 'text/xml'):\n return get_etree().fromstring(content)\n\n if ct != 'application/x-www-form-urlencoded' and strict:\n return content\n charset = options.get('charset', 'utf-8')\n return url_decode(content, charset=charset).to_dict()"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nmake request parameters right.", "response": "def prepare_request(uri, headers=None, data=None, method=None):\n \"\"\"Make request parameters right.\"\"\"\n if headers is None:\n headers = {}\n\n if data and not method:\n method = 'POST'\n elif not method:\n method = 'GET'\n\n if method == 'GET' and data:\n uri = add_params_to_uri(uri, data)\n data = None\n\n return uri, headers, data, method"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nregistering a new remote application.", "response": "def remote_app(self, name, register=True, **kwargs):\n \"\"\"Registers a new remote application.\n\n :param name: the name of the remote application\n :param register: whether the remote app will be registered\n\n Find more parameters from :class:`OAuthRemoteApp`.\n \"\"\"\n remote = OAuthRemoteApp(self, name, **kwargs)\n if register:\n assert name not in self.remote_apps\n self.remote_apps[name] = remote\n return remote"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nsend a request to the remote server with OAuth tokens attached.", "response": "def request(self, url, data=None, headers=None, format='urlencoded',\n method='GET', content_type=None, token=None):\n \"\"\"\n Sends a request to the remote server with OAuth tokens attached.\n\n :param data: the data to be sent to the server.\n :param headers: an optional dictionary of headers.\n :param format: the format for the `data`. Can be `urlencoded` for\n URL encoded data or `json` for JSON.\n :param method: the HTTP request method to use.\n :param content_type: an optional content type. If a content type\n is provided, the data is passed as it, and\n the `format` is ignored.\n :param token: an optional token to pass, if it is None, token will\n be generated by tokengetter.\n \"\"\"\n\n headers = dict(headers or {})\n if token is None:\n token = self.get_request_token()\n\n client = self.make_client(token)\n url = self.expand_url(url)\n if method == 'GET':\n assert format == 'urlencoded'\n if data:\n url = add_params_to_uri(url, data)\n data = None\n else:\n if content_type is None:\n data, content_type = encode_request_data(data, format)\n if content_type is not None:\n headers['Content-Type'] = content_type\n\n if self.request_token_url:\n # oauth1\n uri, headers, body = client.sign(\n url, http_method=method, body=data, headers=headers\n )\n else:\n # oauth2\n uri, headers, body = client.add_token(\n url, http_method=method, body=data, headers=headers\n )\n\n if hasattr(self, 'pre_request'):\n # This is designed for some rubbish services like weibo.\n # Since they don't follow the standards, we need to\n # change the uri, headers, or body.\n uri, headers, body = self.pre_request(uri, headers, body)\n\n if body:\n data = to_bytes(body, self.encoding)\n else:\n data = None\n resp, content = self.http_request(\n uri, headers, data=to_bytes(body, self.encoding), method=method\n )\n return OAuthResponse(resp, content, self.content_type)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef authorize(self, callback=None, state=None, **kwargs):\n params = dict(self.request_token_params) or {}\n params.update(**kwargs)\n\n if self.request_token_url:\n token = self.generate_request_token(callback)[0]\n url = '%s?oauth_token=%s' % (\n self.expand_url(self.authorize_url), url_quote(token)\n )\n if params:\n url += '&' + url_encode(params)\n else:\n assert callback is not None, 'Callback is required for OAuth2'\n\n client = self.make_client()\n\n if 'scope' in params:\n scope = params.pop('scope')\n else:\n scope = None\n\n if isinstance(scope, str):\n # oauthlib need unicode\n scope = _encode(scope, self.encoding)\n\n if 'state' in params:\n if not state:\n state = params.pop('state')\n else:\n # remove state in params\n params.pop('state')\n\n if callable(state):\n # state can be function for generate a random string\n state = state()\n\n session['%s_oauthredir' % self.name] = callback\n url = client.prepare_request_uri(\n self.expand_url(self.authorize_url),\n redirect_uri=callback,\n scope=scope,\n state=state,\n **params\n )\n return redirect(url)", "response": "Returns a redirect response to the remote authorization URL with the signed callback given."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nhandling an oauth1 authorization response.", "response": "def handle_oauth1_response(self, args):\n \"\"\"Handles an oauth1 authorization response.\"\"\"\n client = self.make_client()\n client.verifier = args.get('oauth_verifier')\n tup = session.get('%s_oauthtok' % self.name)\n if not tup:\n raise OAuthException(\n 'Token not found, maybe you disabled cookie',\n type='token_not_found'\n )\n client.resource_owner_key = tup[0]\n client.resource_owner_secret = tup[1]\n\n uri, headers, data = client.sign(\n self.expand_url(self.access_token_url),\n _encode(self.access_token_method)\n )\n headers.update(self._access_token_headers)\n\n resp, content = self.http_request(\n uri, headers, to_bytes(data, self.encoding),\n method=self.access_token_method\n )\n data = parse_response(resp, content)\n if resp.code not in (200, 201):\n raise OAuthException(\n 'Invalid response from %s' % self.name,\n type='invalid_response', data=data\n )\n return data"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nhandle an oauth2 response.", "response": "def handle_oauth2_response(self, args):\n \"\"\"Handles an oauth2 authorization response.\"\"\"\n\n client = self.make_client()\n remote_args = {\n 'code': args.get('code'),\n 'client_secret': self.consumer_secret,\n 'redirect_uri': session.get('%s_oauthredir' % self.name)\n }\n log.debug('Prepare oauth2 remote args %r', remote_args)\n remote_args.update(self.access_token_params)\n headers = copy(self._access_token_headers)\n if self.access_token_method == 'POST':\n headers.update({'Content-Type': 'application/x-www-form-urlencoded'})\n body = client.prepare_request_body(**remote_args)\n resp, content = self.http_request(\n self.expand_url(self.access_token_url),\n headers=headers,\n data=to_bytes(body, self.encoding),\n method=self.access_token_method,\n )\n elif self.access_token_method == 'GET':\n qs = client.prepare_request_body(**remote_args)\n url = self.expand_url(self.access_token_url)\n url += ('?' in url and '&' or '?') + qs\n resp, content = self.http_request(\n url,\n headers=headers,\n method=self.access_token_method,\n )\n else:\n raise OAuthException(\n 'Unsupported access_token_method: %s' %\n self.access_token_method\n )\n\n data = parse_response(resp, content, content_type=self.content_type)\n if resp.code not in (200, 201):\n raise OAuthException(\n 'Invalid response from %s' % self.name,\n type='invalid_response', data=data\n )\n return data"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef authorized_response(self, args=None):\n if args is None:\n args = request.args\n if 'oauth_verifier' in args:\n data = self.handle_oauth1_response(args)\n elif 'code' in args:\n data = self.handle_oauth2_response(args)\n else:\n data = self.handle_unknown_response()\n\n # free request token\n session.pop('%s_oauthtok' % self.name, None)\n session.pop('%s_oauthredir' % self.name, None)\n return data", "response": "Handles authorization response smartly."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nhandle an OAuth callback. .. versionchanged:: 0.7 @authorized_handler is deprecated in favor of authorized_response.", "response": "def authorized_handler(self, f):\n \"\"\"Handles an OAuth callback.\n\n .. versionchanged:: 0.7\n @authorized_handler is deprecated in favor of authorized_response.\n \"\"\"\n @wraps(f)\n def decorated(*args, **kwargs):\n log.warn(\n '@authorized_handler is deprecated in favor of '\n 'authorized_response'\n )\n data = self.authorized_response()\n return f(*((data,) + args), **kwargs)\n return decorated"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _hash_token(application, token):\n if isinstance(token, dict):\n hashed_token = tuple(sorted(token.items()))\n elif isinstance(token, tuple):\n hashed_token = token\n else:\n raise TypeError('%r is unknown type of token' % token)\n\n return (application.__class__.__name__, application.name, hashed_token)", "response": "Creates a hashable object for given token then we could use it as a\n dictionary key."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns a new OAuth session with the return value of .", "response": "def client(self):\n \"\"\"The lazy-created OAuth session with the return value of\n :meth:`tokengetter`.\n\n :returns: The OAuth session instance or ``None`` while token missing.\n \"\"\"\n token = self.obtain_token()\n if token is None:\n raise AccessTokenNotFound\n return self._make_client_with_token(token)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _make_client_with_token(self, token):\n cached_clients = getattr(self, 'clients', None)\n hashed_token = _hash_token(self, token)\n\n if cached_clients and hashed_token in cached_clients:\n return cached_clients[hashed_token]\n\n client = self.make_client(token) # implemented in subclasses\n if cached_clients:\n cached_clients[hashed_token] = client\n\n return client", "response": "Uses cached client or create new one with specific token."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef make_client(self, token):\n if isinstance(token, dict):\n access_token = token['oauth_token']\n access_token_secret = token['oauth_token_secret']\n else:\n access_token, access_token_secret = token\n return self.make_oauth_session(\n resource_owner_key=access_token,\n resource_owner_secret=access_token_secret)", "response": "Creates a client with specific access token pair."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef insecure_transport(self):\n origin = os.environ.get('OAUTHLIB_INSECURE_TRANSPORT')\n if current_app.debug or current_app.testing:\n try:\n os.environ['OAUTHLIB_INSECURE_TRANSPORT'] = '1'\n yield\n finally:\n if origin:\n os.environ['OAUTHLIB_INSECURE_TRANSPORT'] = origin\n else:\n os.environ.pop('OAUTHLIB_INSECURE_TRANSPORT', None)\n else:\n if origin:\n warnings.warn(\n 'OAUTHLIB_INSECURE_TRANSPORT has been found in os.environ '\n 'but the app is not running in debug mode or testing mode.'\n ' It may put you in danger of the Man-in-the-middle attack'\n ' while using OAuth 2.', RuntimeWarning)\n yield", "response": "Creates a context to enable insecure transport."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef server(self):\n if hasattr(self, '_validator'):\n return Server(self._validator)\n\n if hasattr(self, '_clientgetter') and \\\n hasattr(self, '_tokengetter') and \\\n hasattr(self, '_tokensetter') and \\\n hasattr(self, '_noncegetter') and \\\n hasattr(self, '_noncesetter') and \\\n hasattr(self, '_grantgetter') and \\\n hasattr(self, '_grantsetter') and \\\n hasattr(self, '_verifiergetter') and \\\n hasattr(self, '_verifiersetter'):\n\n validator = OAuth1RequestValidator(\n clientgetter=self._clientgetter,\n tokengetter=self._tokengetter,\n tokensetter=self._tokensetter,\n grantgetter=self._grantgetter,\n grantsetter=self._grantsetter,\n noncegetter=self._noncegetter,\n noncesetter=self._noncesetter,\n verifiergetter=self._verifiergetter,\n verifiersetter=self._verifiersetter,\n config=self.app.config,\n )\n\n self._validator = validator\n server = Server(validator)\n if self.app.testing:\n # It will always be false, since the redirect_uri\n # didn't match when doing the testing\n server._check_signature = lambda *args, **kwargs: True\n return server\n raise RuntimeError(\n 'application not bound to required getters and setters'\n )", "response": "Returns the server object for the current application."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef request_token_handler(self, f):\n @wraps(f)\n def decorated(*args, **kwargs):\n server = self.server\n uri, http_method, body, headers = extract_params()\n credentials = f(*args, **kwargs)\n try:\n ret = server.create_request_token_response(\n uri, http_method, body, headers, credentials)\n return create_response(*ret)\n except errors.OAuth1Error as e:\n return _error_response(e)\n return decorated", "response": "Decorator that wraps a function to return a dictionary or None as\n "} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nprotect resource with specified scopes.", "response": "def require_oauth(self, *realms, **kwargs):\n \"\"\"Protect resource with specified scopes.\"\"\"\n def wrapper(f):\n @wraps(f)\n def decorated(*args, **kwargs):\n for func in self._before_request_funcs:\n func()\n\n if hasattr(request, 'oauth') and request.oauth:\n return f(*args, **kwargs)\n\n server = self.server\n uri, http_method, body, headers = extract_params()\n try:\n valid, req = server.validate_protected_resource_request(\n uri, http_method, body, headers, realms\n )\n except Exception as e:\n log.warn('Exception: %r', e)\n e.urlencoded = urlencode([('error', 'unknown')])\n e.status_code = 400\n return _error_response(e)\n for func in self._after_request_funcs:\n valid, req = func(valid, req)\n\n if not valid:\n return abort(401)\n # alias user for convenience\n req.user = req.access_token.user\n request.oauth = req\n return f(*args, **kwargs)\n return decorated\n return wrapper"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting the client secret for a given client key.", "response": "def get_client_secret(self, client_key, request):\n \"\"\"Get client secret.\n\n The client object must has ``client_secret`` attribute.\n \"\"\"\n log.debug('Get client secret of %r', client_key)\n if not request.client:\n request.client = self._clientgetter(client_key=client_key)\n if request.client:\n return request.client.client_secret\n return None"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget the request token secret for the given token.", "response": "def get_request_token_secret(self, client_key, token, request):\n \"\"\"Get request token secret.\n\n The request token object should a ``secret`` attribute.\n \"\"\"\n log.debug('Get request token secret of %r for %r',\n token, client_key)\n tok = request.request_token or self._grantgetter(token=token)\n if tok and tok.client_key == client_key:\n request.request_token = tok\n return tok.secret\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_access_token_secret(self, client_key, token, request):\n log.debug('Get access token secret of %r for %r',\n token, client_key)\n tok = request.access_token or self._tokengetter(\n client_key=client_key,\n token=token,\n )\n if tok:\n request.access_token = tok\n return tok.secret\n return None", "response": "Get the secret of an access token."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_default_realms(self, client_key, request):\n log.debug('Get realms for %r', client_key)\n\n if not request.client:\n request.client = self._clientgetter(client_key=client_key)\n\n client = request.client\n if hasattr(client, 'default_realms'):\n return client.default_realms\n return []", "response": "Get the default realms of the client."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_realms(self, token, request):\n log.debug('Get realms of %r', token)\n tok = request.request_token or self._grantgetter(token=token)\n if not tok:\n return []\n request.request_token = tok\n if hasattr(tok, 'realms'):\n return tok.realms or []\n return []", "response": "Returns a list of realms for the given request token."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the redirect uri for this request token.", "response": "def get_redirect_uri(self, token, request):\n \"\"\"Redirect uri for this request token.\"\"\"\n log.debug('Get redirect uri of %r', token)\n tok = request.request_token or self._grantgetter(token=token)\n return tok.redirect_uri"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nretrieve a previously stored client provided RSA key.", "response": "def get_rsa_key(self, client_key, request):\n \"\"\"Retrieves a previously stored client provided RSA key.\"\"\"\n if not request.client:\n request.client = self._clientgetter(client_key=client_key)\n if hasattr(request.client, 'rsa_key'):\n return request.client.rsa_key\n return None"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nvalidates that the client key is valid.", "response": "def validate_client_key(self, client_key, request):\n \"\"\"Validates that supplied client key.\"\"\"\n log.debug('Validate client key for %r', client_key)\n if not request.client:\n request.client = self._clientgetter(client_key=client_key)\n if request.client:\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nvalidates request token is available for client.", "response": "def validate_request_token(self, client_key, token, request):\n \"\"\"Validates request token is available for client.\"\"\"\n log.debug('Validate request token %r for %r',\n token, client_key)\n tok = request.request_token or self._grantgetter(token=token)\n if tok and tok.client_key == client_key:\n request.request_token = tok\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef validate_access_token(self, client_key, token, request):\n log.debug('Validate access token %r for %r',\n token, client_key)\n tok = request.access_token or self._tokengetter(\n client_key=client_key,\n token=token,\n )\n if tok:\n request.access_token = tok\n return True\n return False", "response": "Validates that the given token is available for the given client."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef validate_timestamp_and_nonce(self, client_key, timestamp, nonce,\n request, request_token=None,\n access_token=None):\n \"\"\"Validate the timestamp and nonce is used or not.\"\"\"\n log.debug('Validate timestamp and nonce %r', client_key)\n nonce_exists = self._noncegetter(\n client_key=client_key, timestamp=timestamp,\n nonce=nonce, request_token=request_token,\n access_token=access_token\n )\n if nonce_exists:\n return False\n self._noncesetter(\n client_key=client_key, timestamp=timestamp,\n nonce=nonce, request_token=request_token,\n access_token=access_token\n )\n return True", "response": "Validate the timestamp and nonce is used or not."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nvalidates if the redirect_uri is allowed by the client.", "response": "def validate_redirect_uri(self, client_key, redirect_uri, request):\n \"\"\"Validate if the redirect_uri is allowed by the client.\"\"\"\n log.debug('Validate redirect_uri %r for %r', redirect_uri, client_key)\n if not request.client:\n request.client = self._clientgetter(client_key=client_key)\n if not request.client:\n return False\n if not request.client.redirect_uris and redirect_uri is None:\n return True\n request.redirect_uri = redirect_uri\n return redirect_uri in request.client.redirect_uris"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking if the token has permission on those realms.", "response": "def validate_realms(self, client_key, token, request, uri=None,\n realms=None):\n \"\"\"Check if the token has permission on those realms.\"\"\"\n log.debug('Validate realms %r for %r', realms, client_key)\n if request.access_token:\n tok = request.access_token\n else:\n tok = self._tokengetter(client_key=client_key, token=token)\n request.access_token = tok\n if not tok:\n return False\n return set(tok.realms).issuperset(set(realms))"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nvalidates that a verifier exists for a given client key and token.", "response": "def validate_verifier(self, client_key, token, verifier, request):\n \"\"\"Validate verifier exists.\"\"\"\n log.debug('Validate verifier %r for %r', verifier, client_key)\n data = self._verifiergetter(verifier=verifier, token=token)\n if not data:\n return False\n if not hasattr(data, 'user'):\n log.debug('Verifier should has user attribute')\n return False\n request.user = data.user\n if hasattr(data, 'client_key'):\n return data.client_key == client_key\n return True"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nverify if the request token is existed.", "response": "def verify_request_token(self, token, request):\n \"\"\"Verify if the request token is existed.\"\"\"\n log.debug('Verify request token %r', token)\n tok = request.request_token or self._grantgetter(token=token)\n if tok:\n request.request_token = tok\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nverifies if the realms match the requested realms.", "response": "def verify_realms(self, token, realms, request):\n \"\"\"Verify if the realms match the requested realms.\"\"\"\n log.debug('Verify realms %r', realms)\n tok = request.request_token or self._grantgetter(token=token)\n if not tok:\n return False\n\n request.request_token = tok\n if not hasattr(tok, 'realms'):\n # realms not enabled\n return True\n return set(tok.realms) == set(realms)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsaves an access token to database.", "response": "def save_access_token(self, token, request):\n \"\"\"Save access token to database.\n\n A tokensetter is required, which accepts a token and request\n parameters::\n\n def tokensetter(token, request):\n access_token = Token(\n client=request.client,\n user=request.user,\n token=token['oauth_token'],\n secret=token['oauth_token_secret'],\n realms=token['oauth_authorized_realms'],\n )\n return access_token.save()\n \"\"\"\n log.debug('Save access token %r', token)\n self._tokensetter(token, request)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nsave a request token to database.", "response": "def save_request_token(self, token, request):\n \"\"\"Save request token to database.\n\n A grantsetter is required, which accepts a token and request\n parameters::\n\n def grantsetter(token, request):\n grant = Grant(\n token=token['oauth_token'],\n secret=token['oauth_token_secret'],\n client=request.client,\n redirect_uri=oauth.redirect_uri,\n realms=request.realms,\n )\n return grant.save()\n \"\"\"\n log.debug('Save request token %r', token)\n self._grantsetter(token, request)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsaving verifier to database.", "response": "def save_verifier(self, token, verifier, request):\n \"\"\"Save verifier to database.\n\n A verifiersetter is required. It would be better to combine request\n token and verifier together::\n\n def verifiersetter(token, verifier, request):\n tok = Grant.query.filter_by(token=token).first()\n tok.verifier = verifier['oauth_verifier']\n tok.user = get_current_user()\n return tok.save()\n\n .. admonition:: Note:\n\n A user is required on verifier, remember to attach current\n user to verifier.\n \"\"\"\n log.debug('Save verifier %r for %r', verifier, token)\n self._verifiersetter(\n token=token, verifier=verifier, request=request\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef server(self):\n expires_in = self.app.config.get('OAUTH2_PROVIDER_TOKEN_EXPIRES_IN')\n token_generator = self.app.config.get(\n 'OAUTH2_PROVIDER_TOKEN_GENERATOR', None\n )\n if token_generator and not callable(token_generator):\n token_generator = import_string(token_generator)\n\n refresh_token_generator = self.app.config.get(\n 'OAUTH2_PROVIDER_REFRESH_TOKEN_GENERATOR', None\n )\n if refresh_token_generator and not callable(refresh_token_generator):\n refresh_token_generator = import_string(refresh_token_generator)\n\n if hasattr(self, '_validator'):\n return Server(\n self._validator,\n token_expires_in=expires_in,\n token_generator=token_generator,\n refresh_token_generator=refresh_token_generator,\n )\n\n if hasattr(self, '_clientgetter') and \\\n hasattr(self, '_tokengetter') and \\\n hasattr(self, '_tokensetter') and \\\n hasattr(self, '_grantgetter') and \\\n hasattr(self, '_grantsetter'):\n\n usergetter = None\n if hasattr(self, '_usergetter'):\n usergetter = self._usergetter\n\n validator_class = self._validator_class\n if validator_class is None:\n validator_class = OAuth2RequestValidator\n validator = validator_class(\n clientgetter=self._clientgetter,\n tokengetter=self._tokengetter,\n grantgetter=self._grantgetter,\n usergetter=usergetter,\n tokensetter=self._tokensetter,\n grantsetter=self._grantsetter,\n )\n self._validator = validator\n return Server(\n validator,\n token_expires_in=expires_in,\n token_generator=token_generator,\n refresh_token_generator=refresh_token_generator,\n )\n raise RuntimeError('application not bound to required getters')", "response": "Return a new instance of the server class."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef verify_request(self, scopes):\n uri, http_method, body, headers = extract_params()\n return self.server.verify_request(\n uri, http_method, body, headers, scopes\n )", "response": "Verify current request, get the oauth data.\n\n If you can't use the ``require_oauth`` decorator, you can fetch\n the data in your request body::\n\n def your_handler():\n valid, req = oauth.verify_request(['email'])\n if valid:\n return jsonify(user=req.user)\n return jsonify(status='error')"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef token_handler(self, f):\n @wraps(f)\n def decorated(*args, **kwargs):\n server = self.server\n uri, http_method, body, headers = extract_params()\n credentials = f(*args, **kwargs) or {}\n log.debug('Fetched extra credentials, %r.', credentials)\n ret = server.create_token_response(\n uri, http_method, body, headers, credentials\n )\n return create_response(*ret)\n return decorated", "response": "Decorator to decorate a function to return a dictionary or None as\n is the extra credentials for creating the token response."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef require_oauth(self, *scopes):\n def wrapper(f):\n @wraps(f)\n def decorated(*args, **kwargs):\n for func in self._before_request_funcs:\n func()\n\n if hasattr(request, 'oauth') and request.oauth:\n return f(*args, **kwargs)\n\n valid, req = self.verify_request(scopes)\n\n for func in self._after_request_funcs:\n valid, req = func(valid, req)\n\n if not valid:\n if self._invalid_response:\n return self._invalid_response(req)\n return abort(401)\n request.oauth = req\n return f(*args, **kwargs)\n return decorated\n return wrapper", "response": "Protect resource with specified scopes."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning client credentials based on the current request.", "response": "def _get_client_creds_from_request(self, request):\n \"\"\"Return client credentials based on the current request.\n\n According to the rfc6749, client MAY use the HTTP Basic authentication\n scheme as defined in [RFC2617] to authenticate with the authorization\n server. The client identifier is encoded using the\n \"application/x-www-form-urlencoded\" encoding algorithm per Appendix B,\n and the encoded value is used as the username; the client password is\n encoded using the same algorithm and used as the password. The\n authorization server MUST support the HTTP Basic authentication scheme\n for authenticating clients that were issued a client password.\n See `Section 2.3.1`_.\n\n .. _`Section 2.3.1`: https://tools.ietf.org/html/rfc6749#section-2.3.1\n \"\"\"\n if request.client_id is not None:\n return request.client_id, request.client_secret\n\n auth = request.headers.get('Authorization')\n # If Werkzeug successfully parsed the Authorization header,\n # `extract_params` helper will replace the header with a parsed dict,\n # otherwise, there is nothing useful in the header and we just skip it.\n if isinstance(auth, dict):\n return auth['username'], auth['password']\n\n return None, None"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ndetermine if client authentication is required for current request.", "response": "def client_authentication_required(self, request, *args, **kwargs):\n \"\"\"Determine if client authentication is required for current request.\n\n According to the rfc6749, client authentication is required in the\n following cases:\n\n Resource Owner Password Credentials Grant: see `Section 4.3.2`_.\n Authorization Code Grant: see `Section 4.1.3`_.\n Refresh Token Grant: see `Section 6`_.\n\n .. _`Section 4.3.2`: http://tools.ietf.org/html/rfc6749#section-4.3.2\n .. _`Section 4.1.3`: http://tools.ietf.org/html/rfc6749#section-4.1.3\n .. _`Section 6`: http://tools.ietf.org/html/rfc6749#section-6\n \"\"\"\n def is_confidential(client):\n if hasattr(client, 'is_confidential'):\n return client.is_confidential\n client_type = getattr(client, 'client_type', None)\n if client_type:\n return client_type == 'confidential'\n return True\n\n grant_types = ('password', 'authorization_code', 'refresh_token')\n client_id, _ = self._get_client_creds_from_request(request)\n if client_id and request.grant_type in grant_types:\n client = self._clientgetter(client_id)\n if client:\n return is_confidential(client)\n return False"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef authenticate_client(self, request, *args, **kwargs):\n client_id, client_secret = self._get_client_creds_from_request(request)\n log.debug('Authenticate client %r', client_id)\n\n client = self._clientgetter(client_id)\n if not client:\n log.debug('Authenticate client failed, client not found.')\n return False\n\n request.client = client\n\n # http://tools.ietf.org/html/rfc6749#section-2\n # The client MAY omit the parameter if the client secret is an empty string.\n if hasattr(client, 'client_secret') and client.client_secret != client_secret:\n log.debug('Authenticate client failed, secret not match.')\n return False\n\n log.debug('Authenticate client success.')\n return True", "response": "Authenticate itself in other means."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nauthenticating a non - confidential client.", "response": "def authenticate_client_id(self, client_id, request, *args, **kwargs):\n \"\"\"Authenticate a non-confidential client.\n\n :param client_id: Client ID of the non-confidential client\n :param request: The Request object passed by oauthlib\n \"\"\"\n if client_id is None:\n client_id, _ = self._get_client_creds_from_request(request)\n\n log.debug('Authenticate client %r.', client_id)\n client = request.client or self._clientgetter(client_id)\n if not client:\n log.debug('Authenticate failed, client not found.')\n return False\n\n # attach client on request for convenience\n request.client = client\n return True"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nconfirm that a client is authorized to redirect to a given redirect_uri.", "response": "def confirm_redirect_uri(self, client_id, code, redirect_uri, client,\n *args, **kwargs):\n \"\"\"Ensure client is authorized to redirect to the redirect_uri.\n\n This method is used in the authorization code grant flow. It will\n compare redirect_uri and the one in grant token strictly, you can\n add a `validate_redirect_uri` function on grant for a customized\n validation.\n \"\"\"\n client = client or self._clientgetter(client_id)\n log.debug('Confirm redirect uri for client %r and code %r.',\n client.client_id, code)\n grant = self._grantgetter(client_id=client.client_id, code=code)\n if not grant:\n log.debug('Grant not found.')\n return False\n if hasattr(grant, 'validate_redirect_uri'):\n return grant.validate_redirect_uri(redirect_uri)\n log.debug('Compare redirect uri for grant %r and %r.',\n grant.redirect_uri, redirect_uri)\n\n testing = 'OAUTHLIB_INSECURE_TRANSPORT' in os.environ\n if testing and redirect_uri is None:\n # For testing\n return True\n\n return grant.redirect_uri == redirect_uri"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_original_scopes(self, refresh_token, request, *args, **kwargs):\n log.debug('Obtaining scope of refreshed token.')\n tok = self._tokengetter(refresh_token=refresh_token)\n return tok.scopes", "response": "Get the list of scopes associated with the refresh token."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef confirm_scopes(self, refresh_token, scopes, request, *args, **kwargs):\n if not scopes:\n log.debug('Scope omitted for refresh token %r', refresh_token)\n return True\n log.debug('Confirm scopes %r for refresh token %r',\n scopes, refresh_token)\n tok = self._tokengetter(refresh_token=refresh_token)\n return set(tok.scopes) == set(scopes)", "response": "Ensures the requested scope is equal to the scope originally granted by the resource owner."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting the default redirect_uri for the given client.", "response": "def get_default_redirect_uri(self, client_id, request, *args, **kwargs):\n \"\"\"Default redirect_uri for the given client.\"\"\"\n request.client = request.client or self._clientgetter(client_id)\n redirect_uri = request.client.default_redirect_uri\n log.debug('Found default redirect uri %r', redirect_uri)\n return redirect_uri"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_default_scopes(self, client_id, request, *args, **kwargs):\n request.client = request.client or self._clientgetter(client_id)\n scopes = request.client.default_scopes\n log.debug('Found default scopes %r', scopes)\n return scopes", "response": "Get the default scopes for the given client."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef invalidate_authorization_code(self, client_id, code, request,\n *args, **kwargs):\n \"\"\"Invalidate an authorization code after use.\n\n We keep the temporary code in a grant, which has a `delete`\n function to destroy itself.\n \"\"\"\n log.debug('Destroy grant token for client %r, %r', client_id, code)\n grant = self._grantgetter(client_id=client_id, code=code)\n if grant:\n grant.delete()", "response": "Invalidate an authorization code after use."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\npersist the authorization code for the given client.", "response": "def save_authorization_code(self, client_id, code, request,\n *args, **kwargs):\n \"\"\"Persist the authorization code.\"\"\"\n log.debug(\n 'Persist authorization code %r for client %r',\n code, client_id\n )\n request.client = request.client or self._clientgetter(client_id)\n self._grantsetter(client_id, code, request, *args, **kwargs)\n return request.client.default_redirect_uri"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef save_bearer_token(self, token, request, *args, **kwargs):\n log.debug('Save bearer token %r', token)\n self._tokensetter(token, request, *args, **kwargs)\n return request.client.default_redirect_uri", "response": "Save the bearer token."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nvalidating the bearer token.", "response": "def validate_bearer_token(self, token, scopes, request):\n \"\"\"Validate access token.\n\n :param token: A string of random characters\n :param scopes: A list of scopes\n :param request: The Request object passed by oauthlib\n\n The validation validates:\n\n 1) if the token is available\n 2) if the token has expired\n 3) if the scopes are available\n \"\"\"\n log.debug('Validate bearer token %r', token)\n tok = self._tokengetter(access_token=token)\n if not tok:\n msg = 'Bearer token not found.'\n request.error_message = msg\n log.debug(msg)\n return False\n\n # validate expires\n if tok.expires is not None and \\\n datetime.datetime.utcnow() > tok.expires:\n msg = 'Bearer token is expired.'\n request.error_message = msg\n log.debug(msg)\n return False\n\n # validate scopes\n if scopes and not set(tok.scopes) & set(scopes):\n msg = 'Bearer token scope not valid.'\n request.error_message = msg\n log.debug(msg)\n return False\n\n request.access_token = tok\n request.user = tok.user\n request.scopes = scopes\n\n if hasattr(tok, 'client'):\n request.client = tok.client\n elif hasattr(tok, 'client_id'):\n request.client = self._clientgetter(tok.client_id)\n return True"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef validate_client_id(self, client_id, request, *args, **kwargs):\n log.debug('Validate client %r', client_id)\n client = request.client or self._clientgetter(client_id)\n if client:\n # attach client to request object\n request.client = client\n return True\n return False", "response": "Ensure client_id belongs to a valid and active client."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef validate_code(self, client_id, code, client, request, *args, **kwargs):\n client = client or self._clientgetter(client_id)\n log.debug(\n 'Validate code for client %r and code %r', client.client_id, code\n )\n grant = self._grantgetter(client_id=client.client_id, code=code)\n if not grant:\n log.debug('Grant not found.')\n return False\n if hasattr(grant, 'expires') and \\\n datetime.datetime.utcnow() > grant.expires:\n log.debug('Grant is expired.')\n return False\n\n request.state = kwargs.get('state')\n request.user = grant.user\n request.scopes = grant.scopes\n return True", "response": "Ensure the grant code is valid."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nvalidate that the client is authorized to use the grant type requested.", "response": "def validate_grant_type(self, client_id, grant_type, client, request,\n *args, **kwargs):\n \"\"\"Ensure the client is authorized to use the grant type requested.\n\n It will allow any of the four grant types (`authorization_code`,\n `password`, `client_credentials`, `refresh_token`) by default.\n Implemented `allowed_grant_types` for client object to authorize\n the request.\n\n It is suggested that `allowed_grant_types` should contain at least\n `authorization_code` and `refresh_token`.\n \"\"\"\n if self._usergetter is None and grant_type == 'password':\n log.debug('Password credential authorization is disabled.')\n return False\n\n default_grant_types = (\n 'authorization_code', 'password',\n 'client_credentials', 'refresh_token',\n )\n\n # Grant type is allowed if it is part of the 'allowed_grant_types'\n # of the selected client or if it is one of the default grant types\n if hasattr(client, 'allowed_grant_types'):\n if grant_type not in client.allowed_grant_types:\n return False\n else:\n if grant_type not in default_grant_types:\n return False\n\n if grant_type == 'client_credentials':\n if not hasattr(client, 'user'):\n log.debug('Client should have a user property')\n return False\n request.user = client.user\n\n return True"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef validate_redirect_uri(self, client_id, redirect_uri, request,\n *args, **kwargs):\n \"\"\"Ensure client is authorized to redirect to the redirect_uri.\n\n This method is used in the authorization code grant flow and also\n in implicit grant flow. It will detect if redirect_uri in client's\n redirect_uris strictly, you can add a `validate_redirect_uri`\n function on grant for a customized validation.\n \"\"\"\n request.client = request.client or self._clientgetter(client_id)\n client = request.client\n if hasattr(client, 'validate_redirect_uri'):\n return client.validate_redirect_uri(redirect_uri)\n return redirect_uri in client.redirect_uris", "response": "Validate that the client is authorized to redirect to the redirect_uri."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nensures the token is valid and belongs to the client", "response": "def validate_refresh_token(self, refresh_token, client, request,\n *args, **kwargs):\n \"\"\"Ensure the token is valid and belongs to the client\n\n This method is used by the authorization code grant indirectly by\n issuing refresh tokens, resource owner password credentials grant\n (also indirectly) and the refresh token grant.\n \"\"\"\n\n token = self._tokengetter(refresh_token=refresh_token)\n\n if token and token.client_id == client.client_id:\n # Make sure the request object contains user and client_id\n request.client_id = token.client_id\n request.user = token.user\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef validate_response_type(self, client_id, response_type, client, request,\n *args, **kwargs):\n \"\"\"Ensure client is authorized to use the response type requested.\n\n It will allow any of the two (`code`, `token`) response types by\n default. Implemented `allowed_response_types` for client object\n to authorize the request.\n \"\"\"\n if response_type not in ('code', 'token'):\n return False\n\n if hasattr(client, 'allowed_response_types'):\n return response_type in client.allowed_response_types\n return True", "response": "Ensure client is authorized to use the response type requested."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nensures the client is authorized to access requested scopes.", "response": "def validate_scopes(self, client_id, scopes, client, request,\n *args, **kwargs):\n \"\"\"Ensure the client is authorized access to requested scopes.\"\"\"\n if hasattr(client, 'validate_scopes'):\n return client.validate_scopes(scopes)\n return set(client.default_scopes).issuperset(set(scopes))"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef validate_user(self, username, password, client, request,\n *args, **kwargs):\n \"\"\"Ensure the username and password is valid.\n\n Attach user object on request for later using.\n \"\"\"\n log.debug('Validating username %r and its password', username)\n if self._usergetter is not None:\n user = self._usergetter(\n username, password, client, request, *args, **kwargs\n )\n if user:\n request.user = user\n return True\n return False\n log.debug('Password credential authorization is disabled.')\n return False", "response": "Validate the username and password."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef revoke_token(self, token, token_type_hint, request, *args, **kwargs):\n if token_type_hint:\n tok = self._tokengetter(**{token_type_hint: token})\n else:\n tok = self._tokengetter(access_token=token)\n if not tok:\n tok = self._tokengetter(refresh_token=token)\n\n if tok:\n request.client_id = tok.client_id\n request.user = tok.user\n tok.delete()\n return True\n\n msg = 'Invalid token supplied.'\n log.debug(msg)\n request.error_message = msg\n return False", "response": "Revoke an access or refresh token."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef json_to_dict(x):\n '''OAuthResponse class can't parse the JSON data with content-type\n- text/html and because of a rubbish api, we can't just tell flask-oauthlib to treat it as json.'''\n if x.find(b'callback') > -1:\n # the rubbish api (https://graph.qq.com/oauth2.0/authorize) is handled here as special case\n pos_lb = x.find(b'{')\n pos_rb = x.find(b'}')\n x = x[pos_lb:pos_rb + 1]\n\n try:\n if type(x) != str: # Py3k\n x = x.decode('utf-8')\n return json.loads(x, encoding='utf-8')\n except:\n return x", "response": "JSON parser for OAuthResponse class can t parse the JSON data with content - type\n - text - html and we can t just tell flask - oauthlib to treat it as json."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nupdate some required parameters for OAuth2. 0 API calls", "response": "def update_qq_api_request_data(data={}):\n '''Update some required parameters for OAuth2.0 API calls'''\n defaults = {\n 'openid': session.get('qq_openid'),\n 'access_token': session.get('qq_token')[0],\n 'oauth_consumer_key': QQ_APP_ID,\n }\n defaults.update(data)\n return defaults"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nchange the authorization header for weibo.", "response": "def change_weibo_header(uri, headers, body):\n \"\"\"Since weibo is a rubbish server, it does not follow the standard,\n we need to change the authorization header for it.\"\"\"\n auth = headers.get('Authorization')\n if auth:\n auth = auth.replace('Bearer', 'OAuth2')\n headers['Authorization'] = auth\n return uri, headers, body"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef register_to(self, oauth, name=None, **kwargs):\n kwargs = self._process_kwargs(\n name=(name or self.default_name), **kwargs)\n return oauth.remote_app(**kwargs)", "response": "Creates a remote app and registers it."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef create(self, oauth, **kwargs):\n kwargs = self._process_kwargs(\n name=self.default_name, register=False, **kwargs)\n return oauth.remote_app(**kwargs)", "response": "Creates a remote app only."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _get_uri_from_request(request):\n uri = request.base_url\n if request.query_string:\n uri += '?' + request.query_string.decode('utf-8')\n return uri", "response": "Returns the uri returned from request. uri"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nmake sure text is bytes type.", "response": "def to_bytes(text, encoding='utf-8'):\n \"\"\"Make sure text is bytes type.\"\"\"\n if not text:\n return text\n if not isinstance(text, bytes_type):\n text = text.encode(encoding)\n return text"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate response class for Flask.", "response": "def create_response(headers, body, status):\n \"\"\"Create response class for Flask.\"\"\"\n response = Response(body or '')\n for k, v in headers.items():\n response.headers[str(k)] = v\n\n response.status_code = status\n return response"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns a SimpleCache instance with the specified parameters.", "response": "def _simple(self, **kwargs):\n \"\"\"Returns a :class:`SimpleCache` instance\n\n .. warning::\n\n This cache system might not be thread safe. Use with caution.\n \"\"\"\n kwargs.update(dict(threshold=self._config('threshold', 500)))\n return SimpleCache(**kwargs)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _memcache(self, **kwargs):\n kwargs.update(dict(\n servers=self._config('MEMCACHED_SERVERS', None),\n key_prefix=self._config('key_prefix', None),\n ))\n return MemcachedCache(**kwargs)", "response": "Returns a MemcachedCache instance"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _redis(self, **kwargs):\n kwargs.update(dict(\n host=self._config('REDIS_HOST', 'localhost'),\n port=self._config('REDIS_PORT', 6379),\n password=self._config('REDIS_PASSWORD', None),\n db=self._config('REDIS_DB', 0),\n key_prefix=self._config('KEY_PREFIX', None),\n ))\n return RedisCache(**kwargs)", "response": "Returns a RedisCache instance with the given kwargs"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _filesystem(self, **kwargs):\n kwargs.update(dict(\n threshold=self._config('threshold', 500),\n ))\n return FileSystemCache(self._config('dir', None), **kwargs)", "response": "Returns a : class : FileSystemCache instance"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_cached_clients():\n if OAuth.state_key not in current_app.extensions:\n raise RuntimeError('%r is not initialized.' % current_app)\n state = current_app.extensions[OAuth.state_key]\n return state.cached_clients", "response": "Gets the cached clients dictionary in current context."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef add_remote_app(self, remote_app, name=None, **kwargs):\n if name is None:\n name = remote_app.name\n if name != remote_app.name or kwargs:\n remote_app = copy.copy(remote_app)\n remote_app.name = name\n vars(remote_app).update(kwargs)\n if not hasattr(remote_app, 'clients'):\n remote_app.clients = cached_clients\n self.remote_apps[name] = remote_app\n return remote_app", "response": "Adds a remote application and applies custom attributes on it."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef remote_app(self, name, version=None, **kwargs):\n if version is None:\n if 'request_token_url' in kwargs:\n version = '1'\n else:\n version = '2'\n if version == '1':\n remote_app = OAuth1Application(name, clients=cached_clients)\n elif version == '2':\n remote_app = OAuth2Application(name, clients=cached_clients)\n else:\n raise ValueError('unkonwn version %r' % version)\n return self.add_remote_app(remote_app, **kwargs)", "response": "Creates and adds a remote application."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncalling the method repeatedly such that it will raise an exception.", "response": "def check_exception(self):\n \"\"\"\n Call the method repeatedly such that it will raise an exception.\n \"\"\"\n for i in xrange(self.iterations):\n cert = X509()\n try:\n cert.get_pubkey()\n except Error:\n pass"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncall the method repeatedly such that it will return a PKey object.", "response": "def check_success(self):\n \"\"\"\n Call the method repeatedly such that it will return a PKey object.\n \"\"\"\n small = xrange(3)\n for i in xrange(self.iterations):\n key = PKey()\n key.generate_key(TYPE_DSA, 256)\n for i in small:\n cert = X509()\n cert.set_pubkey(key)\n for i in small:\n cert.get_pubkey()"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncalls the function with an encrypted PEM and a passphrase callback.", "response": "def check_load_privatekey_callback(self):\n \"\"\"\n Call the function with an encrypted PEM and a passphrase callback.\n \"\"\"\n for i in xrange(self.iterations * 10):\n load_privatekey(\n FILETYPE_PEM, self.ENCRYPTED_PEM, lambda *args: \"hello, secret\")"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef check_load_privatekey_callback_incorrect(self):\n for i in xrange(self.iterations * 10):\n try:\n load_privatekey(\n FILETYPE_PEM, self.ENCRYPTED_PEM,\n lambda *args: \"hello, public\")\n except Error:\n pass", "response": "Check if load_privatekey_callback_incorrect is called for all the time"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ncalling the function which loads the privatekey which returns a non - string.", "response": "def check_load_privatekey_callback_wrong_type(self):\n \"\"\"\n Call the function with an encrypted PEM and a passphrase callback which\n returns a non-string.\n \"\"\"\n for i in xrange(self.iterations * 10):\n try:\n load_privatekey(\n FILETYPE_PEM, self.ENCRYPTED_PEM,\n lambda *args: {})\n except ValueError:\n pass"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef check_get_revoked(self):\n crl = CRL()\n for i in xrange(100):\n crl.add_revoked(Revoked())\n for i in xrange(self.iterations):\n crl.get_revoked()", "response": "Create a CRL object with 100 Revoked objects then call the get_revoked method repeatedly."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncopying an empty Revoked object repeatedly.", "response": "def check_X509_REVOKED_dup(self):\n \"\"\"\n Copy an empty Revoked object repeatedly. The copy is not garbage\n collected, therefore it needs to be manually freed.\n \"\"\"\n for i in xrange(self.iterations * 100):\n revoked_copy = _X509_REVOKED_dup(Revoked()._revoked)\n _lib.X509_REVOKED_free(revoked_copy)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef check_to_EC_KEY(self):\n curves = get_elliptic_curves()\n if curves:\n curve = next(iter(curves))\n for i in xrange(self.iterations * 1000):\n curve._to_EC_KEY()", "response": "Repeatedly create an EC_KEY * from an _EllipticCurve."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef main():\n if len(argv) < 2:\n print('Usage: %s ' % (argv[0],))\n return 1\n\n client = socket()\n\n print('Connecting...', end=\"\")\n stdout.flush()\n client.connect(('127.0.0.1', 8443))\n print('connected', client.getpeername())\n\n client_ssl = Connection(Context(TLSv1_METHOD), client)\n client_ssl.set_connect_state()\n client_ssl.set_tlsext_host_name(argv[1])\n client_ssl.do_handshake()\n print('Server subject is', client_ssl.get_peer_certificate().get_subject())\n client_ssl.close()", "response": "Connect to an SNI - enabled server and request a specific hostname specified by argv [ 1 ] of it."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef createKeyPair(type, bits):\n pkey = crypto.PKey()\n pkey.generate_key(type, bits)\n return pkey", "response": "Create a public key pair."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncreates a certificate request object.", "response": "def createCertRequest(pkey, digest=\"sha256\", **name):\n \"\"\"\n Create a certificate request.\n\n Arguments: pkey - The key to associate with the request\n digest - Digestion method to use for signing, default is sha256\n **name - The name of the subject of the request, possible\n arguments are:\n C - Country name\n ST - State or province name\n L - Locality name\n O - Organization name\n OU - Organizational unit name\n CN - Common name\n emailAddress - E-mail address\n Returns: The certificate request in an X509Req object\n \"\"\"\n req = crypto.X509Req()\n subj = req.get_subject()\n\n for key, value in name.items():\n setattr(subj, key, value)\n\n req.set_pubkey(pkey)\n req.sign(pkey, digest)\n return req"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngenerating a certificate given a certificate request and a private key.", "response": "def createCertificate(req, issuerCertKey, serial, validityPeriod,\n digest=\"sha256\"):\n \"\"\"\n Generate a certificate given a certificate request.\n\n Arguments: req - Certificate request to use\n issuerCert - The certificate of the issuer\n issuerKey - The private key of the issuer\n serial - Serial number for the certificate\n notBefore - Timestamp (relative to now) when the certificate\n starts being valid\n notAfter - Timestamp (relative to now) when the certificate\n stops being valid\n digest - Digest method to use for signing, default is sha256\n Returns: The signed certificate in an X509 object\n \"\"\"\n issuerCert, issuerKey = issuerCertKey\n notBefore, notAfter = validityPeriod\n cert = crypto.X509()\n cert.set_serial_number(serial)\n cert.gmtime_adj_notBefore(notBefore)\n cert.gmtime_adj_notAfter(notAfter)\n cert.set_issuer(issuerCert.get_subject())\n cert.set_subject(req.get_subject())\n cert.set_pubkey(req.get_pubkey())\n cert.sign(issuerKey, digest)\n return cert"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _make_requires(flag, error):\n def _requires_decorator(func):\n if not flag:\n @wraps(func)\n def explode(*args, **kwargs):\n raise NotImplementedError(error)\n return explode\n else:\n return func\n\n return _requires_decorator", "response": "Creates a decorator that ensures that OpenSSL\n functions that rely on OpenSSL\n functions that are not present in this build raise NotImplementedError."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nload trusted certificates from a file and return the set of trusted certificates.", "response": "def load_verify_locations(self, cafile, capath=None):\n \"\"\"\n Let SSL know where we can find trusted certificates for the certificate\n chain. Note that the certificates have to be in PEM format.\n\n If capath is passed, it must be a directory prepared using the\n ``c_rehash`` tool included with OpenSSL. Either, but not both, of\n *pemfile* or *capath* may be :data:`None`.\n\n :param cafile: In which file we can find the certificates (``bytes`` or\n ``unicode``).\n :param capath: In which directory we can find the certificates\n (``bytes`` or ``unicode``).\n\n :return: None\n \"\"\"\n if cafile is None:\n cafile = _ffi.NULL\n else:\n cafile = _path_string(cafile)\n\n if capath is None:\n capath = _ffi.NULL\n else:\n capath = _path_string(capath)\n\n load_result = _lib.SSL_CTX_load_verify_locations(\n self._context, cafile, capath\n )\n if not load_result:\n _raise_current_error()"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef set_passwd_cb(self, callback, userdata=None):\n if not callable(callback):\n raise TypeError(\"callback must be callable\")\n\n self._passphrase_helper = self._wrap_callback(callback)\n self._passphrase_callback = self._passphrase_helper.callback\n _lib.SSL_CTX_set_default_passwd_cb(\n self._context, self._passphrase_callback)\n self._passphrase_userdata = userdata", "response": "Set the passphrase callback. This function will be called when the passphrase is loaded."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nset the default path for the certificate files that are used for verification purposes.", "response": "def set_default_verify_paths(self):\n \"\"\"\n Specify that the platform provided CA certificates are to be used for\n verification purposes. This method has some caveats related to the\n binary wheels that cryptography (pyOpenSSL's primary dependency) ships:\n\n * macOS will only load certificates using this method if the user has\n the ``openssl@1.1`` `Homebrew `_ formula installed\n in the default location.\n * Windows will not work.\n * manylinux1 cryptography wheels will work on most common Linux\n distributions in pyOpenSSL 17.1.0 and above. pyOpenSSL detects the\n manylinux1 wheel and attempts to load roots via a fallback path.\n\n :return: None\n \"\"\"\n # SSL_CTX_set_default_verify_paths will attempt to load certs from\n # both a cafile and capath that are set at compile time. However,\n # it will first check environment variables and, if present, load\n # those paths instead\n set_result = _lib.SSL_CTX_set_default_verify_paths(self._context)\n _openssl_assert(set_result == 1)\n # After attempting to set default_verify_paths we need to know whether\n # to go down the fallback path.\n # First we'll check to see if any env vars have been set. If so,\n # we won't try to do anything else because the user has set the path\n # themselves.\n dir_env_var = _ffi.string(\n _lib.X509_get_default_cert_dir_env()\n ).decode(\"ascii\")\n file_env_var = _ffi.string(\n _lib.X509_get_default_cert_file_env()\n ).decode(\"ascii\")\n if not self._check_env_vars_set(dir_env_var, file_env_var):\n default_dir = _ffi.string(_lib.X509_get_default_cert_dir())\n default_file = _ffi.string(_lib.X509_get_default_cert_file())\n # Now we check to see if the default_dir and default_file are set\n # to the exact values we use in our manylinux1 builds. If they are\n # then we know to load the fallbacks\n if (\n default_dir == _CRYPTOGRAPHY_MANYLINUX1_CA_DIR and\n default_file == _CRYPTOGRAPHY_MANYLINUX1_CA_FILE\n ):\n # This is manylinux1, let's load our fallback paths\n self._fallback_default_verify_paths(\n _CERTIFICATE_FILE_LOCATIONS,\n _CERTIFICATE_PATH_LOCATIONS\n )"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncheck to see if the default cert dir or file environment vars are present.", "response": "def _check_env_vars_set(self, dir_env_var, file_env_var):\n \"\"\"\n Check to see if the default cert dir/file environment vars are present.\n\n :return: bool\n \"\"\"\n return (\n os.environ.get(file_env_var) is not None or\n os.environ.get(dir_env_var) is not None\n )"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ndefault verify paths are based on the compiled version of OpenSSL. However, when pyca/cryptography is compiled as a manylinux1 wheel that compiled location can potentially be wrong. So, like Go, we will try a predefined set of paths and attempt to load roots from there. :return: None", "response": "def _fallback_default_verify_paths(self, file_path, dir_path):\n \"\"\"\n Default verify paths are based on the compiled version of OpenSSL.\n However, when pyca/cryptography is compiled as a manylinux1 wheel\n that compiled location can potentially be wrong. So, like Go, we\n will try a predefined set of paths and attempt to load roots\n from there.\n\n :return: None\n \"\"\"\n for cafile in file_path:\n if os.path.isfile(cafile):\n self.load_verify_locations(cafile)\n break\n\n for capath in dir_path:\n if os.path.isdir(capath):\n self.load_verify_locations(None, capath)\n break"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nloading a certificate chain from a file.", "response": "def use_certificate_chain_file(self, certfile):\n \"\"\"\n Load a certificate chain from a file.\n\n :param certfile: The name of the certificate chain file (``bytes`` or\n ``unicode``). Must be PEM encoded.\n\n :return: None\n \"\"\"\n certfile = _path_string(certfile)\n\n result = _lib.SSL_CTX_use_certificate_chain_file(\n self._context, certfile\n )\n if not result:\n _raise_current_error()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nload a certificate from a file.", "response": "def use_certificate_file(self, certfile, filetype=FILETYPE_PEM):\n \"\"\"\n Load a certificate from a file\n\n :param certfile: The name of the certificate file (``bytes`` or\n ``unicode``).\n :param filetype: (optional) The encoding of the file, which is either\n :const:`FILETYPE_PEM` or :const:`FILETYPE_ASN1`. The default is\n :const:`FILETYPE_PEM`.\n\n :return: None\n \"\"\"\n certfile = _path_string(certfile)\n if not isinstance(filetype, integer_types):\n raise TypeError(\"filetype must be an integer\")\n\n use_result = _lib.SSL_CTX_use_certificate_file(\n self._context, certfile, filetype\n )\n if not use_result:\n _raise_current_error()"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef use_certificate(self, cert):\n if not isinstance(cert, X509):\n raise TypeError(\"cert must be an X509 instance\")\n\n use_result = _lib.SSL_CTX_use_certificate(self._context, cert._x509)\n if not use_result:\n _raise_current_error()", "response": "Load a certificate from a X509 object."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_extra_chain_cert(self, certobj):\n if not isinstance(certobj, X509):\n raise TypeError(\"certobj must be an X509 instance\")\n\n copy = _lib.X509_dup(certobj._x509)\n add_result = _lib.SSL_CTX_add_extra_chain_cert(self._context, copy)\n if not add_result:\n # TODO: This is untested.\n _lib.X509_free(copy)\n _raise_current_error()", "response": "Add certificate to the chain."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nloads a private key from a file.", "response": "def use_privatekey_file(self, keyfile, filetype=_UNSPECIFIED):\n \"\"\"\n Load a private key from a file\n\n :param keyfile: The name of the key file (``bytes`` or ``unicode``)\n :param filetype: (optional) The encoding of the file, which is either\n :const:`FILETYPE_PEM` or :const:`FILETYPE_ASN1`. The default is\n :const:`FILETYPE_PEM`.\n\n :return: None\n \"\"\"\n keyfile = _path_string(keyfile)\n\n if filetype is _UNSPECIFIED:\n filetype = FILETYPE_PEM\n elif not isinstance(filetype, integer_types):\n raise TypeError(\"filetype must be an integer\")\n\n use_result = _lib.SSL_CTX_use_PrivateKey_file(\n self._context, keyfile, filetype)\n if not use_result:\n self._raise_passphrase_exception()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nusing a private key to load a set of IA - A keys from a PKey object.", "response": "def use_privatekey(self, pkey):\n \"\"\"\n Load a private key from a PKey object\n\n :param pkey: The PKey object\n :return: None\n \"\"\"\n if not isinstance(pkey, PKey):\n raise TypeError(\"pkey must be a PKey instance\")\n\n use_result = _lib.SSL_CTX_use_PrivateKey(self._context, pkey._pkey)\n if not use_result:\n self._raise_passphrase_exception()"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef load_client_ca(self, cafile):\n ca_list = _lib.SSL_load_client_CA_file(\n _text_to_bytes_and_warn(\"cafile\", cafile)\n )\n _openssl_assert(ca_list != _ffi.NULL)\n _lib.SSL_CTX_set_client_CA_list(self._context, ca_list)", "response": "Load the trusted certificates from a PEM - formatted file."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsets the session id to buf within which a session can be reused for the current Context object.", "response": "def set_session_id(self, buf):\n \"\"\"\n Set the session id to *buf* within which a session can be reused for\n this Context object. This is needed when doing session resumption,\n because there is no way for a stored session to know which Context\n object it is associated with.\n\n :param bytes buf: The session id.\n\n :returns: None\n \"\"\"\n buf = _text_to_bytes_and_warn(\"buf\", buf)\n _openssl_assert(\n _lib.SSL_CTX_set_session_id_context(\n self._context,\n buf,\n len(buf),\n ) == 1\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsets the behavior of the session cache used by all connections using .", "response": "def set_session_cache_mode(self, mode):\n \"\"\"\n Set the behavior of the session cache used by all connections using\n this Context. The previously set mode is returned. See\n :const:`SESS_CACHE_*` for details about particular modes.\n\n :param mode: One or more of the SESS_CACHE_* flags (combine using\n bitwise or)\n :returns: The previously set caching mode.\n\n .. versionadded:: 0.14\n \"\"\"\n if not isinstance(mode, integer_types):\n raise TypeError(\"mode must be an integer\")\n\n return _lib.SSL_CTX_set_session_cache_mode(self._context, mode)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsetting the verification flags for this Context object to mode and specify that the verification callbacks should be used for verification callbacks.", "response": "def set_verify(self, mode, callback):\n \"\"\"\n et the verification flags for this Context object to *mode* and specify\n that *callback* should be used for verification callbacks.\n\n :param mode: The verify mode, this should be one of\n :const:`VERIFY_NONE` and :const:`VERIFY_PEER`. If\n :const:`VERIFY_PEER` is used, *mode* can be OR:ed with\n :const:`VERIFY_FAIL_IF_NO_PEER_CERT` and\n :const:`VERIFY_CLIENT_ONCE` to further control the behaviour.\n :param callback: The Python callback to use. This should take five\n arguments: A Connection object, an X509 object, and three integer\n variables, which are in turn potential error number, error depth\n and return code. *callback* should return True if verification\n passes and False otherwise.\n :return: None\n\n See SSL_CTX_set_verify(3SSL) for further details.\n \"\"\"\n if not isinstance(mode, integer_types):\n raise TypeError(\"mode must be an integer\")\n\n if not callable(callback):\n raise TypeError(\"callback must be callable\")\n\n self._verify_helper = _VerifyHelper(callback)\n self._verify_callback = self._verify_helper.callback\n _lib.SSL_CTX_set_verify(self._context, mode, self._verify_callback)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef set_verify_depth(self, depth):\n if not isinstance(depth, integer_types):\n raise TypeError(\"depth must be an integer\")\n\n _lib.SSL_CTX_set_verify_depth(self._context, depth)", "response": "Sets the maximum depth for the certificate chain verification that shall\n be allowed for this Context object."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef load_tmp_dh(self, dhfile):\n dhfile = _path_string(dhfile)\n\n bio = _lib.BIO_new_file(dhfile, b\"r\")\n if bio == _ffi.NULL:\n _raise_current_error()\n bio = _ffi.gc(bio, _lib.BIO_free)\n\n dh = _lib.PEM_read_bio_DHparams(bio, _ffi.NULL, _ffi.NULL, _ffi.NULL)\n dh = _ffi.gc(dh, _lib.DH_free)\n _lib.SSL_CTX_set_tmp_dh(self._context, dh)", "response": "Load the parameters for Ephemeral Diffie - Hellman."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsets the list of ciphers to be used in this context.", "response": "def set_cipher_list(self, cipher_list):\n \"\"\"\n Set the list of ciphers to be used in this context.\n\n See the OpenSSL manual for more information (e.g.\n :manpage:`ciphers(1)`).\n\n :param bytes cipher_list: An OpenSSL cipher string.\n :return: None\n \"\"\"\n cipher_list = _text_to_bytes_and_warn(\"cipher_list\", cipher_list)\n\n if not isinstance(cipher_list, bytes):\n raise TypeError(\"cipher_list must be a byte string.\")\n\n _openssl_assert(\n _lib.SSL_CTX_set_cipher_list(self._context, cipher_list) == 1\n )\n # In OpenSSL 1.1.1 setting the cipher list will always return TLS 1.3\n # ciphers even if you pass an invalid cipher. Applications (like\n # Twisted) have tests that depend on an error being raised if an\n # invalid cipher string is passed, but without the following check\n # for the TLS 1.3 specific cipher suites it would never error.\n tmpconn = Connection(self, None)\n if (\n tmpconn.get_cipher_list() == [\n 'TLS_AES_256_GCM_SHA384',\n 'TLS_CHACHA20_POLY1305_SHA256',\n 'TLS_AES_128_GCM_SHA256'\n ]\n ):\n raise Error(\n [\n (\n 'SSL routines',\n 'SSL_CTX_set_cipher_list',\n 'no cipher match',\n ),\n ],\n )"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef set_client_ca_list(self, certificate_authorities):\n name_stack = _lib.sk_X509_NAME_new_null()\n _openssl_assert(name_stack != _ffi.NULL)\n\n try:\n for ca_name in certificate_authorities:\n if not isinstance(ca_name, X509Name):\n raise TypeError(\n \"client CAs must be X509Name objects, not %s \"\n \"objects\" % (\n type(ca_name).__name__,\n )\n )\n copy = _lib.X509_NAME_dup(ca_name._name)\n _openssl_assert(copy != _ffi.NULL)\n push_result = _lib.sk_X509_NAME_push(name_stack, copy)\n if not push_result:\n _lib.X509_NAME_free(copy)\n _raise_current_error()\n except Exception:\n _lib.sk_X509_NAME_free(name_stack)\n raise\n\n _lib.SSL_CTX_set_client_CA_list(self._context, name_stack)", "response": "Sets the list of preferred client certificate signers for this server."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef add_client_ca(self, certificate_authority):\n if not isinstance(certificate_authority, X509):\n raise TypeError(\"certificate_authority must be an X509 instance\")\n\n add_result = _lib.SSL_CTX_add_client_CA(\n self._context, certificate_authority._x509)\n _openssl_assert(add_result == 1)", "response": "Add the client CA certificate to the list of preferred signers for this context."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef set_timeout(self, timeout):\n if not isinstance(timeout, integer_types):\n raise TypeError(\"timeout must be an integer\")\n\n return _lib.SSL_CTX_set_timeout(self._context, timeout)", "response": "Sets the timeout for newly created sessions for this Context object to a given number of seconds."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef set_info_callback(self, callback):\n @wraps(callback)\n def wrapper(ssl, where, return_code):\n callback(Connection._reverse_mapping[ssl], where, return_code)\n self._info_callback = _ffi.callback(\n \"void (*)(const SSL *, int, int)\", wrapper)\n _lib.SSL_CTX_set_info_callback(self._context, self._info_callback)", "response": "Set the information callback to be called when the internal entry is found."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_cert_store(self):\n store = _lib.SSL_CTX_get_cert_store(self._context)\n if store == _ffi.NULL:\n # TODO: This is untested.\n return None\n\n pystore = X509Store.__new__(X509Store)\n pystore._store = store\n return pystore", "response": "Get the certificate store for the context."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nadds options to the set before are not cleared!", "response": "def set_options(self, options):\n \"\"\"\n Add options. Options set before are not cleared!\n This method should be used with the :const:`OP_*` constants.\n\n :param options: The options to add.\n :return: The new option bitmask.\n \"\"\"\n if not isinstance(options, integer_types):\n raise TypeError(\"options must be an integer\")\n\n return _lib.SSL_CTX_set_options(self._context, options)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsetting the mode of the current context.", "response": "def set_mode(self, mode):\n \"\"\"\n Add modes via bitmask. Modes set before are not cleared! This method\n should be used with the :const:`MODE_*` constants.\n\n :param mode: The mode to add.\n :return: The new mode bitmask.\n \"\"\"\n if not isinstance(mode, integer_types):\n raise TypeError(\"mode must be an integer\")\n\n return _lib.SSL_CTX_set_mode(self._context, mode)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef set_tlsext_servername_callback(self, callback):\n @wraps(callback)\n def wrapper(ssl, alert, arg):\n callback(Connection._reverse_mapping[ssl])\n return 0\n\n self._tlsext_servername_callback = _ffi.callback(\n \"int (*)(SSL *, int *, void *)\", wrapper)\n _lib.SSL_CTX_set_tlsext_servername_callback(\n self._context, self._tlsext_servername_callback)", "response": "Set a callback function to be invoked when the client specifies a servernames."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef set_tlsext_use_srtp(self, profiles):\n if not isinstance(profiles, bytes):\n raise TypeError(\"profiles must be a byte string.\")\n\n _openssl_assert(\n _lib.SSL_CTX_set_tlsext_use_srtp(self._context, profiles) == 0\n )", "response": "Enable support for negotiating SRTP keying material."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef set_npn_advertise_callback(self, callback):\n _warn_npn()\n self._npn_advertise_helper = _NpnAdvertiseHelper(callback)\n self._npn_advertise_callback = self._npn_advertise_helper.callback\n _lib.SSL_CTX_set_next_protos_advertised_cb(\n self._context, self._npn_advertise_callback, _ffi.NULL)", "response": "Set the callback function that will be invoked when the next negotiation of a new one - protocol entry is advertised."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsetting the callback function that will be invoked when a server offers Next Protocol Negotiation options.", "response": "def set_npn_select_callback(self, callback):\n \"\"\"\n Specify a callback function that will be called when a server offers\n Next Protocol Negotiation options.\n\n :param callback: The callback function. It will be invoked with two\n arguments: the Connection, and a list of offered protocols as\n bytestrings, e.g. ``[b'http/1.1', b'spdy/2']``. It should return\n one of those bytestrings, the chosen protocol.\n\n .. versionadded:: 0.15\n \"\"\"\n _warn_npn()\n self._npn_select_helper = _NpnSelectHelper(callback)\n self._npn_select_callback = self._npn_select_helper.callback\n _lib.SSL_CTX_set_next_proto_select_cb(\n self._context, self._npn_select_callback, _ffi.NULL)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsets the protocols that the client is prepared to speak after the TLS connection has been negotiated using the Application Layer Protocol Negotiation.", "response": "def set_alpn_protos(self, protos):\n \"\"\"\n Specify the protocols that the client is prepared to speak after the\n TLS connection has been negotiated using Application Layer Protocol\n Negotiation.\n\n :param protos: A list of the protocols to be offered to the server.\n This list should be a Python list of bytestrings representing the\n protocols to offer, e.g. ``[b'http/1.1', b'spdy/2']``.\n \"\"\"\n # Take the list of protocols and join them together, prefixing them\n # with their lengths.\n protostr = b''.join(\n chain.from_iterable((int2byte(len(p)), p) for p in protos)\n )\n\n # Build a C string from the list. We don't need to save this off\n # because OpenSSL immediately copies the data out.\n input_str = _ffi.new(\"unsigned char[]\", protostr)\n _lib.SSL_CTX_set_alpn_protos(self._context, input_str, len(protostr))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef set_alpn_select_callback(self, callback):\n self._alpn_select_helper = _ALPNSelectHelper(callback)\n self._alpn_select_callback = self._alpn_select_helper.callback\n _lib.SSL_CTX_set_alpn_select_cb(\n self._context, self._alpn_select_callback, _ffi.NULL)", "response": "Sets the callback function that will be invoked on the server when the server offers protocols using ALPN."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _set_ocsp_callback(self, helper, data):\n self._ocsp_helper = helper\n self._ocsp_callback = helper.callback\n if data is None:\n self._ocsp_data = _ffi.NULL\n else:\n self._ocsp_data = _ffi.new_handle(data)\n\n rc = _lib.SSL_CTX_set_tlsext_status_cb(\n self._context, self._ocsp_callback\n )\n _openssl_assert(rc == 1)\n rc = _lib.SSL_CTX_set_tlsext_status_arg(self._context, self._ocsp_data)\n _openssl_assert(rc == 1)", "response": "Set the OCSP callback."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef set_ocsp_server_callback(self, callback, data=None):\n helper = _OCSPServerCallbackHelper(callback)\n self._set_ocsp_callback(helper, data)", "response": "Set a callback to be invoked when the server side has received OCSP data."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef set_ocsp_client_callback(self, callback, data=None):\n helper = _OCSPClientCallbackHelper(callback)\n self._set_ocsp_callback(helper, data)", "response": "Set a callback to validate OCSP data stapled to the TLS handshake on the client side."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef set_context(self, context):\n if not isinstance(context, Context):\n raise TypeError(\"context must be a Context instance\")\n\n _lib.SSL_set_SSL_CTX(self._ssl, context._context)\n self._context = context", "response": "Switch this connection to a new session context."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nretrieve the servername extension value if provided in the client hello message or None if there isn t one.", "response": "def get_servername(self):\n \"\"\"\n Retrieve the servername extension value if provided in the client hello\n message, or None if there wasn't one.\n\n :return: A byte string giving the server name or :data:`None`.\n\n .. versionadded:: 0.13\n \"\"\"\n name = _lib.SSL_get_servername(\n self._ssl, _lib.TLSEXT_NAMETYPE_host_name\n )\n if name == _ffi.NULL:\n return None\n\n return _ffi.string(name)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef set_tlsext_host_name(self, name):\n if not isinstance(name, bytes):\n raise TypeError(\"name must be a byte string\")\n elif b\"\\0\" in name:\n raise TypeError(\"name must not contain NUL byte\")\n\n # XXX I guess this can fail sometimes?\n _lib.SSL_set_tlsext_host_name(self._ssl, name)", "response": "Set the value of the servername extension to send in the client hello."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsending data on the socket.", "response": "def send(self, buf, flags=0):\n \"\"\"\n Send data on the connection. NOTE: If you get one of the WantRead,\n WantWrite or WantX509Lookup exceptions on this, you have to call the\n method again with the SAME buffer.\n\n :param buf: The string, buffer or memoryview to send\n :param flags: (optional) Included for compatibility with the socket\n API, the value is ignored\n :return: The number of bytes written\n \"\"\"\n # Backward compatibility\n buf = _text_to_bytes_and_warn(\"buf\", buf)\n\n if isinstance(buf, memoryview):\n buf = buf.tobytes()\n if isinstance(buf, _buffer):\n buf = str(buf)\n if not isinstance(buf, bytes):\n raise TypeError(\"data must be a memoryview, buffer or byte string\")\n if len(buf) > 2147483647:\n raise ValueError(\"Cannot send more than 2**31-1 bytes at once.\")\n\n result = _lib.SSL_write(self._ssl, buf, len(buf))\n self._raise_ssl_error(self._ssl, result)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef sendall(self, buf, flags=0):\n buf = _text_to_bytes_and_warn(\"buf\", buf)\n\n if isinstance(buf, memoryview):\n buf = buf.tobytes()\n if isinstance(buf, _buffer):\n buf = str(buf)\n if not isinstance(buf, bytes):\n raise TypeError(\"buf must be a memoryview, buffer or byte string\")\n\n left_to_send = len(buf)\n total_sent = 0\n data = _ffi.new(\"char[]\", buf)\n\n while left_to_send:\n # SSL_write's num arg is an int,\n # so we cannot send more than 2**31-1 bytes at once.\n result = _lib.SSL_write(\n self._ssl,\n data + total_sent,\n min(left_to_send, 2147483647)\n )\n self._raise_ssl_error(self._ssl, result)\n total_sent += result\n left_to_send -= result", "response": "Send all data to the socket."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef recv(self, bufsiz, flags=None):\n buf = _no_zero_allocator(\"char[]\", bufsiz)\n if flags is not None and flags & socket.MSG_PEEK:\n result = _lib.SSL_peek(self._ssl, buf, bufsiz)\n else:\n result = _lib.SSL_read(self._ssl, buf, bufsiz)\n self._raise_ssl_error(self._ssl, result)\n return _ffi.buffer(buf, result)[:]", "response": "Receive data on the connection."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreceiving data on the connection and copy it directly into the provided buffer.", "response": "def recv_into(self, buffer, nbytes=None, flags=None):\n \"\"\"\n Receive data on the connection and copy it directly into the provided\n buffer, rather than creating a new string.\n\n :param buffer: The buffer to copy into.\n :param nbytes: (optional) The maximum number of bytes to read into the\n buffer. If not present, defaults to the size of the buffer. If\n larger than the size of the buffer, is reduced to the size of the\n buffer.\n :param flags: (optional) The only supported flag is ``MSG_PEEK``,\n all other flags are ignored.\n :return: The number of bytes read into the buffer.\n \"\"\"\n if nbytes is None:\n nbytes = len(buffer)\n else:\n nbytes = min(nbytes, len(buffer))\n\n # We need to create a temporary buffer. This is annoying, it would be\n # better if we could pass memoryviews straight into the SSL_read call,\n # but right now we can't. Revisit this if CFFI gets that ability.\n buf = _no_zero_allocator(\"char[]\", nbytes)\n if flags is not None and flags & socket.MSG_PEEK:\n result = _lib.SSL_peek(self._ssl, buf, nbytes)\n else:\n result = _lib.SSL_read(self._ssl, buf, nbytes)\n self._raise_ssl_error(self._ssl, result)\n\n # This strange line is all to avoid a memory copy. The buffer protocol\n # should allow us to assign a CFFI buffer to the LHS of this line, but\n # on CPython 3.3+ that segfaults. As a workaround, we can temporarily\n # wrap it in a memoryview.\n buffer[:result] = memoryview(_ffi.buffer(buf, result))\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreads the contents of a memory BIO into a string.", "response": "def bio_read(self, bufsiz):\n \"\"\"\n If the Connection was created with a memory BIO, this method can be\n used to read bytes from the write end of that memory BIO. Many\n Connection methods will add bytes which must be read in this manner or\n the buffer will eventually fill up and the Connection will be able to\n take no further actions.\n\n :param bufsiz: The maximum number of bytes to read\n :return: The string read.\n \"\"\"\n if self._from_ssl is None:\n raise TypeError(\"Connection sock was not None\")\n\n if not isinstance(bufsiz, integer_types):\n raise TypeError(\"bufsiz must be an integer\")\n\n buf = _no_zero_allocator(\"char[]\", bufsiz)\n result = _lib.BIO_read(self._from_ssl, buf, bufsiz)\n if result <= 0:\n self._handle_bio_errors(self._from_ssl, result)\n\n return _ffi.buffer(buf, result)[:]"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef bio_write(self, buf):\n buf = _text_to_bytes_and_warn(\"buf\", buf)\n\n if self._into_ssl is None:\n raise TypeError(\"Connection sock was not None\")\n\n result = _lib.BIO_write(self._into_ssl, buf, len(buf))\n if result <= 0:\n self._handle_bio_errors(self._into_ssl, result)\n return result", "response": "Write a string to the current memory BIO."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef renegotiate(self):\n if not self.renegotiate_pending():\n _openssl_assert(_lib.SSL_renegotiate(self._ssl) == 1)\n return True\n return False", "response": "Check if the renegotiation can be started."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef do_handshake(self):\n result = _lib.SSL_do_handshake(self._ssl)\n self._raise_ssl_error(self._ssl, result)", "response": "Perform an SSL handshake."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nconnecting to a remote address and set up SSL on the socket.", "response": "def connect(self, addr):\n \"\"\"\n Call the :meth:`connect` method of the underlying socket and set up SSL\n on the socket, using the :class:`Context` object supplied to this\n :class:`Connection` object at creation.\n\n :param addr: A remote address\n :return: What the socket's connect method returns\n \"\"\"\n _lib.SSL_set_connect_state(self._ssl)\n return self._socket.connect(addr)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nconnect to the given address and return the SSL context.", "response": "def connect_ex(self, addr):\n \"\"\"\n Call the :meth:`connect_ex` method of the underlying socket and set up\n SSL on the socket, using the Context object supplied to this Connection\n object at creation. Note that if the :meth:`connect_ex` method of the\n socket doesn't return 0, SSL won't be initialized.\n\n :param addr: A remove address\n :return: What the socket's connect_ex method returns\n \"\"\"\n connect_ex = self._socket.connect_ex\n self.set_connect_state()\n return connect_ex(addr)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef accept(self):\n client, addr = self._socket.accept()\n conn = Connection(self._context, client)\n conn.set_accept_state()\n return (conn, addr)", "response": "Call the accept method of the underlying socket and set up SSL\n on the returned socket."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nsend a shutdown message to the Connection.", "response": "def shutdown(self):\n \"\"\"\n Send the shutdown message to the Connection.\n\n :return: True if the shutdown completed successfully (i.e. both sides\n have sent closure alerts), False otherwise (in which case you\n call :meth:`recv` or :meth:`send` when the connection becomes\n readable/writeable).\n \"\"\"\n result = _lib.SSL_shutdown(self._ssl)\n if result < 0:\n self._raise_ssl_error(self._ssl, result)\n elif result > 0:\n return True\n else:\n return False"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_cipher_list(self):\n ciphers = []\n for i in count():\n result = _lib.SSL_get_cipher_list(self._ssl, i)\n if result == _ffi.NULL:\n break\n ciphers.append(_native(_ffi.string(result)))\n return ciphers", "response": "Retrieve the list of native cipher strings used by the Connection object."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget the list of certificates that are suggested for client authentication.", "response": "def get_client_ca_list(self):\n \"\"\"\n Get CAs whose certificates are suggested for client authentication.\n\n :return: If this is a server connection, the list of certificate\n authorities that will be sent or has been sent to the client, as\n controlled by this :class:`Connection`'s :class:`Context`.\n\n If this is a client connection, the list will be empty until the\n connection with the server is established.\n\n .. versionadded:: 0.10\n \"\"\"\n ca_names = _lib.SSL_get_client_CA_list(self._ssl)\n if ca_names == _ffi.NULL:\n # TODO: This is untested.\n return []\n\n result = []\n for i in range(_lib.sk_X509_NAME_num(ca_names)):\n name = _lib.sk_X509_NAME_value(ca_names, i)\n copy = _lib.X509_NAME_dup(name)\n _openssl_assert(copy != _ffi.NULL)\n\n pyname = X509Name.__new__(X509Name)\n pyname._name = _ffi.gc(copy, _lib.X509_NAME_free)\n result.append(pyname)\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef set_shutdown(self, state):\n if not isinstance(state, integer_types):\n raise TypeError(\"state must be an integer\")\n\n _lib.SSL_set_shutdown(self._ssl, state)", "response": "Sets the shutdown state of the SSL connection."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nretrieve the random value used with the server hello message.", "response": "def server_random(self):\n \"\"\"\n Retrieve the random value used with the server hello message.\n\n :return: A string representing the state\n \"\"\"\n session = _lib.SSL_get_session(self._ssl)\n if session == _ffi.NULL:\n return None\n length = _lib.SSL_get_server_random(self._ssl, _ffi.NULL, 0)\n assert length > 0\n outp = _no_zero_allocator(\"unsigned char[]\", length)\n _lib.SSL_get_server_random(self._ssl, outp, length)\n return _ffi.buffer(outp, length)[:]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nretrieving the random value used with the client hello message.", "response": "def client_random(self):\n \"\"\"\n Retrieve the random value used with the client hello message.\n\n :return: A string representing the state\n \"\"\"\n session = _lib.SSL_get_session(self._ssl)\n if session == _ffi.NULL:\n return None\n\n length = _lib.SSL_get_client_random(self._ssl, _ffi.NULL, 0)\n assert length > 0\n outp = _no_zero_allocator(\"unsigned char[]\", length)\n _lib.SSL_get_client_random(self._ssl, outp, length)\n return _ffi.buffer(outp, length)[:]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nretrieving the master key for this session.", "response": "def master_key(self):\n \"\"\"\n Retrieve the value of the master key for this session.\n\n :return: A string representing the state\n \"\"\"\n session = _lib.SSL_get_session(self._ssl)\n if session == _ffi.NULL:\n return None\n\n length = _lib.SSL_SESSION_get_master_key(session, _ffi.NULL, 0)\n assert length > 0\n outp = _no_zero_allocator(\"unsigned char[]\", length)\n _lib.SSL_SESSION_get_master_key(session, outp, length)\n return _ffi.buffer(outp, length)[:]"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef export_keying_material(self, label, olen, context=None):\n outp = _no_zero_allocator(\"unsigned char[]\", olen)\n context_buf = _ffi.NULL\n context_len = 0\n use_context = 0\n if context is not None:\n context_buf = context\n context_len = len(context)\n use_context = 1\n success = _lib.SSL_export_keying_material(self._ssl, outp, olen,\n label, len(label),\n context_buf, context_len,\n use_context)\n _openssl_assert(success == 1)\n return _ffi.buffer(outp, olen)[:]", "response": "Exports the keying material for application use."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_certificate(self):\n cert = _lib.SSL_get_certificate(self._ssl)\n if cert != _ffi.NULL:\n _lib.X509_up_ref(cert)\n return X509._from_raw_x509_ptr(cert)\n return None", "response": "Retrieve the local certificate."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nretrieving the other side s certificate", "response": "def get_peer_certificate(self):\n \"\"\"\n Retrieve the other side's certificate (if any)\n\n :return: The peer's certificate\n \"\"\"\n cert = _lib.SSL_get_peer_certificate(self._ssl)\n if cert != _ffi.NULL:\n return X509._from_raw_x509_ptr(cert)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nretrieving the other side s certificate chain.", "response": "def get_peer_cert_chain(self):\n \"\"\"\n Retrieve the other side's certificate (if any)\n\n :return: A list of X509 instances giving the peer's certificate chain,\n or None if it does not have one.\n \"\"\"\n cert_stack = _lib.SSL_get_peer_cert_chain(self._ssl)\n if cert_stack == _ffi.NULL:\n return None\n\n result = []\n for i in range(_lib.sk_X509_num(cert_stack)):\n # TODO could incref instead of dup here\n cert = _lib.X509_dup(_lib.sk_X509_value(cert_stack, i))\n pycert = X509._from_raw_x509_ptr(cert)\n result.append(pycert)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn the currently used SSL session.", "response": "def get_session(self):\n \"\"\"\n Returns the Session currently used.\n\n :return: An instance of :class:`OpenSSL.SSL.Session` or\n :obj:`None` if no session exists.\n\n .. versionadded:: 0.14\n \"\"\"\n session = _lib.SSL_get1_session(self._ssl)\n if session == _ffi.NULL:\n return None\n\n pysession = Session.__new__(Session)\n pysession._session = _ffi.gc(session, _lib.SSL_SESSION_free)\n return pysession"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef set_session(self, session):\n if not isinstance(session, Session):\n raise TypeError(\"session must be a Session instance\")\n\n result = _lib.SSL_set_session(self._ssl, session._session)\n if not result:\n _raise_current_error()", "response": "Sets the session to be used when the TLS connection is established."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _get_finished_message(self, function):\n # The OpenSSL documentation says nothing about what might happen if the\n # count argument given is zero. Specifically, it doesn't say whether\n # the output buffer may be NULL in that case or not. Inspection of the\n # implementation reveals that it calls memcpy() unconditionally.\n # Section 7.1.4, paragraph 1 of the C standard suggests that\n # memcpy(NULL, source, 0) is not guaranteed to produce defined (let\n # alone desirable) behavior (though it probably does on just about\n # every implementation...)\n #\n # Allocate a tiny buffer to pass in (instead of just passing NULL as\n # one might expect) for the initial call so as to be safe against this\n # potentially undefined behavior.\n empty = _ffi.new(\"char[]\", 0)\n size = function(self._ssl, empty, 0)\n if size == 0:\n # No Finished message so far.\n return None\n\n buf = _no_zero_allocator(\"char[]\", size)\n function(self._ssl, buf, size)\n return _ffi.buffer(buf, size)[:]", "response": "Internal helper to get the finished message from the peer."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_cipher_name(self):\n cipher = _lib.SSL_get_current_cipher(self._ssl)\n if cipher == _ffi.NULL:\n return None\n else:\n name = _ffi.string(_lib.SSL_CIPHER_get_name(cipher))\n return name.decode(\"utf-8\")", "response": "Returns the name of the currently used cipher."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_cipher_bits(self):\n cipher = _lib.SSL_get_current_cipher(self._ssl)\n if cipher == _ffi.NULL:\n return None\n else:\n return _lib.SSL_CIPHER_get_bits(cipher, _ffi.NULL)", "response": "Returns the number of secret bits of the currently used cipher."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning the protocol version of the currently used cipher.", "response": "def get_cipher_version(self):\n \"\"\"\n Obtain the protocol version of the currently used cipher.\n\n :returns: The protocol name of the currently used cipher\n or :obj:`None` if no connection has been established.\n :rtype: :class:`unicode` or :class:`NoneType`\n\n .. versionadded:: 0.15\n \"\"\"\n cipher = _lib.SSL_get_current_cipher(self._ssl)\n if cipher == _ffi.NULL:\n return None\n else:\n version = _ffi.string(_lib.SSL_CIPHER_get_version(cipher))\n return version.decode(\"utf-8\")"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_protocol_version_name(self):\n version = _ffi.string(_lib.SSL_get_version(self._ssl))\n return version.decode(\"utf-8\")", "response": "Retrieves the protocol version of the current connection."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_next_proto_negotiated(self):\n _warn_npn()\n data = _ffi.new(\"unsigned char **\")\n data_len = _ffi.new(\"unsigned int *\")\n\n _lib.SSL_get0_next_proto_negotiated(self._ssl, data, data_len)\n\n return _ffi.buffer(data[0], data_len[0])[:]", "response": "Get the next protocol negotiated by NPN."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsetting the ALPN protocol list for the client.", "response": "def set_alpn_protos(self, protos):\n \"\"\"\n Specify the client's ALPN protocol list.\n\n These protocols are offered to the server during protocol negotiation.\n\n :param protos: A list of the protocols to be offered to the server.\n This list should be a Python list of bytestrings representing the\n protocols to offer, e.g. ``[b'http/1.1', b'spdy/2']``.\n \"\"\"\n # Take the list of protocols and join them together, prefixing them\n # with their lengths.\n protostr = b''.join(\n chain.from_iterable((int2byte(len(p)), p) for p in protos)\n )\n\n # Build a C string from the list. We don't need to save this off\n # because OpenSSL immediately copies the data out.\n input_str = _ffi.new(\"unsigned char[]\", protostr)\n _lib.SSL_set_alpn_protos(self._ssl, input_str, len(protostr))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting the protocol that was negotiated by ALPN.", "response": "def get_alpn_proto_negotiated(self):\n \"\"\"\n Get the protocol that was negotiated by ALPN.\n\n :returns: A bytestring of the protocol name. If no protocol has been\n negotiated yet, returns an empty string.\n \"\"\"\n data = _ffi.new(\"unsigned char **\")\n data_len = _ffi.new(\"unsigned int *\")\n\n _lib.SSL_get0_alpn_selected(self._ssl, data, data_len)\n\n if not data_len:\n return b''\n\n return _ffi.buffer(data[0], data_len[0])[:]"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef request_ocsp(self):\n rc = _lib.SSL_set_tlsext_status_type(\n self._ssl, _lib.TLSEXT_STATUSTYPE_ocsp\n )\n _openssl_assert(rc == 1)", "response": "Request OCSP data from the server."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nallocates a new OpenSSL memory BIO.", "response": "def _new_mem_buf(buffer=None):\n \"\"\"\n Allocate a new OpenSSL memory BIO.\n\n Arrange for the garbage collector to clean it up automatically.\n\n :param buffer: None or some bytes to use to put into the BIO so that they\n can be read out.\n \"\"\"\n if buffer is None:\n bio = _lib.BIO_new(_lib.BIO_s_mem())\n free = _lib.BIO_free\n else:\n data = _ffi.new(\"char[]\", buffer)\n bio = _lib.BIO_new_mem_buf(data, len(buffer))\n\n # Keep the memory alive as long as the bio is alive!\n def free(bio, ref=data):\n return _lib.BIO_free(bio)\n\n _openssl_assert(bio != _ffi.NULL)\n\n bio = _ffi.gc(bio, free)\n return bio"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _bio_to_string(bio):\n result_buffer = _ffi.new('char**')\n buffer_length = _lib.BIO_get_mem_data(bio, result_buffer)\n return _ffi.buffer(result_buffer[0], buffer_length)[:]", "response": "Converts an OpenSSL BIO object into a Python byte string."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsetting the ASN1 time value for the current ASN1_TIME object.", "response": "def _set_asn1_time(boundary, when):\n \"\"\"\n The the time value of an ASN1 time object.\n\n @param boundary: An ASN1_TIME pointer (or an object safely\n castable to that type) which will have its value set.\n @param when: A string representation of the desired time value.\n\n @raise TypeError: If C{when} is not a L{bytes} string.\n @raise ValueError: If C{when} does not represent a time in the required\n format.\n @raise RuntimeError: If the time value cannot be set for some other\n (unspecified) reason.\n \"\"\"\n if not isinstance(when, bytes):\n raise TypeError(\"when must be a byte string\")\n\n set_result = _lib.ASN1_TIME_set_string(boundary, when)\n if set_result == 0:\n raise ValueError(\"Invalid string\")"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nretrieving the time value of an ASN1 time object.", "response": "def _get_asn1_time(timestamp):\n \"\"\"\n Retrieve the time value of an ASN1 time object.\n\n @param timestamp: An ASN1_GENERALIZEDTIME* (or an object safely castable to\n that type) from which the time value will be retrieved.\n\n @return: The time value from C{timestamp} as a L{bytes} string in a certain\n format. Or C{None} if the object contains no time value.\n \"\"\"\n string_timestamp = _ffi.cast('ASN1_STRING*', timestamp)\n if _lib.ASN1_STRING_length(string_timestamp) == 0:\n return None\n elif (\n _lib.ASN1_STRING_type(string_timestamp) == _lib.V_ASN1_GENERALIZEDTIME\n ):\n return _ffi.string(_lib.ASN1_STRING_data(string_timestamp))\n else:\n generalized_timestamp = _ffi.new(\"ASN1_GENERALIZEDTIME**\")\n _lib.ASN1_TIME_to_generalizedtime(timestamp, generalized_timestamp)\n if generalized_timestamp[0] == _ffi.NULL:\n # This may happen:\n # - if timestamp was not an ASN1_TIME\n # - if allocating memory for the ASN1_GENERALIZEDTIME failed\n # - if a copy of the time data from timestamp cannot be made for\n # the newly allocated ASN1_GENERALIZEDTIME\n #\n # These are difficult to test. cffi enforces the ASN1_TIME type.\n # Memory allocation failures are a pain to trigger\n # deterministically.\n _untested_error(\"ASN1_TIME_to_generalizedtime\")\n else:\n string_timestamp = _ffi.cast(\n \"ASN1_STRING*\", generalized_timestamp[0])\n string_data = _lib.ASN1_STRING_data(string_timestamp)\n string_result = _ffi.string(string_data)\n _lib.ASN1_GENERALIZEDTIME_free(generalized_timestamp[0])\n return string_result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn a single OpenSSL curve object selected by name.", "response": "def get_elliptic_curve(name):\n \"\"\"\n Return a single curve object selected by name.\n\n See :py:func:`get_elliptic_curves` for information about curve objects.\n\n :param name: The OpenSSL short name identifying the curve object to\n retrieve.\n :type name: :py:class:`unicode`\n\n If the named curve is not supported then :py:class:`ValueError` is raised.\n \"\"\"\n for curve in get_elliptic_curves():\n if curve.name == name:\n return curve\n raise ValueError(\"unknown curve name\", name)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef load_certificate(type, buffer):\n if isinstance(buffer, _text_type):\n buffer = buffer.encode(\"ascii\")\n\n bio = _new_mem_buf(buffer)\n\n if type == FILETYPE_PEM:\n x509 = _lib.PEM_read_bio_X509(bio, _ffi.NULL, _ffi.NULL, _ffi.NULL)\n elif type == FILETYPE_ASN1:\n x509 = _lib.d2i_X509_bio(bio, _ffi.NULL)\n else:\n raise ValueError(\n \"type argument must be FILETYPE_PEM or FILETYPE_ASN1\")\n\n if x509 == _ffi.NULL:\n _raise_current_error()\n\n return X509._from_raw_x509_ptr(x509)", "response": "Load a certificate from the string buffer encoded with the type FILETYPE_PEM or FILETYPE_ASN1."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ndump the certificate into a buffer encoded with the type *type*.", "response": "def dump_certificate(type, cert):\n \"\"\"\n Dump the certificate *cert* into a buffer string encoded with the type\n *type*.\n\n :param type: The file type (one of FILETYPE_PEM, FILETYPE_ASN1, or\n FILETYPE_TEXT)\n :param cert: The certificate to dump\n :return: The buffer with the dumped certificate in\n \"\"\"\n bio = _new_mem_buf()\n\n if type == FILETYPE_PEM:\n result_code = _lib.PEM_write_bio_X509(bio, cert._x509)\n elif type == FILETYPE_ASN1:\n result_code = _lib.i2d_X509_bio(bio, cert._x509)\n elif type == FILETYPE_TEXT:\n result_code = _lib.X509_print_ex(bio, cert._x509, 0, 0)\n else:\n raise ValueError(\n \"type argument must be FILETYPE_PEM, FILETYPE_ASN1, or \"\n \"FILETYPE_TEXT\")\n\n assert result_code == 1\n return _bio_to_string(bio)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef dump_publickey(type, pkey):\n bio = _new_mem_buf()\n if type == FILETYPE_PEM:\n write_bio = _lib.PEM_write_bio_PUBKEY\n elif type == FILETYPE_ASN1:\n write_bio = _lib.i2d_PUBKEY_bio\n else:\n raise ValueError(\"type argument must be FILETYPE_PEM or FILETYPE_ASN1\")\n\n result_code = write_bio(bio, pkey._pkey)\n if result_code != 1: # pragma: no cover\n _raise_current_error()\n\n return _bio_to_string(bio)", "response": "Dump a public key to a buffer."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ndump the private key into a buffer of the specified type.", "response": "def dump_privatekey(type, pkey, cipher=None, passphrase=None):\n \"\"\"\n Dump the private key *pkey* into a buffer string encoded with the type\n *type*. Optionally (if *type* is :const:`FILETYPE_PEM`) encrypting it\n using *cipher* and *passphrase*.\n\n :param type: The file type (one of :const:`FILETYPE_PEM`,\n :const:`FILETYPE_ASN1`, or :const:`FILETYPE_TEXT`)\n :param PKey pkey: The PKey to dump\n :param cipher: (optional) if encrypted PEM format, the cipher to use\n :param passphrase: (optional) if encrypted PEM format, this can be either\n the passphrase to use, or a callback for providing the passphrase.\n\n :return: The buffer with the dumped key in\n :rtype: bytes\n \"\"\"\n bio = _new_mem_buf()\n\n if not isinstance(pkey, PKey):\n raise TypeError(\"pkey must be a PKey\")\n\n if cipher is not None:\n if passphrase is None:\n raise TypeError(\n \"if a value is given for cipher \"\n \"one must also be given for passphrase\")\n cipher_obj = _lib.EVP_get_cipherbyname(_byte_string(cipher))\n if cipher_obj == _ffi.NULL:\n raise ValueError(\"Invalid cipher name\")\n else:\n cipher_obj = _ffi.NULL\n\n helper = _PassphraseHelper(type, passphrase)\n if type == FILETYPE_PEM:\n result_code = _lib.PEM_write_bio_PrivateKey(\n bio, pkey._pkey, cipher_obj, _ffi.NULL, 0,\n helper.callback, helper.callback_args)\n helper.raise_if_problem()\n elif type == FILETYPE_ASN1:\n result_code = _lib.i2d_PrivateKey_bio(bio, pkey._pkey)\n elif type == FILETYPE_TEXT:\n if _lib.EVP_PKEY_id(pkey._pkey) != _lib.EVP_PKEY_RSA:\n raise TypeError(\"Only RSA keys are supported for FILETYPE_TEXT\")\n\n rsa = _ffi.gc(\n _lib.EVP_PKEY_get1_RSA(pkey._pkey),\n _lib.RSA_free\n )\n result_code = _lib.RSA_print(bio, rsa, 0)\n else:\n raise ValueError(\n \"type argument must be FILETYPE_PEM, FILETYPE_ASN1, or \"\n \"FILETYPE_TEXT\")\n\n _openssl_assert(result_code != 0)\n\n return _bio_to_string(bio)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nloads a public key from a buffer.", "response": "def load_publickey(type, buffer):\n \"\"\"\n Load a public key from a buffer.\n\n :param type: The file type (one of :data:`FILETYPE_PEM`,\n :data:`FILETYPE_ASN1`).\n :param buffer: The buffer the key is stored in.\n :type buffer: A Python string object, either unicode or bytestring.\n :return: The PKey object.\n :rtype: :class:`PKey`\n \"\"\"\n if isinstance(buffer, _text_type):\n buffer = buffer.encode(\"ascii\")\n\n bio = _new_mem_buf(buffer)\n\n if type == FILETYPE_PEM:\n evp_pkey = _lib.PEM_read_bio_PUBKEY(\n bio, _ffi.NULL, _ffi.NULL, _ffi.NULL)\n elif type == FILETYPE_ASN1:\n evp_pkey = _lib.d2i_PUBKEY_bio(bio, _ffi.NULL)\n else:\n raise ValueError(\"type argument must be FILETYPE_PEM or FILETYPE_ASN1\")\n\n if evp_pkey == _ffi.NULL:\n _raise_current_error()\n\n pkey = PKey.__new__(PKey)\n pkey._pkey = _ffi.gc(evp_pkey, _lib.EVP_PKEY_free)\n pkey._only_public = True\n return pkey"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nloads a private key from the string buffer encoded with the type .", "response": "def load_privatekey(type, buffer, passphrase=None):\n \"\"\"\n Load a private key (PKey) from the string *buffer* encoded with the type\n *type*.\n\n :param type: The file type (one of FILETYPE_PEM, FILETYPE_ASN1)\n :param buffer: The buffer the key is stored in\n :param passphrase: (optional) if encrypted PEM format, this can be\n either the passphrase to use, or a callback for\n providing the passphrase.\n\n :return: The PKey object\n \"\"\"\n if isinstance(buffer, _text_type):\n buffer = buffer.encode(\"ascii\")\n\n bio = _new_mem_buf(buffer)\n\n helper = _PassphraseHelper(type, passphrase)\n if type == FILETYPE_PEM:\n evp_pkey = _lib.PEM_read_bio_PrivateKey(\n bio, _ffi.NULL, helper.callback, helper.callback_args)\n helper.raise_if_problem()\n elif type == FILETYPE_ASN1:\n evp_pkey = _lib.d2i_PrivateKey_bio(bio, _ffi.NULL)\n else:\n raise ValueError(\"type argument must be FILETYPE_PEM or FILETYPE_ASN1\")\n\n if evp_pkey == _ffi.NULL:\n _raise_current_error()\n\n pkey = PKey.__new__(PKey)\n pkey._pkey = _ffi.gc(evp_pkey, _lib.EVP_PKEY_free)\n return pkey"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ndump the certificate request req into a buffer of the specified type.", "response": "def dump_certificate_request(type, req):\n \"\"\"\n Dump the certificate request *req* into a buffer string encoded with the\n type *type*.\n\n :param type: The file type (one of FILETYPE_PEM, FILETYPE_ASN1)\n :param req: The certificate request to dump\n :return: The buffer with the dumped certificate request in\n \"\"\"\n bio = _new_mem_buf()\n\n if type == FILETYPE_PEM:\n result_code = _lib.PEM_write_bio_X509_REQ(bio, req._req)\n elif type == FILETYPE_ASN1:\n result_code = _lib.i2d_X509_REQ_bio(bio, req._req)\n elif type == FILETYPE_TEXT:\n result_code = _lib.X509_REQ_print_ex(bio, req._req, 0, 0)\n else:\n raise ValueError(\n \"type argument must be FILETYPE_PEM, FILETYPE_ASN1, or \"\n \"FILETYPE_TEXT\"\n )\n\n _openssl_assert(result_code != 0)\n\n return _bio_to_string(bio)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef load_certificate_request(type, buffer):\n if isinstance(buffer, _text_type):\n buffer = buffer.encode(\"ascii\")\n\n bio = _new_mem_buf(buffer)\n\n if type == FILETYPE_PEM:\n req = _lib.PEM_read_bio_X509_REQ(bio, _ffi.NULL, _ffi.NULL, _ffi.NULL)\n elif type == FILETYPE_ASN1:\n req = _lib.d2i_X509_REQ_bio(bio, _ffi.NULL)\n else:\n raise ValueError(\"type argument must be FILETYPE_PEM or FILETYPE_ASN1\")\n\n _openssl_assert(req != _ffi.NULL)\n\n x509req = X509Req.__new__(X509Req)\n x509req._req = _ffi.gc(req, _lib.X509_REQ_free)\n return x509req", "response": "Load a certificate request from the string buffer encoded with\n the type *type*."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef sign(pkey, data, digest):\n data = _text_to_bytes_and_warn(\"data\", data)\n\n digest_obj = _lib.EVP_get_digestbyname(_byte_string(digest))\n if digest_obj == _ffi.NULL:\n raise ValueError(\"No such digest method\")\n\n md_ctx = _lib.Cryptography_EVP_MD_CTX_new()\n md_ctx = _ffi.gc(md_ctx, _lib.Cryptography_EVP_MD_CTX_free)\n\n _lib.EVP_SignInit(md_ctx, digest_obj)\n _lib.EVP_SignUpdate(md_ctx, data, len(data))\n\n length = _lib.EVP_PKEY_size(pkey._pkey)\n _openssl_assert(length > 0)\n signature_buffer = _ffi.new(\"unsigned char[]\", length)\n signature_length = _ffi.new(\"unsigned int *\")\n final_result = _lib.EVP_SignFinal(\n md_ctx, signature_buffer, signature_length, pkey._pkey)\n _openssl_assert(final_result == 1)\n\n return _ffi.buffer(signature_buffer, signature_length[0])[:]", "response": "Sign a data string using the given key and message digest."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef verify(cert, signature, data, digest):\n data = _text_to_bytes_and_warn(\"data\", data)\n\n digest_obj = _lib.EVP_get_digestbyname(_byte_string(digest))\n if digest_obj == _ffi.NULL:\n raise ValueError(\"No such digest method\")\n\n pkey = _lib.X509_get_pubkey(cert._x509)\n _openssl_assert(pkey != _ffi.NULL)\n pkey = _ffi.gc(pkey, _lib.EVP_PKEY_free)\n\n md_ctx = _lib.Cryptography_EVP_MD_CTX_new()\n md_ctx = _ffi.gc(md_ctx, _lib.Cryptography_EVP_MD_CTX_free)\n\n _lib.EVP_VerifyInit(md_ctx, digest_obj)\n _lib.EVP_VerifyUpdate(md_ctx, data, len(data))\n verify_result = _lib.EVP_VerifyFinal(\n md_ctx, signature, len(signature), pkey\n )\n\n if verify_result != 1:\n _raise_current_error()", "response": "Verify the signature for a data string."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef dump_crl(type, crl):\n bio = _new_mem_buf()\n\n if type == FILETYPE_PEM:\n ret = _lib.PEM_write_bio_X509_CRL(bio, crl._crl)\n elif type == FILETYPE_ASN1:\n ret = _lib.i2d_X509_CRL_bio(bio, crl._crl)\n elif type == FILETYPE_TEXT:\n ret = _lib.X509_CRL_print(bio, crl._crl)\n else:\n raise ValueError(\n \"type argument must be FILETYPE_PEM, FILETYPE_ASN1, or \"\n \"FILETYPE_TEXT\")\n\n assert ret == 1\n return _bio_to_string(bio)", "response": "Dump a certificate revocation list to a buffer."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nloads a certificate revocation list from a string buffer.", "response": "def load_crl(type, buffer):\n \"\"\"\n Load Certificate Revocation List (CRL) data from a string *buffer*.\n *buffer* encoded with the type *type*.\n\n :param type: The file type (one of FILETYPE_PEM, FILETYPE_ASN1)\n :param buffer: The buffer the CRL is stored in\n\n :return: The PKey object\n \"\"\"\n if isinstance(buffer, _text_type):\n buffer = buffer.encode(\"ascii\")\n\n bio = _new_mem_buf(buffer)\n\n if type == FILETYPE_PEM:\n crl = _lib.PEM_read_bio_X509_CRL(bio, _ffi.NULL, _ffi.NULL, _ffi.NULL)\n elif type == FILETYPE_ASN1:\n crl = _lib.d2i_X509_CRL_bio(bio, _ffi.NULL)\n else:\n raise ValueError(\"type argument must be FILETYPE_PEM or FILETYPE_ASN1\")\n\n if crl == _ffi.NULL:\n _raise_current_error()\n\n result = CRL.__new__(CRL)\n result._crl = _ffi.gc(crl, _lib.X509_CRL_free)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nloads the PKCS7 data from the string buffer encoded with the type *type*.", "response": "def load_pkcs7_data(type, buffer):\n \"\"\"\n Load pkcs7 data from the string *buffer* encoded with the type\n *type*.\n\n :param type: The file type (one of FILETYPE_PEM or FILETYPE_ASN1)\n :param buffer: The buffer with the pkcs7 data.\n :return: The PKCS7 object\n \"\"\"\n if isinstance(buffer, _text_type):\n buffer = buffer.encode(\"ascii\")\n\n bio = _new_mem_buf(buffer)\n\n if type == FILETYPE_PEM:\n pkcs7 = _lib.PEM_read_bio_PKCS7(bio, _ffi.NULL, _ffi.NULL, _ffi.NULL)\n elif type == FILETYPE_ASN1:\n pkcs7 = _lib.d2i_PKCS7_bio(bio, _ffi.NULL)\n else:\n raise ValueError(\"type argument must be FILETYPE_PEM or FILETYPE_ASN1\")\n\n if pkcs7 == _ffi.NULL:\n _raise_current_error()\n\n pypkcs7 = PKCS7.__new__(PKCS7)\n pypkcs7._pkcs7 = _ffi.gc(pkcs7, _lib.PKCS7_free)\n return pypkcs7"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef load_pkcs12(buffer, passphrase=None):\n passphrase = _text_to_bytes_and_warn(\"passphrase\", passphrase)\n\n if isinstance(buffer, _text_type):\n buffer = buffer.encode(\"ascii\")\n\n bio = _new_mem_buf(buffer)\n\n # Use null passphrase if passphrase is None or empty string. With PKCS#12\n # password based encryption no password and a zero length password are two\n # different things, but OpenSSL implementation will try both to figure out\n # which one works.\n if not passphrase:\n passphrase = _ffi.NULL\n\n p12 = _lib.d2i_PKCS12_bio(bio, _ffi.NULL)\n if p12 == _ffi.NULL:\n _raise_current_error()\n p12 = _ffi.gc(p12, _lib.PKCS12_free)\n\n pkey = _ffi.new(\"EVP_PKEY**\")\n cert = _ffi.new(\"X509**\")\n cacerts = _ffi.new(\"Cryptography_STACK_OF_X509**\")\n\n parse_result = _lib.PKCS12_parse(p12, passphrase, pkey, cert, cacerts)\n if not parse_result:\n _raise_current_error()\n\n cacerts = _ffi.gc(cacerts[0], _lib.sk_X509_free)\n\n # openssl 1.0.0 sometimes leaves an X509_check_private_key error in the\n # queue for no particular reason. This error isn't interesting to anyone\n # outside this function. It's not even interesting to us. Get rid of it.\n try:\n _raise_current_error()\n except Error:\n pass\n\n if pkey[0] == _ffi.NULL:\n pykey = None\n else:\n pykey = PKey.__new__(PKey)\n pykey._pkey = _ffi.gc(pkey[0], _lib.EVP_PKEY_free)\n\n if cert[0] == _ffi.NULL:\n pycert = None\n friendlyname = None\n else:\n pycert = X509._from_raw_x509_ptr(cert[0])\n\n friendlyname_length = _ffi.new(\"int*\")\n friendlyname_buffer = _lib.X509_alias_get0(\n cert[0], friendlyname_length\n )\n friendlyname = _ffi.buffer(\n friendlyname_buffer, friendlyname_length[0]\n )[:]\n if friendlyname_buffer == _ffi.NULL:\n friendlyname = None\n\n pycacerts = []\n for i in range(_lib.sk_X509_num(cacerts)):\n x509 = _lib.sk_X509_value(cacerts, i)\n pycacert = X509._from_raw_x509_ptr(x509)\n pycacerts.append(pycacert)\n if not pycacerts:\n pycacerts = None\n\n pkcs12 = PKCS12.__new__(PKCS12)\n pkcs12._pkey = pykey\n pkcs12._cert = pycert\n pkcs12._cacerts = pycacerts\n pkcs12._friendlyname = friendlyname\n return pkcs12", "response": "Load the PKCS12 data from the given buffer."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nexports as a cryptography key.", "response": "def to_cryptography_key(self):\n \"\"\"\n Export as a ``cryptography`` key.\n\n :rtype: One of ``cryptography``'s `key interfaces`_.\n\n .. _key interfaces: https://cryptography.io/en/latest/hazmat/\\\n primitives/asymmetric/rsa/#key-interfaces\n\n .. versionadded:: 16.1.0\n \"\"\"\n backend = _get_backend()\n if self._only_public:\n return backend._evp_pkey_to_public_key(self._pkey)\n else:\n return backend._evp_pkey_to_private_key(self._pkey)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef from_cryptography_key(cls, crypto_key):\n pkey = cls()\n if not isinstance(crypto_key, (rsa.RSAPublicKey, rsa.RSAPrivateKey,\n dsa.DSAPublicKey, dsa.DSAPrivateKey)):\n raise TypeError(\"Unsupported key type\")\n\n pkey._pkey = crypto_key._evp_pkey\n if isinstance(crypto_key, (rsa.RSAPublicKey, dsa.DSAPublicKey)):\n pkey._only_public = True\n pkey._initialized = True\n return pkey", "response": "Constructs a new PKey object based on a cryptography key."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef generate_key(self, type, bits):\n if not isinstance(type, int):\n raise TypeError(\"type must be an integer\")\n\n if not isinstance(bits, int):\n raise TypeError(\"bits must be an integer\")\n\n if type == TYPE_RSA:\n if bits <= 0:\n raise ValueError(\"Invalid number of bits\")\n\n # TODO Check error return\n exponent = _lib.BN_new()\n exponent = _ffi.gc(exponent, _lib.BN_free)\n _lib.BN_set_word(exponent, _lib.RSA_F4)\n\n rsa = _lib.RSA_new()\n\n result = _lib.RSA_generate_key_ex(rsa, bits, exponent, _ffi.NULL)\n _openssl_assert(result == 1)\n\n result = _lib.EVP_PKEY_assign_RSA(self._pkey, rsa)\n _openssl_assert(result == 1)\n\n elif type == TYPE_DSA:\n dsa = _lib.DSA_new()\n _openssl_assert(dsa != _ffi.NULL)\n\n dsa = _ffi.gc(dsa, _lib.DSA_free)\n res = _lib.DSA_generate_parameters_ex(\n dsa, bits, _ffi.NULL, 0, _ffi.NULL, _ffi.NULL, _ffi.NULL\n )\n _openssl_assert(res == 1)\n\n _openssl_assert(_lib.DSA_generate_key(dsa) == 1)\n _openssl_assert(_lib.EVP_PKEY_set1_DSA(self._pkey, dsa) == 1)\n else:\n raise Error(\"No such key type\")\n\n self._initialized = True", "response": "Generates a key pair of the given type with the given number of bits."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks the consistency of an RSA private key.", "response": "def check(self):\n \"\"\"\n Check the consistency of an RSA private key.\n\n This is the Python equivalent of OpenSSL's ``RSA_check_key``.\n\n :return: ``True`` if key is consistent.\n\n :raise OpenSSL.crypto.Error: if the key is inconsistent.\n\n :raise TypeError: if the key is of a type which cannot be checked.\n Only RSA keys can currently be checked.\n \"\"\"\n if self._only_public:\n raise TypeError(\"public key only\")\n\n if _lib.EVP_PKEY_type(self.type()) != _lib.EVP_PKEY_RSA:\n raise TypeError(\"key type unsupported\")\n\n rsa = _lib.EVP_PKEY_get1_RSA(self._pkey)\n rsa = _ffi.gc(rsa, _lib.RSA_free)\n result = _lib.RSA_check_key(rsa)\n if result:\n return True\n _raise_current_error()"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _load_elliptic_curves(cls, lib):\n num_curves = lib.EC_get_builtin_curves(_ffi.NULL, 0)\n builtin_curves = _ffi.new('EC_builtin_curve[]', num_curves)\n # The return value on this call should be num_curves again. We\n # could check it to make sure but if it *isn't* then.. what could\n # we do? Abort the whole process, I suppose...? -exarkun\n lib.EC_get_builtin_curves(builtin_curves, num_curves)\n return set(\n cls.from_nid(lib, c.nid)\n for c in builtin_curves)", "response": "Load the elliptic curves supported by OpenSSL."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _get_elliptic_curves(cls, lib):\n if cls._curves is None:\n cls._curves = cls._load_elliptic_curves(lib)\n return cls._curves", "response": "Get cache and return the elliptic curves supported by OpenSSL."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef from_nid(cls, lib, nid):\n return cls(lib, nid, _ffi.string(lib.OBJ_nid2sn(nid)).decode(\"ascii\"))", "response": "Instantiate a new _EllipticCurve object associated with the given OpenSSL NID."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _to_EC_KEY(self):\n key = self._lib.EC_KEY_new_by_curve_name(self._nid)\n return _ffi.gc(key, _lib.EC_KEY_free)", "response": "Create an OpenSSL EC_KEY structure initialized to use this curve."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn the DER encoding of this name.", "response": "def der(self):\n \"\"\"\n Return the DER encoding of this name.\n\n :return: The DER encoded form of this name.\n :rtype: :py:class:`bytes`\n \"\"\"\n result_buffer = _ffi.new('unsigned char**')\n encode_result = _lib.i2d_X509_NAME(self._name, result_buffer)\n _openssl_assert(encode_result >= 0)\n\n string_result = _ffi.buffer(result_buffer[0], encode_result)[:]\n _lib.OPENSSL_free(result_buffer[0])\n return string_result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the components of this name as a list of 2 - tuples.", "response": "def get_components(self):\n \"\"\"\n Returns the components of this name, as a sequence of 2-tuples.\n\n :return: The components of this name.\n :rtype: :py:class:`list` of ``name, value`` tuples.\n \"\"\"\n result = []\n for i in range(_lib.X509_NAME_entry_count(self._name)):\n ent = _lib.X509_NAME_get_entry(self._name, i)\n\n fname = _lib.X509_NAME_ENTRY_get_object(ent)\n fval = _lib.X509_NAME_ENTRY_get_data(ent)\n\n nid = _lib.OBJ_obj2nid(fname)\n name = _lib.OBJ_nid2sn(nid)\n\n # ffi.string does not handle strings containing NULL bytes\n # (which may have been generated by old, broken software)\n value = _ffi.buffer(_lib.ASN1_STRING_data(fval),\n _lib.ASN1_STRING_length(fval))[:]\n result.append((_ffi.string(name), value))\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_short_name(self):\n obj = _lib.X509_EXTENSION_get_object(self._extension)\n nid = _lib.OBJ_obj2nid(obj)\n return _ffi.string(_lib.OBJ_nid2sn(nid))", "response": "Returns the short type name of this X. 509 extension."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning the data of the X509 extension encoded as ASN. 1.", "response": "def get_data(self):\n \"\"\"\n Returns the data of the X509 extension, encoded as ASN.1.\n\n :return: The ASN.1 encoded data of this X509 extension.\n :rtype: :py:data:`bytes`\n\n .. versionadded:: 0.12\n \"\"\"\n octet_result = _lib.X509_EXTENSION_get_data(self._extension)\n string_result = _ffi.cast('ASN1_STRING*', octet_result)\n char_result = _lib.ASN1_STRING_data(string_result)\n result_length = _lib.ASN1_STRING_length(string_result)\n return _ffi.buffer(char_result, result_length)[:]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nexport as a cryptography certificate signing request.", "response": "def to_cryptography(self):\n \"\"\"\n Export as a ``cryptography`` certificate signing request.\n\n :rtype: ``cryptography.x509.CertificateSigningRequest``\n\n .. versionadded:: 17.1.0\n \"\"\"\n from cryptography.hazmat.backends.openssl.x509 import (\n _CertificateSigningRequest\n )\n backend = _get_backend()\n return _CertificateSigningRequest(backend, self._req)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nconstructs a new X509Req object based on a cryptography. x509. CertificateSigningRequest object.", "response": "def from_cryptography(cls, crypto_req):\n \"\"\"\n Construct based on a ``cryptography`` *crypto_req*.\n\n :param crypto_req: A ``cryptography`` X.509 certificate signing request\n :type crypto_req: ``cryptography.x509.CertificateSigningRequest``\n\n :rtype: X509Req\n\n .. versionadded:: 17.1.0\n \"\"\"\n if not isinstance(crypto_req, x509.CertificateSigningRequest):\n raise TypeError(\"Must be a certificate signing request\")\n\n req = cls()\n req._req = crypto_req._x509_req\n return req"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nset the public key of the certificate signing request.", "response": "def set_pubkey(self, pkey):\n \"\"\"\n Set the public key of the certificate signing request.\n\n :param pkey: The public key to use.\n :type pkey: :py:class:`PKey`\n\n :return: ``None``\n \"\"\"\n set_result = _lib.X509_REQ_set_pubkey(self._req, pkey._pkey)\n _openssl_assert(set_result == 1)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_pubkey(self):\n pkey = PKey.__new__(PKey)\n pkey._pkey = _lib.X509_REQ_get_pubkey(self._req)\n _openssl_assert(pkey._pkey != _ffi.NULL)\n pkey._pkey = _ffi.gc(pkey._pkey, _lib.EVP_PKEY_free)\n pkey._only_public = True\n return pkey", "response": "Get the public key of the certificate signing request."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsets the version subfield of the certificate request.", "response": "def set_version(self, version):\n \"\"\"\n Set the version subfield (RFC 2459, section 4.1.2.1) of the certificate\n request.\n\n :param int version: The version number.\n :return: ``None``\n \"\"\"\n set_result = _lib.X509_REQ_set_version(self._req, version)\n _openssl_assert(set_result == 1)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_subject(self):\n name = X509Name.__new__(X509Name)\n name._name = _lib.X509_REQ_get_subject_name(self._req)\n _openssl_assert(name._name != _ffi.NULL)\n\n # The name is owned by the X509Req structure. As long as the X509Name\n # Python object is alive, keep the X509Req Python object alive.\n name._owner = self\n\n return name", "response": "Returns the subject of this certificate signing request."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef add_extensions(self, extensions):\n stack = _lib.sk_X509_EXTENSION_new_null()\n _openssl_assert(stack != _ffi.NULL)\n\n stack = _ffi.gc(stack, _lib.sk_X509_EXTENSION_free)\n\n for ext in extensions:\n if not isinstance(ext, X509Extension):\n raise ValueError(\"One of the elements is not an X509Extension\")\n\n # TODO push can fail (here and elsewhere)\n _lib.sk_X509_EXTENSION_push(stack, ext._extension)\n\n add_result = _lib.X509_REQ_add_extensions(self._req, stack)\n _openssl_assert(add_result == 1)", "response": "Add extensions to the certificate signing request."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget the X.509 extensions in the certificate signing request.", "response": "def get_extensions(self):\n \"\"\"\n Get X.509 extensions in the certificate signing request.\n\n :return: The X.509 extensions in this request.\n :rtype: :py:class:`list` of :py:class:`X509Extension` objects.\n\n .. versionadded:: 0.15\n \"\"\"\n exts = []\n native_exts_obj = _lib.X509_REQ_get_extensions(self._req)\n for i in range(_lib.sk_X509_EXTENSION_num(native_exts_obj)):\n ext = X509Extension.__new__(X509Extension)\n ext._extension = _lib.sk_X509_EXTENSION_value(native_exts_obj, i)\n exts.append(ext)\n return exts"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef verify(self, pkey):\n if not isinstance(pkey, PKey):\n raise TypeError(\"pkey must be a PKey instance\")\n\n result = _lib.X509_REQ_verify(self._req, pkey._pkey)\n if result <= 0:\n _raise_current_error()\n\n return result", "response": "Verifies the signature on this certificate signing request."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nexport as a cryptography. x509. Certificate.", "response": "def to_cryptography(self):\n \"\"\"\n Export as a ``cryptography`` certificate.\n\n :rtype: ``cryptography.x509.Certificate``\n\n .. versionadded:: 17.1.0\n \"\"\"\n from cryptography.hazmat.backends.openssl.x509 import _Certificate\n backend = _get_backend()\n return _Certificate(backend, self._x509)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef from_cryptography(cls, crypto_cert):\n if not isinstance(crypto_cert, x509.Certificate):\n raise TypeError(\"Must be a certificate\")\n\n cert = cls()\n cert._x509 = crypto_cert._x509\n return cert", "response": "Constructs a new X509 certificate object from a cryptography. x509. Certificate object."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsets the version number of the certificate.", "response": "def set_version(self, version):\n \"\"\"\n Set the version number of the certificate. Note that the\n version value is zero-based, eg. a value of 0 is V1.\n\n :param version: The version number of the certificate.\n :type version: :py:class:`int`\n\n :return: ``None``\n \"\"\"\n if not isinstance(version, int):\n raise TypeError(\"version must be an integer\")\n\n _lib.X509_set_version(self._x509, version)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_pubkey(self):\n pkey = PKey.__new__(PKey)\n pkey._pkey = _lib.X509_get_pubkey(self._x509)\n if pkey._pkey == _ffi.NULL:\n _raise_current_error()\n pkey._pkey = _ffi.gc(pkey._pkey, _lib.EVP_PKEY_free)\n pkey._only_public = True\n return pkey", "response": "Get the public key of the certificate."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef set_pubkey(self, pkey):\n if not isinstance(pkey, PKey):\n raise TypeError(\"pkey must be a PKey instance\")\n\n set_result = _lib.X509_set_pubkey(self._x509, pkey._pkey)\n _openssl_assert(set_result == 1)", "response": "Set the public key of the certificate."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef sign(self, pkey, digest):\n if not isinstance(pkey, PKey):\n raise TypeError(\"pkey must be a PKey instance\")\n\n if pkey._only_public:\n raise ValueError(\"Key only has public part\")\n\n if not pkey._initialized:\n raise ValueError(\"Key is uninitialized\")\n\n evp_md = _lib.EVP_get_digestbyname(_byte_string(digest))\n if evp_md == _ffi.NULL:\n raise ValueError(\"No such digest method\")\n\n sign_result = _lib.X509_sign(self._x509, pkey._pkey, evp_md)\n _openssl_assert(sign_result > 0)", "response": "Sign the certificate with this key and the given message digest."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget the signature algorithm used in the certificate.", "response": "def get_signature_algorithm(self):\n \"\"\"\n Return the signature algorithm used in the certificate.\n\n :return: The name of the algorithm.\n :rtype: :py:class:`bytes`\n\n :raises ValueError: If the signature algorithm is undefined.\n\n .. versionadded:: 0.13\n \"\"\"\n algor = _lib.X509_get0_tbs_sigalg(self._x509)\n nid = _lib.OBJ_obj2nid(algor.algorithm)\n if nid == _lib.NID_undef:\n raise ValueError(\"Undefined signature algorithm\")\n return _ffi.string(_lib.OBJ_nid2ln(nid))"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns the digest of the object.", "response": "def digest(self, digest_name):\n \"\"\"\n Return the digest of the X509 object.\n\n :param digest_name: The name of the digest algorithm to use.\n :type digest_name: :py:class:`bytes`\n\n :return: The digest of the object, formatted as\n :py:const:`b\":\"`-delimited hex pairs.\n :rtype: :py:class:`bytes`\n \"\"\"\n digest = _lib.EVP_get_digestbyname(_byte_string(digest_name))\n if digest == _ffi.NULL:\n raise ValueError(\"No such digest method\")\n\n result_buffer = _ffi.new(\"unsigned char[]\", _lib.EVP_MAX_MD_SIZE)\n result_length = _ffi.new(\"unsigned int[]\", 1)\n result_length[0] = len(result_buffer)\n\n digest_result = _lib.X509_digest(\n self._x509, digest, result_buffer, result_length)\n _openssl_assert(digest_result == 1)\n\n return b\":\".join([\n b16encode(ch).upper() for ch\n in _ffi.buffer(result_buffer, result_length[0])])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nset the serial number of the certificate.", "response": "def set_serial_number(self, serial):\n \"\"\"\n Set the serial number of the certificate.\n\n :param serial: The new serial number.\n :type serial: :py:class:`int`\n\n :return: :py:data`None`\n \"\"\"\n if not isinstance(serial, _integer_types):\n raise TypeError(\"serial must be an integer\")\n\n hex_serial = hex(serial)[2:]\n if not isinstance(hex_serial, bytes):\n hex_serial = hex_serial.encode('ascii')\n\n bignum_serial = _ffi.new(\"BIGNUM**\")\n\n # BN_hex2bn stores the result in &bignum. Unless it doesn't feel like\n # it. If bignum is still NULL after this call, then the return value\n # is actually the result. I hope. -exarkun\n small_serial = _lib.BN_hex2bn(bignum_serial, hex_serial)\n\n if bignum_serial[0] == _ffi.NULL:\n set_result = _lib.ASN1_INTEGER_set(\n _lib.X509_get_serialNumber(self._x509), small_serial)\n if set_result:\n # TODO Not tested\n _raise_current_error()\n else:\n asn1_serial = _lib.BN_to_ASN1_INTEGER(bignum_serial[0], _ffi.NULL)\n _lib.BN_free(bignum_serial[0])\n if asn1_serial == _ffi.NULL:\n # TODO Not tested\n _raise_current_error()\n asn1_serial = _ffi.gc(asn1_serial, _lib.ASN1_INTEGER_free)\n set_result = _lib.X509_set_serialNumber(self._x509, asn1_serial)\n _openssl_assert(set_result == 1)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_serial_number(self):\n asn1_serial = _lib.X509_get_serialNumber(self._x509)\n bignum_serial = _lib.ASN1_INTEGER_to_BN(asn1_serial, _ffi.NULL)\n try:\n hex_serial = _lib.BN_bn2hex(bignum_serial)\n try:\n hexstring_serial = _ffi.string(hex_serial)\n serial = int(hexstring_serial, 16)\n return serial\n finally:\n _lib.OPENSSL_free(hex_serial)\n finally:\n _lib.BN_free(bignum_serial)", "response": "Return the serial number of this certificate."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nadjusting the time stamp on which the certificate stops being valid.", "response": "def gmtime_adj_notAfter(self, amount):\n \"\"\"\n Adjust the time stamp on which the certificate stops being valid.\n\n :param int amount: The number of seconds by which to adjust the\n timestamp.\n :return: ``None``\n \"\"\"\n if not isinstance(amount, int):\n raise TypeError(\"amount must be an integer\")\n\n notAfter = _lib.X509_get_notAfter(self._x509)\n _lib.X509_gmtime_adj(notAfter, amount)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef gmtime_adj_notBefore(self, amount):\n if not isinstance(amount, int):\n raise TypeError(\"amount must be an integer\")\n\n notBefore = _lib.X509_get_notBefore(self._x509)\n _lib.X509_gmtime_adj(notBefore, amount)", "response": "Adjust the timestamp on which the certificate starts being valid."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck whether the certificate has expired.", "response": "def has_expired(self):\n \"\"\"\n Check whether the certificate has expired.\n\n :return: ``True`` if the certificate has expired, ``False`` otherwise.\n :rtype: bool\n \"\"\"\n time_string = _native(self.get_notAfter())\n not_after = datetime.datetime.strptime(time_string, \"%Y%m%d%H%M%SZ\")\n\n return not_after < datetime.datetime.utcnow()"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_issuer(self):\n name = self._get_name(_lib.X509_get_issuer_name)\n self._issuer_invalidator.add(name)\n return name", "response": "Returns the issuer of this certificate."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef set_issuer(self, issuer):\n self._set_name(_lib.X509_set_issuer_name, issuer)\n self._issuer_invalidator.clear()", "response": "Sets the issuer of this certificate."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_subject(self):\n name = self._get_name(_lib.X509_get_subject_name)\n self._subject_invalidator.add(name)\n return name", "response": "Returns the subject of this certificate."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsets the subject of this certificate.", "response": "def set_subject(self, subject):\n \"\"\"\n Set the subject of this certificate.\n\n :param subject: The subject.\n :type subject: :py:class:`X509Name`\n\n :return: ``None``\n \"\"\"\n self._set_name(_lib.X509_set_subject_name, subject)\n self._subject_invalidator.clear()"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef add_extensions(self, extensions):\n for ext in extensions:\n if not isinstance(ext, X509Extension):\n raise ValueError(\"One of the elements is not an X509Extension\")\n\n add_result = _lib.X509_add_ext(self._x509, ext._extension, -1)\n if not add_result:\n _raise_current_error()", "response": "Add the extensions to the certificate."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_extension(self, index):\n ext = X509Extension.__new__(X509Extension)\n ext._extension = _lib.X509_get_ext(self._x509, index)\n if ext._extension == _ffi.NULL:\n raise IndexError(\"extension index out of bounds\")\n\n extension = _lib.X509_EXTENSION_dup(ext._extension)\n ext._extension = _ffi.gc(extension, _lib.X509_EXTENSION_free)\n return ext", "response": "Get a specific extension of the certificate by index."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef add_cert(self, cert):\n if not isinstance(cert, X509):\n raise TypeError()\n\n # As of OpenSSL 1.1.0i adding the same cert to the store more than\n # once doesn't cause an error. Accordingly, this code now silences\n # the error for OpenSSL < 1.1.0i as well.\n if _lib.X509_STORE_add_cert(self._store, cert._x509) == 0:\n code = _lib.ERR_peek_error()\n err_reason = _lib.ERR_GET_REASON(code)\n _openssl_assert(\n err_reason == _lib.X509_R_CERT_ALREADY_IN_HASH_TABLE\n )\n _lib.ERR_clear_error()", "response": "Adds a trusted certificate to the store."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nadd a certificate revocation list to this store.", "response": "def add_crl(self, crl):\n \"\"\"\n Add a certificate revocation list to this store.\n\n The certificate revocation lists added to a store will only be used if\n the associated flags are configured to check certificate revocation\n lists.\n\n .. versionadded:: 16.1.0\n\n :param CRL crl: The certificate revocation list to add to this store.\n :return: ``None`` if the certificate revocation list was added\n successfully.\n \"\"\"\n _openssl_assert(_lib.X509_STORE_add_crl(self._store, crl._crl) != 0)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef set_time(self, vfy_time):\n param = _lib.X509_VERIFY_PARAM_new()\n param = _ffi.gc(param, _lib.X509_VERIFY_PARAM_free)\n\n _lib.X509_VERIFY_PARAM_set_time(param, int(vfy_time.strftime('%s')))\n _openssl_assert(_lib.X509_STORE_set1_param(self._store, param) != 0)", "response": "Sets the time for which the certificates are verified."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _init(self):\n ret = _lib.X509_STORE_CTX_init(\n self._store_ctx, self._store._store, self._cert._x509, _ffi.NULL\n )\n if ret <= 0:\n _raise_current_error()", "response": "Initialize the store context for subsequent verification operations."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nconverts an OpenSSL native context error failure into a Python Exception.", "response": "def _exception_from_context(self):\n \"\"\"\n Convert an OpenSSL native context error failure into a Python\n exception.\n\n When a call to native OpenSSL X509_verify_cert fails, additional\n information about the failure can be obtained from the store context.\n \"\"\"\n errors = [\n _lib.X509_STORE_CTX_get_error(self._store_ctx),\n _lib.X509_STORE_CTX_get_error_depth(self._store_ctx),\n _native(_ffi.string(_lib.X509_verify_cert_error_string(\n _lib.X509_STORE_CTX_get_error(self._store_ctx)))),\n ]\n # A context error should always be associated with a certificate, so we\n # expect this call to never return :class:`None`.\n _x509 = _lib.X509_STORE_CTX_get_current_cert(self._store_ctx)\n _cert = _lib.X509_dup(_x509)\n pycert = X509._from_raw_x509_ptr(_cert)\n return X509StoreContextError(errors, pycert)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nverifies a certificate in a context.", "response": "def verify_certificate(self):\n \"\"\"\n Verify a certificate in a context.\n\n .. versionadded:: 0.15\n\n :raises X509StoreContextError: If an error occurred when validating a\n certificate in the context. Sets ``certificate`` attribute to\n indicate which certificate caused the error.\n \"\"\"\n # Always re-initialize the store context in case\n # :meth:`verify_certificate` is called multiple times.\n #\n # :meth:`_init` is called in :meth:`__init__` so _cleanup is called\n # before _init to ensure memory is not leaked.\n self._cleanup()\n self._init()\n ret = _lib.X509_verify_cert(self._store_ctx)\n self._cleanup()\n if ret <= 0:\n raise self._exception_from_context()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsetting the serial number of the current trust class.", "response": "def set_serial(self, hex_str):\n \"\"\"\n Set the serial number.\n\n The serial number is formatted as a hexadecimal number encoded in\n ASCII.\n\n :param bytes hex_str: The new serial number.\n\n :return: ``None``\n \"\"\"\n bignum_serial = _ffi.gc(_lib.BN_new(), _lib.BN_free)\n bignum_ptr = _ffi.new(\"BIGNUM**\")\n bignum_ptr[0] = bignum_serial\n bn_result = _lib.BN_hex2bn(bignum_ptr, hex_str)\n if not bn_result:\n raise ValueError(\"bad hex string\")\n\n asn1_serial = _ffi.gc(\n _lib.BN_to_ASN1_INTEGER(bignum_serial, _ffi.NULL),\n _lib.ASN1_INTEGER_free)\n _lib.X509_REVOKED_set_serialNumber(self._revoked, asn1_serial)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting the serial number of the certificate store.", "response": "def get_serial(self):\n \"\"\"\n Get the serial number.\n\n The serial number is formatted as a hexadecimal number encoded in\n ASCII.\n\n :return: The serial number.\n :rtype: bytes\n \"\"\"\n bio = _new_mem_buf()\n\n asn1_int = _lib.X509_REVOKED_get0_serialNumber(self._revoked)\n _openssl_assert(asn1_int != _ffi.NULL)\n result = _lib.i2a_ASN1_INTEGER(bio, asn1_int)\n _openssl_assert(result >= 0)\n return _bio_to_string(bio)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef set_reason(self, reason):\n if reason is None:\n self._delete_reason()\n elif not isinstance(reason, bytes):\n raise TypeError(\"reason must be None or a byte string\")\n else:\n reason = reason.lower().replace(b' ', b'')\n reason_code = [r.lower() for r in self._crl_reasons].index(reason)\n\n new_reason_ext = _lib.ASN1_ENUMERATED_new()\n _openssl_assert(new_reason_ext != _ffi.NULL)\n new_reason_ext = _ffi.gc(new_reason_ext, _lib.ASN1_ENUMERATED_free)\n\n set_result = _lib.ASN1_ENUMERATED_set(new_reason_ext, reason_code)\n _openssl_assert(set_result != _ffi.NULL)\n\n self._delete_reason()\n add_result = _lib.X509_REVOKED_add1_ext_i2d(\n self._revoked, _lib.NID_crl_reason, new_reason_ext, 0, 0)\n _openssl_assert(add_result == 1)", "response": "Sets the revocation reason of this entry."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget the reason of this revocation.", "response": "def get_reason(self):\n \"\"\"\n Get the reason of this revocation.\n\n :return: The reason, or ``None`` if there is none.\n :rtype: bytes or NoneType\n\n .. seealso::\n\n :meth:`all_reasons`, which gives you a list of all supported\n reasons this method might return.\n \"\"\"\n for i in range(_lib.X509_REVOKED_get_ext_count(self._revoked)):\n ext = _lib.X509_REVOKED_get_ext(self._revoked, i)\n obj = _lib.X509_EXTENSION_get_object(ext)\n if _lib.OBJ_obj2nid(obj) == _lib.NID_crl_reason:\n bio = _new_mem_buf()\n\n print_result = _lib.X509V3_EXT_print(bio, ext, 0, 0)\n if not print_result:\n print_result = _lib.M_ASN1_OCTET_STRING_print(\n bio, _lib.X509_EXTENSION_get_data(ext)\n )\n _openssl_assert(print_result != 0)\n\n return _bio_to_string(bio)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsetting the revocation date.", "response": "def set_rev_date(self, when):\n \"\"\"\n Set the revocation timestamp.\n\n :param bytes when: The timestamp of the revocation,\n as ASN.1 TIME.\n :return: ``None``\n \"\"\"\n dt = _lib.X509_REVOKED_get0_revocationDate(self._revoked)\n return _set_asn1_time(dt, when)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nexporting as a cryptography. x509. CertificateRevocationList.", "response": "def to_cryptography(self):\n \"\"\"\n Export as a ``cryptography`` CRL.\n\n :rtype: ``cryptography.x509.CertificateRevocationList``\n\n .. versionadded:: 17.1.0\n \"\"\"\n from cryptography.hazmat.backends.openssl.x509 import (\n _CertificateRevocationList\n )\n backend = _get_backend()\n return _CertificateRevocationList(backend, self._crl)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nconstructs a CRL object from a cryptography. x509. CertificateRevocationList.", "response": "def from_cryptography(cls, crypto_crl):\n \"\"\"\n Construct based on a ``cryptography`` *crypto_crl*.\n\n :param crypto_crl: A ``cryptography`` certificate revocation list\n :type crypto_crl: ``cryptography.x509.CertificateRevocationList``\n\n :rtype: CRL\n\n .. versionadded:: 17.1.0\n \"\"\"\n if not isinstance(crypto_crl, x509.CertificateRevocationList):\n raise TypeError(\"Must be a certificate revocation list\")\n\n crl = cls()\n crl._crl = crypto_crl._x509_crl\n return crl"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_revoked(self):\n results = []\n revoked_stack = _lib.X509_CRL_get_REVOKED(self._crl)\n for i in range(_lib.sk_X509_REVOKED_num(revoked_stack)):\n revoked = _lib.sk_X509_REVOKED_value(revoked_stack, i)\n revoked_copy = _lib.Cryptography_X509_REVOKED_dup(revoked)\n pyrev = Revoked.__new__(Revoked)\n pyrev._revoked = _ffi.gc(revoked_copy, _lib.X509_REVOKED_free)\n results.append(pyrev)\n if results:\n return tuple(results)", "response": "Return the revocation list in this certificate."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding a revocation to the CRL structure.", "response": "def add_revoked(self, revoked):\n \"\"\"\n Add a revoked (by value not reference) to the CRL structure\n\n This revocation will be added by value, not by reference. That\n means it's okay to mutate it after adding: it won't affect\n this CRL.\n\n :param Revoked revoked: The new revocation.\n :return: ``None``\n \"\"\"\n copy = _lib.Cryptography_X509_REVOKED_dup(revoked._revoked)\n _openssl_assert(copy != _ffi.NULL)\n\n add_result = _lib.X509_CRL_add0_revoked(self._crl, copy)\n _openssl_assert(add_result != 0)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngets the CRL s issuer.", "response": "def get_issuer(self):\n \"\"\"\n Get the CRL's issuer.\n\n .. versionadded:: 16.1.0\n\n :rtype: X509Name\n \"\"\"\n _issuer = _lib.X509_NAME_dup(_lib.X509_CRL_get_issuer(self._crl))\n _openssl_assert(_issuer != _ffi.NULL)\n _issuer = _ffi.gc(_issuer, _lib.X509_NAME_free)\n issuer = X509Name.__new__(X509Name)\n issuer._name = _issuer\n return issuer"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef sign(self, issuer_cert, issuer_key, digest):\n digest_obj = _lib.EVP_get_digestbyname(digest)\n _openssl_assert(digest_obj != _ffi.NULL)\n _lib.X509_CRL_set_issuer_name(\n self._crl, _lib.X509_get_subject_name(issuer_cert._x509))\n _lib.X509_CRL_sort(self._crl)\n result = _lib.X509_CRL_sign(self._crl, issuer_key._pkey, digest_obj)\n _openssl_assert(result != 0)", "response": "Signs the CRL with the given certificate and private key."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef export(self, cert, key, type=FILETYPE_PEM, days=100,\n digest=_UNSPECIFIED):\n \"\"\"\n Export the CRL as a string.\n\n :param X509 cert: The certificate used to sign the CRL.\n :param PKey key: The key used to sign the CRL.\n :param int type: The export format, either :data:`FILETYPE_PEM`,\n :data:`FILETYPE_ASN1`, or :data:`FILETYPE_TEXT`.\n :param int days: The number of days until the next update of this CRL.\n :param bytes digest: The name of the message digest to use (eg\n ``b\"sha256\"``).\n :rtype: bytes\n \"\"\"\n\n if not isinstance(cert, X509):\n raise TypeError(\"cert must be an X509 instance\")\n if not isinstance(key, PKey):\n raise TypeError(\"key must be a PKey instance\")\n if not isinstance(type, int):\n raise TypeError(\"type must be an integer\")\n\n if digest is _UNSPECIFIED:\n raise TypeError(\"digest must be provided\")\n\n digest_obj = _lib.EVP_get_digestbyname(digest)\n if digest_obj == _ffi.NULL:\n raise ValueError(\"No such digest method\")\n\n bio = _lib.BIO_new(_lib.BIO_s_mem())\n _openssl_assert(bio != _ffi.NULL)\n\n # A scratch time object to give different values to different CRL\n # fields\n sometime = _lib.ASN1_TIME_new()\n _openssl_assert(sometime != _ffi.NULL)\n\n _lib.X509_gmtime_adj(sometime, 0)\n _lib.X509_CRL_set_lastUpdate(self._crl, sometime)\n\n _lib.X509_gmtime_adj(sometime, days * 24 * 60 * 60)\n _lib.X509_CRL_set_nextUpdate(self._crl, sometime)\n\n _lib.X509_CRL_set_issuer_name(\n self._crl, _lib.X509_get_subject_name(cert._x509)\n )\n\n sign_result = _lib.X509_CRL_sign(self._crl, key._pkey, digest_obj)\n if not sign_result:\n _raise_current_error()\n\n return dump_crl(type, self)", "response": "Exports the CRL as a string."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the type name of the PKCS7 structure", "response": "def get_type_name(self):\n \"\"\"\n Returns the type name of the PKCS7 structure\n\n :return: A string with the typename\n \"\"\"\n nid = _lib.OBJ_obj2nid(self._pkcs7.type)\n string_type = _lib.OBJ_nid2sn(nid)\n return _ffi.string(string_type)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsets the certificate in the PKCS #12 structure.", "response": "def set_certificate(self, cert):\n \"\"\"\n Set the certificate in the PKCS #12 structure.\n\n :param cert: The new certificate, or :py:const:`None` to unset it.\n :type cert: :py:class:`X509` or :py:const:`None`\n\n :return: ``None``\n \"\"\"\n if not isinstance(cert, X509):\n raise TypeError(\"cert must be an X509 instance\")\n self._cert = cert"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nset the private key portion of the PKCS #12 structure.", "response": "def set_privatekey(self, pkey):\n \"\"\"\n Set the certificate portion of the PKCS #12 structure.\n\n :param pkey: The new private key, or :py:const:`None` to unset it.\n :type pkey: :py:class:`PKey` or :py:const:`None`\n\n :return: ``None``\n \"\"\"\n if not isinstance(pkey, PKey):\n raise TypeError(\"pkey must be a PKey instance\")\n self._pkey = pkey"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreplaces or set the CA certificates within the PKCS12 object.", "response": "def set_ca_certificates(self, cacerts):\n \"\"\"\n Replace or set the CA certificates within the PKCS12 object.\n\n :param cacerts: The new CA certificates, or :py:const:`None` to unset\n them.\n :type cacerts: An iterable of :py:class:`X509` or :py:const:`None`\n\n :return: ``None``\n \"\"\"\n if cacerts is None:\n self._cacerts = None\n else:\n cacerts = list(cacerts)\n for cert in cacerts:\n if not isinstance(cert, X509):\n raise TypeError(\n \"iterable must only contain X509 instances\"\n )\n self._cacerts = cacerts"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef set_friendlyname(self, name):\n if name is None:\n self._friendlyname = None\n elif not isinstance(name, bytes):\n raise TypeError(\n \"name must be a byte string or None (not %r)\" % (name,)\n )\n self._friendlyname = name", "response": "Set the friendly name in the PKCS #12 structure."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndumping a PKCS12 object as a string.", "response": "def export(self, passphrase=None, iter=2048, maciter=1):\n \"\"\"\n Dump a PKCS12 object as a string.\n\n For more information, see the :c:func:`PKCS12_create` man page.\n\n :param passphrase: The passphrase used to encrypt the structure. Unlike\n some other passphrase arguments, this *must* be a string, not a\n callback.\n :type passphrase: :py:data:`bytes`\n\n :param iter: Number of times to repeat the encryption step.\n :type iter: :py:data:`int`\n\n :param maciter: Number of times to repeat the MAC step.\n :type maciter: :py:data:`int`\n\n :return: The string representation of the PKCS #12 structure.\n :rtype:\n \"\"\"\n passphrase = _text_to_bytes_and_warn(\"passphrase\", passphrase)\n\n if self._cacerts is None:\n cacerts = _ffi.NULL\n else:\n cacerts = _lib.sk_X509_new_null()\n cacerts = _ffi.gc(cacerts, _lib.sk_X509_free)\n for cert in self._cacerts:\n _lib.sk_X509_push(cacerts, cert._x509)\n\n if passphrase is None:\n passphrase = _ffi.NULL\n\n friendlyname = self._friendlyname\n if friendlyname is None:\n friendlyname = _ffi.NULL\n\n if self._pkey is None:\n pkey = _ffi.NULL\n else:\n pkey = self._pkey._pkey\n\n if self._cert is None:\n cert = _ffi.NULL\n else:\n cert = self._cert._x509\n\n pkcs12 = _lib.PKCS12_create(\n passphrase, friendlyname, pkey, cert, cacerts,\n _lib.NID_pbe_WithSHA1And3_Key_TripleDES_CBC,\n _lib.NID_pbe_WithSHA1And3_Key_TripleDES_CBC,\n iter, maciter, 0)\n if pkcs12 == _ffi.NULL:\n _raise_current_error()\n pkcs12 = _ffi.gc(pkcs12, _lib.PKCS12_free)\n\n bio = _new_mem_buf()\n _lib.i2d_PKCS12_bio(bio, pkcs12)\n return _bio_to_string(bio)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsigning the certificate request with this key and the given message digest.", "response": "def sign(self, pkey, digest):\n \"\"\"\n Sign the certificate request with this key and digest type.\n\n :param pkey: The private key to sign with.\n :type pkey: :py:class:`PKey`\n\n :param digest: The message digest to use.\n :type digest: :py:class:`bytes`\n\n :return: ``None``\n \"\"\"\n if pkey._only_public:\n raise ValueError(\"Key has only public part\")\n\n if not pkey._initialized:\n raise ValueError(\"Key is uninitialized\")\n\n digest_obj = _lib.EVP_get_digestbyname(_byte_string(digest))\n if digest_obj == _ffi.NULL:\n raise ValueError(\"No such digest method\")\n\n sign_result = _lib.NETSCAPE_SPKI_sign(\n self._spki, pkey._pkey, digest_obj\n )\n _openssl_assert(sign_result > 0)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef verify(self, key):\n answer = _lib.NETSCAPE_SPKI_verify(self._spki, key._pkey)\n if answer <= 0:\n _raise_current_error()\n return True", "response": "Verifies a signature on a certificate request."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ngenerate a base64 encoded representation of this SPKI object.", "response": "def b64_encode(self):\n \"\"\"\n Generate a base64 encoded representation of this SPKI object.\n\n :return: The base64 encoded string.\n :rtype: :py:class:`bytes`\n \"\"\"\n encoded = _lib.NETSCAPE_SPKI_b64_encode(self._spki)\n result = _ffi.string(encoded)\n _lib.OPENSSL_free(encoded)\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_pubkey(self):\n pkey = PKey.__new__(PKey)\n pkey._pkey = _lib.NETSCAPE_SPKI_get_pubkey(self._spki)\n _openssl_assert(pkey._pkey != _ffi.NULL)\n pkey._pkey = _ffi.gc(pkey._pkey, _lib.EVP_PKEY_free)\n pkey._only_public = True\n return pkey", "response": "Get the public key of this certificate."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef set_pubkey(self, pkey):\n set_result = _lib.NETSCAPE_SPKI_set_pubkey(self._spki, pkey._pkey)\n _openssl_assert(set_result == 1)", "response": "Set the public key of the certificate\n ."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef exception_from_error_queue(exception_type):\n errors = []\n\n while True:\n error = lib.ERR_get_error()\n if error == 0:\n break\n errors.append((\n text(lib.ERR_lib_error_string(error)),\n text(lib.ERR_func_error_string(error)),\n text(lib.ERR_reason_error_string(error))))\n\n raise exception_type(errors)", "response": "Convert an OpenSSL library failure into a Python exception."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef native(s):\n if not isinstance(s, (binary_type, text_type)):\n raise TypeError(\"%r is neither bytes nor unicode\" % s)\n if PY3:\n if isinstance(s, binary_type):\n return s.decode(\"utf-8\")\n else:\n if isinstance(s, text_type):\n return s.encode(\"utf-8\")\n return s", "response": "Convert a string to the native version of the object."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef path_string(s):\n if isinstance(s, binary_type):\n return s\n elif isinstance(s, text_type):\n return s.encode(sys.getfilesystemencoding())\n else:\n raise TypeError(\"Path must be represented as bytes or unicode string\")", "response": "Converts a Python string to a bytes string identifying the same\n path and which can be passed into OpenSSL API accepting a filename."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef text_to_bytes_and_warn(label, obj):\n if isinstance(obj, text_type):\n warnings.warn(\n _TEXT_WARNING.format(label),\n category=DeprecationWarning,\n stacklevel=3\n )\n return obj.encode('utf-8')\n return obj", "response": "Convert a text string to bytes and emit a warning if it is not."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef add(buffer, entropy):\n if not isinstance(buffer, bytes):\n raise TypeError(\"buffer must be a byte string\")\n\n if not isinstance(entropy, int):\n raise TypeError(\"entropy must be an integer\")\n\n _lib.RAND_add(buffer, len(buffer), entropy)", "response": "Add a random bytes from a byte string into the CSPRNG state."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\naccepting returns a new SSLWrapper object", "response": "def accept(self):\n \"\"\"\n This is the other part of the shutdown() workaround.\n Since servers create new sockets, we have to infect\n them with our magic. :)\n \"\"\"\n c, a = self.__dict__[\"conn\"].accept()\n return (SSLWrapper(c), a)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef setup(self):\n self.connection = self.request # for doPOST\n self.rfile = socket._fileobject(self.request, \"rb\", self.rbufsize)\n self.wfile = socket._fileobject(self.request, \"wb\", self.wbufsize)", "response": "Setup the file - like objects for the current version of the current SSL. Connection object."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nrun an SNI-enabled server which selects between a few certificates in a C{dict} based on the handshake request it receives from a client.", "response": "def main():\n \"\"\"\n Run an SNI-enabled server which selects between a few certificates in a\n C{dict} based on the handshake request it receives from a client.\n \"\"\"\n port = socket()\n port.setsockopt(SOL_SOCKET, SO_REUSEADDR, 1)\n port.bind(('', 8443))\n port.listen(3)\n\n print('Accepting...', end=\"\")\n stdout.flush()\n server, addr = port.accept()\n print('accepted', addr)\n\n server_context = Context(TLSv1_METHOD)\n server_context.set_tlsext_servername_callback(pick_certificate)\n\n server_ssl = Connection(server_context, server)\n server_ssl.set_accept_state()\n server_ssl.do_handshake()\n server.close()"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _print_token_factory(col):\n def _helper(msg):\n style = style_from_dict({\n Token.Color: col,\n })\n tokens = [\n (Token.Color, msg)\n ]\n print_tokens(tokens, style=style)\n\n def _helper_no_terminal(msg):\n # workaround if we have no terminal\n print(msg)\n if sys.stdout.isatty():\n return _helper\n else:\n return _helper_no_terminal", "response": "Internal helper to provide color names."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_service_metadata(self):\n return {\n 'import_labels_as_tags':\n self.config.get('import_labels_as_tags', False, asbool),\n 'label_template':\n self.config.get('label_template', DEFAULT_LABEL_TEMPLATE),\n }", "response": "Return extra config options to be passed to the TrelloIssue class"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns a list of dicts representing issues from a remote service.", "response": "def issues(self):\n \"\"\"\n Returns a list of dicts representing issues from a remote service.\n \"\"\"\n for board in self.get_boards():\n for lst in self.get_lists(board['id']):\n listextra = dict(boardname=board['name'], listname=lst['name'])\n for card in self.get_cards(lst['id']):\n issue = self.get_issue_for_record(card, extra=listextra)\n issue.update_extra({\"annotations\": self.annotations(card)})\n yield issue"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_boards(self):\n if 'include_boards' in self.config:\n for boardid in self.config.get('include_boards', to_type=aslist):\n # Get the board name\n yield self.api_request(\n \"/1/boards/{id}\".format(id=boardid), fields='name')\n else:\n boards = self.api_request(\"/1/members/me/boards\", fields='name')\n for board in boards:\n yield board", "response": "Get the list of boards to pull cards from."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning a list of the filtered lists for the given board", "response": "def get_lists(self, board):\n \"\"\"\n Returns a list of the filtered lists for the given board\n This filters the trello lists according to the configuration values of\n trello.include_lists and trello.exclude_lists.\n \"\"\"\n lists = self.api_request(\n \"/1/boards/{board_id}/lists/open\".format(board_id=board),\n fields='name')\n\n include_lists = self.config.get('include_lists', to_type=aslist)\n if include_lists:\n lists = [l for l in lists if l['name'] in include_lists]\n\n exclude_lists = self.config.get('exclude_lists', to_type=aslist)\n if exclude_lists:\n lists = [l for l in lists if l['name'] not in exclude_lists]\n\n return lists"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn an iterator for the cards in a given list filtered by the configuration values of only_if_assigned and also_unassigned.", "response": "def get_cards(self, list_id):\n \"\"\" Returns an iterator for the cards in a given list, filtered\n according to configuration values of trello.only_if_assigned and\n trello.also_unassigned \"\"\"\n params = {'fields': 'name,idShort,shortLink,shortUrl,url,labels,due'}\n member = self.config.get('only_if_assigned', None)\n unassigned = self.config.get('also_unassigned', False, asbool)\n if member is not None:\n params['members'] = 'true'\n params['member_fields'] = 'username'\n cards = self.api_request(\n \"/1/lists/{list_id}/cards/open\".format(list_id=list_id),\n **params)\n for card in cards:\n if (member is None\n or member in [m['username'] for m in card['members']]\n or (unassigned and not card['members'])):\n yield card"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning an iterator for the comments on a certain card.", "response": "def get_comments(self, card_id):\n \"\"\" Returns an iterator for the comments on a certain card. \"\"\"\n params = {'filter': 'commentCard', 'memberCreator_fields': 'username'}\n comments = self.api_request(\n \"/1/cards/{card_id}/actions\".format(card_id=card_id),\n **params)\n for comment in comments:\n assert comment['type'] == 'commentCard'\n yield comment"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef api_request(self, url, **params):\n params['key'] = self.config.get('api_key'),\n params['token'] = self.config.get('token'),\n url = \"https://api.trello.com\" + url\n return self.json_response(requests.get(url, params=params))", "response": "Make a trello API request. This takes an absolute url and a list of argumnets and returns a GET request with the key and token from the configuration"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting all the issues for a given repository.", "response": "def get_issues(self, repo, keys):\n \"\"\" Grab all the issues \"\"\"\n key1, key2 = keys\n key3 = key1[:-1] # Just the singular form of key1\n\n url = self.base_url + \"/api/0/\" + repo + \"/\" + key1\n response = self.session.get(url, params=dict(status='Open'))\n\n if not bool(response):\n error = response.json()\n code = error['error_code']\n if code == 'ETRACKERDISABLED':\n return []\n else:\n raise IOError('Failed to talk to %r %r' % (url, error))\n\n issues = []\n for result in response.json()[key2]:\n idx = six.text_type(result['id'])\n result['html_url'] = \"/\".join([self.base_url, repo, key3, idx])\n issues.append((repo, result))\n\n return issues"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget issue list from bugwarriorrc file", "response": "def get_issue_generator(self, user_id, project_id, project_name):\n \"\"\"\n Approach:\n\n 1. Get user ID from bugwarriorrc file\n 2. Get list of tickets from /user-tasks for a given project\n 3. For each ticket/task returned from #2, get ticket/task info and\n check if logged-in user is primary (look at `is_owner` and\n `user_id`)\n \"\"\"\n\n user_tasks_data = self.call_api(\n \"/projects/\" + six.text_type(project_id) + \"/user-tasks\")\n\n for key, task in enumerate(user_tasks_data):\n\n assigned_task = self.get_task_dict(project_id, key, task)\n\n if assigned_task:\n log.debug(\n \" Adding '\" + assigned_task['description'] +\n \"' to task list.\")\n yield assigned_task"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nbuild the full url to the API endpoint", "response": "def _api_url(self, path, **context):\n \"\"\" Build the full url to the API endpoint \"\"\"\n if self.host == 'github.com':\n baseurl = \"https://api.github.com\"\n else:\n baseurl = \"https://{}/api/v3\".format(self.host)\n return baseurl + path.format(**context)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nrun a generic issue / PR query", "response": "def get_query(self, query):\n \"\"\"Run a generic issue/PR query\"\"\"\n url = self._api_url(\n \"/search/issues?q={query}&per_page=100\", query=query)\n return self._getter(url, subkey='items')"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _getter(self, url, subkey=None):\n\n kwargs = {}\n if 'basic' in self.auth:\n kwargs['auth'] = self.auth['basic']\n\n results = []\n link = dict(next=url)\n\n while 'next' in link:\n response = self.session.get(link['next'], **kwargs)\n\n # Warn about the mis-leading 404 error code. See:\n # https://github.com/ralphbean/bugwarrior/issues/374\n if response.status_code == 404 and 'token' in self.auth:\n log.warn(\"A '404' from github may indicate an auth \"\n \"failure. Make sure both that your token is correct \"\n \"and that it has 'public_repo' and not 'public \"\n \"access' rights.\")\n\n json_res = self.json_response(response)\n\n if subkey is not None:\n json_res = json_res[subkey]\n\n results += json_res\n\n link = self._link_field_to_dict(response.headers.get('link', None))\n\n return results", "response": "Internal method to retrieve the list of all the related resources."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _link_field_to_dict(field):\n\n if not field:\n return dict()\n\n return dict([\n (\n part.split('; ')[1][5:-1],\n part.split('; ')[0][1:-1],\n ) for part in field.split(', ')\n ])", "response": "Utility for ripping apart github s Link header field."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_owned_repo_issues(self, tag):\n issues = {}\n for issue in self.client.get_issues(*tag.split('/')):\n issues[issue['url']] = (tag, issue)\n return issues", "response": "Get all the issues owned by the repo with the given tag."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting all issues matching a github query", "response": "def get_query(self, query):\n \"\"\" Grab all issues matching a github query \"\"\"\n issues = {}\n for issue in self.client.get_query(query):\n url = issue['html_url']\n try:\n repo = self.get_repository_from_issue(issue)\n except ValueError as e:\n log.critical(e)\n else:\n issues[url] = (repo, issue)\n return issues"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _reqs(self, tag):\n return [\n (tag, i) for i in\n self.client.get_pulls(*tag.split('/'))\n ]", "response": "Get all the pull requests for a given tag"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _aggregate_issues(conf, main_section, target, queue, service_name):\n\n start = time.time()\n\n try:\n service = get_service(service_name)(conf, main_section, target)\n issue_count = 0\n for issue in service.issues():\n queue.put(issue)\n issue_count += 1\n except SystemExit as e:\n log.critical(str(e))\n queue.put((SERVICE_FINISHED_ERROR, (target, e)))\n except BaseException as e:\n if hasattr(e, 'request') and e.request:\n # Exceptions raised by requests library have the HTTP request\n # object stored as attribute. The request can have hooks attached\n # to it, and we need to remove them, as there can be unpickleable\n # methods. There is no one left to call these hooks anyway.\n e.request.hooks = {}\n log.exception(\"Worker for [%s] failed: %s\" % (target, e))\n queue.put((SERVICE_FINISHED_ERROR, (target, e)))\n else:\n queue.put((SERVICE_FINISHED_OK, (target, issue_count, )))\n finally:\n duration = time.time() - start\n log.info(\"Done with [%s] in %fs\" % (target, duration))", "response": "This function is separated out from the main_section func only so that we can use multiprocessing\nFormula on it."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nyielding all issues from every target in the config.", "response": "def aggregate_issues(conf, main_section, debug):\n \"\"\" Return all issues from every target. \"\"\"\n log.info(\"Starting to aggregate remote issues.\")\n\n # Create and call service objects for every target in the config\n targets = aslist(conf.get(main_section, 'targets'))\n\n queue = multiprocessing.Queue()\n\n log.info(\"Spawning %i workers.\" % len(targets))\n processes = []\n\n if debug:\n for target in targets:\n _aggregate_issues(\n conf,\n main_section,\n target,\n queue,\n conf.get(target, 'service')\n )\n else:\n for target in targets:\n proc = multiprocessing.Process(\n target=_aggregate_issues,\n args=(conf, main_section, target, queue, conf.get(target, 'service'))\n )\n proc.start()\n processes.append(proc)\n\n # Sleep for 1 second here to try and avoid a race condition where\n # all N workers start up and ask the gpg-agent process for\n # information at the same time. This causes gpg-agent to fumble\n # and tell some of our workers some incomplete things.\n time.sleep(1)\n\n currently_running = len(targets)\n while currently_running > 0:\n issue = queue.get(True)\n if isinstance(issue, tuple):\n completion_type, args = issue\n if completion_type == SERVICE_FINISHED_ERROR:\n target, e = args\n log.info(\"Terminating workers\")\n for process in processes:\n process.terminate()\n raise RuntimeError(\n \"critical error in target '{}'\".format(target))\n currently_running -= 1\n continue\n yield issue\n\n log.info(\"Done aggregating remote issues.\")"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn a main config value or default if it does not exist.", "response": "def _get_config_or_default(self, key, default, as_type=lambda x: x):\n \"\"\"Return a main config value, or default if it does not exist.\"\"\"\n\n if self.main_config.has_option(self.main_section, key):\n return as_type(self.main_config.get(self.main_section, key))\n return default"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_templates(self):\n templates = {}\n for key in six.iterkeys(Task.FIELDS):\n template_key = '%s_template' % key\n if template_key in self.config:\n templates[key] = self.config.get(template_key)\n return templates", "response": "Get any defined templates for the Taskwarrior record."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nvalidate generic options for a particular target", "response": "def validate_config(cls, service_config, target):\n \"\"\" Validate generic options for a particular target \"\"\"\n if service_config.has_option(target, 'only_if_assigned'):\n die(\"[%s] has an 'only_if_assigned' option. Should be \"\n \"'%s.only_if_assigned'.\" % (target, cls.CONFIG_PREFIX))\n if service_config.has_option(target, 'also_unassigned'):\n die(\"[%s] has an 'also_unassigned' option. Should be \"\n \"'%s.also_unassigned'.\" % (target, cls.CONFIG_PREFIX))\n if service_config.has_option(target, 'default_priority'):\n die(\"[%s] has a 'default_priority' option. Should be \"\n \"'%s.default_priority'.\" % (target, cls.CONFIG_PREFIX))\n if service_config.has_option(target, 'add_tags'):\n die(\"[%s] has an 'add_tags' option. Should be \"\n \"'%s.add_tags'.\" % (target, cls.CONFIG_PREFIX))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn True if the issue in question should be included.", "response": "def include(self, issue):\n \"\"\" Return true if the issue in question should be included \"\"\"\n only_if_assigned = self.config.get('only_if_assigned', None)\n\n if only_if_assigned:\n owner = self.get_owner(issue)\n include_owners = [only_if_assigned]\n\n if self.config.get('also_unassigned', None, asbool):\n include_owners.append(None)\n\n return owner in include_owners\n\n only_if_author = self.config.get('only_if_author', None)\n\n if only_if_author:\n return self.get_author(issue) == only_if_author\n\n return True"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef make_table(grid):\n cell_width = 2 + max(\n reduce(\n lambda x, y: x+y, [[len(item) for item in row] for row in grid], []\n )\n )\n num_cols = len(grid[0])\n rst = table_div(num_cols, cell_width, 0)\n header_flag = 1\n for row in grid:\n rst = rst + '| ' + '| '.join(\n [normalize_cell(x, cell_width-1) for x in row]\n ) + '|\\n'\n rst = rst + table_div(num_cols, cell_width, header_flag)\n header_flag = 0\n return rst", "response": "Make a RST - compatible table from a list of items"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nretrieving the sensitive password for a service by a given username.", "response": "def get_service_password(service, username, oracle=None, interactive=False):\n \"\"\"\n Retrieve the sensitive password for a service by:\n\n * retrieving password from a secure store (@oracle:use_keyring, default)\n * asking the password from the user (@oracle:ask_password, interactive)\n * executing a command and use the output as password\n (@oracle:eval:)\n\n Note that the keyring may or may not be locked\n which requires that the user provides a password (interactive mode).\n\n :param service: Service name, may be key into secure store (as string).\n :param username: Username for the service (as string).\n :param oracle: Hint which password oracle strategy to use.\n :return: Retrieved password (as string)\n\n .. seealso::\n https://bitbucket.org/kang/python-keyring-lib\n \"\"\"\n import getpass\n\n password = None\n if not oracle or oracle == \"@oracle:use_keyring\":\n keyring = get_keyring()\n password = keyring.get_password(service, username)\n if interactive and password is None:\n # -- LEARNING MODE: Password is not stored in keyring yet.\n oracle = \"@oracle:ask_password\"\n password = get_service_password(service, username,\n oracle, interactive=True)\n if password:\n keyring.set_password(service, username, password)\n elif interactive and oracle == \"@oracle:ask_password\":\n prompt = \"%s password: \" % service\n password = getpass.getpass(prompt)\n elif oracle.startswith('@oracle:eval:'):\n command = oracle[13:]\n return oracle_eval(command)\n\n if password is None:\n die(\"MISSING PASSWORD: oracle='%s', interactive=%s for service=%s\" %\n (oracle, interactive, service))\n return password"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef oracle_eval(command):\n p = subprocess.Popen(\n command, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)\n p.wait()\n if p.returncode == 0:\n return p.stdout.readline().strip().decode('utf-8')\n else:\n die(\n \"Error retrieving password: `{command}` returned '{error}'\".format(\n command=command, error=p.stderr.read().strip()))", "response": "Retrieve the password from the given command"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_config_path():\n if os.environ.get(BUGWARRIORRC):\n return os.environ[BUGWARRIORRC]\n xdg_config_home = (\n os.environ.get('XDG_CONFIG_HOME') or os.path.expanduser('~/.config'))\n xdg_config_dirs = (\n (os.environ.get('XDG_CONFIG_DIRS') or '/etc/xdg').split(':'))\n paths = [\n os.path.join(xdg_config_home, 'bugwarrior', 'bugwarriorrc'),\n os.path.expanduser(\"~/.bugwarriorrc\")]\n paths += [\n os.path.join(d, 'bugwarrior', 'bugwarriorrc') for d in xdg_config_dirs]\n for path in paths:\n if os.path.exists(path):\n return path\n return paths[0]", "response": "Determines the path to the config file."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nexpand environment variables user home in the log. file and return as absolute path.", "response": "def fix_logging_path(config, main_section):\n \"\"\"\n Expand environment variables and user home (~) in the log.file and return\n as relative path.\n \"\"\"\n log_file = config.get(main_section, 'log.file')\n if log_file:\n log_file = os.path.expanduser(os.path.expandvars(log_file))\n if os.path.isabs(log_file):\n log_file = os.path.relpath(log_file)\n return log_file"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn an integer value from an option in section.", "response": "def getint(self, section, option):\n \"\"\" Accepts both integers and empty values. \"\"\"\n try:\n return super(BugwarriorConfigParser, self).getint(section, option)\n except ValueError:\n if self.get(section, option) == u'':\n return None\n else:\n raise ValueError(\n \"{section}.{option} must be an integer or empty.\".format(\n section=section, option=option))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget a bug attribute.", "response": "def _get_bug_attr(bug, attr):\n \"\"\"Default longdescs/flags case to [] since they may not be present.\"\"\"\n if attr in (\"longdescs\", \"flags\"):\n return getattr(bug, attr, [])\n return getattr(bug, attr)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef pull(dry_run, flavor, interactive, debug):\n\n try:\n main_section = _get_section_name(flavor)\n config = _try_load_config(main_section, interactive)\n\n lockfile_path = os.path.join(get_data_path(config, main_section),\n 'bugwarrior.lockfile')\n lockfile = PIDLockFile(lockfile_path)\n lockfile.acquire(timeout=10)\n try:\n # Get all the issues. This can take a while.\n issue_generator = aggregate_issues(config, main_section, debug)\n\n # Stuff them in the taskwarrior db as necessary\n synchronize(issue_generator, config, main_section, dry_run)\n finally:\n lockfile.release()\n except LockTimeout:\n log.critical(\n 'Your taskrc repository is currently locked. '\n 'Remove the file at %s if you are sure no other '\n 'bugwarrior processes are currently running.' % (\n lockfile_path\n )\n )\n except RuntimeError as e:\n log.exception(\"Aborted (%s)\" % e)", "response": "Pulls down tasks from forges and adds them to your taskwarrior database."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_data(self, url):\n return self.json_response(requests.get(url, **self.requests_kwargs))", "response": "Perform a request to the fully qualified url and return json."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_collection(self, url):\n url = self.BASE_API2 + url\n while url is not None:\n response = self.get_data(url)\n for value in response['values']:\n yield value\n url = response.get('next', None)", "response": "Returns an iterator that lazily goes through all the values in the object collection."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef hamdist(str1, str2):\n diffs = 0\n for ch1, ch2 in zip(str1, str2):\n if ch1 != ch2:\n diffs += 1\n return diffs", "response": "Count the number of differences between equal length strings str1 and str2"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef find_local_uuid(tw, keys, issue, legacy_matching=False):\n if not issue['description']:\n raise ValueError('Issue %s has no description.' % issue)\n\n possibilities = set([])\n\n if legacy_matching:\n legacy_description = issue.get_default_description().rsplit('..', 1)[0]\n # Furthermore, we have to kill off any single quotes which break in\n # task-2.4.x, as much as it saddens me.\n legacy_description = legacy_description.split(\"'\")[0]\n results = tw.filter_tasks({\n 'description.startswith': legacy_description,\n 'or': [\n ('status', 'pending'),\n ('status', 'waiting'),\n ],\n })\n possibilities = possibilities | set([\n task['uuid'] for task in results\n ])\n\n for service, key_list in six.iteritems(keys):\n if any([key in issue for key in key_list]):\n results = tw.filter_tasks({\n 'and': [(\"%s.is\" % key, issue[key]) for key in key_list],\n 'or': [\n ('status', 'pending'),\n ('status', 'waiting'),\n ],\n })\n possibilities = possibilities | set([\n task['uuid'] for task in results\n ])\n\n if len(possibilities) == 1:\n return possibilities.pop()\n\n if len(possibilities) > 1:\n raise MultipleMatches(\n \"Issue %s matched multiple IDs: %s\" % (\n issue['description'],\n possibilities\n )\n )\n\n raise NotFound(\n \"No issue was found matching %s\" % issue\n )", "response": "Given a list of unique identifiers and a given issue find its local UUID."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef merge_left(field, local_task, remote_issue, hamming=False):\n\n # Ensure that empty defaults are present\n local_field = local_task.get(field, [])\n remote_field = remote_issue.get(field, [])\n\n # We need to make sure an array exists for this field because\n # we will be appending to it in a moment.\n if field not in local_task:\n local_task[field] = []\n\n # If a remote does not appear in local, add it to the local task\n new_count = 0\n for remote in remote_field:\n for local in local_field:\n if (\n # For annotations, they don't have to match *exactly*.\n (\n hamming\n and get_annotation_hamming_distance(remote, local) == 0\n )\n # But for everything else, they should.\n or (\n remote == local\n )\n ):\n break\n else:\n log.debug(\"%s not found in %r\" % (remote, local_field))\n local_task[field].append(remote)\n new_count += 1\n if new_count > 0:\n log.debug('Added %s new values to %s (total: %s)' % (\n new_count, field, len(local_task[field]),))", "response": "Merge array field from the remote_issue into the local task."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef build_uda_config_overrides(targets):\n\n from bugwarrior.services import get_service\n\n targets_udas = {}\n for target in targets:\n targets_udas.update(get_service(target).ISSUE_CLASS.UDAS)\n return {\n 'uda': targets_udas\n }", "response": "Returns a list of UDAs defined by given targets"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _parse_sprint_string(sprint):\n entries = sprint[sprint.index('[')+1:sprint.index(']')].split('=')\n fields = sum((entry.rsplit(',', 1) for entry in entries), [])\n return dict(zip(fields[::2], fields[1::2]))", "response": "Parse the big ugly sprint string stored by JIRA."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget valid user credentials from storage.", "response": "def get_credentials(self):\n \"\"\"Gets valid user credentials from storage.\n\n If nothing has been stored, or if the stored credentials are invalid,\n the OAuth2 flow is completed to obtain the new credentials.\n\n Returns:\n Credentials, the obtained credential.\n \"\"\"\n with self.AUTHENTICATION_LOCK:\n log.info('Starting authentication for %s', self.target)\n store = oauth2client.file.Storage(self.credentials_path)\n credentials = store.get()\n if not credentials or credentials.invalid:\n log.info(\"No valid login. Starting OAUTH flow.\")\n flow = oauth2client.client.flow_from_clientsecrets(self.client_secret_path, self.SCOPES)\n flow.user_agent = self.APPLICATION_NAME\n flags = oauth2client.tools.argparser.parse_args([])\n credentials = oauth2client.tools.run_flow(flow, store, flags)\n log.info('Storing credentials to %r', self.credentials_path)\n return credentials"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef multi_rouge_n(sequences, scores_ids, n=2):\n ngrams = [_get_word_ngrams(n, sequence) for sequence in sequences]\n counts = [len(ngram) for ngram in ngrams]\n\n scores = []\n for hyp_id, ref_id in scores_ids:\n evaluated_ngrams = ngrams[hyp_id]\n evaluated_count = counts[hyp_id]\n\n reference_ngrams = ngrams[ref_id]\n reference_count = counts[ref_id]\n\n overlapping_ngrams = evaluated_ngrams.intersection(reference_ngrams)\n overlapping_count = len(overlapping_ngrams)\n\n scores += [f_r_p_rouge_n(evaluated_count,\n reference_count, overlapping_count)]\n return scores", "response": "Efficient way to compute rouge n for a list of sequences"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef rouge_n(evaluated_sentences, reference_sentences, n=2):\n if len(evaluated_sentences) <= 0 or len(reference_sentences) <= 0:\n raise ValueError(\"Collections must contain at least 1 sentence.\")\n\n evaluated_ngrams = _get_word_ngrams(n, evaluated_sentences)\n reference_ngrams = _get_word_ngrams(n, reference_sentences)\n reference_count = len(reference_ngrams)\n evaluated_count = len(evaluated_ngrams)\n\n # Gets the overlapping ngrams between evaluated and reference\n overlapping_ngrams = evaluated_ngrams.intersection(reference_ngrams)\n overlapping_count = len(overlapping_ngrams)\n\n return f_r_p_rouge_n(evaluated_count, reference_count, overlapping_count)", "response": "Computes ROUGE - N of two text collections of sentences."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _union_lcs(evaluated_sentences, reference_sentence, prev_union=None):\n if prev_union is None:\n prev_union = set()\n\n if len(evaluated_sentences) <= 0:\n raise ValueError(\"Collections must contain at least 1 sentence.\")\n\n lcs_union = prev_union\n prev_count = len(prev_union)\n reference_words = _split_into_words([reference_sentence])\n\n combined_lcs_length = 0\n for eval_s in evaluated_sentences:\n evaluated_words = _split_into_words([eval_s])\n lcs = set(_recon_lcs(reference_words, evaluated_words))\n combined_lcs_length += len(lcs)\n lcs_union = lcs_union.union(lcs)\n\n new_lcs_count = len(lcs_union) - prev_count\n return new_lcs_count, lcs_union", "response": "Returns the LCS score of the union of the two given sentences."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncomputing ROUGE - L summary level of two text collections of sentences.", "response": "def rouge_l_summary_level(evaluated_sentences, reference_sentences):\n \"\"\"\n Computes ROUGE-L (summary level) of two text collections of sentences.\n http://research.microsoft.com/en-us/um/people/cyl/download/papers/\n rouge-working-note-v1.3.1.pdf\n\n Calculated according to:\n R_lcs = SUM(1, u)[LCS(r_i,C)]/m\n P_lcs = SUM(1, u)[LCS(r_i,C)]/n\n F_lcs = ((1 + beta^2)*R_lcs*P_lcs) / (R_lcs + (beta^2) * P_lcs)\n\n where:\n SUM(i,u) = SUM from i through u\n u = number of sentences in reference summary\n C = Candidate summary made up of v sentences\n m = number of words in reference summary\n n = number of words in candidate summary\n\n Args:\n evaluated_sentences: The sentences that have been picked by the\n summarizer\n reference_sentence: One of the sentences in the reference summaries\n\n Returns:\n A float: F_lcs\n\n Raises:\n ValueError: raises exception if a param has len <= 0\n \"\"\"\n if len(evaluated_sentences) <= 0 or len(reference_sentences) <= 0:\n raise ValueError(\"Collections must contain at least 1 sentence.\")\n\n # total number of words in reference sentences\n m = len(set(_split_into_words(reference_sentences)))\n\n # total number of words in evaluated sentences\n n = len(set(_split_into_words(evaluated_sentences)))\n\n # print(\"m,n %d %d\" % (m, n))\n union_lcs_sum_across_all_references = 0\n union = set()\n for ref_s in reference_sentences:\n lcs_count, union = _union_lcs(evaluated_sentences,\n ref_s,\n prev_union=union)\n union_lcs_sum_across_all_references += lcs_count\n\n llcs = union_lcs_sum_across_all_references\n r_lcs = llcs / m\n p_lcs = llcs / n\n beta = p_lcs / (r_lcs + 1e-12)\n num = (1 + (beta**2)) * r_lcs * p_lcs\n denom = r_lcs + ((beta**2) * p_lcs)\n f_lcs = num / (denom + 1e-12)\n return {\"f\": f_lcs, \"p\": p_lcs, \"r\": r_lcs}"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncalculate ROUGE scores between each pair of cluster entries.", "response": "def get_scores(self, avg=False, ignore_empty=False):\n \"\"\"Calculate ROUGE scores between each pair of\n lines (hyp_file[i], ref_file[i]).\n Args:\n * hyp_path: hypothesis file path\n * ref_path: references file path\n * avg (False): whether to get an average scores or a list\n \"\"\"\n hyp_path, ref_path = self.hyp_path, self.ref_path\n\n with io.open(hyp_path, encoding=\"utf-8\", mode=\"r\") as hyp_file:\n hyps = [line[:-1] for line in hyp_file]\n with io.open(ref_path, encoding=\"utf-8\", mode=\"r\") as ref_file:\n refs = [line[:-1] for line in ref_file]\n\n return self.rouge.get_scores(hyps, refs, avg=avg,\n ignore_empty=ignore_empty)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef calc_pvalues(query, gene_sets, background=20000, **kwargs):\n\n # number of genes in your query data\n k = len(query) \n query = set(query)\n vals = []\n # background should be all genes in annotated database\n # such as go, kegg et.al.\n if isinstance(background, set): \n bg = len(background) # total number in your annotated database \n # filter genes that not found in annotated database\n query = query.intersection(background)\n elif isinstance(background, int):\n bg = background\n else:\n raise ValueError(\"background should be set or int object\")\n # pval\n subsets = sorted(gene_sets.keys())\n for s in subsets:\n category = gene_sets.get(s)\n m = len(category)\n hits = query.intersection(set(category))\n x = len(hits)\n if x < 1 : continue\n # pVal = hypergeom.sf(hitCount-1,popTotal,bgHits,queryTotal) \n # p(X >= hitCounts)\n vals.append((s, hypergeom.sf(x-1, bg, m, k), x, m, hits))\n\n return zip(*vals)", "response": "calculate pvalues for all genes in the graph"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef fdrcorrection(pvals, alpha=0.05):\n # Implement copy from GOATools.\n pvals = np.asarray(pvals)\n pvals_sortind = np.argsort(pvals)\n pvals_sorted = np.take(pvals, pvals_sortind)\n\n ecdffactor = _ecdf(pvals_sorted)\n reject = pvals_sorted <= ecdffactor*alpha\n if reject.any():\n rejectmax = max(np.nonzero(reject)[0])\n reject[:rejectmax] = True\n pvals_corrected_raw = pvals_sorted / ecdffactor\n pvals_corrected = np.minimum.accumulate(pvals_corrected_raw[::-1])[::-1]\n del pvals_corrected_raw\n pvals_corrected[pvals_corrected>1] = 1\n pvals_corrected_ = np.empty_like(pvals_corrected)\n pvals_corrected_[pvals_sortind] = pvals_corrected\n del pvals_corrected\n reject_ = np.empty_like(reject)\n reject_[pvals_sortind] = reject\n return reject_, pvals_corrected_", "response": "benjamini hocheberg fdr correction. inspired by statsmodels \n "} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nstandardize the mean and variance of the DataFrame.", "response": "def zscore(data2d, axis=0):\n \"\"\"Standardize the mean and variance of the data axis Parameters.\n\n :param data2d: DataFrame to normalize.\n :param axis: int, Which axis to normalize across. If 0, normalize across rows,\n if 1, normalize across columns. If None, don't change data\n \n :Returns: Normalized DataFrame. Normalized data with a mean of 0 and variance of 1\n across the specified axis.\n\n \"\"\"\n if axis is None:\n # normalized to mean and std using entire matrix\n # z_scored = (data2d - data2d.values.mean()) / data2d.values.std(ddof=1)\n return data2d\n assert axis in [0,1]\n # if axis == 1:\n # z_scored = data2d\n # else:\n # z_scored = data2d.T\n\n # z_scored = (z_scored - z_scored.mean()) / z_scored.std(ddof=1)\n \n # if axis == 1:\n # return z_scored\n # else:\n # return z_scored.T\n z_scored = data2d.apply(lambda x: (x-x.mean())/x.std(ddof=1), \n axis=operator.xor(1, axis))\n return z_scored"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nvisualize the dataframe df in a heatmap.", "response": "def heatmap(df, z_score=None, title='', figsize=(5,5), cmap='RdBu_r', \n xticklabels=True, yticklabels=True, ofname=None, **kwargs):\n \"\"\"Visualize the dataframe.\n\n :param df: DataFrame from expression table.\n :param z_score: z_score axis{0, 1}. If None, don't normalize data.\n :param title: gene set name.\n :param outdir: path to save heatmap.\n :param figsize: heatmap figsize.\n :param cmap: matplotlib colormap.\n :param ofname: output file name. If None, don't save figure \n\n \"\"\"\n df = zscore(df, axis=z_score)\n df = df.iloc[::-1]\n # Get the positions and used label for the ticks\n ny, nx = df.shape\n xticks = np.arange(0, nx, 1) + .5\n yticks = np.arange(0, ny, 1) + .5\n\n # If working on commandline, don't show figure\n if hasattr(sys, 'ps1') and (ofname is None): \n fig = plt.figure(figsize=figsize)\n else:\n fig = Figure(figsize=figsize)\n canvas = FigureCanvas(fig)\n ax = fig.add_subplot(111)\n vmin = np.percentile(df.min(), 2)\n vmax = np.percentile(df.max(), 98)\n matrix = ax.pcolormesh(df.values, cmap=cmap, vmin=vmin, vmax=vmax)\n ax.set_ylim([0,len(df)])\n ax.set(xticks=xticks, yticks=yticks)\n ax.set_xticklabels(df.columns.values if xticklabels else '', fontsize=14, rotation=90)\n ax.set_yticklabels(df.index.values if yticklabels else '', fontsize=14)\n ax.set_title(\"%s\\nHeatmap of the Analyzed Geneset\"%title, fontsize=20)\n ax.tick_params(axis='both', which='both', bottom=False, top=False,\n right=False, left=False)\n # cax=fig.add_axes([0.93,0.25,0.05,0.20])\n # cbar = fig.colorbar(matrix, cax=cax)\n cbar = colorbar(matrix)\n cbar.ax.tick_params(axis='both', which='both', bottom=False, top=False,\n right=False, left=False)\n for side in [\"top\", \"right\", \"left\", \"bottom\"]:\n ax.spines[side].set_visible(False)\n cbar.ax.spines[side].set_visible(False)\n # cbar.ax.set_title('',loc='left')\n\n if ofname is not None: \n # canvas.print_figure(ofname, bbox_inches='tight', dpi=300)\n fig.savefig(ofname, bbox_inches='tight', dpi=300)\n return"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef gseaplot(rank_metric, term, hits_indices, nes, pval, fdr, RES,\n pheno_pos='', pheno_neg='', figsize=(6,5.5), \n cmap='seismic', ofname=None, **kwargs):\n \"\"\"This is the main function for reproducing the gsea plot.\n\n :param rank_metric: pd.Series for rankings, rank_metric.values.\n :param term: gene_set name\n :param hits_indices: hits indices of rank_metric.index presented in gene set S.\n :param nes: Normalized enrichment scores.\n :param pval: nominal p-value.\n :param fdr: false discovery rate.\n :param RES: running enrichment scores.\n :param pheno_pos: phenotype label, positive correlated.\n :param pheno_neg: phenotype label, negative correlated.\n :param figsize: matplotlib figsize.\n :param ofname: output file name. If None, don't save figure \n\n \"\"\"\n # plt.style.use('classic')\n # center color map at midpoint = 0\n norm = _MidpointNormalize(midpoint=0)\n\n #dataFrame of ranked matrix scores\n x = np.arange(len(rank_metric))\n rankings = rank_metric.values\n # figsize = (6,6)\n phenoP_label = pheno_pos + ' (Positively Correlated)'\n phenoN_label = pheno_neg + ' (Negatively Correlated)'\n zero_score_ind = np.abs(rankings).argmin()\n z_score_label = 'Zero score at ' + str(zero_score_ind)\n nes_label = 'NES: '+ \"{:.3f}\".format(float(nes))\n pval_label = 'Pval: '+ \"{:.3f}\".format(float(pval))\n fdr_label = 'FDR: '+ \"{:.3f}\".format(float(fdr))\n im_matrix = np.tile(rankings, (2,1))\n\n # output truetype\n plt.rcParams.update({'pdf.fonttype':42,'ps.fonttype':42})\n # in most case, we will have many plots, so do not display plots\n # It's also usefull to run this script on command line.\n\n # GSEA Plots\n gs = plt.GridSpec(16,1)\n if hasattr(sys, 'ps1') and (ofname is None):\n # working inside python console, show figure\n fig = plt.figure(figsize=figsize)\n else:\n # If working on commandline, don't show figure\n fig = Figure(figsize=figsize)\n canvas = FigureCanvas(fig)\n # Ranked Metric Scores Plot\n ax1 = fig.add_subplot(gs[11:])\n module = 'tmp' if ofname is None else ofname.split(\".\")[-2]\n if module == 'ssgsea':\n nes_label = 'ES: '+ \"{:.3f}\".format(float(nes))\n pval_label='Pval: '\n fdr_label='FDR: '\n ax1.fill_between(x, y1=np.log(rankings), y2=0, color='#C9D3DB')\n ax1.set_ylabel(\"log ranked metric\", fontsize=14)\n else:\n ax1.fill_between(x, y1=rankings, y2=0, color='#C9D3DB')\n ax1.set_ylabel(\"Ranked list metric\", fontsize=14)\n ax1.text(.05, .9, phenoP_label, color='red',\n horizontalalignment='left', verticalalignment='top',\n transform=ax1.transAxes)\n ax1.text(.95, .05, phenoN_label, color='Blue',\n horizontalalignment='right', verticalalignment='bottom',\n transform=ax1.transAxes)\n # the x coords of this transformation are data, and the y coord are axes\n trans1 = transforms.blended_transform_factory(ax1.transData, ax1.transAxes)\n if module != 'ssgsea':\n ax1.vlines(zero_score_ind, 0, 1, linewidth=.5, transform=trans1, linestyles='--', color='grey')\n ax1.text(zero_score_ind, 0.5, z_score_label,\n horizontalalignment='center',\n verticalalignment='center',\n transform=trans1)\n ax1.set_xlabel(\"Rank in Ordered Dataset\", fontsize=14)\n ax1.spines['top'].set_visible(False)\n ax1.tick_params(axis='both', which='both', top=False, right=False, left=False)\n ax1.locator_params(axis='y', nbins=5)\n ax1.yaxis.set_major_formatter(plt.FuncFormatter(lambda tick_loc,tick_num : '{:.1f}'.format(tick_loc) ))\n\n # use round method to control float number\n # ax1.yaxis.set_major_formatter(plt.FuncFormatter(lambda tick_loc,tick_num : round(tick_loc, 1) ))\n\n # gene hits\n ax2 = fig.add_subplot(gs[8:10], sharex=ax1)\n\n # the x coords of this transformation are data, and the y coord are axes\n trans2 = transforms.blended_transform_factory(ax2.transData, ax2.transAxes)\n ax2.vlines(hits_indices, 0, 1,linewidth=.5,transform=trans2)\n ax2.spines['bottom'].set_visible(False)\n ax2.tick_params(axis='both', which='both', bottom=False, top=False,\n labelbottom=False, right=False, left=False, labelleft=False)\n # colormap\n ax3 = fig.add_subplot(gs[10], sharex=ax1)\n ax3.imshow(im_matrix, aspect='auto', norm=norm, cmap=cmap, interpolation='none') # cm.coolwarm\n ax3.spines['bottom'].set_visible(False)\n ax3.tick_params(axis='both', which='both', bottom=False, top=False,\n labelbottom=False, right=False, left=False,labelleft=False)\n\n # Enrichment score plot\n ax4 = fig.add_subplot(gs[:8], sharex=ax1)\n ax4.plot(x, RES, linewidth=4, color ='#88C544')\n ax4.text(.1, .1, fdr_label, transform=ax4.transAxes)\n ax4.text(.1, .2, pval_label, transform=ax4.transAxes)\n ax4.text(.1, .3, nes_label, transform=ax4.transAxes)\n\n # the y coords of this transformation are data, and the x coord are axes\n trans4 = transforms.blended_transform_factory(ax4.transAxes, ax4.transData)\n ax4.hlines(0, 0, 1, linewidth=.5, transform=trans4, color='grey')\n ax4.set_ylabel(\"Enrichment score (ES)\", fontsize=14)\n ax4.set_xlim(min(x), max(x))\n ax4.tick_params(axis='both', which='both', bottom=False, top=False, labelbottom=False, right=False)\n ax4.locator_params(axis='y', nbins=5)\n # FuncFormatter need two argument, I don't know why. this lambda function used to format yaxis tick labels.\n ax4.yaxis.set_major_formatter(plt.FuncFormatter(lambda tick_loc,tick_num : '{:.1f}'.format(tick_loc)) )\n\n # fig adjustment\n fig.suptitle(term, fontsize=16, fontweight='bold')\n fig.subplots_adjust(hspace=0)\n # fig.tight_layout()\n if ofname is not None: \n # canvas.print_figure(ofname, bbox_inches='tight', dpi=300)\n fig.savefig(ofname, bbox_inches='tight', dpi=300)\n return", "response": "This function is used to reproduce the gsea plot for a single gene set. It is used to plot the ranked matrix scores for a single term."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nvisualizes enrichr results. :param df: GSEApy DataFrame results. :param column: which column of DataFrame to show. Default: Adjusted P-value :param title: figure title :param cutoff: p-adjust cut-off. :param top_term: number of enriched terms to show. :param ascending: bool, the order of y axis. :param sizes: tuple, (min, max) scatter size. Not functional for now :param norm: maplotlib.colors.Normalize object. :param legend: bool, whether to show legend. :param figsize: tuple, figure size. :param cmap: matplotlib colormap :param ofname: output file name. If None, don't save figure", "response": "def dotplot(df, column='Adjusted P-value', title='', cutoff=0.05, top_term=10, \n sizes=None, norm=None, legend=True, figsize=(6, 5.5), \n cmap='RdBu_r', ofname=None, **kwargs):\n \"\"\"Visualize enrichr results.\n\n :param df: GSEApy DataFrame results.\n :param column: which column of DataFrame to show. Default: Adjusted P-value\n :param title: figure title\n :param cutoff: p-adjust cut-off.\n :param top_term: number of enriched terms to show.\n :param ascending: bool, the order of y axis.\n :param sizes: tuple, (min, max) scatter size. Not functional for now\n :param norm: maplotlib.colors.Normalize object.\n :param legend: bool, whether to show legend.\n :param figsize: tuple, figure size. \n :param cmap: matplotlib colormap\n :param ofname: output file name. If None, don't save figure \n\n \"\"\"\n\n\n colname = column \n # sorting the dataframe for better visualization\n if colname in ['Adjusted P-value', 'P-value']:\n # check if any values in `df[colname]` can't be coerced to floats\n can_be_coerced = df[colname].map(isfloat)\n if np.sum(~can_be_coerced) > 0:\n raise ValueError('some value in %s could not be typecast to `float`'%colname)\n else:\n df.loc[:, colname] = df[colname].map(float)\n df = df[df[colname] <= cutoff]\n if len(df) < 1: \n msg = \"Warning: No enrich terms when cutoff = %s\"%cutoff\n return msg\n df = df.assign(logAP=lambda x: - x[colname].apply(np.log10))\n colname='logAP'\n df = df.sort_values(by=colname).iloc[-top_term:,:]\n # \n temp = df['Overlap'].str.split(\"/\", expand=True).astype(int)\n df = df.assign(Hits=temp.iloc[:,0], Background=temp.iloc[:,1])\n df = df.assign(Hits_ratio=lambda x:x.Hits / x.Background)\n # x axis values\n x = df.loc[:, colname].values\n combined_score = df['Combined Score'].round().astype('int')\n # y axis index and values\n y = [i for i in range(0,len(df))]\n ylabels = df['Term'].values\n # Normalise to [0,1]\n # b = (df['Count'] - df['Count'].min())/ np.ptp(df['Count'])\n # area = 100 * b\n \n # control the size of scatter and legend marker\n levels = numbers = np.sort(df.Hits.unique())\n if norm is None:\n norm = Normalize()\n elif isinstance(norm, tuple):\n norm = Normalize(*norm)\n elif not isinstance(norm, Normalize):\n err = (\"``size_norm`` must be None, tuple, \"\n \"or Normalize object.\")\n raise ValueError(err)\n min_width, max_width = np.r_[20, 100] * plt.rcParams[\"lines.linewidth\"]\n norm.clip = True\n if not norm.scaled():\n norm(np.asarray(numbers))\n size_limits = norm.vmin, norm.vmax\n scl = norm(numbers)\n widths = np.asarray(min_width + scl * (max_width - min_width))\n if scl.mask.any():\n widths[scl.mask] = 0\n sizes = dict(zip(levels, widths))\n df['sizes'] = df.Hits.map(sizes)\n area = df['sizes'].values\n\n # creat scatter plot\n if hasattr(sys, 'ps1') and (ofname is None):\n # working inside python console, show figure\n fig, ax = plt.subplots(figsize=figsize)\n else:\n # If working on commandline, don't show figure\n fig = Figure(figsize=figsize)\n canvas = FigureCanvas(fig)\n ax = fig.add_subplot(111)\n vmin = np.percentile(combined_score.min(), 2)\n vmax = np.percentile(combined_score.max(), 98)\n sc = ax.scatter(x=x, y=y, s=area, edgecolors='face', c=combined_score,\n cmap=cmap, vmin=vmin, vmax=vmax)\n\n if column in ['Adjusted P-value', 'P-value']:\n xlabel = \"-log$_{10}$(%s)\"%column\n else:\n xlabel = column \n ax.set_xlabel(xlabel, fontsize=14, fontweight='bold')\n ax.yaxis.set_major_locator(plt.FixedLocator(y))\n ax.yaxis.set_major_formatter(plt.FixedFormatter(ylabels))\n ax.set_yticklabels(ylabels, fontsize=16)\n \n # ax.set_ylim([-1, len(df)])\n ax.grid()\n # colorbar\n cax=fig.add_axes([0.95,0.20,0.03,0.22])\n cbar = fig.colorbar(sc, cax=cax,)\n cbar.ax.tick_params(right=True)\n cbar.ax.set_title('Combined\\nScore',loc='left', fontsize=12)\n\n # for terms less than 3\n if len(df) >= 3:\n # find the index of the closest value to the median\n idx = [area.argmax(), np.abs(area - area.mean()).argmin(), area.argmin()]\n idx = unique(idx)\n else:\n idx = df.index.values\n label = df.iloc[idx, df.columns.get_loc('Hits')]\n \n if legend:\n handles, _ = ax.get_legend_handles_labels()\n legend_markers = []\n for ix in idx: \n legend_markers.append(ax.scatter([],[], s=area[ix], c='b'))\n # artist = ax.scatter([], [], s=size_levels,) \n ax.legend(legend_markers, label, title='Hits')\n ax.set_title(title, fontsize=20, fontweight='bold')\n \n if ofname is not None: \n # canvas.print_figure(ofname, bbox_inches='tight', dpi=300)\n fig.savefig(ofname, bbox_inches='tight', dpi=300)\n return\n return ax"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef barplot(df, column='Adjusted P-value', title=\"\", cutoff=0.05, top_term=10,\n figsize=(6.5,6), color='salmon', ofname=None, **kwargs):\n \"\"\"Visualize enrichr results.\n\n :param df: GSEApy DataFrame results.\n :param column: which column of DataFrame to show. Default: Adjusted P-value\n :param title: figure title.\n :param cutoff: cut-off of the cloumn you've chosen.\n :param top_term: number of top enriched terms to show.\n :param figsize: tuple, matplotlib figsize.\n :param color: color for bars.\n :param ofname: output file name. If None, don't save figure \n \n \"\"\"\n\n colname = column \n if colname in ['Adjusted P-value', 'P-value']: \n df = df[df[colname] <= cutoff]\n if len(df) < 1: \n msg = \"Warning: No enrich terms using library %s when cutoff = %s\"%(title, cutoff)\n return msg\n df = df.assign(logAP = lambda x: - x[colname].apply(np.log10))\n colname = 'logAP' \n dd = df.sort_values(by=colname).iloc[-top_term:,:]\n # dd = d.head(top_term).sort_values('logAP')\n # create bar plot\n if hasattr(sys, 'ps1') and (ofname is None):\n # working inside python console, show (True) figure\n fig = plt.figure(figsize=figsize)\n else:\n # If working on commandline, don't show figure\n fig = Figure(figsize=figsize)\n canvas = FigureCanvas(fig)\n ax = fig.add_subplot(111)\n bar = dd.plot.barh(x='Term', y=colname, color=color, \n alpha=0.75, fontsize=16, ax=ax)\n \n if column in ['Adjusted P-value', 'P-value']:\n xlabel = \"-log$_{10}$(%s)\"%column\n else:\n xlabel = column \n bar.set_xlabel(xlabel, fontsize=16, fontweight='bold')\n bar.set_ylabel(\"\")\n bar.set_title(title, fontsize=24, fontweight='bold')\n bar.xaxis.set_major_locator(MaxNLocator(integer=True))\n bar.legend_.remove()\n adjust_spines(ax, spines=['left','bottom'])\n\n if ofname is not None: \n # canvas.print_figure(ofname, bbox_inches='tight', dpi=300)\n fig.savefig(ofname, bbox_inches='tight', dpi=300)\n return\n return ax", "response": "Visualize enrichr results.\n\n :param df: GSEApy DataFrame results.\n :param column: which column of DataFrame to show. Default: Adjusted P-value\n :param title: figure title.\n :param cutoff: cut-off of the cloumn you've chosen.\n :param top_term: number of top enriched terms to show.\n :param figsize: tuple, matplotlib figsize.\n :param color: color for bars.\n :param ofname: output file name. If None, don't save figure"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nfunction for removing spines and ticks.", "response": "def adjust_spines(ax, spines):\n \"\"\"function for removing spines and ticks.\n\n :param ax: axes object\n :param spines: a list of spines names to keep. e.g [left, right, top, bottom]\n if spines = []. remove all spines and ticks.\n\n \"\"\"\n for loc, spine in ax.spines.items():\n if loc in spines:\n # spine.set_position(('outward', 10)) # outward by 10 points\n # spine.set_smart_bounds(True)\n continue\n else:\n spine.set_color('none') # don't draw spine\n\n # turn off ticks where there is no spine\n if 'left' in spines:\n ax.yaxis.set_ticks_position('left')\n else:\n # no yaxis ticks\n ax.yaxis.set_ticks([])\n\n if 'bottom' in spines:\n ax.xaxis.set_ticks_position('bottom')\n else:\n # no xaxis ticks\n ax.xaxis.set_ticks([])"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef main():\n\n # Parse options...\n argparser = prepare_argparser()\n args = argparser.parse_args()\n subcommand = args.subcommand_name\n\n if subcommand == \"replot\":\n # reproduce plots using GSEAPY\n from .gsea import Replot\n rep = Replot(indir=args.indir, outdir=args.outdir, weighted_score_type=args.weight,\n figsize=args.figsize, graph_num=args.graph,\n format=args.format, verbose=args.verbose)\n rep.run()\n\n\n elif subcommand == \"gsea\":\n # compute using GSEAPY\n from .gsea import GSEA\n\n gs = GSEA(args.data, args.gmt, args.cls, args.outdir,\n args.mins, args.maxs, args.n, args.weight,\n args.type, args.method, args.ascending, args.threads,\n args.figsize, args.format, args.graph, args.noplot, args.seed, args.verbose)\n gs.run()\n elif subcommand == \"prerank\":\n from .gsea import Prerank\n\n pre = Prerank(args.rnk, args.gmt, args.outdir, args.label[0], args.label[1],\n args.mins, args.maxs, args.n, args.weight, args.ascending, args.threads,\n args.figsize, args.format, args.graph, args.noplot, args.seed, args.verbose)\n pre.run()\n\n elif subcommand == \"ssgsea\":\n from .gsea import SingleSampleGSEA\n ss = SingleSampleGSEA(data=args.data, gene_sets=args.gmt, outdir=args.outdir,\n sample_norm_method=args.norm,\n min_size=args.mins, max_size=args.maxs, permutation_num=args.n,\n weighted_score_type=args.weight, scale=args.scale,\n ascending=args.ascending, processes=args.threads,\n figsize=args.figsize, format=args.format, graph_num=args.graph,\n no_plot=args.noplot, seed=args.seed, verbose=args.verbose)\n ss.run()\n\n elif subcommand == \"enrichr\":\n # calling enrichr API\n from .enrichr import Enrichr\n enr = Enrichr(gene_list=args.gene_list, descriptions=args.descrip,\n gene_sets=args.library, organism=args.organism,\n outdir=args.outdir, format=args.format, cutoff=args.thresh, \n background=args.bg, figsize=args.figsize,\n top_term=args.term, no_plot=args.noplot, verbose=args.verbose)\n enr.run()\n elif subcommand == \"biomart\":\n from .parser import Biomart\n # read input file or a argument\n name, value = args.filter\n if os.path.isfile(value):\n with open(value, 'r') as val:\n lines = val.readlines()\n value = [ l.strip() for l in lines]\n # run query\n bm = Biomart(host=args.host, verbose=args.verbose)\n bm.query(dataset=args.bg, attributes=args.attrs.split(\",\"), \n filters={name : value}, filename=args.ofile)\n else:\n argparser.print_help()\n sys.exit(0)", "response": "The main function for GSEApy."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nprepare an argument parser object.", "response": "def prepare_argparser():\n \"\"\"Prepare argparser object. New options will be added in this function first.\"\"\"\n description = \"%(prog)s -- Gene Set Enrichment Analysis in Python\"\n epilog = \"For command line options of each command, type: %(prog)s COMMAND -h\"\n\n # top-level parser\n argparser = ap.ArgumentParser(description=description, epilog=epilog)\n argparser.add_argument(\"--version\", action=\"version\", version=\"%(prog)s \"+ __version__)\n subparsers = argparser.add_subparsers(dest='subcommand_name') #help=\"sub-command help\")\n\n # command for 'gsea'\n add_gsea_parser(subparsers)\n # command for 'prerank'\n add_prerank_parser(subparsers)\n # command for 'ssgsea'\n add_singlesample_parser(subparsers)\n # command for 'plot'\n add_plot_parser(subparsers)\n # command for 'enrichr'\n add_enrichr_parser(subparsers)\n # command for 'biomart'\n add_biomart_parser(subparsers)\n\n return argparser"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nadd the output option to the parser.", "response": "def add_output_option(parser):\n \"\"\"output option\"\"\"\n\n parser.add_argument(\"-o\", \"--outdir\", dest=\"outdir\", type=str, default='GSEApy_reports',\n metavar='', action=\"store\",\n help=\"The GSEApy output directory. Default: the current working directory\")\n parser.add_argument(\"-f\", \"--format\", dest=\"format\", type=str, metavar='', action=\"store\",\n choices=(\"pdf\", \"png\", \"jpeg\", \"eps\", \"svg\"), default=\"pdf\",\n help=\"File extensions supported by Matplotlib active backend,\\\n choose from {'pdf', 'png', 'jpeg','ps', 'eps','svg'}. Default: 'pdf'.\")\n parser.add_argument(\"--fs\", \"--figsize\", action='store', nargs=2, dest='figsize',\n metavar=('width', 'height'),type=float, default=(6.5, 6),\n help=\"The figsize keyword argument need two parameters to define. Default: (6.5, 6)\")\n parser.add_argument(\"--graph\", dest = \"graph\", action=\"store\", type=int, default=20, metavar='int',\n help=\"Numbers of top graphs produced. Default: 20\")\n parser.add_argument(\"--no-plot\", action='store_true', dest='noplot', default=False,\n help=\"Speed up computing by suppressing the plot output.\"+\\\n \"This is useful only if data are interested. Default: False.\")\n parser.add_argument(\"-v\", \"--verbose\", action=\"store_true\", default=False, dest='verbose',\n help=\"Increase output verbosity, print out progress of your job\", )"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nadd output group to parser.", "response": "def add_output_group(parser, required=True):\n \"\"\"output group\"\"\"\n\n output_group = parser.add_mutually_exclusive_group(required=required)\n output_group.add_argument(\"-o\", \"--ofile\", dest=\"ofile\", type=str, default='GSEApy_reports',\n help=\"Output file name. Mutually exclusive with --o-prefix.\")\n output_group.add_argument(\"--o-prefix\", dest=\"ofile\", type=str, default='GSEApy_reports',\n help=\"Output file prefix. Mutually exclusive with -o/--ofile.\")"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nadds main function gsea argument parsers.", "response": "def add_gsea_parser(subparsers):\n \"\"\"Add main function 'gsea' argument parsers.\"\"\"\n\n argparser_gsea = subparsers.add_parser(\"gsea\", help=\"Main GSEApy Function: run GSEApy instead of GSEA.\")\n\n # group for input files\n group_input = argparser_gsea.add_argument_group(\"Input files arguments\")\n group_input.add_argument(\"-d\", \"--data\", dest=\"data\", action=\"store\", type=str, required=True,\n help=\"Input gene expression dataset file in txt format.Same with GSEA.\")\n group_input.add_argument(\"-c\", \"--cls\", dest=\"cls\", action=\"store\", type=str, required=True,\n help=\"Input class vector (phenotype) file in CLS format. Same with GSEA.\")\n group_input.add_argument(\"-g\", \"--gmt\", dest=\"gmt\", action=\"store\", type=str, required=True,\n help=\"Gene set database in GMT format. Same with GSEA.\")\n group_input.add_argument(\"-t\", \"--permu-type\", action=\"store\", dest=\"type\", type=str, metavar='perType',\n choices=(\"gene_set\", \"phenotype\"), default=\"gene_set\",\n help=\"Permutation type. Same with GSEA, choose from {'gene_set', 'phenotype'}\")\n\n # group for output files\n group_output = argparser_gsea.add_argument_group(\"Output arguments\")\n add_output_option(group_output)\n\n # group for General options.\n group_opt = argparser_gsea.add_argument_group(\"GSEA advanced arguments\")\n group_opt.add_argument(\"-n\", \"--permu-num\", dest = \"n\", action=\"store\", type=int, default=1000, metavar='nperm',\n help=\"Number of random permutations. For calculating esnulls. Default: 1000\")\n group_opt.add_argument(\"--min-size\", dest=\"mins\", action=\"store\", type=int, default=15, metavar='int',\n help=\"Min size of input genes presented in Gene Sets. Default: 15\")\n group_opt.add_argument(\"--max-size\", dest = \"maxs\", action=\"store\", type=int, default=500, metavar='int',\n help=\"Max size of input genes presented in Gene Sets. Default: 500\")\n group_opt.add_argument(\"-w\", \"--weight\", action='store', dest='weight', default=1.0, type=float, metavar='float',\n help='Weighted_score of rank_metrics. For weighting input genes. Choose from {0, 1, 1.5, 2}. Default: 1',)\n group_opt.add_argument(\"-m\", \"--method\", action=\"store\", dest=\"method\", type=str, metavar='',\n choices=(\"signal_to_noise\", \"t_test\", \"ratio_of_classes\", \"diff_of_classes\", \"log2_ratio_of_classes\"),\n default=\"log2_ratio_of_classes\",\n help=\"Methods to calculate correlations of ranking metrics. \\\n Choose from {'signal_to_noise', 't_test', 'ratio_of_classes', 'diff_of_classes','log2_ratio_of_classes'}.\\\n Default: 'log2_ratio_of_classes'\")\n group_opt.add_argument(\"-a\", \"--ascending\", action='store_true', dest='ascending', default=False,\n help='Rank metric sorting order. If the -a flag was chosen, then ascending equals to True. Default: False.')\n group_opt.add_argument(\"-s\", \"--seed\", dest = \"seed\", action=\"store\", type=int, default=None, metavar='',\n help=\"Number of random seed. Default: None\")\n group_opt.add_argument(\"-p\", \"--threads\", dest = \"threads\", action=\"store\", type=int, default=1, metavar='procs',\n help=\"Number of Processes you are going to use. Default: 1\")\n\n return"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef add_prerank_parser(subparsers):\n\n argparser_prerank = subparsers.add_parser(\"prerank\", help=\"Run GSEApy Prerank tool on preranked gene list.\")\n\n # group for input files\n prerank_input = argparser_prerank.add_argument_group(\"Input files arguments\")\n prerank_input.add_argument(\"-r\", \"--rnk\", dest=\"rnk\", action=\"store\", type=str, required=True,\n help=\"Ranking metric file in .rnk format. Same with GSEA.\")\n prerank_input.add_argument(\"-g\", \"--gmt\", dest=\"gmt\", action=\"store\", type=str, required=True,\n help=\"Gene set database in GMT format. Same with GSEA.\")\n prerank_input.add_argument(\"-l\", \"--label\", action='store', nargs=2, dest='label',\n metavar=('pos', 'neg'), type=str, default=('Pos','Neg'),\n help=\"The phenotype label argument need two parameters to define. Default: ('Pos','Neg')\")\n\n # group for output files\n prerank_output = argparser_prerank.add_argument_group(\"Output arguments\")\n add_output_option(prerank_output)\n\n # group for General options.\n prerank_opt = argparser_prerank.add_argument_group(\"GSEA advanced arguments\")\n prerank_opt.add_argument(\"-n\", \"--permu-num\", dest = \"n\", action=\"store\", type=int, default=1000, metavar='nperm',\n help=\"Number of random permutations. For calculating esnulls. Default: 1000\")\n prerank_opt.add_argument(\"--min-size\", dest=\"mins\", action=\"store\", type=int, default=15, metavar='int',\n help=\"Min size of input genes presented in Gene Sets. Default: 15\")\n prerank_opt.add_argument(\"--max-size\", dest = \"maxs\", action=\"store\", type=int, default=500, metavar='int',\n help=\"Max size of input genes presented in Gene Sets. Default: 500\")\n prerank_opt.add_argument(\"-w\", \"--weight\", action='store', dest='weight', default=1.0, type=float, metavar='float',\n help='Weighted_score of rank_metrics. For weighting input genes. Choose from {0, 1, 1.5, 2}. Default: 1',)\n prerank_opt.add_argument(\"-a\", \"--ascending\", action='store_true', dest='ascending', default=False,\n help='Rank metric sorting order. If the -a flag was chosen, then ascending equals to True. Default: False.')\n prerank_opt.add_argument(\"-s\", \"--seed\", dest = \"seed\", action=\"store\", type=int, default=None, metavar='',\n help=\"Number of random seed. Default: None\")\n prerank_opt.add_argument(\"-p\", \"--threads\", dest = \"threads\", action=\"store\", type=int, default=1, metavar='procs',\n help=\"Number of Processes you are going to use. Default: 1\")\n\n return", "response": "Add function prerank argument parsers."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding function plot argument parsers.", "response": "def add_plot_parser(subparsers):\n \"\"\"Add function 'plot' argument parsers.\"\"\"\n\n argparser_replot = subparsers.add_parser(\"replot\", help=\"Reproduce GSEA desktop output figures.\")\n\n group_replot = argparser_replot.add_argument_group(\"Input arguments\")\n\n group_replot.add_argument(\"-i\", \"--indir\", action=\"store\", dest=\"indir\", required=True, metavar='GSEA_dir',\n help=\"The GSEA desktop results directroy that you want to reproduce the figure \")\n add_output_option(group_replot)\n #add_output_group( argparser_plot )\n group_replot.add_argument(\"-w\", \"--weight\", action='store', dest='weight', default=1.0, type=float, metavar='float',\n help='Weighted_score of rank_metrics. Please Use the same value in GSEA. Choose from (0, 1, 1.5, 2),default: 1',)\n\n return"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_enrichr_parser(subparsers):\n\n argparser_enrichr = subparsers.add_parser(\"enrichr\", help=\"Using Enrichr API to perform GO analysis.\")\n\n # group for required options.\n enrichr_opt = argparser_enrichr.add_argument_group(\"Input arguments\")\n enrichr_opt.add_argument(\"-i\", \"--input-list\", action=\"store\", dest=\"gene_list\", type=str, required=True, metavar='IDs',\n help=\"Enrichr uses a list of gene names as input.\")\n enrichr_opt.add_argument(\"-g\", \"--gene-sets\", action=\"store\", dest=\"library\", type=str, required=True, metavar='GMT',\n help=\"Enrichr library name(s) required. Separate each name by comma.\")\n enrichr_opt.add_argument(\"--org\", \"--organism\", action=\"store\", dest=\"organism\", type=str, default='',\n help=\"Enrichr supported organism name. Default: human. See here: https://amp.pharm.mssm.edu/modEnrichr.\")\n enrichr_opt.add_argument(\"--ds\", \"--description\", action=\"store\", dest=\"descrip\", type=str, default='enrichr', metavar='STRING',\n help=\"It is recommended to enter a short description for your list so that multiple lists \\\n can be differentiated from each other if you choose to save or share your list.\")\n enrichr_opt.add_argument(\"--cut\", \"--cut-off\", action=\"store\", dest=\"thresh\", metavar='float', type=float, default=0.05,\n help=\"Adjust-Pval cutoff, used for generating plots. Default: 0.05.\")\n enrichr_opt.add_argument(\"--bg\", \"--background\", action=\"store\", dest=\"bg\", default='hsapiens_gene_ensembl', metavar='BGNUM',\n help=\"BioMart Dataset name or Background total genes number. Default: None\")\n enrichr_opt.add_argument(\"-t\", \"--top-term\", dest=\"term\", action=\"store\", type=int, default=10, metavar='int',\n help=\"Numbers of top terms shown in the plot. Default: 10\")\n # enrichr_opt.add_argument(\"--scale\", dest = \"scale\", action=\"store\", type=float, default=0.5, metavar='float',\n # help=\"scatter dot scale in the dotplot. Default: 0.5\")\n # enrichr_opt.add_argument(\"--no-plot\", action='store_true', dest='no_plot', default=False,\n # help=\"Suppress the plot output.This is useful only if data are interested. Default: False.\")\n\n enrichr_output = argparser_enrichr.add_argument_group(\"Output figure arguments\")\n add_output_option(enrichr_output)\n return", "response": "Add function enrichr argument parsers."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef add_biomart_parser(subparsers):\n\n argparser_biomart = subparsers.add_parser(\"biomart\", help=\"Using BioMart API to convert gene ids.\")\n\n # group for required options.\n biomart_opt = argparser_biomart.add_argument_group(\"Input arguments\")\n biomart_opt.add_argument(\"-f\", \"--filter\", action='store', nargs=2, dest='filter',\n required=True, metavar=('NAME', 'VALUE'),\n help=\"\"\"Which filter to use. Input filter name, and value.\n If multi-value required, separate each value by comma.\n If value is a txt file, then one ID per row, exclude header.\"\"\") \n biomart_opt.add_argument(\"-a\", \"--attributes\", action=\"store\", dest=\"attrs\", type=str, required=True, metavar='ATTR',\n help=\"Which attribute(s) to retrieve. Separate each attr by comma.\")\n biomart_opt.add_argument(\"-o\", \"--ofile\", dest=\"ofile\", type=str, required=True, help=\"Output file name\") \n biomart_opt.add_argument(\"-d\", \"--dataset\", action=\"store\", dest=\"bg\", type=str, default='hsapiens_gene_ensembl', metavar='DATA',\n help=\"Which dataset to use. Default: hsapiens_gene_ensembl\")\n biomart_opt.add_argument(\"--host\", action=\"store\", dest=\"host\", type=str, default='www.ensembl.org', metavar='HOST',\n help=\"Which host to use. Select from {'www.ensembl.org', 'asia.ensembl.org', 'useast.ensembl.org'}.\")\n biomart_opt.add_argument(\"-m\", \"--mart\", action=\"store\", dest=\"mart\", type=str, metavar='MART',\n default=\"ENSEMBL_MART_ENSEMBL\", help=\"Which mart to use. Default: ENSEMBL_MART_ENSEMBL.\")\n biomart_opt.add_argument(\"-v\", \"--verbose\", action=\"store_true\", default=False, dest='verbose',\n help=\"Increase output verbosity, print out progress of your job\", )", "response": "Add function biomart argument parsers."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngenerates a tensor that contains the score of each location in the gene list.", "response": "def enrichment_score_tensor(gene_mat, cor_mat, gene_sets, weighted_score_type, nperm=1000,\n rs=np.random.RandomState(), single=False, scale=False):\n \"\"\"Next generation algorithm of GSEA and ssGSEA.\n\n :param gene_mat: the ordered gene list(vector) with or without gene indices matrix.\n :param cor_mat: correlation vector or matrix (e.g. signal to noise scores)\n corresponding to the genes in the gene list or matrix.\n :param dict gene_sets: gmt file dict.\n :param float weighted_score_type: weighting by the correlation.\n options: 0(classic), 1, 1.5, 2. default:1 for GSEA and 0.25 for ssGSEA.\n :param int nperm: permutation times.\n :param bool scale: If True, normalize the scores by number of genes_mat.\n :param bool single: If True, use ssGSEA algorithm, otherwise use GSEA.\n :param rs: Random state for initialize gene list shuffling.\n Default: np.random.RandomState(seed=None)\n :return: a tuple contains::\n\n | ES: Enrichment score (real number between -1 and +1), for ssGSEA, set scale eq to True.\n | ESNULL: Enrichment score calculated from random permutation.\n | Hits_Indices: Indices of genes if genes are included in gene_set.\n | RES: The running enrichment score for all locations in the gene list.\n\n \"\"\"\n # gene_mat -> 1d: prerank, ssSSEA or 2d: GSEA\n keys = sorted(gene_sets.keys())\n\n if weighted_score_type == 0:\n # don't bother doing calcuation, just set to 1\n cor_mat = np.ones(cor_mat.shape)\n elif weighted_score_type > 0:\n pass\n else:\n logging.error(\"Using negative values of weighted_score_type, not allowed\")\n sys.exit(0)\n\n cor_mat = np.abs(cor_mat)\n if cor_mat.ndim ==1:\n # ssGSEA or Prerank\n # genestes->M, genes->N, perm-> axis=2\n N, M = len(gene_mat), len(keys)\n # generate gene hits matrix\n # for 1d ndarray of gene_mat, set assume_unique=True,\n # means the input arrays are both assumed to be unique,\n # which can speed up the calculation.\n tag_indicator = np.vstack([np.in1d(gene_mat, gene_sets[key], assume_unique=True) for key in keys])\n tag_indicator = tag_indicator.astype(int)\n # index of hits\n hit_ind = [ np.flatnonzero(tag).tolist() for tag in tag_indicator ]\n # generate permutated hits matrix\n perm_tag_tensor = np.repeat(tag_indicator, nperm+1).reshape((M,N,nperm+1))\n # shuffle matrix, last matrix is not shuffled when nperm > 0\n if nperm: np.apply_along_axis(lambda x: np.apply_along_axis(rs.shuffle,0,x),1, perm_tag_tensor[:,:,:-1])\n # missing hits\n no_tag_tensor = 1 - perm_tag_tensor\n # calculate numerator, denominator of each gene hits\n rank_alpha = (perm_tag_tensor*cor_mat[np.newaxis,:,np.newaxis])** weighted_score_type\n\n elif cor_mat.ndim == 2:\n # GSEA\n # 2d ndarray, gene_mat and cor_mat are shuffled already\n # reshape matrix\n cor_mat = cor_mat.T\n # gene_mat is a tuple contains (gene_name, permuate_gene_name_indices)\n genes, genes_ind = gene_mat\n # genestes->M, genes->N, perm-> axis=2\n # don't use assume_unique=True in 2d array when use np.isin().\n # elements in gene_mat are not unique, or will cause unwanted results\n tag_indicator = np.vstack([np.in1d(genes, gene_sets[key], assume_unique=True) for key in keys])\n tag_indicator = tag_indicator.astype(int)\n perm_tag_tensor = np.stack([tag.take(genes_ind).T for tag in tag_indicator], axis=0)\n #index of hits\n hit_ind = [ np.flatnonzero(tag).tolist() for tag in perm_tag_tensor[:,:,-1] ]\n # nohits\n no_tag_tensor = 1 - perm_tag_tensor\n # calculate numerator, denominator of each gene hits\n rank_alpha = (perm_tag_tensor*cor_mat[np.newaxis,:,:])** weighted_score_type\n else:\n logging.error(\"Program die because of unsupported input\")\n sys.exit(0)\n\n # Nhint = tag_indicator.sum(1)\n # Nmiss = N - Nhint\n axis=1\n P_GW_denominator = np.sum(rank_alpha, axis=axis, keepdims=True)\n P_NG_denominator = np.sum(no_tag_tensor, axis=axis, keepdims=True)\n REStensor = np.cumsum(rank_alpha / P_GW_denominator - no_tag_tensor / P_NG_denominator, axis=axis)\n # ssGSEA: scale es by gene numbers ?\n # https://gist.github.com/gaoce/39e0907146c752c127728ad74e123b33\n if scale: REStensor = REStensor / len(gene_mat)\n if single:\n #ssGSEA\n esmatrix = REStensor.sum(axis=axis)\n else:\n #GSEA\n esmax, esmin = REStensor.max(axis=axis), REStensor.min(axis=axis)\n esmatrix = np.where(np.abs(esmax)>np.abs(esmin), esmax, esmin)\n\n es, esnull, RES = esmatrix[:,-1], esmatrix[:,:-1], REStensor[:,:,-1]\n\n return es, esnull, hit_ind, RES"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef ranking_metric_tensor(exprs, method, permutation_num, pos, neg, classes,\n ascending, rs=np.random.RandomState()):\n \"\"\"Build shuffled ranking matrix when permutation_type eq to phenotype.\n\n :param exprs: gene_expression DataFrame, gene_name indexed.\n :param str method: calculate correlation or ranking. methods including:\n 1. 'signal_to_noise'.\n 2. 't_test'.\n 3. 'ratio_of_classes' (also referred to as fold change).\n 4. 'diff_of_classes'.\n 5. 'log2_ratio_of_classes'.\n :param int permuation_num: how many times of classes is being shuffled\n :param str pos: one of labels of phenotype's names.\n :param str neg: one of labels of phenotype's names.\n :param list classes: a list of phenotype labels, to specify which column of\n dataframe belongs to what class of phenotype.\n :param bool ascending: bool. Sort ascending vs. descending.\n\n :return:\n returns two 2d ndarray with shape (nperm, gene_num).\n\n | cor_mat_indices: the indices of sorted and permutated (exclude last row) ranking matrix.\n | cor_mat: sorted and permutated (exclude last row) ranking matrix.\n\n \"\"\"\n # S: samples, G: gene number\n G, S = exprs.shape\n # genes = exprs.index.values\n expr_mat = exprs.values.T\n perm_cor_tensor = np.tile(expr_mat, (permutation_num+1,1,1))\n # random shuffle on the first dim, last matrix is not shuffled\n for arr in perm_cor_tensor[:-1]: rs.shuffle(arr)\n classes = np.array(classes)\n pos = classes == pos\n neg = classes == neg\n pos_cor_mean = perm_cor_tensor[:,pos,:].mean(axis=1)\n neg_cor_mean = perm_cor_tensor[:,neg,:].mean(axis=1)\n pos_cor_std = perm_cor_tensor[:,pos,:].std(axis=1, ddof=1)\n neg_cor_std = perm_cor_tensor[:,neg,:].std(axis=1, ddof=1)\n\n if method == 'signal_to_noise':\n cor_mat = (pos_cor_mean - neg_cor_mean)/(pos_cor_std + neg_cor_std)\n elif method == 't_test':\n denom = 1.0/G\n cor_mat = (pos_cor_mean - neg_cor_mean)/ np.sqrt(denom*pos_cor_std**2 + denom*neg_cor_std**2)\n elif method == 'ratio_of_classes':\n cor_mat = pos_cor_mean / neg_cor_mean\n elif method == 'diff_of_classes':\n cor_mat = pos_cor_mean - neg_cor_mean\n elif method == 'log2_ratio_of_classes':\n cor_mat = np.log2(pos_cor_mean / neg_cor_mean)\n else:\n logging.error(\"Please provide correct method name!!!\")\n sys.exit(0)\n # return matix[nperm+1, perm_cors]\n cor_mat_ind = cor_mat.argsort()\n # ndarray: sort in place\n cor_mat.sort()\n # genes_mat = genes.take(cor_mat_ind)\n if ascending: return cor_mat_ind, cor_mat\n # descending order of ranking and genes\n # return genes_mat[:,::-1], cor_mat[:,::-1]\n return cor_mat_ind[:, ::-1], cor_mat[:, ::-1]", "response": "Builds a ranking metric tensor for a given set of genes."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef ranking_metric(df, method, pos, neg, classes, ascending):\n\n # exclude any zero stds.\n df_mean = df.groupby(by=classes, axis=1).mean()\n df_std = df.groupby(by=classes, axis=1).std()\n\n\n if method == 'signal_to_noise':\n ser = (df_mean[pos] - df_mean[neg])/(df_std[pos] + df_std[neg])\n elif method == 't_test':\n ser = (df_mean[pos] - df_mean[neg])/ np.sqrt(df_std[pos]**2/len(df_std)+df_std[neg]**2/len(df_std) )\n elif method == 'ratio_of_classes':\n ser = df_mean[pos] / df_mean[neg]\n elif method == 'diff_of_classes':\n ser = df_mean[pos] - df_mean[neg]\n elif method == 'log2_ratio_of_classes':\n ser = np.log2(df_mean[pos] / df_mean[neg])\n else:\n logging.error(\"Please provide correct method name!!!\")\n sys.exit(0)\n ser = ser.sort_values(ascending=ascending)\n\n return ser", "response": "This function is used to calculate a metric for a given gene expression table."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncomputing the ranked terms and ranked scores for each gene set in the input dataframe", "response": "def gsea_compute_tensor(data, gmt, n, weighted_score_type, permutation_type,\n method, pheno_pos, pheno_neg, classes, ascending,\n processes=1, seed=None, single=False, scale=False):\n \"\"\"compute enrichment scores and enrichment nulls.\n\n :param data: preprocessed expression dataframe or a pre-ranked file if prerank=True.\n :param dict gmt: all gene sets in .gmt file. need to call load_gmt() to get results.\n :param int n: permutation number. default: 1000.\n :param str method: ranking_metric method. see above.\n :param str pheno_pos: one of labels of phenotype's names.\n :param str pheno_neg: one of labels of phenotype's names.\n :param list classes: a list of phenotype labels, to specify which column of dataframe belongs to what category of phenotype.\n :param float weighted_score_type: default:1\n :param bool ascending: sorting order of rankings. Default: False.\n :param seed: random seed. Default: np.random.RandomState()\n :param bool scale: if true, scale es by gene number.\n\n :return: a tuple contains::\n\n | zipped results of es, nes, pval, fdr.\n | nested list of hit indices of input gene_list.\n | nested list of ranked enrichment score of each input gene_sets.\n | list of enriched terms\n\n \"\"\"\n w = weighted_score_type\n subsets = sorted(gmt.keys())\n rs = np.random.RandomState(seed)\n genes_mat, cor_mat = data.index.values, data.values\n base = 5 if data.shape[0] >= 5000 else 10\n block = ceil(len(subsets) / base)\n\n if permutation_type == \"phenotype\":\n # shuffling classes and generate random correlation rankings\n logging.debug(\"Start to permutate classes..............................\")\n genes_ind = []\n cor_mat = []\n temp_rnk = []\n pool_rnk = Pool(processes=processes)\n # split large array into smaller blocks to avoid memory overflow\n i=1\n while i <= block:\n rs = np.random.RandomState(seed)\n temp_rnk.append(pool_rnk.apply_async(ranking_metric_tensor,\n args=(data, method, base, pheno_pos, pheno_neg, classes,\n ascending, rs)))\n i +=1\n pool_rnk.close()\n pool_rnk.join()\n\n for k, temp in enumerate(temp_rnk):\n gi, cor = temp.get()\n if k+1 == block:\n genes_ind.append(gi)\n cor_mat.append(cor)\n else:\n genes_ind.append(gi[:-1])\n cor_mat.append(cor[:-1])\n\n genes_ind, cor_mat = np.vstack(genes_ind), np.vstack(cor_mat)\n # convert to tuple\n genes_mat = (data.index.values, genes_ind)\n\n logging.debug(\"Start to compute es and esnulls........................\")\n # Prerank, ssGSEA, GSEA\n es = []\n RES = []\n hit_ind = []\n esnull = []\n temp_esnu = []\n pool_esnu = Pool(processes=processes)\n # split large array into smaller blocks to avoid memory overflow\n i, m = 1, 0\n while i <= block:\n # you have to reseed, or all your processes are sharing the same seed value\n rs = np.random.RandomState(seed)\n gmtrim = {k: gmt.get(k) for k in subsets[m:base * i]}\n temp_esnu.append(pool_esnu.apply_async(enrichment_score_tensor,\n args=(genes_mat, cor_mat,\n gmtrim, w, n, rs,\n single, scale)))\n m = base * i\n i += 1\n pool_esnu.close()\n pool_esnu.join()\n # esn is a list, don't need to use append method.\n for si, temp in enumerate(temp_esnu):\n e, enu, hit, rune = temp.get()\n esnull.append(enu)\n es.append(e)\n RES.append(rune)\n hit_ind += hit\n # concate results\n es, esnull, RES = np.hstack(es), np.vstack(esnull), np.vstack(RES)\n\n return gsea_significance(es, esnull), hit_ind, RES, subsets"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef gsea_compute(data, gmt, n, weighted_score_type, permutation_type,\n method, pheno_pos, pheno_neg, classes, ascending,\n processes=1, seed=None, single=False, scale=False):\n \"\"\"compute enrichment scores and enrichment nulls.\n\n :param data: preprocessed expression dataframe or a pre-ranked file if prerank=True.\n :param dict gmt: all gene sets in .gmt file. need to call load_gmt() to get results.\n :param int n: permutation number. default: 1000.\n :param str method: ranking_metric method. see above.\n :param str pheno_pos: one of labels of phenotype's names.\n :param str pheno_neg: one of labels of phenotype's names.\n :param list classes: a list of phenotype labels, to specify which column of dataframe belongs to what category of phenotype.\n :param float weighted_score_type: default:1\n :param bool ascending: sorting order of rankings. Default: False.\n :param seed: random seed. Default: np.random.RandomState()\n :param bool scale: if true, scale es by gene number.\n\n :return: a tuple contains::\n\n | zipped results of es, nes, pval, fdr.\n | nested list of hit indices of input gene_list.\n | nested list of ranked enrichment score of each input gene_sets.\n | list of enriched terms\n\n \"\"\"\n \n w = weighted_score_type\n subsets = sorted(gmt.keys())\n es = []\n RES=[]\n hit_ind=[]\n esnull = [ [] for a in range(len(subsets)) ]\n\n logging.debug(\"Start to compute enrichment scores......................\")\n\n if permutation_type == \"phenotype\":\n logging.debug(\"Start to permutate classes..............................\")\n # shuffling classes and generate random correlation rankings\n rs = np.random.RandomState(seed)\n genes_mat, cor_mat = ranking_metric_tensor(exprs=data, method=method,\n permutation_num=n,\n pos=pheno_pos, neg=pheno_neg,\n classes=classes,\n ascending=ascending, rs=rs)\n\n # compute es, esnulls. hits, RES\n logging.debug(\"Start to compute enrichment nulls.......................\")\n es, esnull, hit_ind, RES = enrichment_score_tensor(gene_mat=genes_mat,\n cor_mat=cor_mat,\n gene_sets=gmt,\n weighted_score_type=w,\n nperm=n, rs=rs,\n single=False, scale=False,)\n\n else:\n # Prerank, ssGSEA, GSEA with gene_set permutation\n gl, cor_vec = data.index.values, data.values\n logging.debug(\"Start to compute es and esnulls........................\")\n # es, esnull, hit_ind, RES = enrichment_score_tensor(gene_mat=gl,\n # cor_mat=cor_mat,\n # gene_sets=gmt,\n # weighted_score_type=w,\n # nperm=n, rs=rs\n # single=single, scale=scale)\n\n # split large array into smaller blocks to avoid memory overflow\n temp_esnu=[]\n pool_esnu = Pool(processes=processes)\n for subset in subsets:\n # you have to reseed, or all your processes are sharing the same seed value\n rs = np.random.RandomState(seed)\n temp_esnu.append(pool_esnu.apply_async(enrichment_score,\n args=(gl, cor_vec, gmt.get(subset), w,\n n, rs, single, scale)))\n\n pool_esnu.close()\n pool_esnu.join()\n # esn is a list, don't need to use append method.\n for si, temp in enumerate(temp_esnu):\n e, enu, hit, rune = temp.get()\n esnull[si] = enu\n es.append(e)\n RES.append(rune)\n hit_ind.append(hit)\n\n return gsea_significance(es, esnull), hit_ind, RES, subsets", "response": "compute the ranked terms and null scores for each gene set in the input expression dataframe"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncompute nominal p - value for S given an observed ES and an empty array.", "response": "def gsea_pval(es, esnull):\n \"\"\"Compute nominal p-value.\n\n From article (PNAS):\n estimate nominal p-value for S from esnull by using the positive\n or negative portion of the distribution corresponding to the sign\n of the observed ES(S).\n \"\"\"\n\n # to speed up, using numpy function to compute pval in parallel.\n condlist = [ es < 0, es >=0]\n choicelist = [np.sum(esnull < es.reshape(len(es),1), axis=1)/ np.sum(esnull < 0, axis=1),\n np.sum(esnull >= es.reshape(len(es),1), axis=1)/ np.sum(esnull >= 0, axis=1)]\n pval = np.select(condlist, choicelist)\n\n return pval"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncomputes nominal pvals normalized ES and FDR q value.", "response": "def gsea_significance(enrichment_scores, enrichment_nulls):\n \"\"\"Compute nominal pvals, normalized ES, and FDR q value.\n\n For a given NES(S) = NES* >= 0. The FDR is the ratio of the percentage of all (S,pi) with\n NES(S,pi) >= 0, whose NES(S,pi) >= NES*, divided by the percentage of\n observed S wih NES(S) >= 0, whose NES(S) >= NES*, and similarly if NES(S) = NES* <= 0.\n \"\"\"\n # For a zero by zero division (undetermined, results in a NaN),\n np.seterr(divide='ignore', invalid='ignore')\n # import warnings\n # warnings.simplefilter(\"ignore\")\n es = np.array(enrichment_scores)\n esnull = np.array(enrichment_nulls)\n logging.debug(\"Start to compute pvals..................................\")\n # compute pvals.\n enrichmentPVals = gsea_pval(es, esnull).tolist()\n\n logging.debug(\"Compute nes and nesnull.................................\")\n # nEnrichmentScores, nEnrichmentNulls = normalize(es, esnull)\n\n # new normalized enrichment score implementation.\n # this could speed up significantly.\n esnull_pos = (esnull*(esnull>=0)).mean(axis=1)\n esnull_neg = (esnull*(esnull<0)).mean(axis=1)\n nEnrichmentScores = np.where(es>=0, es/esnull_pos, -es/esnull_neg)\n nEnrichmentNulls = np.where(esnull>=0, esnull/esnull_pos[:,np.newaxis],\n -esnull/esnull_neg[:,np.newaxis])\n\n logging.debug(\"start to compute fdrs..................................\")\n\n # FDR null distribution histogram\n # create a histogram of all NES(S,pi) over all S and pi\n # Use this null distribution to compute an FDR q value,\n # vals = reduce(lambda x,y: x+y, nEnrichmentNulls, [])\n # nvals = np.array(sorted(vals))\n # or\n nvals = np.sort(nEnrichmentNulls.flatten())\n nnes = np.sort(nEnrichmentScores)\n fdrs = []\n # FDR computation\n for i in range(len(enrichment_scores)):\n nes = nEnrichmentScores[i]\n # use the same pval method to calculate fdr\n if nes >= 0:\n allPos = int(len(nvals) - np.searchsorted(nvals, 0, side=\"left\"))\n allHigherAndPos = int(len(nvals) - np.searchsorted(nvals, nes, side=\"left\"))\n nesPos = len(nnes) - int(np.searchsorted(nnes, 0, side=\"left\"))\n nesHigherAndPos = len(nnes) - int(np.searchsorted(nnes, nes, side=\"left\"))\n # allPos = (nvals >= 0).sum()\n # allHigherAndPos = (nvals >= nes).sum()\n # nesPos = (nnes >=0).sum()\n # nesHigherAndPos = (nnes >= nes).sum()\n else:\n allPos = int(np.searchsorted(nvals, 0, side=\"left\"))\n allHigherAndPos = int(np.searchsorted(nvals, nes, side=\"right\"))\n nesPos = int(np.searchsorted(nnes, 0, side=\"left\"))\n nesHigherAndPos = int(np.searchsorted(nnes, nes, side=\"right\"))\n # allPos = (nvals < 0).sum()\n # allHigherAndPos = (nvals < nes).sum()\n # nesPos = (nnes < 0).sum()\n # nesHigherAndPos = (nnes < nes).sum()\n \n try:\n pi_norm = allHigherAndPos/float(allPos) \n pi_obs = nesHigherAndPos/float(nesPos)\n fdr = pi_norm / pi_obs\n fdrs.append(fdr if fdr < 1 else 1.0)\n except:\n fdrs.append(1000000000.0)\n\n logging.debug(\"Statistical testing finished.............................\")\n\n return zip(enrichment_scores, nEnrichmentScores, enrichmentPVals, fdrs)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef log_init(outlog, log_level=logging.INFO):\n\n # clear old root logger handlers\n logging.getLogger(\"gseapy\").handlers = []\n # init a root logger\n logging.basicConfig(level = logging.DEBUG,\n format = 'LINE %(lineno)-4d: %(asctime)s [%(levelname)-8s] %(message)s',\n filename = outlog,\n filemode = 'w')\n\n # define a Handler which writes INFO messages or higher to the sys.stderr\n console = logging.StreamHandler()\n console.setLevel(log_level)\n # set a format which is simpler for console use\n formatter = logging.Formatter('%(asctime)s %(message)s')\n # tell the handler to use this format\n console.setFormatter(formatter)\n # add handlers\n logging.getLogger(\"gseapy\").addHandler(console)\n logger = logging.getLogger(\"gseapy\")\n # logger.setLevel(log_level)\n return logger", "response": "Initialize the root logger with a new log level."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nretrying connection. define max tries num define max tries num define max tries num define max tries num define max tries num define max tries num", "response": "def retry(num=5):\n \"\"\"\"retry connection.\n \n define max tries num\n if the backoff_factor is 0.1, then sleep() will sleep for\n [0.1s, 0.2s, 0.4s, ...] between retries.\n It will also force a retry if the status code returned is 500, 502, 503 or 504. \n \n \"\"\"\n s = requests.Session()\n retries = Retry(total=num, backoff_factor=0.1,\n status_forcelist=[500, 502, 503, 504])\n s.mount('http://', HTTPAdapter(max_retries=retries))\n\n return s"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nextracts class name from. cls file.", "response": "def gsea_cls_parser(cls):\n \"\"\"Extract class(phenotype) name from .cls file.\n\n :param cls: the a class list instance or .cls file which is identical to GSEA input .\n :return: phenotype name and a list of class vector.\n \"\"\"\n\n if isinstance(cls, list) :\n classes = cls\n sample_name= unique(classes)\n elif isinstance(cls, str) :\n with open(cls) as c:\n file = c.readlines()\n classes = file[2].strip('\\n').split(\" \")\n sample_name = file[1].lstrip(\"# \").strip('\\n').split(\" \")\n else:\n raise Exception('Error parsing sample name!')\n\n return sample_name[0], sample_name[1], classes"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nparse the. results file stored under the GSEA database.", "response": "def gsea_edb_parser(results_path, index=0):\n \"\"\"Parse results.edb file stored under **edb** file folder.\n\n :param results_path: the .results file located inside edb folder.\n :param index: gene_set index of gmt database, used for iterating items.\n :return: enrichment_term, hit_index,nes, pval, fdr.\n \"\"\"\n from bs4 import BeautifulSoup\n\n soup = BeautifulSoup(open(results_path), features='xml')\n tag = soup.findAll('DTG')\n term = dict(tag[index].attrs)\n # dict_keys(['RANKED_LIST', 'GENESET', 'FWER', 'ES_PROFILE',\n # 'HIT_INDICES', 'ES', 'NES', 'TEMPLATE', 'RND_ES', 'RANK_SCORE_AT_ES',\n # 'NP', 'RANK_AT_ES', 'FDR'])\n enrich_term = term.get('GENESET').split(\"#\")[1]\n es_profile = term.get('ES_PROFILE').split(\" \")\n # rank_es = term.get('RND_ES').split(\" \")\n hit_ind =term.get('HIT_INDICES').split(\" \")\n es_profile = [float(i) for i in es_profile ]\n hit_ind = [float(i) for i in hit_ind ]\n #r ank_es = [float(i) for i in rank_es ]\n nes = term.get('NES')\n pval = term.get('NP')\n fdr = term.get('FDR')\n # fwer = term.get('FWER')\n # index_range = len(tag)-1\n logging.debug(\"Enriched Gene set is: \"+ enrich_term)\n\n return enrich_term, hit_ind, nes, pval, fdr"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef gsea_gmt_parser(gmt, min_size = 3, max_size = 1000, gene_list=None):\n\n if gmt.lower().endswith(\".gmt\"):\n logging.info(\"User Defined gene sets is given.......continue..........\")\n with open(gmt) as genesets:\n genesets_dict = { line.strip().split(\"\\t\")[0]: line.strip().split(\"\\t\")[2:]\n for line in genesets.readlines()}\n else:\n logging.info(\"Downloading and generating Enrichr library gene sets...\")\n if gmt in DEFAULT_LIBRARY:\n names = DEFAULT_LIBRARY\n else:\n names = get_library_name()\n if gmt in names:\n \"\"\"\n define max tries num\n if the backoff_factor is 0.1, then sleep() will sleep for\n [0.1s, 0.2s, 0.4s, ...] between retries.\n It will also force a retry if the status code returned is 500, 502, 503 or 504.\n \"\"\"\n s = requests.Session()\n retries = Retry(total=5, backoff_factor=0.1,\n status_forcelist=[ 500, 502, 503, 504 ])\n s.mount('http://', HTTPAdapter(max_retries=retries))\n # queery string\n ENRICHR_URL = 'http://amp.pharm.mssm.edu/Enrichr/geneSetLibrary'\n query_string = '?mode=text&libraryName=%s'\n # get\n response = s.get( ENRICHR_URL + query_string % gmt, timeout=None)\n else:\n raise Exception(\"gene_set files(.gmt) not found\")\n if not response.ok:\n raise Exception('Error fetching enrichment results, check internet connection first.')\n\n genesets_dict = { line.strip().split(\"\\t\")[0]:\n list(map(lambda x: x.split(\",\")[0], line.strip().split(\"\\t\")[2:]))\n for line in response.iter_lines(chunk_size=1024, decode_unicode='utf-8')}\n\n\n\n # filtering dict\n if sys.version_info[0] >= 3 :\n genesets_filter = {k: v for k, v in genesets_dict.items() if len(v) >= min_size and len(v) <= max_size}\n elif sys.version_info[0] == 2:\n genesets_filter = {k: v for k, v in genesets_dict.iteritems() if len(v) >= min_size and len(v) <= max_size}\n else:\n logging.error(\"System failure. Please Provide correct input files\")\n sys.exit(1)\n if gene_list is not None:\n subsets = sorted(genesets_filter.keys())\n for subset in subsets:\n tag_indicator = in1d(gene_list, genesets_filter.get(subset), assume_unique=True)\n tag_len = sum(tag_indicator)\n if tag_len <= min_size or tag_len >= max_size:\n del genesets_filter[subset]\n else:\n continue\n # some_dict = {key: value for key, value in some_dict.items() if value != value_to_remove}\n # use np.intersect1d() may be faster???\n filsets_num = len(genesets_dict) - len(genesets_filter)\n logging.info(\"%04d gene_sets have been filtered out when max_size=%s and min_size=%s\"%(filsets_num, max_size, min_size))\n\n if filsets_num == len(genesets_dict):\n logging.error(\"No gene sets passed throught filtering condition!!!, try new paramters again!\\n\" +\\\n \"Note: Gene names for gseapy is case sensitive.\" )\n sys.exit(1)\n else:\n return genesets_filter", "response": "Parse GSEA GEMT file and return a dictionary of gene sets."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget the active enrichr library name.", "response": "def get_library_name(database='Human'):\n \"\"\"return enrichr active enrichr library name. \n :param str database: Select one from { 'Human', 'Mouse', 'Yeast', 'Fly', 'Fish', 'Worm' } \n \n \"\"\"\n\n # make a get request to get the gmt names and meta data from Enrichr\n\n # old code\n # response = requests.get('http://amp.pharm.mssm.edu/Enrichr/geneSetLibrary?mode=meta')\n # gmt_data = response.json()\n # # generate list of lib names\n # libs = []\n # # get library names\n # for inst_gmt in gmt_data['libraries']:\n # # only include active gmts\n # if inst_gmt['isActive'] == True:\n # libs.append(inst_gmt['libraryName'])\n\n\n if database not in ['Human', 'Mouse', 'Yeast', 'Fly', 'Fish', 'Worm']:\n sys.stderr.write(\"\"\"No supported database. Please input one of these:\n 'Human', 'Mouse', 'Yeast', 'Fly', 'Fish', 'Worm' \"\"\")\n return \n if database in ['Human', 'Mouse']: database=''\n lib_url='http://amp.pharm.mssm.edu/%sEnrichr/datasetStatistics'%database\n libs_json = json.loads(requests.get(lib_url).text)\n libs = [lib['libraryName'] for lib in libs_json['statistics']]\n\n return sorted(libs)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_marts(self):\n\n mart_names = pd.Series(self.names, name=\"Name\")\n mart_descriptions = pd.Series(self.displayNames, name=\"Description\")\n\n return pd.concat([mart_names, mart_descriptions], axis=1)", "response": "Get available marts and their names."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_datasets(self, mart='ENSEMBL_MART_ENSEMBL'):\n datasets = self.datasets(mart, raw=True)\n return pd.read_csv(StringIO(datasets), header=None, usecols=[1, 2],\n names = [\"Name\", \"Description\"],sep=\"\\t\")", "response": "Get available datasets from mart"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_attributes(self, dataset):\n attributes = self.attributes(dataset)\n attr_ = [ (k, v[0]) for k, v in attributes.items()]\n return pd.DataFrame(attr_, columns=[\"Attribute\",\"Description\"])", "response": "Get available attritbutes from dataset you ve selected"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets available filters from dataset you ve selected", "response": "def get_filters(self, dataset):\n \"\"\"Get available filters from dataset you've selected\"\"\"\n filters = self.filters(dataset)\n filt_ = [ (k, v[0]) for k, v in filters.items()]\n return pd.DataFrame(filt_, columns=[\"Filter\", \"Description\"])"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef query(self, dataset='hsapiens_gene_ensembl', attributes=[], \n filters={}, filename=None):\n \"\"\"mapping ids using BioMart. \n\n :param dataset: str, default: 'hsapiens_gene_ensembl'\n :param attributes: str, list, tuple\n :param filters: dict, {'filter name': list(filter value)}\n :param host: www.ensembl.org, asia.ensembl.org, useast.ensembl.org\n :return: a dataframe contains all attributes you selected.\n\n **Note**: it will take a couple of minutes to get the results.\n A xml template for querying biomart. (see https://gist.github.com/keithshep/7776579)\n \n exampleTaxonomy = \"mmusculus_gene_ensembl\"\n exampleGene = \"ENSMUSG00000086981,ENSMUSG00000086982,ENSMUSG00000086983\"\n urlTemplate = \\\n '''http://ensembl.org/biomart/martservice?query=''' \\\n '''''' \\\n '''''' \\\n '''''' \\\n '''''' \\\n '''''' \\\n '''''' \\\n '''''' \\\n '''''' \\\n '''''' \n \n exampleURL = urlTemplate % (exampleTaxonomy, exampleGene)\n req = requests.get(exampleURL, stream=True)\n \n \"\"\"\n if not attributes: \n attributes = ['ensembl_gene_id', 'external_gene_name', 'entrezgene', 'go_id'] \n # i=0\n # while (self.host is None) and (i < 3):\n # self.host = self.ghosts[i]\n # i +=1 \n self.new_query()\n # 'mmusculus_gene_ensembl'\n self.add_dataset_to_xml(dataset)\n for at in attributes:\n self.add_attribute_to_xml(at)\n # add filters\n if filters:\n for k, v in filters.items(): \n if isinstance(v, list): v = \",\".join(v)\n self.add_filter_to_xml(k, v)\n\n xml_query = self.get_xml()\n results = super(Biomart, self).query(xml_query)\n df = pd.read_csv(StringIO(results), header=None, sep=\"\\t\",\n names=attributes, index_col=None)\n # save file to cache path.\n if filename is None: \n mkdirs(DEFAULT_CACHE_PATH)\n filename = os.path.join(DEFAULT_CACHE_PATH, \"{}.background.genes.txt\".format(dataset))\n df.to_csv(filename, sep=\"\\t\", index=False)\n \n return df", "response": "Query BioMart for mapping ids using BioMart."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef gsea(data, gene_sets, cls, outdir='GSEA_', min_size=15, max_size=500, permutation_num=1000,\n weighted_score_type=1,permutation_type='gene_set', method='log2_ratio_of_classes',\n\t ascending=False, processes=1, figsize=(6.5,6), format='pdf',\n graph_num=20, no_plot=False, seed=None, verbose=False):\n \"\"\" Run Gene Set Enrichment Analysis.\n\n :param data: Gene expression data table, Pandas DataFrame, gct file.\n :param gene_sets: Enrichr Library name or .gmt gene sets file or dict of gene sets. Same input with GSEA.\n :param cls: A list or a .cls file format required for GSEA.\n :param str outdir: Results output directory.\n :param int permutation_num: Number of permutations for significance computation. Default: 1000.\n :param str permutation_type: Permutation type, \"phenotype\" for phenotypes, \"gene_set\" for genes.\n :param int min_size: Minimum allowed number of genes from gene set also the data set. Default: 15.\n :param int max_size: Maximum allowed number of genes from gene set also the data set. Default: 500.\n :param float weighted_score_type: Refer to :func:`algorithm.enrichment_score`. Default:1.\n :param method: The method used to calculate a correlation or ranking. Default: 'log2_ratio_of_classes'.\n Others methods are:\n\n 1. 'signal_to_noise'\n\n You must have at least three samples for each phenotype to use this metric.\n The larger the signal-to-noise ratio, the larger the differences of the means (scaled by the standard deviations);\n that is, the more distinct the gene expression is in each phenotype and the more the gene acts as a \u201cclass marker.\u201d\n\n 2. 't_test'\n\n Uses the difference of means scaled by the standard deviation and number of samples.\n Note: You must have at least three samples for each phenotype to use this metric.\n The larger the tTest ratio, the more distinct the gene expression is in each phenotype\n and the more the gene acts as a \u201cclass marker.\u201d\n\n 3. 'ratio_of_classes' (also referred to as fold change).\n\n Uses the ratio of class means to calculate fold change for natural scale data.\n\n 4. 'diff_of_classes'\n\n\n Uses the difference of class means to calculate fold change for nature scale data\n\n\n 5. 'log2_ratio_of_classes'\n\n Uses the log2 ratio of class means to calculate fold change for natural scale data.\n This is the recommended statistic for calculating fold change for log scale data.\n\n\n :param bool ascending: Sorting order of rankings. Default: False.\n :param int processes: Number of Processes you are going to use. Default: 1.\n :param list figsize: Matplotlib figsize, accept a tuple or list, e.g. [width,height]. Default: [6.5,6].\n :param str format: Matplotlib figure format. Default: 'pdf'.\n :param int graph_num: Plot graphs for top sets of each phenotype.\n :param bool no_plot: If equals to True, no figure will be drawn. Default: False.\n :param seed: Random seed. expect an integer. Default:None.\n :param bool verbose: Bool, increase output verbosity, print out progress of your job, Default: False.\n\n :return: Return a GSEA obj. All results store to a dictionary, obj.results,\n where contains::\n\n | {es: enrichment score,\n | nes: normalized enrichment score,\n | p: P-value,\n | fdr: FDR,\n | size: gene set size,\n | matched_size: genes matched to the data,\n | genes: gene names from the data set\n | ledge_genes: leading edge genes}\n\n\n \"\"\"\n gs = GSEA(data, gene_sets, cls, outdir, min_size, max_size, permutation_num,\n weighted_score_type, permutation_type, method, ascending, processes,\n figsize, format, graph_num, no_plot, seed, verbose)\n gs.run()\n\n return gs", "response": "Run GSEA analysis on a set of gene sets."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef ssgsea(data, gene_sets, outdir=\"ssGSEA_\", sample_norm_method='rank', min_size=15, max_size=2000,\n permutation_num=0, weighted_score_type=0.25, scale=True, ascending=False, processes=1,\n figsize=(7,6), format='pdf', graph_num=20, no_plot=False, seed=None, verbose=False):\n \"\"\"Run Gene Set Enrichment Analysis with single sample GSEA tool\n\n :param data: Expression table, pd.Series, pd.DataFrame, GCT file, or .rnk file format.\n :param gene_sets: Enrichr Library name or .gmt gene sets file or dict of gene sets. Same input with GSEA.\n :param outdir: Results output directory.\n :param str sample_norm_method: \"Sample normalization method. Choose from {'rank', 'log', 'log_rank'}. Default: rank.\n\n 1. 'rank': Rank your expression data, and transform by 10000*rank_dat/gene_numbers\n 2. 'log' : Do not rank, but transform data by log(data + exp(1)), while data = data[data<1] =1.\n 3. 'log_rank': Rank your expression data, and transform by log(10000*rank_dat/gene_numbers+ exp(1))\n 4. 'custom': Do nothing, and use your own rank value to calculate enrichment score.\n \n see here: https://github.com/GSEA-MSigDB/ssGSEAProjection-gpmodule/blob/master/src/ssGSEAProjection.Library.R, line 86\n\n :param int min_size: Minimum allowed number of genes from gene set also the data set. Default: 15.\n :param int max_size: Maximum allowed number of genes from gene set also the data set. Default: 2000.\n :param int permutation_num: Number of permutations for significance computation. Default: 0.\n :param str weighted_score_type: Refer to :func:`algorithm.enrichment_score`. Default:0.25.\n :param bool scale: If True, normalize the scores by number of genes in the gene sets.\n :param bool ascending: Sorting order of rankings. Default: False.\n :param int processes: Number of Processes you are going to use. Default: 1.\n :param list figsize: Matplotlib figsize, accept a tuple or list, e.g. [width,height]. Default: [7,6].\n :param str format: Matplotlib figure format. Default: 'pdf'.\n :param int graph_num: Plot graphs for top sets of each phenotype.\n :param bool no_plot: If equals to True, no figure will be drawn. Default: False.\n :param seed: Random seed. expect an integer. Default:None.\n :param bool verbose: Bool, increase output verbosity, print out progress of your job, Default: False.\n\n :return: Return a ssGSEA obj. \n All results store to a dictionary, access enrichment score by obj.resultsOnSamples,\n and normalized enrichment score by obj.res2d.\n if permutation_num > 0, additional results contain::\n\n | {es: enrichment score,\n | nes: normalized enrichment score,\n | p: P-value,\n | fdr: FDR,\n | size: gene set size,\n | matched_size: genes matched to the data,\n | genes: gene names from the data set\n | ledge_genes: leading edge genes, if permutation_num >0}\n\n\n \"\"\"\n\n ss = SingleSampleGSEA(data, gene_sets, outdir, sample_norm_method, min_size, max_size,\n permutation_num, weighted_score_type, scale, ascending,\n processes, figsize, format, graph_num, no_plot, seed, verbose)\n ss.run()\n return ss", "response": "Run GSEA Enrichment Analysis with single sample GSEA"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nruns GSEA Prerank analysis with pre - ranked correlation table.", "response": "def prerank(rnk, gene_sets, outdir='GSEA_Prerank', pheno_pos='Pos', pheno_neg='Neg',\n min_size=15, max_size=500, permutation_num=1000, weighted_score_type=1,\n ascending=False, processes=1, figsize=(6.5,6), format='pdf',\n graph_num=20, no_plot=False, seed=None, verbose=False):\n \"\"\" Run Gene Set Enrichment Analysis with pre-ranked correlation defined by user.\n\n :param rnk: pre-ranked correlation table or pandas DataFrame. Same input with ``GSEA`` .rnk file.\n :param gene_sets: Enrichr Library name or .gmt gene sets file or dict of gene sets. Same input with GSEA.\n :param outdir: results output directory.\n :param int permutation_num: Number of permutations for significance computation. Default: 1000.\n :param int min_size: Minimum allowed number of genes from gene set also the data set. Default: 15.\n :param int max_size: Maximum allowed number of genes from gene set also the data set. Defaults: 500.\n :param str weighted_score_type: Refer to :func:`algorithm.enrichment_score`. Default:1.\n :param bool ascending: Sorting order of rankings. Default: False.\n :param int processes: Number of Processes you are going to use. Default: 1.\n :param list figsize: Matplotlib figsize, accept a tuple or list, e.g. [width,height]. Default: [6.5,6].\n :param str format: Matplotlib figure format. Default: 'pdf'.\n :param int graph_num: Plot graphs for top sets of each phenotype.\n :param bool no_plot: If equals to True, no figure will be drawn. Default: False.\n :param seed: Random seed. expect an integer. Default:None.\n :param bool verbose: Bool, increase output verbosity, print out progress of your job, Default: False.\n\n :return: Return a Prerank obj. All results store to a dictionary, obj.results,\n where contains::\n\n | {es: enrichment score,\n | nes: normalized enrichment score,\n | p: P-value,\n | fdr: FDR,\n | size: gene set size,\n | matched_size: genes matched to the data,\n | genes: gene names from the data set\n | ledge_genes: leading edge genes}\n\n\n \"\"\"\n pre = Prerank(rnk, gene_sets, outdir, pheno_pos, pheno_neg,\n min_size, max_size, permutation_num, weighted_score_type,\n ascending, processes, figsize, format, graph_num, no_plot, seed, verbose)\n pre.run()\n return pre"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncreate temp directory and return filename", "response": "def prepare_outdir(self):\n \"\"\"create temp directory.\"\"\"\n self._outdir = self.outdir\n if self._outdir is None:\n self._tmpdir = TemporaryDirectory()\n self.outdir = self._tmpdir.name\n elif isinstance(self.outdir, str):\n mkdirs(self.outdir)\n else:\n raise Exception(\"Error parsing outdir: %s\"%type(self.outdir))\n\n # handle gmt type\n if isinstance(self.gene_sets, str):\n _gset = os.path.split(self.gene_sets)[-1].lower().rstrip(\".gmt\")\n elif isinstance(self.gene_sets, dict):\n _gset = \"blank_name\"\n else:\n raise Exception(\"Error parsing gene_sets parameter for gene sets\")\n\n logfile = os.path.join(self.outdir, \"gseapy.%s.%s.log\" % (self.module, _gset))\n return logfile"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsetting cpu numbers to be used", "response": "def _set_cores(self):\n \"\"\"set cpu numbers to be used\"\"\"\n\n cpu_num = cpu_count()-1\n if self._processes > cpu_num:\n cores = cpu_num\n elif self._processes < 1:\n cores = 1\n else:\n cores = self._processes\n # have to be int if user input is float\n self._processes = int(cores)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _load_ranking(self, rnk):\n # load data\n if isinstance(rnk, pd.DataFrame):\n rank_metric = rnk.copy()\n # handle dataframe with gene_name as index.\n if rnk.shape[1] == 1: rank_metric = rnk.reset_index()\n elif isinstance(rnk, pd.Series):\n rank_metric = rnk.reset_index()\n elif os.path.isfile(rnk):\n rank_metric = pd.read_csv(rnk, header=None, comment='#', sep=\"\\t\")\n else:\n raise Exception('Error parsing gene ranking values!')\n # sort ranking values from high to low\n rank_metric.sort_values(by=rank_metric.columns[1], ascending=self.ascending, inplace=True)\n # drop na values\n if rank_metric.isnull().any(axis=1).sum() >0:\n self._logger.warning(\"Input gene rankings contains NA values(gene name and ranking value), drop them all!\")\n # print out NAs\n NAs = rank_metric[rank_metric.isnull().any(axis=1)]\n self._logger.debug('NAs list:\\n'+NAs.to_string())\n rank_metric.dropna(how='any', inplace=True)\n # drop duplicate IDs, keep the first\n if rank_metric.duplicated(subset=rank_metric.columns[0]).sum() >0:\n self._logger.warning(\"Input gene rankings contains duplicated IDs, Only use the duplicated ID with highest value!\")\n # print out duplicated IDs.\n dups = rank_metric[rank_metric.duplicated(subset=rank_metric.columns[0])]\n self._logger.debug('Dups list:\\n'+dups.to_string())\n rank_metric.drop_duplicates(subset=rank_metric.columns[0], inplace=True, keep='first')\n # reset ranking index, because you have sort values and drop duplicates.\n rank_metric.reset_index(drop=True, inplace=True)\n rank_metric.columns = ['gene_name','rank']\n rankser = rank_metric.set_index('gene_name')['rank']\n self.ranking = rankser\n # return series\n return rankser", "response": "Parse the ranking file and return a Pandas Series with the gene name indexed rankings and their corresponding IDs."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nload gene set dict from list of genes", "response": "def load_gmt(self, gene_list, gmt):\n \"\"\"load gene set dict\"\"\"\n\n if isinstance(gmt, dict):\n genesets_dict = gmt\n elif isinstance(gmt, str):\n genesets_dict = self.parse_gmt(gmt)\n else:\n raise Exception(\"Error parsing gmt parameter for gene sets\")\n \n subsets = list(genesets_dict.keys())\n self.n_genesets = len(subsets)\n for subset in subsets:\n subset_list = genesets_dict.get(subset)\n if isinstance(subset_list, set):\n subset_list = list(subset_list)\n genesets_dict[subset] = subset_list\n tag_indicator = np.in1d(gene_list, subset_list, assume_unique=True)\n tag_len = tag_indicator.sum()\n if self.min_size <= tag_len <= self.max_size: continue\n del genesets_dict[subset]\n\n filsets_num = len(subsets) - len(genesets_dict)\n self._logger.info(\"%04d gene_sets have been filtered out when max_size=%s and min_size=%s\"%(filsets_num, self.max_size, self.min_size))\n\n if filsets_num == len(subsets):\n self._logger.error(\"No gene sets passed through filtering condition!!!, try new parameters again!\\n\" +\\\n \"Note: check gene name, gmt file format, or filtering size.\" )\n sys.exit(0)\n\n self._gmtdct=genesets_dict\n return genesets_dict"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef parse_gmt(self, gmt):\n\n if gmt.lower().endswith(\".gmt\"):\n with open(gmt) as genesets:\n genesets_dict = { line.strip().split(\"\\t\")[0]: line.strip().split(\"\\t\")[2:]\n for line in genesets.readlines()}\n return genesets_dict\n\n elif gmt in DEFAULT_LIBRARY:\n pass\n elif gmt in self.get_libraries():\n pass\n else:\n self._logger.error(\"No supported gene_sets: %s\"%gmt)\n sys.exit(0)\n\n tmpname = \"enrichr.\" + gmt + \".gmt\"\n tempath = os.path.join(DEFAULT_CACHE_PATH, tmpname)\n # if file already download\n if os.path.isfile(tempath):\n self._logger.info(\"Enrichr library gene sets already downloaded in: %s, use local file\"%DEFAULT_CACHE_PATH)\n return self.parse_gmt(tempath)\n else:\n return self._download_libraries(gmt)", "response": "parse a gmt file and return a dict of all the genesets in the cache"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns active enrichr library name. Offical API", "response": "def get_libraries(self, database=''):\n \"\"\"return active enrichr library name.Offical API \"\"\"\n\n lib_url='http://amp.pharm.mssm.edu/%sEnrichr/datasetStatistics'%database\n libs_json = json.loads(requests.get(lib_url).text)\n libs = [lib['libraryName'] for lib in libs_json['statistics']]\n return sorted(libs)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _download_libraries(self, libname):\n self._logger.info(\"Downloading and generating Enrichr library gene sets......\")\n s = retry(5)\n # queery string\n ENRICHR_URL = 'http://amp.pharm.mssm.edu/Enrichr/geneSetLibrary'\n query_string = '?mode=text&libraryName=%s'\n # get\n response = s.get( ENRICHR_URL + query_string % libname, timeout=None)\n if not response.ok:\n raise Exception('Error fetching enrichment results, check internet connection first.')\n # reformat to dict and save to disk\n mkdirs(DEFAULT_CACHE_PATH)\n genesets_dict = {}\n outname = \"enrichr.%s.gmt\"%libname\n gmtout = open(os.path.join(DEFAULT_CACHE_PATH, outname), \"w\")\n for line in response.iter_lines(chunk_size=1024, decode_unicode='utf-8'):\n line=line.strip()\n k = line.split(\"\\t\")[0]\n v = list(map(lambda x: x.split(\",\")[0], line.split(\"\\t\")[2:]))\n genesets_dict.update({ k: v})\n outline = \"%s\\t\\t%s\\n\"%(k, \"\\t\".join(v))\n gmtout.write(outline)\n gmtout.close()\n\n return genesets_dict", "response": "download enrichr libraries and generate a dict of gene sets"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _plotting(self, rank_metric, results, graph_num, outdir, \n format, figsize, pheno_pos='', pheno_neg=''):\n \"\"\" Plotting API.\n :param rank_metric: sorted pd.Series with rankings values.\n :param results: self.results\n :param data: preprocessed expression table\n\n \"\"\"\n \n # no values need to be returned\n if self._outdir is None: return\n #Plotting\n top_term = self.res2d.index[:graph_num]\n # multi-threading\n pool = Pool(self._processes)\n for gs in top_term:\n hit = results.get(gs)['hits_indices']\n NES = 'nes' if self.module != 'ssgsea' else 'es'\n term = gs.replace('/','_').replace(\":\",\"_\")\n outfile = '{0}/{1}.{2}.{3}'.format(self.outdir, term, self.module, self.format)\n # gseaplot(rank_metric=rank_metric, term=term, hits_indices=hit,\n # nes=results.get(gs)[NES], pval=results.get(gs)['pval'], \n # fdr=results.get(gs)['fdr'], RES=results.get(gs)['RES'],\n # pheno_pos=pheno_pos, pheno_neg=pheno_neg, figsize=figsize,\n # ofname=outfile)\n pool.apply_async(gseaplot, args=(rank_metric, term, hit, results.get(gs)[NES],\n results.get(gs)['pval'],results.get(gs)['fdr'],\n results.get(gs)['RES'],\n pheno_pos, pheno_neg, \n figsize, 'seismic', outfile))\n if self.module == 'gsea':\n outfile2 = \"{0}/{1}.heatmap.{2}\".format(self.outdir, term, self.format)\n # heatmap(df=self.heatmat.iloc[hit, :], title=term, ofname=outfile2, \n # z_score=0, figsize=(self._width, len(hit)/2))\n pool.apply_async(heatmap, args=(self.heatmat.iloc[hit, :], 0, term, \n (self._width, len(hit)/2+2), 'RdBu_r',\n True, True, outfile2))\n pool.close()\n pool.join()", "response": "Plots the rank_metric and data for the current object."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreformats gsea results and save to txt", "response": "def _save_results(self, zipdata, outdir, module, gmt, rank_metric, permutation_type):\n \"\"\"reformat gsea results, and save to txt\"\"\"\n\n res = OrderedDict()\n for gs, gseale, ind, RES in zipdata:\n rdict = OrderedDict()\n rdict['es'] = gseale[0]\n rdict['nes'] = gseale[1]\n rdict['pval'] = gseale[2]\n rdict['fdr'] = gseale[3]\n rdict['geneset_size'] = len(gmt[gs])\n rdict['matched_size'] = len(ind)\n #reformat gene list.\n _genes = rank_metric.index.values[ind]\n rdict['genes'] = \";\".join([ str(g).strip() for g in _genes ])\n \n if self.module != 'ssgsea':\n # extract leading edge genes\n if rdict['es'] > 0:\n # RES -> ndarray, ind -> list\n idx = RES.argmax()\n ldg_pos = list(filter(lambda x: x<= idx, ind))\n elif rdict['es'] < 0:\n idx = RES.argmin()\n ldg_pos = list(filter(lambda x: x >= idx, ind))\n else:\n ldg_pos = ind # es == 0 ?\n rdict['ledge_genes'] = ';'.join(list(map(str,rank_metric.iloc[ldg_pos].index)))\n \n rdict['RES'] = RES\n rdict['hits_indices'] = ind\n # save to one odict\n res[gs] = rdict\n # save\n self.results = res\n # save to dataframe\n res_df = pd.DataFrame.from_dict(res, orient='index')\n res_df.index.name = 'Term'\n res_df.drop(['RES','hits_indices'], axis=1, inplace=True)\n res_df.sort_values(by=['fdr','pval'], inplace=True)\n self.res2d = res_df\n\n if self._outdir is None: return\n out = os.path.join(outdir,'gseapy.{b}.{c}.report.csv'.format(b=module, c=permutation_type))\n if self.module == 'ssgsea':\n out = out.replace(\".csv\",\".txt\")\n with open(out, 'a') as f:\n f.write('# normalize enrichment scores by random permutation procedure (GSEA method)\\n')\n f.write(\"# might not proper for publication\\n\")\n res_df.to_csv(f, sep='\\t')\n else:\n res_df.to_csv(out)\n\n return"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nload the data from the file and return a DataFrame with the data as a DataFrame.", "response": "def load_data(self, cls_vec):\n \"\"\"pre-processed the data frame.new filtering methods will be implement here.\n \"\"\"\n # read data in\n if isinstance(self.data, pd.DataFrame) :\n exprs = self.data.copy()\n # handle index is gene_names\n if exprs.index.dtype == 'O':\n exprs = exprs.reset_index()\n elif os.path.isfile(self.data) :\n # GCT input format?\n if self.data.endswith(\"gct\"):\n exprs = pd.read_csv(self.data, skiprows=1, comment='#',sep=\"\\t\")\n else:\n exprs = pd.read_csv(self.data, comment='#',sep=\"\\t\")\n else:\n raise Exception('Error parsing gene expression DataFrame!')\n\n #drop duplicated gene names\n if exprs.iloc[:,0].duplicated().sum() > 0:\n self._logger.warning(\"Warning: dropping duplicated gene names, only keep the first values\")\n exprs.drop_duplicates(subset=exprs.columns[0], inplace=True) #drop duplicate gene_names.\n if exprs.isnull().any().sum() > 0:\n self._logger.warning(\"Warning: Input data contains NA, filled NA with 0\")\n exprs.dropna(how='all', inplace=True) #drop rows with all NAs\n exprs = exprs.fillna(0)\n # set gene name as index\n exprs.set_index(keys=exprs.columns[0], inplace=True)\n # select numberic columns\n df = exprs.select_dtypes(include=[np.number])\n # drop any genes which std ==0\n df_std = df.groupby(by=cls_vec, axis=1).std()\n df = df[~df_std.isin([0]).any(axis=1)]\n df = df + 0.00001 # we don't like zeros!!!\n return df"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run(self):\n\n assert self.permutation_type in [\"phenotype\", \"gene_set\"]\n assert self.min_size <= self.max_size\n\n # Start Analysis\n self._logger.info(\"Parsing data files for GSEA.............................\")\n # phenotype labels parsing\n phenoPos, phenoNeg, cls_vector = gsea_cls_parser(self.classes)\n # select correct expression genes and values.\n dat = self.load_data(cls_vector)\n # data frame must have length > 1\n assert len(dat) > 1\n # ranking metrics calculation.\n dat2 = ranking_metric(df=dat, method=self.method, pos=phenoPos, neg=phenoNeg,\n classes=cls_vector, ascending=self.ascending)\n self.ranking = dat2\n # filtering out gene sets and build gene sets dictionary\n gmt = self.load_gmt(gene_list=dat2.index.values, gmt=self.gene_sets)\n\n self._logger.info(\"%04d gene_sets used for further statistical testing.....\"% len(gmt))\n self._logger.info(\"Start to run GSEA...Might take a while..................\")\n # cpu numbers\n self._set_cores()\n # compute ES, NES, pval, FDR, RES\n dataset = dat if self.permutation_type =='phenotype' else dat2\n gsea_results,hit_ind,rank_ES, subsets = gsea_compute_tensor(data=dataset, gmt=gmt, n=self.permutation_num,\n weighted_score_type=self.weighted_score_type,\n permutation_type=self.permutation_type,\n method=self.method,\n pheno_pos=phenoPos, pheno_neg=phenoNeg,\n classes=cls_vector, ascending=self.ascending,\n processes=self._processes, seed=self.seed)\n \n self._logger.info(\"Start to generate GSEApy reports and figures............\")\n res_zip = zip(subsets, list(gsea_results), hit_ind, rank_ES)\n self._save_results(zipdata=res_zip, outdir=self.outdir, module=self.module,\n gmt=gmt, rank_metric=dat2, permutation_type=self.permutation_type)\n\n # reorder datarame for heatmap\n self._heatmat(df=dat.loc[dat2.index], classes=cls_vector, \n pheno_pos=phenoPos, pheno_neg=phenoNeg)\n # Plotting\n if not self._noplot:\n self._plotting(rank_metric=dat2, results=self.results,\n graph_num=self.graph_num, outdir=self.outdir,\n figsize=self.figsize, format=self.format,\n pheno_pos=phenoPos, pheno_neg=phenoNeg)\n\n self._logger.info(\"Congratulations. GSEApy ran successfully.................\\n\")\n if self._outdir is None:\n self._tmpdir.cleanup()\n\n return", "response": "This method is called by the base class to run the GSEA analysis."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef run(self):\n\n assert self.min_size <= self.max_size\n\n # parsing rankings\n dat2 = self._load_ranking(self.rnk)\n assert len(dat2) > 1\n\n # cpu numbers\n self._set_cores()\n # Start Analysis\n self._logger.info(\"Parsing data files for GSEA.............................\")\n # filtering out gene sets and build gene sets dictionary\n gmt = self.load_gmt(gene_list=dat2.index.values, gmt=self.gene_sets)\n\n self._logger.info(\"%04d gene_sets used for further statistical testing.....\"% len(gmt))\n self._logger.info(\"Start to run GSEA...Might take a while..................\")\n # compute ES, NES, pval, FDR, RES\n gsea_results, hit_ind,rank_ES, subsets = gsea_compute(data=dat2, n=self.permutation_num, gmt=gmt,\n weighted_score_type=self.weighted_score_type,\n permutation_type='gene_set', method=None,\n pheno_pos=self.pheno_pos, pheno_neg=self.pheno_neg,\n classes=None, ascending=self.ascending,\n processes=self._processes, seed=self.seed)\n self._logger.info(\"Start to generate gseapy reports, and produce figures...\")\n res_zip = zip(subsets, list(gsea_results), hit_ind, rank_ES)\n self._save_results(zipdata=res_zip, outdir=self.outdir, module=self.module,\n gmt=gmt, rank_metric=dat2, permutation_type=\"gene_sets\")\n\n # Plotting\n if not self._noplot:\n self._plotting(rank_metric=dat2, results=self.results,\n graph_num=self.graph_num, outdir=self.outdir,\n figsize=self.figsize, format=self.format,\n pheno_pos=self.pheno_pos, pheno_neg=self.pheno_neg)\n\n self._logger.info(\"Congratulations. GSEApy runs successfully................\\n\")\n if self._outdir is None:\n self._tmpdir.cleanup()\n\n return", "response": "Run the GSEA prerank workflow."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nnormalizing the samples according to the sample normalization method", "response": "def norm_samples(self, dat):\n \"\"\"normalization samples\n see here: http://rowley.mit.edu/caw_web/ssGSEAProjection/ssGSEAProjection.Library.R\n \"\"\"\n\n if self.sample_norm_method == 'rank':\n data = dat.rank(axis=0, method='average', na_option='bottom')\n data = 10000*data / data.shape[0]\n elif self.sample_norm_method == 'log_rank':\n data = dat.rank(axis=0, method='average', na_option='bottom')\n data = log(10000*data / data.shape[0] + exp(1))\n elif self.sample_norm_method == 'log':\n dat[dat < 1] = 1\n data = log(dat + exp(1))\n elif self.sample_norm_method == 'custom':\n self._logger.info(\"Use custom rank metric for ssGSEA\")\n data = dat\n else:\n sys.stderr.write(\"No supported method: %s\"%self.sample_norm_method)\n sys.exit(0)\n\n return data"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef run(self):\n self._logger.info(\"Parsing data files for ssGSEA...........................\")\n # load data\n data = self.load_data()\n # normalized samples, and rank\n normdat = self.norm_samples(data)\n # filtering out gene sets and build gene sets dictionary\n gmt = self.load_gmt(gene_list=normdat.index.values, gmt=self.gene_sets)\n self._logger.info(\"%04d gene_sets used for further statistical testing.....\"% len(gmt))\n # set cpu numbers\n self._set_cores()\n # start analsis\n self._logger.info(\"Start to run ssGSEA...Might take a while................\")\n if self.permutation_num == 0 :\n # ssGSEA without permutation\n self.runSamples(df=normdat, gmt=gmt)\n else:\n # run permutation procedure and calculate pvals, fdrs\n self._logger.warning(\"run ssGSEA with permutation procedure, don't use these part of results for publication.\")\n self.runSamplesPermu(df=normdat, gmt=gmt)\n # clean up all outputs if _outdir is None\n if self._outdir is None:\n self._tmpdir.cleanup()", "response": "run entry and return a dictionary of the most recent analisymmetric state."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef runSamplesPermu(self, df, gmt=None):\n\n assert self.min_size <= self.max_size\n mkdirs(self.outdir)\n self.resultsOnSamples = OrderedDict()\n outdir = self.outdir\n # iter throught each sample\n for name, ser in df.iteritems():\n self.outdir = os.path.join(outdir, str(name))\n self._logger.info(\"Run Sample: %s \" % name)\n mkdirs(self.outdir)\n # sort ranking values from high to low or reverse\n dat2 = ser.sort_values(ascending=self.ascending)\n # reset integer index, or caused unwanted problems\n # df.reset_index(drop=True, inplace=True)\n\n # compute ES, NES, pval, FDR, RES\n gsea_results, hit_ind,rank_ES, subsets = gsea_compute(data=dat2, n=self.permutation_num, gmt=gmt,\n weighted_score_type=self.weighted_score_type,\n permutation_type='gene_set', method=None,\n pheno_pos='', pheno_neg='',\n classes=None, ascending=self.ascending,\n processes=self._processes,\n seed=self.seed, single=True, scale=self.scale)\n\n # write file\n res_zip = zip(subsets, list(gsea_results), hit_ind, rank_ES)\n self._save_results(zipdata=res_zip, outdir=self.outdir, module=self.module,\n gmt=gmt, rank_metric=dat2, permutation_type=\"gene_sets\")\n self.resultsOnSamples[name] = self.res2d.es\n # plotting\n if self._noplot: continue\n self._logger.info(\"Plotting Sample: %s \\n\" % name)\n self._plotting(rank_metric=dat2, results=self.results,\n graph_num=self.graph_num, outdir=self.outdir,\n figsize=self.figsize, format=self.format)\n\n # save es, nes to file\n self._save(outdir)\n\n return", "response": "Run the GSEA algorithm with a single sample with a permutation procedure."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef runSamples(self, df, gmt=None):\n\n # df.index.values are gene_names\n # Save each sample results to odict\n self.resultsOnSamples = OrderedDict()\n outdir = self.outdir\n # run ssgsea for gct expression matrix\n #multi-threading\n subsets = sorted(gmt.keys())\n tempes=[]\n names=[]\n rankings=[]\n pool = Pool(processes=self._processes)\n for name, ser in df.iteritems():\n #prepare input\n dat = ser.sort_values(ascending=self.ascending)\n rankings.append(dat)\n names.append(name)\n genes_sorted, cor_vec = dat.index.values, dat.values\n rs = np.random.RandomState(self.seed)\n # apply_async\n tempes.append(pool.apply_async(enrichment_score_tensor,\n args=(genes_sorted, cor_vec, gmt,\n self.weighted_score_type,\n self.permutation_num, rs, True,\n self.scale)))\n pool.close()\n pool.join()\n # save results and plotting\n for i, temp in enumerate(tempes):\n name, rnk = names[i], rankings[i]\n self._logger.info(\"Calculate Enrichment Score for Sample: %s \"%name)\n es, esnull, hit_ind, RES = temp.get()\n # create results subdir\n self.outdir= os.path.join(outdir, str(name))\n mkdirs(self.outdir)\n # save results\n self.resultsOnSamples[name] = pd.Series(data=es, index=subsets, name=name)\n # plotting\n if self._noplot: continue\n self._logger.info(\"Plotting Sample: %s \\n\" % name)\n for i, term in enumerate(subsets):\n term = term.replace('/','_').replace(\":\",\"_\")\n outfile = '{0}/{1}.{2}.{3}'.format(self.outdir, term, self.module, self.format)\n gseaplot(rank_metric=rnk, term=term, \n hits_indices=hit_ind[i], nes=es[i], pval=1, fdr=1, \n RES=RES[i], pheno_pos='', pheno_neg='', \n figsize=self.figsize, ofname=outfile)\n # save es, nes to file\n self._save(outdir)\n\n return", "response": "Run multi - threading GSEA for each sample."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _save(self, outdir):\n # save raw ES to one csv file\n samplesRawES = pd.DataFrame(self.resultsOnSamples)\n samplesRawES.index.name = 'Term|ES'\n # normalize enrichment scores by using the entire data set, as indicated\n # by Barbie et al., 2009, online methods, pg. 2\n samplesNES = samplesRawES / (samplesRawES.values.max() - samplesRawES.values.min())\n samplesNES = samplesNES.copy()\n samplesNES.index.rename('Term|NES', inplace=True)\n self.res2d = samplesNES\n self._logger.info(\"Congratulations. GSEApy runs successfully................\\n\")\n if self._outdir is None: return\n # write es\n outESfile = os.path.join(outdir, \"gseapy.samples.raw.es.txt\")\n with open(outESfile, 'a') as f:\n if self.scale:\n f.write('# scale the enrichment scores by number of genes in the gene sets\\n')\n f.write('# this normalization has not effects on the final NES, ' + \\\n 'as indicated by Barbie et al., 2009, online methods, pg. 2\\n')\n else:\n f.write('# raw enrichment scores of all data\\n')\n f.write('# no scale es by numbers of genes in the gene sets\\n')\n samplesRawES.to_csv(f, sep='\\t')\n\n outNESfile = os.path.join(outdir, \"gseapy.samples.normalized.es.txt\")\n with open(outNESfile, 'a') as f:\n f.write('# normalize enrichment scores by using the entire data set\\n')\n f.write('# as indicated by Barbie et al., 2009, online methods, pg. 2\\n')\n samplesNES.to_csv(f, sep='\\t')\n return", "response": "save raw es and stats"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run(self):\n assert self.min_size <= self.max_size\n assert self.fignum > 0\n import glob\n from bs4 import BeautifulSoup\n\n # parsing files.......\n try:\n results_path = glob.glob(self.indir+'*/edb/results.edb')[0]\n rank_path = glob.glob(self.indir+'*/edb/*.rnk')[0]\n gene_set_path = glob.glob(self.indir+'*/edb/gene_sets.gmt')[0]\n except IndexError as e:\n sys.stderr.write(\"Could not locate GSEA files in the given directory!\")\n sys.exit(1)\n # extract sample names from .cls file\n cls_path = glob.glob(self.indir+'*/edb/*.cls')\n if cls_path:\n pos, neg, classes = gsea_cls_parser(cls_path[0])\n else:\n # logic for prerank results\n pos, neg = '',''\n # start reploting\n self.gene_sets=gene_set_path\n # obtain gene sets\n gene_set_dict = self.parse_gmt(gmt=gene_set_path)\n # obtain rank_metrics\n rank_metric = self._load_ranking(rank_path)\n correl_vector = rank_metric.values\n gene_list = rank_metric.index.values\n # extract each enriment term in the results.edb files and plot.\n database = BeautifulSoup(open(results_path), features='xml')\n length = len(database.findAll('DTG'))\n fig_num = self.fignum if self.fignum <= length else length\n for idx in range(fig_num):\n # extract statistical resutls from results.edb file\n enrich_term, hit_ind, nes, pval, fdr= gsea_edb_parser(results_path, index=idx)\n gene_set = gene_set_dict.get(enrich_term)\n # calculate enrichment score\n RES = enrichment_score(gene_list=gene_list, \n correl_vector=correl_vector,\n gene_set=gene_set, \n weighted_score_type=self.weighted_score_type,\n nperm=0)[-1]\n # plotting\n term = enrich_term.replace('/','_').replace(\":\",\"_\")\n outfile = '{0}/{1}.{2}.{3}'.format(self.outdir, term, self.module, self.format)\n gseaplot(rank_metric=rank_metric, term=enrich_term, \n hits_indices=hit_ind, nes=nes, pval=pval, fdr=fdr, \n RES=RES, pheno_pos=pos, pheno_neg=neg, \n figsize=self.figsize, ofname=outfile)\n\n self._logger.info(\"Congratulations! Your plots have been reproduced successfully!\\n\")", "response": "main function that parses the results. edb files and plots the GSEA data."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef prepare_outdir(self):\n self._outdir = self.outdir\n if self._outdir is None:\n self._tmpdir = TemporaryDirectory()\n self.outdir = self._tmpdir.name\n elif isinstance(self.outdir, str):\n mkdirs(self.outdir)\n else:\n raise Exception(\"Error parsing outdir: %s\"%type(self.outdir))\n\n # handle gene_sets\n logfile = os.path.join(self.outdir, \"gseapy.%s.%s.log\" % (self.module, self.descriptions))\n return logfile", "response": "create temp directory and return logfile."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nparsing gene_sets input file type", "response": "def parse_genesets(self):\n \"\"\"parse gene_sets input file type\"\"\"\n\n enrichr_library = self.get_libraries()\n if isinstance(self.gene_sets, list):\n gss = self.gene_sets\n elif isinstance(self.gene_sets, str):\n gss = [ g.strip() for g in self.gene_sets.strip().split(\",\") ]\n elif isinstance(self.gene_sets, dict):\n gss = [self.gene_sets]\n else:\n raise Exception(\"Error parsing enrichr libraries, please provided corrected one\")\n \n # gss: a list contain .gmt, dict, enrichr_liraries.\n # now, convert .gmt to dict\n gss_exist = [] \n for g in gss:\n if isinstance(g, dict): \n gss_exist.append(g)\n continue\n\n if isinstance(g, str): \n if g in enrichr_library: \n gss_exist.append(g)\n continue\n if g.lower().endswith(\".gmt\") and os.path.exists(g):\n self._logger.info(\"User Defined gene sets is given: %s\"%g)\n with open(g) as genesets:\n g_dict = { line.strip().split(\"\\t\")[0]: line.strip().split(\"\\t\")[2:]\n for line in genesets.readlines() }\n gss_exist.append(g_dict)\n return gss_exist"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef send_genes(self, gene_list, url):\n payload = {\n 'list': (None, gene_list),\n 'description': (None, self.descriptions)\n }\n # response\n response = requests.post(url, files=payload)\n if not response.ok:\n raise Exception('Error analyzing gene list')\n sleep(1)\n job_id = json.loads(response.text)\n\n return job_id", "response": "send gene list to enrichr server"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncomparing the genes sent and received to get successfully recognized genes", "response": "def check_genes(self, gene_list, usr_list_id):\n '''\n Compare the genes sent and received to get successfully recognized genes\n '''\n response = requests.get('http://amp.pharm.mssm.edu/Enrichr/view?userListId=%s' % usr_list_id)\n if not response.ok:\n raise Exception('Error getting gene list back')\n returnedL = json.loads(response.text)[\"genes\"]\n returnedN = sum([1 for gene in gene_list if gene in returnedL])\n self._logger.info('{} genes successfully recognized by Enrichr'.format(returnedN))"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_results(self, gene_list):\n ADDLIST_URL = 'http://amp.pharm.mssm.edu/%sEnrichr/addList'%self._organism\n job_id = self.send_genes(gene_list, ADDLIST_URL)\n user_list_id = job_id['userListId']\n\n RESULTS_URL = 'http://amp.pharm.mssm.edu/%sEnrichr/export'%self._organism\n query_string = '?userListId=%s&filename=%s&backgroundType=%s'\n # set max retries num =5\n s = retry(num=5)\n filename = \"%s.%s.reports\" % (self._gs, self.descriptions)\n url = RESULTS_URL + query_string % (user_list_id, filename, self._gs)\n response = s.get(url, stream=True, timeout=None)\n # response = requests.get(RESULTS_URL + query_string % (user_list_id, gene_set))\n sleep(1)\n res = pd.read_csv(StringIO(response.content.decode('utf-8')),sep=\"\\t\")\n return [job_id['shortId'], res]", "response": "Get the results from the enrichr API"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nselects Enrichr organism from below", "response": "def get_organism(self):\n \"\"\"Select Enrichr organism from below:\n\n Human & Mouse: H. sapiens & M. musculus\n Fly: D. melanogaster\n Yeast: S. cerevisiae\n Worm: C. elegans\n Fish: D. rerio\n\n \"\"\"\n\n organism = {'default': ['', 'hs', 'mm', 'human','mouse',\n 'homo sapiens', 'mus musculus',\n 'h. sapiens', 'm. musculus'],\n 'Fly': ['fly', 'd. melanogaster', 'drosophila melanogaster'],\n 'Yeast': ['yeast', 's. cerevisiae', 'saccharomyces cerevisiae'],\n 'Worm': ['worm', 'c. elegans', 'caenorhabditis elegans', 'nematode'],\n 'Fish': ['fish', 'd. rerio', 'danio rerio', 'zebrafish']\n }\n\n for k, v in organism.items():\n if self.organism.lower() in v :\n self._organism = k\n\n if self._organism is None:\n raise Exception(\"No supported organism found !!!\")\n\n if self._organism == 'default':\n self._organism = ''\n return"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nenrich the object with the current state of the object", "response": "def enrich(self, gmt):\n \"\"\"use local mode\n \n p = p-value computed using the Fisher exact test (Hypergeometric test) \n\n Not implemented here:\n\n combine score = log(p)\u00b7z\n\n see here: http://amp.pharm.mssm.edu/Enrichr/help#background&q=4\n \n columns contain:\n \n Term Overlap P-value Adjusted_P-value Genes\n\n \"\"\"\n if isscalar(self.background):\n if isinstance(self.background, int) or self.background.isdigit():\n self._bg = int(self.background)\n elif isinstance(self.background, str):\n # self.background = set(reduce(lambda x,y: x+y, gmt.values(),[]))\n self._bg = self.get_background()\n self._logger.info(\"Background: found %s genes\"%(len(self._bg)))\n else:\n raise Exception(\"Unsupported background data type\")\n else:\n # handle array object: nd.array, list, tuple, set, Series\n try:\n it = iter(self.background)\n self._bg = set(self.background)\n except TypeError:\n self._logger.error(\"Unsupported background data type\")\n # statistical testing\n hgtest = list(calc_pvalues(query=self._gls, gene_sets=gmt, \n background=self._bg))\n if len(hgtest) > 0:\n terms, pvals, olsz, gsetsz, genes = hgtest\n fdrs, rej = multiple_testing_correction(ps = pvals, \n alpha=self.cutoff,\n method='benjamini-hochberg')\n # save to a dataframe\n odict = OrderedDict()\n odict['Term'] = terms\n odict['Overlap'] = list(map(lambda h,g: \"%s/%s\"%(h, g), olsz, gsetsz))\n odict['P-value'] = pvals\n odict['Adjusted P-value'] = fdrs\n # odict['Reject (FDR< %s)'%self.cutoff ] = rej\n odict['Genes'] = [\";\".join(g) for g in genes]\n res = pd.DataFrame(odict)\n return res\n return"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef run(self):\n\n # set organism\n self.get_organism()\n # read input file\n genes_list = self.parse_genelists()\n gss = self.parse_genesets()\n # if gmt\n self._logger.info(\"Connecting to Enrichr Server to get latest library names\")\n if len(gss) < 1:\n sys.stderr.write(\"Not validated Enrichr library name provided\\n\")\n sys.stdout.write(\"Hint: use get_library_name() to view full list of supported names\")\n sys.exit(1)\n self.results = pd.DataFrame()\n\n for g in gss: \n if isinstance(g, dict): \n ## local mode\n res = self.enrich(g)\n shortID, self._gs = str(id(g)), \"CUSTOM%s\"%id(g)\n if res is None: \n self._logger.info(\"No hits return, for gene set: Custom%s\"%shortID)\n continue\n else:\n ## online mode\n self._gs = str(g)\n self._logger.debug(\"Start Enrichr using library: %s\" % (self._gs))\n self._logger.info('Analysis name: %s, Enrichr Library: %s' % (self.descriptions, self._gs))\n shortID, res = self.get_results(genes_list)\n # Remember gene set library used\n res.insert(0, \"Gene_set\", self._gs)\n # Append to master dataframe\n self.results = self.results.append(res, ignore_index=True, sort=True)\n self.res2d = res\n if self._outdir is None: continue\n self._logger.info('Save file of enrichment results: Job Id:' + str(shortID))\n outfile = \"%s/%s.%s.%s.reports.txt\" % (self.outdir, self._gs, self.descriptions, self.module)\n self.res2d.to_csv(outfile, index=False, encoding='utf-8', sep=\"\\t\")\n # plotting\n if not self.__no_plot:\n msg = barplot(df=res, cutoff=self.cutoff, figsize=self.figsize,\n top_term=self.__top_term, color='salmon',\n title=self._gs,\n ofname=outfile.replace(\"txt\", self.format))\n if msg is not None : self._logger.warning(msg)\n self._logger.info('Done.\\n')\n # clean up tmpdir\n if self._outdir is None: self._tmpdir.cleanup()\n\n return", "response": "run enrichr for one sample gene list but multi - libraries"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef cube(script, size=1.0, center=False, color=None):\n\n \"\"\"# Convert size to list if it isn't already\n if not isinstance(size, list):\n size = list(size)\n # If a single value was supplied use it for all 3 axes\n if len(size) == 1:\n size = [size[0], size[0], size[0]]\"\"\"\n size = util.make_list(size, 3)\n if script.ml_version == '1.3.4BETA':\n filter_name = 'Box'\n else:\n filter_name = 'Box/Cube'\n filter_xml = ''.join([\n ' \\n'.format(filter_name),\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Cube', change_layer=True)\n transform.scale(script, value=size)\n # Box is centered on origin at creation\n if not center:\n transform.translate(script, value=[size[0]/2, size[1]/2, size[2]/2])\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "Create a cube primitive containing a size of size."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef cylinder(script, up='z', height=1.0, radius=None, radius1=None,\n radius2=None, diameter=None, diameter1=None, diameter2=None,\n center=False, cir_segments=32, color=None):\n \"\"\"Create a cylinder or cone primitive. Usage is based on OpenSCAD.\n # height = height of the cylinder\n # radius1 = radius of the cone on bottom end\n # radius2 = radius of the cone on top end\n # center = If true will center the height of the cone/cylinder around\n # the origin. Default is false, placing the base of the cylinder or radius1\n # radius of cone at the origin.\n #\n cir_segments Number of sides of the polygonal approximation of the cone\n\n color = specify a color name to apply vertex colors to the newly\n # created mesh\n \"\"\"\n if radius is not None and diameter is None:\n if radius1 is None and diameter1 is None:\n radius1 = radius\n if radius2 is None and diameter2 is None:\n radius2 = radius\n if diameter is not None:\n if radius1 is None and diameter1 is None:\n radius1 = diameter / 2\n if radius2 is None and diameter2 is None:\n radius2 = diameter / 2\n if diameter1 is not None:\n radius1 = diameter1 / 2\n if diameter2 is not None:\n radius2 = diameter2 / 2\n if radius1 is None:\n radius1 = 1.0\n if radius2 is None:\n radius2 = radius1\n\n # Cylinder is created centered with Y up\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Cone', change_layer=True)\n if not center:\n transform.translate(script, [0, height / 2, 0])\n if up.lower() == 'z':\n transform.rotate(script, axis='x', angle=90) # rotate to Z up\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "Create a cylinder or cone primitive."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates an icosphere mesh", "response": "def icosphere(script, radius=1.0, diameter=None, subdivisions=3, color=None):\n \"\"\"create an icosphere mesh\n\n radius Radius of the sphere\n # subdivisions = Subdivision level; Number of the recursive subdivision of the\n # surface. Default is 3 (a sphere approximation composed by 1280 faces).\n # Admitted values are in the range 0 (an icosahedron) to 8 (a 1.3 MegaTris\n # approximation of a sphere). Formula for number of faces: F=20*4^subdiv\n # color = specify a color name to apply vertex colors to the newly\n # created mesh\"\"\"\n if diameter is not None:\n radius = diameter / 2\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Sphere', change_layer=True)\n if color is not None:\n vert_color.function(script, color=color)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef sphere_cap(script, angle=1.0, subdivisions=3, color=None):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Sphere Cap', change_layer=True)\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "This function creates a new non - empty sphere cap."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef torus(script, major_radius=3.0, minor_radius=1.0, inner_diameter=None,\n outer_diameter=None, major_segments=48, minor_segments=12,\n color=None):\n \"\"\"Create a torus mesh\n\n Args:\n major_radius (float, (optional)): radius from the origin to the\n center of the cross sections\n minor_radius (float, (optional)): radius of the torus cross\n section\n inner_diameter (float, (optional)): inner diameter of torus. If\n both inner_diameter and outer_diameter are provided then\n these will override major_radius and minor_radius.,\n outer_diameter (float, (optional)): outer diameter of torus. If\n both inner_diameter and outer_diameter are provided then\n these will override major_radius and minor_radius.\n major_segments (int (optional)): number of segments for the main\n ring of the torus\n minor_segments (int (optional)): number of segments for the minor\n ring of the torus\n color (str (optional)): color name to apply vertex colors to the\n newly created mesh\n\n Returns:\n None\n\n \"\"\"\n if inner_diameter is not None and outer_diameter is not None:\n major_radius = (inner_diameter + outer_diameter) / 4\n minor_radius = major_radius - inner_diameter / 2\n # Ref: inner_diameter = 2 * (major_radius - minor_radius)\n # Ref: outer_diameter = 2 * (major_radius + minor_radius)\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Torus', change_layer=True)\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "Create a torus mesh from a script."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef grid(script, size=1.0, x_segments=1, y_segments=1, center=False,\n color=None):\n \"\"\"2D square/plane/grid created on XY plane\n\n x_segments # Number of segments in the X direction.\n y_segments # Number of segments in the Y direction.\n center=\"false\" # If true square will be centered on origin;\n otherwise it is place in the positive XY quadrant.\n\n\n \"\"\"\n size = util.make_list(size, 2)\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Grid Generator', change_layer=True)\n\n \"\"\"This is to work around a bug in MeshLab whereby the Grid Generator does not\n create zero values for z. Ref bug #458: https://sourceforge.net/p/meshlab/bugs/458/\"\"\"\n transform.vert_function(script, z_func='rint(z)')\n\n \"\"\"Note that the \"center\" parameter in the mlx script does not actually\n center the square, not sure what it is doing. Instead this is set to false,\n which places the plane in the -X,+Y quadrant, and it is translated to the\n appropriate position after creation.\n \"\"\"\n if center:\n transform.translate(script, value=[size[0]/2, -size[1]/2, 0])\n else:\n transform.translate(script, value=[size[0], 0, 0])\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "Generate a 2D square grid."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef annulus(script, radius=None, radius1=None, radius2=None, diameter=None,\n diameter1=None, diameter2=None, cir_segments=32, color=None):\n \"\"\"Create a 2D (surface) circle or annulus\n radius1=1 # Outer radius of the circle\n radius2=0 # Inner radius of the circle (if non-zero it creates an annulus)\n color=\"\" # specify a color name to apply vertex colors to the newly created mesh\n\n OpenSCAD: parameters: diameter overrides radius, radius1 & radius2 override radius\n \"\"\"\n if radius is not None and diameter is None:\n if radius1 is None and diameter1 is None:\n radius1 = radius\n if radius2 is None and diameter2 is None:\n radius2 = 0\n if diameter is not None:\n if radius1 is None and diameter1 is None:\n radius1 = diameter / 2\n if radius2 is None and diameter2 is None:\n radius2 = 0\n if diameter1 is not None:\n radius1 = diameter1 / 2\n if diameter2 is not None:\n radius2 = diameter2 / 2\n if radius1 is None:\n radius1 = 1\n if radius2 is None:\n radius2 = 0\n\n # Circle is created centered on the XY plane\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Annulus', change_layer=True)\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "Create an annulus in a 2D mesh."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef cylinder_open_hires(script, height=1.0, radius=1, diameter=None,\n cir_segments=48, height_segments=1,\n invert_normals=False, center=False, color=None):\n \"\"\" Creates a round open tube, e.g. a cylinder with no top or bottom.\n\n Useful if you want to wrap it around and join the open ends together, forming a torus.\n\n invert_normals (bool (optional)): if True normals point outward; in false normals point inward.\n \"\"\"\n if diameter is not None:\n radius = diameter / 2\n\n if center:\n z_translate = -height / 2\n else:\n z_translate = 0.0\n\n grid(script,\n [2 * math.pi * radius, height],\n x_segments=cir_segments,\n y_segments=height_segments)\n transform.rotate(script, 'x', 90)\n transform.translate(script, [math.pi * radius / 2, 0, z_translate])\n if not invert_normals:\n transform.rotate(script, 'z', 180)\n transform.wrap2cylinder(script, radius)\n clean.merge_vert(script, threshold=0.00002)\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "Creates a cylinder open tube."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncreate a square open tube with the given size.", "response": "def cube_open_hires_old(script, size=1.0, x_segments=1, y_segments=1, z_segments=1,\n center=False, color=None):\n \"\"\" Creates a square open tube, e.g. a box with no top or bottom.\n\n Useful if you want to wrap it around and join the open ends together, forming a torus.\n \"\"\"\n \"\"\"# Convert size to list if it isn't already\n if not isinstance(size, list):\n size = list(size)\n # If a single value was supplied use it for all 3 axes\n if len(size) == 1:\n size = [size[0], size[0], size[0]]\"\"\"\n size = util.make_list(size, 3)\n\n # X sides\n grid(script, [size[0], size[2]],\n x_segments=x_segments,\n y_segments=z_segments)\n transform.rotate(script, 'x', 90)\n #transform.translate(script, [0, 0, -size[2]])\n layers.duplicate(script)\n # Rotate to correct normals\n transform.rotate(script, 'z', 180)\n transform.translate(script, [size[0], size[1], 0])\n\n # Y sides\n grid(script, [size[2], size[1]],\n x_segments=z_segments,\n y_segments=y_segments)\n transform.rotate(script, 'y', -90)\n #transform.rotate(script, 'z', 90)\n #transform.translate(script, [0, 0, -size[2]])\n layers.duplicate(script)\n # Rotate to correct normals\n transform.rotate(script, 'z', 180)\n transform.translate(script, [size[0], size[1], 0])\n\n layers.join(script)\n clean.merge_vert(script, threshold=0.00002)\n # normals.fix(script)\n if center:\n transform.translate(script, [-size[0] / 2, -size[1] / 2, -size[2] / 2])\n if color is not None:\n vert_color.function(script, color=color)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef cube_open_hires(script, size=1.0, x_segments=1, y_segments=1, z_segments=1,\n center=False, color=None):\n \"\"\" Creates a square open tube, e.g. a box with no top or bottom.\n\n Useful if you want to wrap it around and join the open ends together, forming a torus.\n \"\"\"\n \"\"\"# Convert size to list if it isn't already\n if not isinstance(size, list):\n size = list(size)\n # If a single value was supplied use it for all 3 axes\n if len(size) == 1:\n size = [size[0], size[0], size[0]]\"\"\"\n size = util.make_list(size, 3)\n\n # Make big grid and bend\n grid(script, [2*(x_segments + y_segments), z_segments],\n x_segments=2*(x_segments + y_segments),\n y_segments=z_segments)\n transform.rotate(script, 'x', 90)\n # Bend 3 times into a rectangular tube\n if script.ml_version == '1.3.4BETA': # muparser version: 1.3.2\n transform.vert_function(script,\n x_func='if(x>{x_size}, {x_size}, x)'.format(x_size=x_segments),\n y_func='if(x>{x_size}, (x-{x_size}), y)'.format(x_size=x_segments),\n z_func='z')\n transform.vert_function(script,\n x_func='if(y>{y_size}, ({y_size}-y+{x_size}), x)'.format(x_size=x_segments, y_size=y_segments),\n y_func='if(y>{y_size}, {y_size}, y)'.format(y_size=y_segments),\n z_func='z')\n transform.vert_function(script,\n x_func='if(x<0, 0, x)',\n y_func='if(x<0, ({y_size}+x), y)'.format(y_size=y_segments),\n z_func='z')\n else: # muparser version: 2.2.5\n transform.vert_function(script,\n x_func='(x>{x_size} ? {x_size} : x)'.format(x_size=x_segments),\n y_func='(x>{x_size} ? (x-{x_size}) : y)'.format(x_size=x_segments),\n z_func='z')\n transform.vert_function(script,\n x_func='(y>{y_size} ? ({y_size}-y+{x_size}) : x)'.format(x_size=x_segments, y_size=y_segments),\n y_func='(y>{y_size} ? {y_size} : y)'.format(y_size=y_segments),\n z_func='z')\n transform.vert_function(script,\n x_func='(x<0 ? 0 : x)',\n y_func='(x<0 ? ({y_size}+x) : y)'.format(y_size=y_segments),\n z_func='z')\n clean.merge_vert(script, threshold=0.00002)\n transform.scale(script, [size[0]/x_segments, size[1]/y_segments, size[2]/z_segments])\n if center:\n transform.translate(script, [-size[0] / 2, -size[1] / 2, -size[2] / 2])\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "Creates a square open tube with the given size."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a plane with a specified number of vertices on the sides and no vertices on the interior.", "response": "def plane_hires_edges(script, size=1.0, x_segments=1, y_segments=1,\n center=False, color=None):\n \"\"\" Creates a plane with a specified number of vertices\n on it sides, but no vertices on the interior.\n\n Currently used to create a simpler bottom for cube_hires.\n\n \"\"\"\n size = util.make_list(size, 2)\n\n grid(script, size=[x_segments + y_segments - 1, 1],\n x_segments=(x_segments + y_segments - 1), y_segments=1)\n if ml_script1.ml_version == '1.3.4BETA':\n and_val = 'and'\n else:\n and_val = '&&'\n\n if script.ml_version == '1.3.4BETA': # muparser version: 1.3.2\n # Deform left side\n transform.vert_function(\n script,\n x_func='if((y>0) and (x<%s),0,x)' % (y_segments),\n y_func='if((y>0) and (x<%s),(x+1)*%s,y)' % (\n y_segments, size[1] / y_segments))\n # Deform top\n transform.vert_function(\n script,\n x_func='if((y>0) and (x>=%s),(x-%s+1)*%s,x)' % (\n y_segments, y_segments, size[0] / x_segments),\n y_func='if((y>0) and (x>=%s),%s,y)' % (y_segments, size[1]))\n # Deform right side\n transform.vert_function(\n script,\n x_func='if((y<.00001) and (x>%s),%s,x)' % (\n x_segments, size[0]),\n y_func='if((y<.00001) and (x>%s),(x-%s)*%s,y)' % (\n x_segments, x_segments, size[1] / y_segments))\n # Deform bottom\n transform.vert_function(\n script,\n x_func='if((y<.00001) and (x<=%s) and (x>0),(x)*%s,x)' % (\n x_segments, size[0] / x_segments),\n y_func='if((y<.00001) and (x<=%s) and (x>0),0,y)' % (x_segments))\n else: # muparser version: 2.2.5\n # Deform left side\n transform.vert_function(\n script,\n x_func='((y>0) && (x<{yseg}) ? 0 : x)'.format(yseg=y_segments),\n y_func='((y>0) && (x<%s) ? (x+1)*%s : y)' % (\n y_segments, size[1] / y_segments))\n # Deform top\n transform.vert_function(\n script,\n x_func='((y>0) && (x>=%s) ? (x-%s+1)*%s : x)' % (\n y_segments, y_segments, size[0] / x_segments),\n y_func='((y>0) && (x>=%s) ? %s : y)' % (y_segments, size[1]))\n # Deform right side\n transform.vert_function(\n script,\n x_func='((y<.00001) && (x>%s) ? %s : x)' % (\n x_segments, size[0]),\n y_func='((y<.00001) && (x>%s) ? (x-%s)*%s : y)' % (\n x_segments, x_segments, size[1] / y_segments))\n # Deform bottom\n transform.vert_function(\n script,\n x_func='((y<.00001) && (x<=%s) && (x>0) ? (x)*%s : x)' % (\n x_segments, size[0] / x_segments),\n y_func='((y<.00001) && (x<=%s) && (x>0) ? 0 : y)' % (x_segments))\n if center:\n transform.translate(script, [-size[0] / 2, -size[1] / 2])\n if color is not None:\n vert_color.function(script, color=color)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef cube_hires(script, size=1.0, x_segments=1, y_segments=1, z_segments=1,\n simple_bottom=True, center=False, color=None):\n \"\"\"Create a box with user defined number of segments in each direction.\n\n Grid spacing is the same as its dimensions (spacing = 1) and its\n thickness is one. Intended to be used for e.g. deforming using functions\n or a height map (lithopanes) and can be resized after creation.\n\n Warnings: function uses layers.join\n\n top_option\n 0 open\n 1 full\n 2 simple\n bottom_option\n 0 open\n 1 full\n 2 simple\n \"\"\"\n \"\"\"# Convert size to list if it isn't already\n if not isinstance(size, list):\n size = list(size)\n # If a single value was supplied use it for all 3 axes\n if len(size) == 1:\n size = [size[0], size[0], size[0]]\"\"\"\n size = util.make_list(size, 3)\n\n # Top\n grid(script,\n size,\n x_segments,\n y_segments)\n transform.translate(script, [0, 0, size[2]])\n\n # Bottom\n if simple_bottom:\n plane_hires_edges(\n script, size, x_segments, y_segments)\n else:\n layers.duplicate(script)\n transform.translate(script, [0, 0, -size[2]])\n # Rotate to correct normals\n transform.rotate(script, 'x', 180)\n transform.translate(script, [0, size[1], 0])\n\n # Sides\n cube_open_hires(\n script=script, size=size, x_segments=x_segments,\n y_segments=y_segments, z_segments=z_segments)\n\n # Join everything together\n layers.join(script)\n # Need some tolerance on merge_vert due to rounding errors\n clean.merge_vert(script, threshold=0.00002)\n if center:\n transform.translate(script, [-size[0] / 2, -size[1] / 2, -size[2] / 2])\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "Create a new empty tree with the same size as the current one."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef annulus_hires(script, radius=None, radius1=None, radius2=None,\n diameter=None, diameter1=None, diameter2=None,\n cir_segments=48, rad_segments=1, color=None):\n \"\"\"Create a cylinder with user defined number of segments\n\n \"\"\"\n if radius is not None and diameter is None:\n if radius1 is None and diameter1 is None:\n radius1 = radius\n if radius2 is None and diameter2 is None:\n radius2 = 0\n if diameter is not None:\n if radius1 is None and diameter1 is None:\n radius1 = diameter / 2\n if radius2 is None and diameter2 is None:\n radius2 = 0\n if diameter1 is not None:\n radius1 = diameter1 / 2\n if diameter2 is not None:\n radius2 = diameter2 / 2\n if radius1 is None:\n radius1 = 1\n if radius2 is None:\n radius2 = 0\n ring = (radius1 - radius2) / rad_segments\n\n for i in range(0, rad_segments):\n annulus(script,\n radius1=radius1 - i * ring,\n radius2=radius1 - (i + 1) * ring,\n cir_segments=cir_segments)\n layers.join(script, merge_vert=True)\n if color is not None:\n vert_color.function(script, color=color)\n return None", "response": "Create a cylinder with user defined number of segments and a color."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ncreates a tube of cylinders with the given height and radius.", "response": "def tube_hires(script, height=1.0, radius=None, radius1=None, radius2=None,\n diameter=None, diameter1=None, diameter2=None, cir_segments=32,\n rad_segments=1, height_segments=1, center=False,\n simple_bottom=False, color=None):\n \"\"\"Create a cylinder with user defined number of segments\n\n \"\"\"\n\n # TODO: add option to round the top of the cylinder, i.e. deform spherically\n # TODO: add warnings if values are ignored, e.g. if you specify both radius\n # and diameter.\n if radius is not None and diameter is None:\n if radius1 is None and diameter1 is None:\n radius1 = radius\n if radius2 is None and diameter2 is None:\n radius2 = 0\n if diameter is not None:\n if radius1 is None and diameter1 is None:\n radius1 = diameter / 2\n if radius2 is None and diameter2 is None:\n radius2 = 0\n if diameter1 is not None:\n radius1 = diameter1 / 2\n if diameter2 is not None:\n radius2 = diameter2 / 2\n if radius1 is None:\n radius1 = 1\n if radius2 is None:\n radius2 = 0\n\n # Create top\n annulus_hires(script,\n radius1=radius1,\n radius2=radius2,\n cir_segments=cir_segments,\n rad_segments=rad_segments)\n transform.translate(script, [0, 0, height])\n\n # Create bottom\n if simple_bottom:\n annulus(script,\n radius1=radius1,\n radius2=radius2,\n cir_segments=cir_segments)\n else:\n layers.duplicate(script)\n transform.translate(script, [0, 0, -height])\n # Rotate to correct normals\n transform.rotate(script, 'x', 180)\n\n # Create outer tube\n cylinder_open_hires(script, height, radius1,\n cir_segments=cir_segments,\n height_segments=height_segments)\n\n # Create inner tube\n if radius2 != 0:\n cylinder_open_hires(script, height, radius2,\n cir_segments=cir_segments,\n height_segments=height_segments,\n invert_normals=True)\n\n # Join everything together\n layers.join(script)\n # Need some tolerance on merge_vert due to rounding errors\n clean.merge_vert(script, threshold=0.00002)\n if center:\n transform.translate(script, [0, 0, -height / 2])\n if color is not None:\n vert_color.function(script, color=color)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreads color_names. txt and find the red green and blue values for a named color.", "response": "def color_values(color):\n \"\"\"Read color_names.txt and find the red, green, and blue values\n for a named color.\n \"\"\"\n # Get the directory where this script file is located:\n this_dir = os.path.dirname(\n os.path.realpath(\n inspect.getsourcefile(\n lambda: 0)))\n color_name_file = os.path.join(this_dir, 'color_names.txt')\n found = False\n for line in open(color_name_file, 'r'):\n line = line.rstrip()\n if color.lower() == line.split()[0]:\n #hex_color = line.split()[1]\n red = line.split()[2]\n green = line.split()[3]\n blue = line.split()[4]\n found = True\n break\n if not found:\n print('Color name \"%s\" not found, using default (white)' % color)\n red = 255\n green = 255\n blue = 255\n return red, green, blue"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef check_list(var, num_terms):\n if not isinstance(var, list):\n if isinstance(var, tuple):\n var = list(var)\n else:\n var = [var]\n for _ in range(1, num_terms):\n var.append(var[0])\n if len(var) != num_terms:\n print(\n '\"%s\" has the wrong number of terms; it needs %s. Exiting ...' %\n (var, num_terms))\n sys.exit(1)\n return var", "response": "Check if a variable is a list and is the correct length."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nmakes a variable a list if it is not already a list", "response": "def make_list(var, num_terms=1):\n \"\"\" Make a variable a list if it is not already\n\n If variable is not a list it will make it a list of the correct length with\n all terms identical.\n \"\"\"\n if not isinstance(var, list):\n if isinstance(var, tuple):\n var = list(var)\n else:\n var = [var]\n #if len(var) == 1:\n for _ in range(1, num_terms):\n var.append(var[0])\n return var"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef write_filter(script, filter_xml):\n if isinstance(script, mlx.FilterScript):\n script.filters.append(filter_xml)\n elif isinstance(script, str):\n script_file = open(script, 'a')\n script_file.write(filter_xml)\n script_file.close()\n else:\n print(filter_xml)\n return None", "response": "Writes a filter to the specified filter script or file."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef merge_vert(script, threshold=0.0):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function takes a filter script and writes it to the filter file."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef close_holes(script, hole_max_edge=30, selected=False,\n sel_new_face=True, self_intersection=True):\n \"\"\" Close holes smaller than a given threshold\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n hole_max_edge (int): The size is expressed as number of edges composing\n the hole boundary.\n selected (bool): Only the holes with at least one of the boundary faces\n selected are closed.\n sel_new_face (bool): After closing a hole the faces that have been\n created are left selected. Any previous selection is lost. Useful\n for example for smoothing or subdividing the newly created holes.\n self_intersection (bool): When closing an holes it tries to prevent the\n creation of faces that intersect faces adjacent to the boundary of\n the hole. It is an heuristic, non intersecting hole filling can be\n NP-complete.\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function returns a filter that closes holes smaller than a given threshold."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef split_vert_on_nonmanifold_face(script, vert_displacement_ratio=0.0):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "Splits a vertex on a non - manifold face."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef snap_mismatched_borders(script, edge_dist_ratio=0.01, unify_vert=True):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "Try to snap together adjacent borders that are slightly mismatched."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef translate(script, value=(0.0, 0.0, 0.0)):\n # Convert value to list if it isn't already\n if not isinstance(value, list):\n value = list(value)\n vert_function(script,\n x_func='x+(%s)' % value[0],\n y_func='y+(%s)' % value[1],\n z_func='z+(%s)' % value[2])\n return None", "response": "An alternative translate implementation that uses a geometric function.\n This is more accurate than the built - in version."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef rotate(script, axis='z', angle=0.0):\n angle = math.radians(angle)\n if axis.lower() == 'x':\n vert_function(script,\n x_func='x',\n y_func='y*cos({angle})-z*sin({angle})'.format(angle=angle),\n z_func='y*sin({angle})+z*cos({angle})'.format(angle=angle))\n elif axis.lower() == 'y':\n vert_function(script,\n x_func='z*sin({angle})+x*cos({angle})'.format(angle=angle),\n y_func='y',\n z_func='z*cos({angle})-x*sin({angle})'.format(angle=angle))\n elif axis.lower() == 'z':\n vert_function(script,\n x_func='x*cos({angle})-y*sin({angle})'.format(angle=angle),\n y_func='x*sin({angle})+y*cos({angle})'.format(angle=angle),\n z_func='z')\n else:\n print('Axis name is not valid; exiting ...')\n sys.exit(1)\n return None", "response": "An alternative rotate implementation that uses a geometric function.\n This is more accurate than the built - in version."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfreezes the current transformation matrix into the coordinates of the vertices of the mesh and set it to the identity.", "response": "def freeze_matrix(script, all_layers=False):\n \"\"\" Freeze the current transformation matrix into the coordinates of the\n vertices of the mesh (and set this matrix to the identity).\n\n In other words it applies in a definitive way the current matrix to the\n vertex coordinates.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n all_layers (bool): If selected the filter will be applied to all\n visible mesh layers.\n\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef function(script, x_func='x', y_func='y', z_func='z'):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "Generate new Coordinates for a single vertex using muparser lib"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef vert_function(script, x_func='x', y_func='y', z_func='z', selected=False):\n if script.ml_version == '1.3.4BETA':\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n else:\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "Generate a new vertex function using muparser lib"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef radial_flare(script, flare_radius=None, start_radius=None, end_radius=None,\n end_height=None):\n \"\"\"\n flare_radius must be >= z2 (height)\n r2 max = flare_radius + r\n \n r2 (num): radius of mesh at end of flare\n \n +15 r= 8.8205\n -15 r= 1.1795\n \n z=10, 5 +/-15 - +/-15*0.74535599249992989880305788957709\n \"\"\"\n # TODO: set radius limit, make it so flare continues to expand linearly after radius limit\n # if(r<=radius_limit, flare, factor*z+constant\n # TODO: add option to specify radius at height instead of radius\n effective_radius = '(flare_radius) + (start_radius) - (r)'\n \n r_func = 'if(z>0, (flare_radius) + (start_radius) - (effective_radius)*cos(z/(flare_radius)), (r))'\n z_func = 'if(z>0, (effective_radius)*sin(z/(flare_radius)), z)'\n \n r_func = r_func.replace('effective_radius', str(effective_radius)).replace('start_radius', str(start_radius)).replace('flare_radius', str(flare_radius))\n z_func = z_func.replace('effective_radius', str(effective_radius)).replace('start_radius', str(start_radius)).replace('flare_radius', str(flare_radius))\n \n function_cyl_co(script=script, r_func=r_func, z_func=z_func)\n return None", "response": "Function to expand a radial flare."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef curl_rim(script, curl_radius=None, start_radius=None, end_radius=None,\n end_height=None):\n \"\"\"\n flare_radius must be >= z2 (height)\n r2 max = flare_radius + r\n \n r2 (num): radius of mesh at end of flare\n \n +15 r= 8.8205\n -15 r= 1.1795\n \n z=10, 5 +/-15 - +/-15*0.74535599249992989880305788957709\n \"\"\"\n # TODO: set radius limit, make it so flare continues to expand linearly after radius limit\n # if(r<=radius_limit, flare, factor*z+constant\n # TODO: add option to specify radius at height instead of radius\n effective_radius = '(curl_radius) - z'\n \n r_func = 'if((r)>(start_radius), (start_radius) + (effective_radius)*sin(((r)-(start_radius))/(curl_radius)), (r))'\n z_func = 'if((r)>(start_radius), (curl_radius) - (effective_radius)*cos(((r)-(start_radius))/(curl_radius)), z)'\n \n r_func = r_func.replace('effective_radius', str(effective_radius)).replace('start_radius', str(start_radius)).replace('curl_radius', str(curl_radius))\n z_func = z_func.replace('effective_radius', str(effective_radius)).replace('start_radius', str(start_radius)).replace('curl_radius', str(curl_radius))\n \n function_cyl_co(script=script, r_func=r_func, z_func=z_func)\n return None", "response": "Function to expand a cylindrical cylinder."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nwrap a script around a cylinder.", "response": "def wrap2cylinder(script, radius=1, pitch=0, taper=0, pitch_func=None,\n taper_func=None):\n \"\"\"Deform mesh around cylinder of radius and axis z\n\n y = 0 will be on the surface of radius \"radius\"\n pitch != 0 will create a helix, with distance \"pitch\" traveled in z for each rotation\n taper = change in r over z. E.g. a value of 0.5 will shrink r by 0.5 for every z length of 1\n\n \"\"\"\n \"\"\"vert_function(s=s, x='(%s+y-taper)*sin(x/(%s+y))' % (radius, radius),\n y='(%s+y)*cos(x/(%s+y))' % (radius, radius),\n z='z-%s*x/(2*%s*(%s+y))' % (pitch, pi, radius))\"\"\"\n if pitch_func is None:\n pitch_func = '-(pitch)*x/(2*pi*(radius))'\n pitch_func = pitch_func.replace(\n 'pitch', str(pitch)).replace(\n 'pi', str(math.pi)).replace(\n 'radius', str(radius))\n if taper_func is None:\n taper_func = '-(taper)*(pitch_func)'\n taper_func = taper_func.replace(\n 'taper', str(taper)).replace(\n 'pitch_func', str(pitch_func)).replace(\n 'pi', str(math.pi))\n\n x_func = '(y+(radius)+(taper_func))*sin(x/(radius))'.replace(\n 'radius', str(radius)).replace('taper_func', str(taper_func))\n y_func = '(y+(radius)+(taper_func))*cos(x/(radius))'.replace(\n 'radius', str(radius)).replace('taper_func', str(taper_func))\n z_func = 'z+(pitch_func)'.replace('pitch_func', str(pitch_func))\n\n vert_function(script, x_func, y_func, z_func)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef bend(script, radius=1, pitch=0, taper=0, angle=0, straght_start=True,\n straght_end=False, radius_limit=None, outside_limit_end=True):\n \"\"\"Bends mesh around cylinder of radius radius and axis z to a certain angle\n\n straight_ends: Only apply twist (pitch) over the area that is bent\n\n outside_limit_end (bool): should values outside of the bend radius_limit be considered part\n of the end (True) or the start (False)?\n \"\"\"\n if radius_limit is None:\n radius_limit = 2 * radius\n # TODO: add limit so bend only applies over y<2*radius; add option to set\n # larger limit\n angle = math.radians(angle)\n segment = radius * angle\n \"\"\"vert_function(s=s, x='if(x<%s and x>-%s, (%s+y)*sin(x/%s), (%s+y)*sin(%s/%s)+(x-%s)*cos(%s/%s))'\n % (segment, segment, radius, radius, radius, segment, radius, segment, segment, radius),\n y='if(x<%s*%s/2 and x>-%s*%s/2, (%s+y)*cos(x/%s), (%s+y)*cos(%s)-(x-%s*%s)*sin(%s))'\n % (radius, angle, radius, angle, radius, radius, radius, angle/2, radius, angle/2, angle/2),\"\"\"\n pitch_func = '-(pitch)*x/(2*pi*(radius))'.replace(\n 'pitch', str(pitch)).replace(\n 'pi', str(math.pi)).replace(\n 'radius', str(radius))\n taper_func = '(taper)*(pitch_func)'.replace(\n 'taper', str(taper)).replace(\n 'pitch_func', str(pitch_func)).replace(\n 'pi', str(math.pi))\n # y\\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function creates a filter that transfers vertex colors to texture colors."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ntransfers mesh colors to face colors", "response": "def mesh2fc(script, all_visible_layers=False):\n \"\"\"Transfer mesh colors to face colors\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n all_visible_layers (bool): If true the color mapping is applied to all the meshes\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef vert_attr_2_meshes(script, source_mesh=0, target_mesh=1,\n geometry=False, normal=False, color=True,\n quality=False, selection=False,\n quality_distance=False, max_distance=0.5):\n \"\"\"Vertex Attribute Transfer (between 2 meshes)\n\n Transfer the chosen per-vertex attributes from one mesh to another. Useful to transfer attributes to different representations of the same object. For each vertex of the target mesh the closest point (not vertex!) on the source mesh is computed, and the requested interpolated attributes from that source point are copied into the target vertex.\n\n The algorithm assumes that the two meshes are reasonably similar and aligned.\n\n UpperBound: absolute value (not percentage)\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n source_mesh (int): The mesh that contains the source data that we want to transfer\n target_mesh (int): The mesh whose vertexes will receive the data from the source\n geometry (bool): If enabled, the position of each vertex of the target mesh will be snapped onto the corresponding closest point on the source mesh\n normal (bool): If enabled, the normal of each vertex of the target mesh will get the (interpolated) normal of the corresponding closest point on the source mesh\n color (bool): If enabled, the color of each vertex of the target mesh will become the color of the corresponding closest point on the source mesh\n quality (bool): If enabled, the quality of each vertex of the target mesh will become the quality of the corresponding closest point on the source mesh\n selection (bool): If enabled, each vertex of the target mesh will be selected if the corresponding closest point on the source mesh falls in a selected face\n quality_distance (bool): If enabled, we store the distance of the transferred value as in the vertex quality\n max_distance (float): Sample points for which we do not find anything within this distance are rejected and not considered for recovering attributes\n\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function takes a filter script object or filename and returns a list of vertices that can be transferred to the target mesh."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ntransfers Vertex Attributes to Texture (between 2 meshes) Args: script: the FilterScript object or script filename to write the filter to. source_mesh (int): The mesh that contains the source data that we want to transfer target_mesh (int): The mesh whose texture will be filled according to source mesh data attribute (int): Choose what attribute has to be transferred onto the target texture. You can choose between Per vertex attributes (color, normal, quality) or to transfer color information from source mesh texture max_distance (float): Sample points for which we do not find anything within this distance are rejected and not considered for recovering data tex_name (str): The texture file to be created tex_width (int): The texture width tex_height (int): The texture height overwrite_tex (bool): If target mesh has a texture will be overwritten (with provided texture dimension) assign_tex (bool): Assign the newly created texture to target mesh fill_tex (bool): If enabled the unmapped texture space is colored using a pull push filling algorithm, if false is set to black Layer stack: No impacts MeshLab versions: 2016.12 1.3.4BETA", "response": "def vert_attr2tex_2_meshes(script, source_mesh=0, target_mesh=1, attribute=0,\n max_distance=0.5, tex_name='TEMP3D_texture.png',\n tex_width=1024, tex_height=1024,\n overwrite_tex=True, assign_tex=False,\n fill_tex=True):\n \"\"\"Transfer Vertex Attributes to Texture (between 2 meshes)\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n source_mesh (int): The mesh that contains the source data that we want to transfer\n target_mesh (int): The mesh whose texture will be filled according to source mesh data\n attribute (int): Choose what attribute has to be transferred onto the target texture. You can choose between Per vertex attributes (color, normal, quality) or to transfer color information from source mesh texture\n max_distance (float): Sample points for which we do not find anything within this distance are rejected and not considered for recovering data\n tex_name (str): The texture file to be created\n tex_width (int): The texture width\n tex_height (int): The texture height\n overwrite_tex (bool): If target mesh has a texture will be overwritten (with provided texture dimension)\n assign_tex (bool): Assign the newly created texture to target mesh\n fill_tex (bool): If enabled the unmapped texture space is colored using a pull push filling algorithm, if false is set to black\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n if script.ml_version == '1.3.4BETA':\n filter_name = 'Transfer Vertex Attributes to Texture (between 2 meshes)'\n else:\n filter_name = 'Transfer: Vertex Attributes to Texture (1 or 2 meshes)'\n filter_xml = ''.join([\n ' \\n'.format(filter_name),\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef tex2vc_2_meshes(script, source_mesh=0, target_mesh=1, max_distance=0.5):\n if script.ml_version == '1.3.4BETA':\n filter_name = 'Texture to Vertex Color (between 2 meshes)'\n else:\n filter_name = 'Transfer: Texture to Vertex Color (1 or 2 meshes)'\n filter_xml = ''.join([\n ' \\n'.format(filter_name),\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function writes a filter to the filter file that can be used to transfer texture colors to vertex colors between 2 meshes."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nsimplify a UV - based edge - collapse strategy.", "response": "def simplify(script, texture=True, faces=25000, target_perc=0.0,\n quality_thr=0.3, preserve_boundary=False, boundary_weight=1.0,\n optimal_placement=True, preserve_normal=False,\n planar_quadric=False, selected=False, extra_tex_coord_weight=1.0,\n preserve_topology=True, quality_weight=False, autoclean=True):\n \"\"\" Simplify a mesh using a Quadric based Edge Collapse Strategy, better\n than clustering but slower. Optionally tries to preserve UV\n parametrization for textured meshes.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n texture (bool):\n faces (int): The desired final number of faces\n target_perc (float): If non zero, this parameter specifies the desired\n final size of the mesh as a percentage of the initial mesh size.\n quality_thr (float): Quality threshold for penalizing bad shaped faces.\n The value is in the range [0..1]0 accept any kind of face (no\n penalties), 0.5 penalize faces with quality less than 0.5,\n proportionally to their shape.\n preserve_boundary (bool): The simplification process tries not to\n affect mesh boundaries\n boundary_weight (float): The importance of the boundary during\n simplification. Default (1.0) means that the boundary has the same\n importance of the rest. Values greater than 1.0 raise boundary\n importance and has the effect of removing less vertices on the\n border. Admitted range of values (0,+inf).\n optimal_placement (bool): Each collapsed vertex is placed in the\n position minimizing the quadric error. It can fail (creating bad\n spikes) in case of very flat areas. If disabled edges are collapsed\n onto one of the two original vertices and the final mesh is\n composed by a subset of the original vertices.\n preserve_normal (bool): Try to avoid face flipping effects and try to\n preserve the original orientation of the surface.\n planar_quadric (bool): Add additional simplification constraints that\n improves the quality of the simplification of the planar portion of\n the mesh.\n selected (bool): The simplification is applied only to the selected set\n of faces. Take care of the target number of faces!\n extra_tex_coord_weight (float): Additional weight for each extra\n Texture Coordinates for every (selected) vertex. Ignored if texture\n is False.\n preserve_topology (bool): Avoid all the collapses that should cause a\n topology change in the mesh (like closing holes, squeezing handles,\n etc). If checked the genus of the mesh should stay unchanged.\n quality_weight (bool): Use the Per-Vertex quality as a weighting factor\n for the simplification. The weight is used as a error amplification\n value, so a vertex with a high quality value will not be simplified\n and a portion of the mesh with low quality values will be\n aggressively simplified.\n autoclean (bool): After the simplification an additional set of steps\n is performed to clean the mesh (unreferenced vertices, bad faces,\n etc).\n\n Layer stack:\n Unchanged; current mesh is simplified in place.\n\n MeshLab versions:\n 2016.12 (different filter name)\n 1.3.4BETA\n \"\"\"\n if texture:\n if isinstance(script, FilterScript) and (script.ml_version == '2016.12'):\n filter_xml = ' \\n'\n else:\n filter_xml = ' \\n'\n else:\n if isinstance(script, FilterScript) and (script.ml_version == '2016.12'):\n filter_xml = ' \\n'\n else:\n filter_xml = ' \\n'\n # Parameters common to both 'with' and 'without texture'\n filter_xml = ''.join([\n filter_xml,\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n if texture: # Parameters unique to 'with texture'\n filter_xml = ''.join([\n filter_xml,\n ' \\n'])\n else: # Parameters unique to 'without texture'\n filter_xml = ''.join([\n filter_xml,\n ' \\n',\n ' \\n',\n ' \\n'])\n filter_xml = ''.join([filter_xml, ' \\n'])\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a new mesh that is a resampled version of the current one.", "response": "def uniform_resampling(script, voxel=1.0, offset=0.0, merge_vert=True,\n discretize=False, multisample=False, thicken=False):\n \"\"\" Create a new mesh that is a resampled version of the current one.\n\n The resampling is done by building a uniform volumetric representation\n where each voxel contains the signed distance from the original surface.\n The resampled surface is reconstructed using the marching cube algorithm\n over this volume.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n voxel (float): voxel (cell) size for resampling. Smaller cells give\n better precision at a higher computational cost. Remember that\n halving the cell size means that you build a volume 8 times larger.\n offset (float): offset amount of the created surface (i.e. distance of\n the created surface from the original one). If offset is zero, the\n created surface passes on the original mesh itself. Values greater\n than zero mean an external surface (offset), and lower than zero\n mean an internal surface (inset). In practice this value is the\n threshold passed to the Marching Cube algorithm to extract the\n isosurface from the distance field representation.\n merge_vert (bool): if True the mesh generated by MC will be cleaned by\n unifying vertices that are almost coincident.\n discretize (bool): if True the position of the intersected edge of the\n marching cube grid is not computed by linear interpolation, but it\n is placed in fixed middle position. As a consequence the resampled\n object will look severely aliased by a stairstep appearance. Useful\n only for simulating the output of 3D printing devices.\n multisample (bool): if True the distance field is more accurately\n compute by multisampling the volume (7 sample for each voxel). Much\n slower but less artifacts.\n thicken (bool): if True, you have to choose a non zero Offset and a\n double surface is built around the original surface, inside and\n outside. Is useful to convert thin floating surfaces into solid,\n thick meshes.\n\n Layer stack:\n Creates 1 new layer 'Offset mesh'\n Current layer is changed to new layer\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Offset mesh')\n return None"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncalculate the convex hull of a set of points.", "response": "def hull(script, reorient_normal=True):\n \"\"\" Calculate the convex hull with Qhull library\n http://www.qhull.org/html/qconvex.htm\n\n The convex hull of a set of points is the boundary of the minimal convex\n set containing the given non-empty finite set of points.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n reorient_normal (bool): Re-orient all faces coherentely after hull\n operation.\n\n Layer stack:\n Creates 1 new layer 'Convex Hull'\n Current layer is changed to new layer\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Convex Hull')\n return None"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nusing the points and normals to build a surface using the Poisson Surface reconstruction approach. Args: script: the FilterScript object or script filename to write the filter to. octree_depth (int): Set the depth of the Octree used for extracting the final surface. Suggested range 5..10. Higher numbers mean higher precision in the reconstruction but also higher processing times. Be patient. solver_divide (int): This integer argument specifies the depth at which a block Gauss-Seidel solver is used to solve the Laplacian equation. Using this parameter helps reduce the memory overhead at the cost of a small increase in reconstruction time. In practice, the authors have found that for reconstructions of depth 9 or higher a subdivide depth of 7 or 8 can reduce the memory usage. The default value is 8. samples_per_node (float): This floating point value specifies the minimum number of sample points that should fall within an octree node as the octree construction is adapted to sampling density. For noise-free samples, small values in the range [1.0 - 5.0] can be used. For more noisy samples, larger values in the range [15.0 - 20.0] may be needed to provide a smoother, noise-reduced, reconstruction. The default value is 1.0. offset (float): This floating point value specifies a correction value for the isosurface threshold that is chosen. Values less than 1 mean internal offsetting, greater than 1 mean external offsetting. Good values are in the range 0.5 .. 2. The default value is 1.0 (no offsetting). Layer stack: Creates 1 new layer 'Poisson mesh' Current layer is changed to new layer MeshLab versions: 1.3.4BETA", "response": "def surface_poisson(script, octree_depth=10, solver_divide=8,\n samples_per_node=1.0, offset=1.0):\n \"\"\" Use the points and normals to build a surface using the Poisson\n Surface reconstruction approach.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n octree_depth (int): Set the depth of the Octree used for extracting the\n final surface. Suggested range 5..10. Higher numbers mean higher\n precision in the reconstruction but also higher processing times.\n Be patient.\n solver_divide (int): This integer argument specifies the depth at which\n a block Gauss-Seidel solver is used to solve the Laplacian\n equation. Using this parameter helps reduce the memory overhead at\n the cost of a small increase in reconstruction time. In practice,\n the authors have found that for reconstructions of depth 9 or\n higher a subdivide depth of 7 or 8 can reduce the memory usage. The\n default value is 8.\n samples_per_node (float): This floating point value specifies the\n minimum number of sample points that should fall within an octree\n node as the octree construction is adapted to sampling density.\n For noise-free samples, small values in the range [1.0 - 5.0] can\n be used. For more noisy samples, larger values in the range\n [15.0 - 20.0] may be needed to provide a smoother, noise-reduced,\n reconstruction. The default value is 1.0.\n offset (float): This floating point value specifies a correction value\n for the isosurface threshold that is chosen. Values less than 1\n mean internal offsetting, greater than 1 mean external offsetting.\n Good values are in the range 0.5 .. 2. The default value is 1.0\n (no offsetting).\n\n Layer stack:\n Creates 1 new layer 'Poisson mesh'\n Current layer is changed to new layer\n\n MeshLab versions:\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Poisson mesh', change_layer=True)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef curvature_flipping(script, angle_threshold=1.0, curve_type=0,\n selected=False):\n \"\"\" Use the points and normals to build a surface using the Poisson\n Surface reconstruction approach.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n angle_threshold (float): To avoid excessive flipping/swapping we\n consider only couple of faces with a significant diedral angle\n (e.g. greater than the indicated threshold).\n curve_type (int): Choose a metric to compute surface curvature on vertices\n H = mean curv, K = gaussian curv, A = area per vertex\n 1: Mean curvature = H\n 2: Norm squared mean curvature = (H * H) / A\n 3: Absolute curvature:\n if(K >= 0) return 2 * H\n else return 2 * sqrt(H ^ 2 - A * K)\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function builds a new filter that can be used to flip the curvature of a set of vertices on the current surface."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nturning a model into a surface with Voronoi style holes in it.", "response": "def voronoi(script, hole_num=50, target_layer=None, sample_layer=None, thickness=0.5, backward=True):\n \"\"\" Turn a model into a surface with Voronoi style holes in it\n\n References:\n http://meshlabstuff.blogspot.com/2009/03/creating-voronoi-sphere.html\n http://meshlabstuff.blogspot.com/2009/04/creating-voronoi-sphere-2.html\n\n Requires FilterScript object\n\n Args:\n script: the FilterScript object to write the filter to. Does not\n work with a script filename.\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n\n if target_layer is None:\n target_layer = script.current_layer()\n if sample_layer is None:\n # Current layer is currently not changed after poisson_disk is run\n sampling.poisson_disk(script, sample_num=hole_num)\n sample_layer = script.last_layer()\n\n vert_color.voronoi(script, target_layer=target_layer, source_layer=sample_layer, backward=backward)\n select.vert_quality(script, min_quality=0.0, max_quality=thickness)\n if backward:\n select.invert(script)\n delete.selected(script)\n smooth.laplacian(script, iterations=3)\n\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nselects all faces of the current mesh.", "response": "def all(script, face=True, vert=True):\n \"\"\" Select all the faces of the current mesh\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n faces (bool): If True the filter will select all the faces.\n verts (bool): If True the filter will select all the vertices.\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef grow(script, iterations=1):\n filter_xml = ' \\n'\n for _ in range(iterations):\n util.write_filter(script, filter_xml)\n return None", "response": "Grow the current set of selected faces with the given number of iterations."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nshrinking the current set of selected faces to the specified number of iterations.", "response": "def shrink(script, iterations=1):\n \"\"\" Shrink (erode, reduce) the current set of selected faces\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n iterations (int): the number of times to shrink the selection.\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ' \\n'\n for _ in range(iterations):\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef small_parts(script, ratio=0.2, non_closed_only=False):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function returns a filter that selects the small disconnected parts of a mesh."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef vert_quality(script, min_quality=0.0, max_quality=0.05, inclusive=True):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function writes a filter to the filter file that selects all faces and vertices within the specified quality range."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef vert_function(script, function='(q < 0)', strict_face_select=True):\n if script.ml_version == '1.3.4BETA':\n strict_select = ''.join([\n ' \\n',\n ])\n else:\n strict_select = ''\n\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n strict_select,\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "Boolean function that will be evaluated in order to select a subset of vertices."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef cylindrical_vert(script, radius=1.0, inside=True):\n if inside:\n function = 'sqrt(x^2+y^2)<={}'.format(radius)\n else:\n function = 'sqrt(x^2+y^2)>={}'.format(radius)\n vert_function(script, function=function)\n return None", "response": "Select all vertices within a cylindrical radius"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nselecting all vertices within a spherical radius", "response": "def spherical_vert(script, radius=1.0, center_pt=(0.0, 0.0, 0.0)):\n \"\"\"Select all vertices within a spherical radius\n\n Args:\n radius (float): radius of the sphere\n center_pt (3 coordinate tuple or list): center point of the sphere\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n function = 'sqrt((x-{})^2+(y-{})^2+(z-{})^2)<={}'.format(\n center_pt[0], center_pt[1], center_pt[2], radius)\n vert_function(script, function=function)\n return None"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef join(script, merge_visible=True, merge_vert=False, delete_layer=True,\n keep_unreferenced_vert=False):\n \"\"\" Flatten all or only the visible layers into a single new mesh.\n\n Transformations are preserved. Existing layers can be optionally\n deleted.\n\n Args:\n script: the mlx.FilterScript object or script filename to write\n the filter to.\n merge_visible (bool): merge only visible layers\n merge_vert (bool): merge the vertices that are duplicated among\n different layers. Very useful when the layers are spliced portions\n of a single big mesh.\n delete_layer (bool): delete all the merged layers. If all layers are\n visible only a single layer will remain after the invocation of\n this filter.\n keep_unreferenced_vert (bool): Do not discard unreferenced vertices\n from source layers. Necessary for point-only layers.\n\n Layer stack:\n Creates a new layer \"Merged Mesh\"\n Changes current layer to the new layer\n Optionally deletes all other layers\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n\n Bugs:\n UV textures: not currently preserved, however will be in a future\n release. https://github.com/cnr-isti-vclab/meshlab/issues/128\n merge_visible: it is not currently possible to change the layer\n visibility from meshlabserver, however this will be possible\n in the future https://github.com/cnr-isti-vclab/meshlab/issues/123\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, mlx.FilterScript):\n script.add_layer('Merged Mesh')\n if delete_layer:\n # As it is not yet possible to change the layer visibility, all\n # layers will be deleted. This will be updated once layer\n # visibility is tracked.\n for i in range(script.last_layer()):\n script.del_layer(0)\n return None", "response": "This function takes a filter script object or a filename to write a filter to a new mesh."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef delete(script, layer_num=None):\n filter_xml = ' \\n'\n if isinstance(script, mlx.FilterScript):\n if (layer_num is None) or (layer_num == script.current_layer()):\n util.write_filter(script, filter_xml)\n script.del_layer(script.current_layer())\n else:\n cur_layer = script.current_layer()\n change(script, layer_num)\n util.write_filter(script, filter_xml)\n if layer_num < script.current_layer():\n change(script, cur_layer - 1)\n else:\n change(script, cur_layer)\n script.del_layer(layer_num)\n else:\n util.write_filter(script, filter_xml)\n return None", "response": "Delete a layer from the stack."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nrename the current layer in the filter.", "response": "def rename(script, label='blank', layer_num=None):\n \"\"\" Rename layer label\n\n Can be useful for outputting mlp files, as the output file names use\n the labels.\n\n Args:\n script: the mlx.FilterScript object or script filename to write\n the filter to.\n label (str): new label for the mesh layer\n layer_num (int): layer number to rename. Default is the\n current layer. Not supported on the file base API.\n\n Layer stack:\n Renames a layer\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n'])\n if isinstance(script, mlx.FilterScript):\n if (layer_num is None) or (layer_num == script.current_layer()):\n util.write_filter(script, filter_xml)\n script.layer_stack[script.current_layer()] = label\n else:\n cur_layer = script.current_layer()\n change(script, layer_num)\n util.write_filter(script, filter_xml)\n change(script, cur_layer)\n script.layer_stack[layer_num] = label\n else:\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nchange the current layer by specifying the new layer number.", "response": "def change(script, layer_num=None):\n \"\"\" Change the current layer by specifying the new layer number.\n\n Args:\n script: the mlx.FilterScript object or script filename to write\n the filter to.\n layer_num (int): the number of the layer to change to. Default is the\n last layer if script is a mlx.FilterScript object; if script is a\n filename the default is the first layer.\n\n Layer stack:\n Modifies current layer\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n if layer_num is None:\n if isinstance(script, mlx.FilterScript):\n layer_num = script.last_layer()\n else:\n layer_num = 0\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, mlx.FilterScript):\n script.set_current_layer(layer_num)\n #script.layer_stack[len(self.layer_stack) - 1] = layer_num\n return None"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nduplicates a layer. New layer label is '*_copy'. Args: script: the mlx.FilterScript object or script filename to write the filter to. layer_num (int): layer number to duplicate. Default is the current layer. Not supported on the file base API. Layer stack: Creates a new layer Changes current layer to the new layer MeshLab versions: 2016.12 1.3.4BETA", "response": "def duplicate(script, layer_num=None):\n \"\"\" Duplicate a layer.\n\n New layer label is '*_copy'.\n\n Args:\n script: the mlx.FilterScript object or script filename to write\n the filter to.\n layer_num (int): layer number to duplicate. Default is the\n current layer. Not supported on the file base API.\n\n Layer stack:\n Creates a new layer\n Changes current layer to the new layer\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ' \\n'\n if isinstance(script, mlx.FilterScript):\n if (layer_num is None) or (layer_num == script.current_layer()):\n util.write_filter(script, filter_xml)\n script.add_layer('{}_copy'.format(script.layer_stack[script.current_layer()]), True)\n else:\n change(script, layer_num)\n util.write_filter(script, filter_xml)\n script.add_layer('{}_copy'.format(script.layer_stack[layer_num]), True)\n else:\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef split_parts(script, part_num=None, layer_num=None):\n filter_xml = ' \\n'\n if isinstance(script, mlx.FilterScript):\n if (layer_num is not None) and (layer_num != script.current_layer()):\n change(script, layer_num)\n util.write_filter(script, filter_xml)\n if part_num is not None:\n for i in range(part_num):\n script.add_layer('CC {}'.format(i), True)\n else:\n script.add_layer('CC 0', True)\n print('Warning: the number of parts was not provided and cannot',\n 'be determined automatically. The layer stack is likely',\n 'incorrect!')\n else:\n util.write_filter(script, filter_xml)\n return None", "response": "Splits the current layer into many layers one for each part of the model."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ndeleting all layers below the specified one.", "response": "def delete_lower(script, layer_num=None):\n \"\"\" Delete all layers below the specified one.\n\n Useful for MeshLab ver 2016.12, whcih will only output layer 0.\n \"\"\"\n if layer_num is None:\n layer_num = script.current_layer()\n if layer_num != 0:\n change(script, 0)\n for i in range(layer_num):\n delete(script, 0)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef run(script='TEMP3D_default.mlx', log=None, ml_log=None,\n mlp_in=None, mlp_out=None, overwrite=False, file_in=None,\n file_out=None, output_mask=None, cmd=None, ml_version=ML_VERSION,\n print_meshlabserver_output=True):\n \"\"\"Run meshlabserver in a subprocess.\n\n Args:\n log (str): filename of the log file for meshlabxml. If not\n None, all meshlabserver stdout and stderr messages\n will be appended to this file.\n ml_log (str): filename of the log file output directly by\n meshlabserver.\n mlp_in (str or list): input meshlab project file. Can be a\n single filename or a list of filenames. Filenames will\n be loaded in the order given. All project files will\n be loaded before individual input files. If you want\n to load project and input files in a different order\n then you should use a custom cmd.\n mlp_out (str): output meshlab project file. Specify a\n single filename (meshlabserver accepts multiple output\n project filenames, however they will all be identical,\n so there is little use). When this option is used all\n layers will be saved as ply files.\n overwrite (bool): when specifying mlp_out, this determines\n whether any existing files will be overwritten (if\n True) or new filenames created (if False). If a new\n project file is created meshes will have '_out' added\n to their name.\n file_in (str or list): input mesh filename. Can be a single\n filename or a list of filenames. Filenames will be\n loaded in the order given. All project files will be\n loaded before individual input files. If you want to\n load project and input files in a different order then\n you should use a custom cmd.\n file_out (str or list): output mesh filename. Can be a\n single filename or a list of filenames. The current\n layer will be saved to this filename or filenames.\n Multiple filenames are useful for saving to multiple\n formats at the same time. Currently there is no way to\n output multiple layers except for saving a mlp project\n file.\n output_mask (str or list): output mask options for the\n output file. Values must include the flag, i.e. -m or\n -output_mask. If this is not provided for an output\n file then function \"default_output_mask\" is used to\n determine default values.\n script (str): the mlx filter script filename to execute.\n cmd (str): a full meshlabserver command line, such as\n \"meshlabserver -input file.stl\". If not None, this\n will override all other arguements except for log.\n print_meshlabserver_output (bool): Pass meshlabserver's output to stdout; useful for debugging.\n Only used if log is None.\n\n Notes:\n Meshlabserver can't handle spaces in paths or filenames (on Windows at least; haven't tested on other platforms). Enclosing the name in quotes or escaping the space has no effect.\n\n Returns:\n return code of meshlabserver process; 0 if successful\n \"\"\"\n if cmd is None:\n cmd = 'meshlabserver'\n if ml_log is not None:\n # Initialize ml_log\n ml_log_file = open(ml_log, 'w')\n ml_log_file.close()\n cmd += ' -l %s' % ml_log\n if mlp_in is not None:\n # make a list if it isn't already\n mlp_in = util.make_list(mlp_in)\n for val in mlp_in:\n cmd += ' -p \"%s\"' % val\n if mlp_out is not None:\n cmd += ' -w %s' % mlp_out\n if overwrite:\n cmd += ' -v'\n if (mlp_in is None) and (file_in is None):\n\t\t\t# If no input files are provided use the default created by begin().\n\t\t\t# This works around the fact that meshlabserver will\n\t\t\t# not run without an input file.\n file_in = ['TEMP3D.xyz']\n if file_in is not None:\n # make a list if it isn't already\n file_in = util.make_list(file_in)\n for val in file_in:\n if val == 'bunny':\n cmd += ' -i \"%s\"' % os.path.join(THIS_MODULEPATH, os.pardir,\n 'models', 'bunny_flat(1Z).ply')\n elif val == 'bunny_raw':\n cmd += ' -i \"%s\"' % os.path.join(THIS_MODULEPATH, os.pardir,\n 'models', 'bunny_raw(-1250Y).ply')\n else:\n cmd += ' -i \"%s\"' % val\n if file_out is not None:\n # make a list if it isn't already\n file_out = util.make_list(file_out)\n if output_mask is not None:\n output_mask = util.make_list(output_mask)\n else:\n output_mask = []\n for index, val in enumerate(file_out):\n cmd += ' -o \"%s\"' % val\n try:\n cmd += ' %s' % output_mask[index]\n except IndexError: # If output_mask can't be found use defaults\n cmd += ' %s' % default_output_mask(val, ml_version=ml_version)\n if script is not None:\n cmd += ' -s \"%s\"' % script\n if log is not None:\n log_file = open(log, 'a')\n log_file.write('meshlabserver cmd = %s\\n' % cmd)\n log_file.write('***START OF MESHLAB STDOUT & STDERR***\\n')\n log_file.close()\n log_file = open(log, 'a')\n else:\n if print_meshlabserver_output:\n log_file = None\n print('meshlabserver cmd = %s' % cmd)\n print('***START OF MESHLAB STDOUT & STDERR***')\n else:\n log_file = open(os.devnull, 'w')\n while True:\n # TODO: test if shell=True is really needed\n return_code = subprocess.call(cmd, shell=True,\n stdout=log_file, stderr=log_file,\n universal_newlines=True)\n if log is not None:\n log_file.close()\n if (return_code == 0) or handle_error(program_name='MeshLab', cmd=cmd, log=log):\n break\n if log is not None:\n log_file = open(log, 'a')\n log_file.write('***END OF MESHLAB STDOUT & STDERR***\\n')\n log_file.write('meshlabserver return code = %s\\n\\n' % return_code)\n log_file.close()\n return return_code", "response": "Run a meshlabserver script."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfinding the texture files referenced by the input file.", "response": "def find_texture_files(fbasename, log=None):\n \"\"\"Finds the filenames of the referenced texture file(s) (and material\n file for obj) for the mesh.\n\n Args:\n fbasename (str): input filename. Supported file extensions:\n obj\n ply\n dae\n x3d\n wrl\n log (str): filename to log output\n\n Returns:\n list: list of all of the texture filenames referenced by the input file.\n May contain duplicates if the texture files are referenced more\n than once. List is empty if no texture files are found.\n list: list of all of the unique texture filenames, also empty if no\n texture files are found.\n str: for obj files only, returns the name of the referenced material file.\n Returns None if no material file is found.\n\n \"\"\"\n fext = os.path.splitext(fbasename)[1][1:].strip().lower()\n material_file = None\n texture_files = []\n vert_colors = False\n face_colors = False\n if fext == 'obj':\n # Material Format: mtllib ./model_mesh.obj.mtl\n with open(fbasename, 'r') as fread:\n for line in fread:\n if 'mtllib' in line:\n material_file = os.path.basename(line.split()[1])\n break\n if material_file is not None:\n # Texture Format: map_Kd model_texture.jpg\n with open(material_file, 'r') as fread:\n for line in fread:\n if 'map_Kd' in line:\n texture_files.append(os.path.basename(line.split()[1]))\n elif fext == 'ply':\n # Texture Format: comment TextureFile model_texture.jpg\n # This works for MeshLab & itSeez3D, but may not work for\n # every ply file.\n face_element = False\n with open(fbasename, 'rb') as fread:\n # read ascii header; works for both ascii & binary files\n while True:\n line = fread.readline().strip().decode('ascii')\n # print(line)\n if 'element face' in line:\n face_element = True\n if 'red' in line:\n if face_element:\n face_colors = True\n else:\n vert_colors = True\n if 'TextureFile' in line:\n texture_files.append(os.path.basename(line.split()[2]))\n if 'end_header' in line:\n break\n elif fext == 'dae': # COLLADA\n # elif fext == 'mlp':\n # Texture Format: \n # model_texture.jpg\n # \n namespace = 'http://www.collada.org/2005/11/COLLADASchema'\n tree = ET.parse(fbasename)\n #root = tree.getroot()\n #print('root = ', root)\n #print('root.tag = ', root.tag, 'root.attrib = ', root.attrib)\n for elem in tree.findall(\n '{%s}library_images/{%s}image/{%s}init_from' % (namespace, namespace, namespace)):\n texture_files.append(elem.text)\n elif fext == 'x3d':\n # Texture Format: \n #ns = 'http://www.w3.org/2001/XMLSchema-instance'\n tree = ET.parse(fbasename)\n #root = tree.getroot()\n #print('root = ', root)\n #print('root.tag = ', root.tag, 'root.attrib = ', root.attrib)\n # for elem in root:\n # for elem in tree.iter(): # iterate through tree; very useful to see possible tags\n #print('elem.tag = ', elem.tag)\n #print('elem.attrib = ', elem.attrib)\n for elem in tree.iter(tag='ImageTexture'):\n #print('elem.attrib = ', elem.attrib)\n texture_files.append(elem.attrib['url'])\n elif fext == 'wrl':\n # Texture Format: texture ImageTexture { url \"model_texture.jpg\" }\n with open(fbasename, 'r') as fread:\n for line in fread:\n if 'ImageTexture' in line:\n texture_files.append(os.path.basename(line.split('\"')[1]))\n elif fext != 'stl': # add other formats that don't support texture, e.g. xyz?\n print('File extension %s is not currently supported' % fext)\n # TODO: raise exception here\n texture_files_unique = list(set(texture_files))\n if log is not None:\n log_file = open(log, 'a')\n log_file.write('Results of find_texture_files:\\n')\n log_file.write('fbasename = %s\\n' % fbasename)\n log_file.write('texture_files = %s\\n' % texture_files)\n log_file.write('texture_files_unique = %s\\n' % texture_files_unique)\n log_file.write('Number of texture files = %s\\n' % len(texture_files))\n log_file.write(\n 'Number of unique texture files = %s\\n\\n' %\n len(texture_files_unique))\n log_file.write('vertex colors = %s\\n' % vert_colors)\n log_file.write('face colors = %s\\n' % face_colors)\n log_file.close()\n colors = {'texture':bool(texture_files), 'vert_colors':vert_colors, 'face_colors':face_colors}\n return texture_files, texture_files_unique, material_file, colors"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nset default output mask options based on file extension", "response": "def default_output_mask(file_out, texture=True, vert_normals=True, vert_colors=False,\n face_colors=False, ml_version=ML_VERSION):\n \"\"\"\n Set default output mask options based on file extension\n Note: v1.34BETA changed -om switch to -m\n Possible options (not all options are available for every format):\n vc -> vertex colors\n vf -> vertex flags\n vq -> vertex quality\n vn -> vertex normals\n vt -> vertex texture coords\n fc -> face colors\n ff -> face flags\n fq -> face quality\n fn -> face normals\n wc -> wedge colors\n wn -> wedge normals\n wt -> wedge texture coords\n \"\"\"\n vn = ''\n wt = ''\n vc = ''\n fc = ''\n\n if ml_version < '1.3.4':\n om = '-om'\n else:\n om = '-m'\n\n fext = os.path.splitext(file_out)[1][1:].strip().lower()\n if fext in ['stl', 'dxf', 'xyz']:\n om = ''\n texture = False\n vert_normals = False\n vert_colors = False\n face_colors = False\n# elif fext == 'ply':\n# vert_colors = True\n\n if vert_normals:\n vn = ' vn'\n if texture:\n wt = ' wt'\n if vert_colors:\n vc = ' vc'\n if face_colors:\n fc = ' fc'\n output_mask = '{}{}{}{}{}'.format(om, vn, wt, vc, fc)\n return output_mask"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef begin(script='TEMP3D_default.mlx', file_in=None, mlp_in=None):\n script_file = open(script, 'w')\n script_file.write(''.join(['\\n',\n '\\n']))\n script_file.close()\n\n current_layer = -1\n last_layer = -1\n stl = False\n\n # Process project files first\n if mlp_in is not None:\n # make a list if it isn't already\n if not isinstance(mlp_in, list):\n mlp_in = [mlp_in]\n for val in mlp_in:\n tree = ET.parse(val)\n #root = tree.getroot()\n for elem in tree.iter(tag='MLMesh'):\n filename = (elem.attrib['filename'])\n current_layer += 1\n last_layer += 1\n # If the mesh file extension is stl, change to that layer and\n # run clean.merge_vert\n if os.path.splitext(filename)[1][1:].strip().lower() == 'stl':\n layers.change(script, current_layer)\n clean.merge_vert(script)\n stl = True\n\n # Process separate input files next\n if file_in is not None:\n # make a list if it isn't already\n if not isinstance(file_in, list):\n file_in = [file_in]\n for val in file_in:\n current_layer += 1\n last_layer += 1\n # If the mesh file extension is stl, change to that layer and\n # run clean.merge_vert\n if os.path.splitext(val)[1][1:].strip().lower() == 'stl':\n layers.change(script, current_layer)\n clean.merge_vert(script)\n stl = True\n\n # If some input files were stl, we need to change back to the last layer\n if stl:\n layers.change(script, last_layer) # Change back to the last layer\n elif last_layer == -1:\n # If no input files are provided, create a dummy file\n # with a single vertex and delete it first in the script.\n # This works around the fact that meshlabserver will\n # not run without an input file.\n file_in = ['TEMP3D.xyz']\n file_in_descriptor = open(file_in[0], 'w')\n file_in_descriptor.write('0 0 0')\n file_in_descriptor.close()\n layers.delete(script)\n return current_layer, last_layer", "response": "Create a new mlx script and write opening tags."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_mlp(file_out, mlp_mesh=None, mlp_raster=None):\n # Opening lines\n mlp_file = open(file_out, 'w')\n mlp_file.write('\\n'.join([\n '',\n '\\n']))\n mlp_file.close()\n\n if mlp_mesh is not None:\n mlp_file = open(file_out, 'a')\n mlp_file.write(' \\n')\n for i, val in enumerate(mlp_mesh):\n if 'label' not in mlp_mesh[i]:\n mlp_mesh[i]['label'] = mlp_mesh[i]['filename']\n if 'matrix' not in mlp_mesh[i]:\n mlp_mesh[i]['matrix'] = [[1, 0, 0, 0], [0, 1, 0, 0], [0, 0, 1, 0], [0, 0, 0, 1]]\n mlp_file.write(' \\n'.format(mlp_mesh[i]['filename'], mlp_mesh[i]['label']))\n mlp_file.write('\\n'.join([\n ' ',\n '{m[0]} {m[1]} {m[2]} {m[3]} '.format(m=mlp_mesh[i]['matrix'][0]),\n '{m[0]} {m[1]} {m[2]} {m[3]} '.format(m=mlp_mesh[i]['matrix'][1]),\n '{m[0]} {m[1]} {m[2]} {m[3]} '.format(m=mlp_mesh[i]['matrix'][2]),\n '{m[0]} {m[1]} {m[2]} {m[3]} '.format(m=mlp_mesh[i]['matrix'][3]),\n '',\n ' \\n']))\n mlp_file.write(' \\n')\n mlp_file.close()\n # print(mlp_mesh)\n else:\n mlp_file = open(file_out, 'a')\n mlp_file.write(' \\n')\n mlp_file.close()\n if mlp_raster is not None:\n mlp_file = open(file_out, 'a')\n mlp_file.write(' \\n')\n for i, val in enumerate(mlp_raster):\n if 'label' not in mlp_raster[i]:\n mlp_raster[i]['label'] = mlp_raster[i]['filename']\n if 'semantic' not in mlp_raster[i]:\n mlp_raster[i]['semantic'] = 1\n if 'lens_distortion' not in mlp_raster[i]['camera']:\n mlp_raster[i]['camera']['lens_distortion'] = [0, 0]\n if 'center_px' not in mlp_raster[i]['camera']:\n mlp_raster[i]['camera']['center_px'] = [int(mlp_raster[i]['camera']['image_px'][0]/2), int(mlp_raster[i]['camera']['image_px'][1]/2)]\n\n mlp_file.write(' \\n'.format(mlp_raster[i]['label']))\n\n mlp_file.write(' '.join([\n ' \\n']))\n mlp_file.write(' \\n'.format(mlp_raster[i]['semantic'], mlp_raster[i]['filename']))\n mlp_file.write(' \\n')\n mlp_file.write(' \\n')\n mlp_file.close()\n # print(mlp_raster)\n else:\n mlp_file = open(file_out, 'a')\n mlp_file.write(' \\n')\n mlp_file.close()\n\n # Closing lines\n mlp_file = open(file_out, 'a')\n mlp_file.write('\\n')\n mlp_file.close()\n return", "response": "Create mlp file from mesh and raster"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nadding a new mesh layer to the end of the stack.", "response": "def add_layer(self, label, change_layer=True):\n \"\"\" Add new mesh layer to the end of the stack\n\n Args:\n label (str): new label for the mesh layer\n change_layer (bool): change to the newly created layer\n \"\"\"\n self.layer_stack.insert(self.last_layer() + 1, label)\n if change_layer:\n self.set_current_layer(self.last_layer())\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef del_layer(self, layer_num):\n del self.layer_stack[layer_num]\n # Adjust current layer if needed\n if layer_num < self.current_layer():\n self.set_current_layer(self.current_layer() - 1)\n return None", "response": "Delete a mesh layer."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsaves the filter script to a mlx file", "response": "def save_to_file(self, script_file):\n \"\"\" Save filter script to an mlx file \"\"\"\n # TODO: rasie exception here instead?\n if not self.filters:\n print('WARNING: no filters to save to file!')\n script_file_descriptor = open(script_file, 'w')\n script_file_descriptor.write(''.join(self.opening + self.filters + self.closing))\n script_file_descriptor.close()"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef run_script(self, log=None, ml_log=None, mlp_out=None, overwrite=False,\n file_out=None, output_mask=None, script_file=None, print_meshlabserver_output=True):\n \"\"\" Run the script\n \"\"\"\n\n temp_script = False\n temp_ml_log = False\n\n if self.__no_file_in:\n # If no input files are provided, create a dummy file\n # with a single vertex and delete it first in the script.\n # This works around the fact that meshlabserver will\n # not run without an input file.\n temp_file_in_file = tempfile.NamedTemporaryFile(delete=False, suffix='.xyz', dir=os.getcwd())\n temp_file_in_file.write(b'0 0 0')\n temp_file_in_file.close()\n self.file_in = [temp_file_in_file.name]\n\n if not self.filters:\n script_file = None\n elif script_file is None:\n # Create temporary script file\n temp_script = True\n temp_script_file = tempfile.NamedTemporaryFile(delete=False, suffix='.mlx')\n temp_script_file.close()\n self.save_to_file(temp_script_file.name)\n script_file = temp_script_file.name\n\n if (self.parse_geometry or self.parse_topology or self.parse_hausdorff) and (ml_log is None):\n # create temp ml_log\n temp_ml_log = True\n ml_log_file = tempfile.NamedTemporaryFile(delete=False, suffix='.txt')\n ml_log_file.close()\n ml_log = ml_log_file.name\n if file_out is None:\n file_out = self.file_out\n\n run(script=script_file, log=log, ml_log=ml_log,\n mlp_in=self.mlp_in, mlp_out=mlp_out, overwrite=overwrite,\n file_in=self.file_in, file_out=file_out, output_mask=output_mask, ml_version=self.ml_version,\n print_meshlabserver_output=print_meshlabserver_output)\n\n # Parse output\n # TODO: record which layer this is associated with?\n if self.parse_geometry:\n self.geometry = compute.parse_geometry(ml_log, log, print_output=print_meshlabserver_output)\n if self.parse_topology:\n self.topology = compute.parse_topology(ml_log, log, print_output=print_meshlabserver_output)\n if self.parse_hausdorff:\n self.hausdorff_distance = compute.parse_hausdorff(ml_log, log, print_output=print_meshlabserver_output)\n\n # Delete temp files\n if self.__no_file_in:\n os.remove(temp_file_in_file.name)\n if temp_script:\n os.remove(temp_script_file.name)\n if temp_ml_log:\n os.remove(ml_log_file.name)", "response": "Run the script and save the output to file_out."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nselects & delete the small disconnected parts of a mesh.", "response": "def small_parts(script, ratio=0.2, non_closed_only=False):\n \"\"\" Select & delete the small disconnected parts (components) of a mesh.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n ratio (float): This ratio (between 0 and 1) defines the meaning of\n 'small' as the threshold ratio between the number of faces of the\n largest component and the other ones. A larger value will select\n more components.\n non_closed_only (bool): Select only non-closed components.\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n select.small_parts(script, ratio, non_closed_only)\n selected(script)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef selected(script, face=True, vert=True):\n if face and vert:\n filter_xml = ' \\n'\n elif face and not vert:\n filter_xml = ' \\n'\n elif not face and vert:\n filter_xml = ' \\n'\n util.write_filter(script, filter_xml)\n return None", "response": "Delete selected vertices and faces and faces from the current mesh."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef unreferenced_vert(script):\n if script.ml_version == '1.3.4BETA':\n filter_xml = ' \\n'\n else:\n filter_xml = ' \\n'\n util.write_filter(script, filter_xml)\n return None", "response": "Check for every vertex in the mesh that is NOT referenced by a face."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck for every vertex on the mesh : there are two vertices with the same coordinates they are merged into a single one.", "response": "def duplicate_verts(script):\n \"\"\" \"Check for every vertex on the mesh: if there are two vertices with\n the same coordinates they are merged into a single one.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n if script.ml_version == '1.3.4BETA':\n filter_xml = ' \\n'\n else:\n filter_xml = ' \\n'\n\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef hausdorff_distance(script, sampled_layer=1, target_layer=0,\n save_sample=False, sample_vert=True, sample_edge=True,\n sample_faux_edge=False, sample_face=True,\n sample_num=1000, maxdist=10):\n \"\"\" Compute the Hausdorff Distance between two meshes, sampling one of the\n two and finding for each sample the closest point over the other mesh.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n sampled_layer (int): The mesh layer whose surface is sampled. For each\n sample we search the closest point on the target mesh layer.\n target_layer (int): The mesh that is sampled for the comparison.\n save_sample (bool): Save the position and distance of all the used\n samples on both the two surfaces, creating two new layers with two\n point clouds representing the used samples.\n sample_vert (bool): For the search of maxima it is useful to sample\n vertices and edges of the mesh with a greater care. It is quite\n probable that the farthest points falls along edges or on mesh\n vertexes, and with uniform montecarlo sampling approaches the\n probability of taking a sample over a vertex or an edge is\n theoretically null. On the other hand this kind of sampling could\n make the overall sampling distribution slightly biased and slightly\n affects the cumulative results.\n sample_edge (bool): see sample_vert\n sample_faux_edge (bool): see sample_vert\n sample_face (bool): see sample_vert\n sample_num (int): The desired number of samples. It can be smaller or\n larger than the mesh size, and according to the chosen sampling\n strategy it will try to adapt.\n maxdist (int): Sample points for which we do not find anything within\n this distance are rejected and not considered neither for averaging\n nor for max.\n\n Layer stack:\n If save_sample is True, two new layers are created: 'Hausdorff Closest\n Points' and 'Hausdorff Sample Point'; and the current layer is\n changed to the last newly created layer.\n If save_sample is False, no impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n # MeshLab defaults:\n # sample_num = number of vertices\n # maxdist = 0.05 * AABB['diag'] #5% of AABB[diag]\n # maxdist_max = AABB['diag']\n maxdist_max = 2*maxdist\n # TODO: parse output (min, max, mean, etc.)\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.parse_hausdorff = True\n if isinstance(script, FilterScript) and save_sample:\n script.add_layer('Hausdorff Closest Points')\n script.add_layer('Hausdorff Sample Point')\n return None", "response": "Compute the Hausdorff distance between two meshes."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncreates a new layer with a point sampling of the current mesh and a point sampling of the base mesh.", "response": "def poisson_disk(script, sample_num=1000, radius=0.0,\n montecarlo_rate=20, save_montecarlo=False,\n approx_geodesic_dist=False, subsample=False, refine=False,\n refine_layer=0, best_sample=True, best_sample_pool=10,\n exact_num=False, radius_variance=1.0):\n \"\"\" Create a new layer populated with a point sampling of the current mesh.\n\n Samples are generated according to a Poisson-disk distribution, using the\n algorithm described in:\n\n 'Efficient and Flexible Sampling with Blue Noise Properties of Triangular Meshes'\n Massimiliano Corsini, Paolo Cignoni, Roberto Scopigno\n IEEE TVCG 2012\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n sample_num (int): The desired number of samples. The radius of the disk\n is calculated according to the sampling density.\n radius (float): If not zero this parameter overrides the previous\n parameter to allow exact radius specification.\n montecarlo_rate (int): The over-sampling rate that is used to generate\n the intial Monte Carlo samples (e.g. if this parameter is 'K' means\n that 'K * sample_num' points will be used). The generated\n Poisson-disk samples are a subset of these initial Monte Carlo\n samples. Larger numbers slow the process but make it a bit more\n accurate.\n save_montecarlo (bool): If True, it will generate an additional Layer\n with the Monte Carlo sampling that was pruned to build the Poisson\n distribution.\n approx_geodesic_dist (bool): If True Poisson-disk distances are\n computed using an approximate geodesic distance, e.g. an Euclidean\n distance weighted by a function of the difference between the\n normals of the two points.\n subsample (bool): If True the original vertices of the base mesh are\n used as base set of points. In this case the sample_num should be\n obviously much smaller than the original vertex number. Note that\n this option is very useful in the case you want to subsample a\n dense point cloud.\n refine (bool): If True the vertices of the refine_layer mesh layer are\n used as starting vertices, and they will be utterly refined by\n adding more and more points until possible.\n refine_layer (int): Used only if refine is True.\n best_sample (bool): If True it will use a simple heuristic for choosing\n the samples. At a small cost (it can slow the process a bit) it\n usually improves the maximality of the generated sampling.\n best_sample_pool (bool): Used only if best_sample is True. It controls\n the number of attempts that it makes to get the best sample. It is\n reasonable that it is smaller than the Monte Carlo oversampling\n factor.\n exact_num (bool): If True it will try to do a dicotomic search for the\n best Poisson-disk radius that will generate the requested number of\n samples with a tolerance of the 0.5%. Obviously it takes much\n longer.\n radius_variance (float): The radius of the disk is allowed to vary\n between r and r*var. If this parameter is 1 the sampling is the\n same as the Poisson-disk Sampling.\n\n Layer stack:\n Creates new layer 'Poisson-disk Samples'. Current layer is NOT changed\n to the new layer (see Bugs).\n If save_montecarlo is True, creates a new layer 'Montecarlo Samples'.\n Current layer is NOT changed to the new layer (see Bugs).\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n\n Bugs:\n Current layer is NOT changed to the new layer, which is inconsistent\n with the majority of filters that create new layers.\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Poisson-disk Samples')\n if save_montecarlo:\n script.add_layer('Montecarlo Samples')\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef mesh_element(script, sample_num=1000, element='VERT'):\n if element.lower() == 'vert':\n element_num = 0\n elif element.lower() == 'edge':\n element_num = 1\n elif element.lower() == 'face':\n element_num = 2\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Sampled Mesh')\n return None", "response": "Create a new MeshLab layer with a point sampling of the current mesh."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef clustered_vert(script, cell_size=1.0, strategy='AVERAGE', selected=False):\n if strategy.lower() == 'average':\n strategy_num = 0\n elif strategy.lower() == 'center':\n strategy_num = 1\n\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, FilterScript):\n script.add_layer('Cluster Samples')\n return None", "response": "Create a new layer populated with a subsampling of the vertexes of the current mesh the subsampling is driven by a simple one - per - gridded cell strategy."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef isometric(script, targetAbstractMinFaceNum=140, targetAbstractMaxFaceNum=180,\n stopCriteria=1, convergenceSpeed=1, DoubleStep=True):\n \"\"\"Isometric parameterization\n\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' Large numbers (greater than 400) are usually not of practical use.\"',\n '/>\\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function returns an isometric parameterization of a script."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef isometric_build_atlased_mesh(script, BorderSize=0.1):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function generates the UV mapping from the isometric parameterization build atlased mesh."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef isometric_remesh(script, SamplingRate=10):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "Isometric parameterization for remeshing."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef set_texture(script, textName=\"TEMP3D.png\", textDim=1024):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "Set the named texture in the current language."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nprojecting the rasters into a single text file.", "response": "def project_rasters(script, tex_file_out=\"TEMP3D.png\", tex_size=1024,\n fill_atlas_gaps=False, depth_threshold=0.5,\n selected=False, use_angle=True, use_distance=True,\n use_borders=True, use_silhouettes=True, use_alpha=False):\n \"\"\"Set texture\n\n Creates new texture file\n tex_file_out = must be png\n fill_atlas_gaps = setting this to false will leave the unprojected area transparent. This can then be easily composed with the original texture with PIL\n\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef param_texture_from_rasters(script, textName=\"TEMP3D.png\", texsize=1024,\n colorCorrection=True, colorCorrectionFilterSize=1,\n useDistanceWeight=True, useImgBorderWeight=True,\n useAlphaWeight=False, cleanIsolatedTriangles=True,\n stretchingAllowed=False, textureGutter=4):\n \"\"\"Set texture\n\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function generates a parameterization image from registered rasters."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef param_from_rasters(script, useDistanceWeight=True, useImgBorderWeight=True,\n useAlphaWeight=False, cleanIsolatedTriangles=True,\n stretchingAllowed=False, textureGutter=4):\n \"\"\"Set texture\n\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function generates a parameterization from registered rasters."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncompute the polyline representing a section of a given script.", "response": "def section(script, axis='z', offset=0.0, surface=False, custom_axis=None,\n planeref=2):\n \"\"\" Compute the polyline representing a planar section (a slice) of a mesh.\n\n If the resulting polyline is closed the result can be filled with a\n triangular mesh representing the section.\n\n Args:\n script: the mlx.FilterScript object or script filename to write\n the filter to.\n axis (str): The slicing plane is perpendicular to this axis. Accepted\n values are 'x', 'y', or 'z'; any other input will be interpreted\n as a custom axis (although using 'custom' is recommended\n for clarity). Upper or lowercase values are accepted.\n offset (float): Specify an offset of the cross-plane. The offset\n corresponds to the distance along 'axis' from the point specified\n in 'planeref'.\n surface (bool): If True, in addition to a layer with the section\n polyline, also a layer with a triangulated version of the section\n polyline will be created. This only works if the section polyline\n is closed.\n custom_axis (3 component list or tuple): Specify a custom axis as\n a 3 component vector (x, y, z); this is ignored unless 'axis' is\n set to 'custom'.\n planeref (int): Specify the reference from which the planes are\n shifted. Valid values are:\n 0 - Bounding box center\n 1 - Bounding box min\n 2 - Origin (default)\n\n Layer stack:\n Creates a new layer '{label}_sect_{axis_name}_{offset}', where\n 'axis_name' is one of [X, Y, Z, custom] and 'offest' is\n truncated 'offset'\n If surface is True, create a new layer '{label}_sect_{axis}_{offset}_mesh'\n Current layer is changed to the last (newly created) layer\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n # Convert axis name into number\n if axis.lower() == 'x':\n axis_num = 0\n axis_name = 'X'\n elif axis.lower() == 'y':\n axis_num = 1\n axis_name = 'Y'\n elif axis.lower() == 'z':\n axis_num = 2\n axis_name = 'Z'\n else: # custom axis\n axis_num = 3\n axis_name = 'custom'\n if custom_axis is None:\n print('WARNING: a custom axis was selected, however',\n '\"custom_axis\" was not provided. Using default (Z).')\n if custom_axis is None:\n custom_axis = (0.0, 0.0, 1.0)\n\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n if isinstance(script, mlx.FilterScript):\n current_layer_label = script.layer_stack[script.current_layer()]\n script.add_layer('{}_sect_{}_{}'.format(current_layer_label, axis_name,\n int(offset)))\n if surface:\n script.add_layer('{}_sect_{}_{}_mesh'.format(current_layer_label,\n axis_name, int(offset)))\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef measure_geometry(script):\n filter_xml = ' \\n'\n util.write_filter(script, filter_xml)\n if isinstance(script, mlx.FilterScript):\n script.parse_geometry = True\n return None", "response": "Compute a set of geometric measures of a mesh and pointcloud."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef measure_topology(script):\n filter_xml = ' \\n'\n util.write_filter(script, filter_xml)\n if isinstance(script, mlx.FilterScript):\n script.parse_topology = True\n return None", "response": "Compute a set of topological measures over a mesh."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nparses the ML log file generated by the measure_geometry function.", "response": "def parse_geometry(ml_log, log=None, ml_version='2016.12', print_output=False):\n \"\"\"Parse the ml_log file generated by the measure_geometry function.\n\n Warnings: Not all keys may exist if mesh is not watertight or manifold\n\n Args:\n ml_log (str): MeshLab log file to parse\n log (str): filename to log output\n \"\"\"\n # TODO: read more than one occurrence per file. Record in list.\n aabb = {}\n geometry = {'aabb':aabb}\n with open(ml_log) as fread:\n for line in fread:\n if 'Mesh Bounding Box min' in line: #2016.12\n geometry['aabb']['min'] = (line.split()[4:7])\n geometry['aabb']['min'] = [util.to_float(val) for val in geometry['aabb']['min']]\n if 'Mesh Bounding Box max' in line: #2016.12\n geometry['aabb']['max'] = (line.split()[4:7])\n geometry['aabb']['max'] = [util.to_float(val) for val in geometry['aabb']['max']]\n if 'Mesh Bounding Box Size' in line: #2016.12\n geometry['aabb']['size'] = (line.split()[4:7])\n geometry['aabb']['size'] = [util.to_float(val) for val in geometry['aabb']['size']]\n if 'Mesh Bounding Box Diag' in line: #2016.12\n geometry['aabb']['diagonal'] = util.to_float(line.split()[4])\n if 'Mesh Volume' in line:\n geometry['volume_mm3'] = util.to_float(line.split()[3])\n geometry['volume_cm3'] = geometry['volume_mm3'] * 0.001\n if 'Mesh Surface' in line:\n if ml_version == '1.3.4BETA':\n geometry['area_mm2'] = util.to_float(line.split()[3])\n else:\n geometry['area_mm2'] = util.to_float(line.split()[4])\n geometry['area_cm2'] = geometry['area_mm2'] * 0.01\n if 'Mesh Total Len of' in line:\n if 'including faux edges' in line:\n geometry['total_edge_length_incl_faux'] = util.to_float(\n line.split()[7])\n else:\n geometry['total_edge_length'] = util.to_float(\n line.split()[7])\n if 'Thin shell barycenter' in line:\n geometry['barycenter'] = (line.split()[3:6])\n geometry['barycenter'] = [util.to_float(val) for val in geometry['barycenter']]\n if 'Thin shell (faces) barycenter' in line: #2016.12\n geometry['barycenter'] = (line.split()[4:7])\n geometry['barycenter'] = [util.to_float(val) for val in geometry['barycenter']]\n if 'Vertices barycenter' in line: #2016.12\n geometry['vert_barycenter'] = (line.split()[2:5])\n geometry['vert_barycenter'] = [util.to_float(val) for val in geometry['vert_barycenter']]\n if 'Center of Mass' in line:\n geometry['center_of_mass'] = (line.split()[4:7])\n geometry['center_of_mass'] = [util.to_float(val) for val in geometry['center_of_mass']]\n if 'Inertia Tensor' in line:\n geometry['inertia_tensor'] = []\n for val in range(3):\n row = (next(fread, val).split()[1:4])\n row = [util.to_float(b) for b in row]\n geometry['inertia_tensor'].append(row)\n if 'Principal axes' in line:\n geometry['principal_axes'] = []\n for val in range(3):\n row = (next(fread, val).split()[1:4])\n row = [util.to_float(b) for b in row]\n geometry['principal_axes'].append(row)\n if 'axis momenta' in line:\n geometry['axis_momenta'] = (next(fread).split()[1:4])\n geometry['axis_momenta'] = [util.to_float(val) for val in geometry['axis_momenta']]\n break # stop after we find the first match\n for key, value in geometry.items():\n if log is not None:\n log_file = open(log, 'a')\n log_file.write('{:27} = {}\\n'.format(key, value))\n log_file.close()\n elif print_output:\n print('{:27} = {}'.format(key, value))\n return geometry"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nparse the ML log file generated by the measure_topology function.", "response": "def parse_topology(ml_log, log=None, ml_version='1.3.4BETA', print_output=False):\n \"\"\"Parse the ml_log file generated by the measure_topology function.\n\n Args:\n ml_log (str): MeshLab log file to parse\n log (str): filename to log output\n\n Returns:\n dict: dictionary with the following keys:\n vert_num (int): number of vertices\n edge_num (int): number of edges\n face_num (int): number of faces\n unref_vert_num (int): number or unreferenced vertices\n boundry_edge_num (int): number of boundary edges\n part_num (int): number of parts (components) in the mesh.\n manifold (bool): True if mesh is two-manifold, otherwise false.\n non_manifold_edge (int): number of non_manifold edges.\n non_manifold_vert (int): number of non-manifold verices\n genus (int or str): genus of the mesh, either a number or\n 'undefined' if the mesh is non-manifold.\n holes (int or str): number of holes in the mesh, either a number\n or 'undefined' if the mesh is non-manifold.\n\n \"\"\"\n topology = {'manifold': True, 'non_manifold_E': 0, 'non_manifold_V': 0}\n with open(ml_log) as fread:\n for line in fread:\n if 'V:' in line:\n vert_edge_face = line.replace('V:', ' ').replace('E:', ' ').replace('F:', ' ').split()\n topology['vert_num'] = int(vert_edge_face[0])\n topology['edge_num'] = int(vert_edge_face[1])\n topology['face_num'] = int(vert_edge_face[2])\n if 'Unreferenced Vertices' in line:\n topology['unref_vert_num'] = int(line.split()[2])\n if 'Boundary Edges' in line:\n topology['boundry_edge_num'] = int(line.split()[2])\n if 'Mesh is composed by' in line:\n topology['part_num'] = int(line.split()[4])\n if 'non 2-manifold mesh' in line:\n topology['manifold'] = False\n if 'non two manifold edges' in line:\n topology['non_manifold_edge'] = int(line.split()[2])\n if 'non two manifold vertexes' in line:\n topology['non_manifold_vert'] = int(line.split()[2])\n if 'Genus is' in line: # undefined or int\n topology['genus'] = line.split()[2]\n if topology['genus'] != 'undefined':\n topology['genus'] = int(topology['genus'])\n if 'holes' in line:\n topology['hole_num'] = line.split()[2]\n if topology['hole_num'] == 'a':\n topology['hole_num'] = 'undefined'\n else:\n topology['hole_num'] = int(topology['hole_num'])\n for key, value in topology.items():\n if log is not None:\n log_file = open(log, 'a')\n log_file.write('{:16} = {}\\n'.format(key, value))\n log_file.close()\n elif print_output:\n print('{:16} = {}'.format(key, value))\n\n return topology"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef parse_hausdorff(ml_log, log=None, print_output=False):\n hausdorff_distance = {\"min_distance\": 0.0,\n \"max_distance\": 0.0,\n \"mean_distance\": 0.0,\n \"rms_distance\": 0.0,\n \"number_points\": 0}\n with open(ml_log) as fread:\n result = fread.readlines()\n data = \"\"\n\n for idx, line in enumerate(result):\n m = re.match(r\"\\s*Sampled (\\d+) pts.*\", line)\n if m is not None:\n hausdorff_distance[\"number_points\"] = int(m.group(1))\n if 'Hausdorff Distance computed' in line:\n data = result[idx + 2]\n\n m = re.match(r\"\\D+(\\d+\\.*\\d*)\\D+(\\d+\\.*\\d*)\\D+(\\d+\\.*\\d*)\\D+(\\d+\\.*\\d*)\", data)\n hausdorff_distance[\"min_distance\"] = float(m.group(1))\n hausdorff_distance[\"max_distance\"] = float(m.group(2))\n hausdorff_distance[\"mean_distance\"] = float(m.group(3))\n hausdorff_distance[\"rms_distance\"] = float(m.group(4))\n for key, value in hausdorff_distance.items():\n if log is not None:\n log_file = open(log, 'a')\n log_file.write('{:16} = {}\\n'.format(key, value))\n log_file.close()\n elif print_output:\n print('{:16} = {}'.format(key, value))\n return hausdorff_distance", "response": "Parse the ML log file generated by the hausdorff_distance function."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef function(script, red=255, green=255, blue=255, alpha=255, color=None):\n # TODO: add options for HSV\n # https://www.cs.rit.edu/~ncs/color/t_convert.html\n if color is not None:\n red, green, blue, _ = color_name[color.lower()]\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "Function to generate color for every vertex in the filter tree."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef voronoi(script, target_layer=0, source_layer=1, backward=True):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function writes a filter to the file."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef cyclic_rainbow(script, direction='sphere', start_pt=(0, 0, 0),\n amplitude=255 / 2, center=255 / 2, freq=0.8,\n phase=(0, 120, 240, 0), alpha=False):\n \"\"\" Color mesh vertices in a repeating sinusiodal rainbow pattern\n\n Sine wave follows the following equation for each color channel (RGBA):\n channel = sin(freq*increment + phase)*amplitude + center\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n direction (str) = the direction that the sine wave will travel; this\n and the start_pt determine the 'increment' of the sine function.\n Valid values are:\n 'sphere' - radiate sine wave outward from start_pt (default)\n 'x' - sine wave travels along the X axis\n 'y' - sine wave travels along the Y axis\n 'z' - sine wave travels along the Z axis\n or define the increment directly using a muparser function, e.g.\n '2x + y'. In this case start_pt will not be used; include it in\n the function directly.\n start_pt (3 coordinate tuple or list): start point of the sine wave. For a\n sphere this is the center of the sphere.\n amplitude (float [0, 255], single value or 4 term tuple or list): amplitude\n of the sine wave, with range between 0-255. If a single value is\n specified it will be used for all channels, otherwise specify each\n channel individually.\n center (float [0, 255], single value or 4 term tuple or list): center\n of the sine wave, with range between 0-255. If a single value is\n specified it will be used for all channels, otherwise specify each\n channel individually.\n freq (float, single value or 4 term tuple or list): frequency of the sine\n wave. If a single value is specified it will be used for all channels,\n otherwise specifiy each channel individually.\n phase (float [0, 360], single value or 4 term tuple or list): phase\n of the sine wave in degrees, with range between 0-360. If a single\n value is specified it will be used for all channels, otherwise specify\n each channel individually.\n alpha (bool): if False the alpha channel will be set to 255 (full opacity).\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n start_pt = util.make_list(start_pt, 3)\n amplitude = util.make_list(amplitude, 4)\n center = util.make_list(center, 4)\n freq = util.make_list(freq, 4)\n phase = util.make_list(phase, 4)\n\n if direction.lower() == 'sphere':\n increment = 'sqrt((x-{})^2+(y-{})^2+(z-{})^2)'.format(\n start_pt[0], start_pt[1], start_pt[2])\n elif direction.lower() == 'x':\n increment = 'x - {}'.format(start_pt[0])\n elif direction.lower() == 'y':\n increment = 'y - {}'.format(start_pt[1])\n elif direction.lower() == 'z':\n increment = 'z - {}'.format(start_pt[2])\n else:\n increment = direction\n\n red_func = '{a}*sin({f}*{i} + {p}) + {c}'.format(\n f=freq[0], i=increment, p=math.radians(phase[0]),\n a=amplitude[0], c=center[0])\n green_func = '{a}*sin({f}*{i} + {p}) + {c}'.format(\n f=freq[1], i=increment, p=math.radians(phase[1]),\n a=amplitude[1], c=center[1])\n blue_func = '{a}*sin({f}*{i} + {p}) + {c}'.format(\n f=freq[2], i=increment, p=math.radians(phase[2]),\n a=amplitude[2], c=center[2])\n if alpha:\n alpha_func = '{a}*sin({f}*{i} + {p}) + {c}'.format(\n f=freq[3], i=increment, p=math.radians(phase[3]),\n a=amplitude[3], c=center[3])\n else:\n alpha_func = 255\n\n function(script, red=red_func, green=green_func, blue=blue_func,\n alpha=alpha_func)\n return None", "response": "This function creates a color mesh that is a cyclic rainbow pattern."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef v_multiply(scalar, v1):\n vector = []\n for i, x in enumerate(v1):\n vector.append('(({})*({}))'.format(scalar, v1[i]))\n return vector", "response": "Multiply vector by scalar"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nadd a new Per-Vertex scalar attribute to current mesh and fill it with the defined function. The specified name can be used in other filter functions. It's possible to use parenthesis, per-vertex variables and boolean operator: (, ), and, or, <, >, = It's possible to use the following per-vertex variables in the expression: Variables: x, y, z (coordinates) nx, ny, nz (normal) r, g, b, a (color) q (quality) rad vi (vertex index) ?vtu, vtv (texture coordinates) ?ti (texture index) ?vsel (is the vertex selected? 1 yes, 0 no) and all custom vertex attributes already defined by user. Args: script: the FilterScript object or script filename to write the filter] to. name (str): the name of new attribute. You can access attribute in other filters through this name. function (str): function to calculate custom attribute value for each vertex Layer stack: No impacts MeshLab versions: 2016.12 1.3.4BETA", "response": "def vert_attr(script, name='radius', function='x^2 + y^2'):\n \"\"\" Add a new Per-Vertex scalar attribute to current mesh and fill it with\n the defined function.\n\n The specified name can be used in other filter functions.\n\n It's possible to use parenthesis, per-vertex variables and boolean operator:\n (, ), and, or, <, >, =\n It's possible to use the following per-vertex variables in the expression:\n\n Variables:\n x, y, z (coordinates)\n nx, ny, nz (normal)\n r, g, b, a (color)\n q (quality)\n rad\n vi (vertex index)\n ?vtu, vtv (texture coordinates)\n ?ti (texture index)\n ?vsel (is the vertex selected? 1 yes, 0 no)\n and all custom vertex attributes already defined by user.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter] to.\n name (str): the name of new attribute. You can access attribute in\n other filters through this name.\n function (str): function to calculate custom attribute value for each\n vertex\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef quatrefoil():\n start_time = time.time()\n\n os.chdir(THIS_SCRIPTPATH)\n #ml_version = '1.3.4BETA'\n ml_version = '2016.12'\n\n # Add meshlabserver directory to OS PATH; omit this if it is already in\n # your PATH\n meshlabserver_path = 'C:\\\\Program Files\\\\VCG\\\\MeshLab'\n \"\"\"\n if ml_version is '1.3.4BETA':\n meshlabserver_path = 'C:\\\\Program Files\\\\VCG\\\\MeshLab'\n elif ml_version is '2016.12':\n meshlabserver_path = 'C:\\\\Program Files\\\\VCG\\\\MeshLab_2016_12'\n \"\"\"\n os.environ['PATH'] = meshlabserver_path + os.pathsep + os.environ['PATH']\n\n # Cross section parameters\n length = math.radians(360)\n tube_size = [10, 10, length]\n segments = [64, 64, 720*2]\n inner_radius = 2.0\n\n # Sinusoidal deformation parametera\n amplitude = 4.2\n freq = 4\n phase = 270\n center = 'r'\n start_pt = 0\n increment = 'z-{}'.format(start_pt)\n\n # Cyclic rainbow color parameters\n c_start_pt = 0\n c_freq = 5\n c_phase_shift = 0 #90 #300\n c_phase = (0 + c_phase_shift, 120 + c_phase_shift, 240 + c_phase_shift, 0)\n\n # Voronoi surface parameters\n holes = [2, 2, 44] # Number of holes in each axis; x are sides, y is outside\n web_thickness = 0.5\n solid_radius = 5.0 # If the mesh is smaller than this radius the holes will be closed\n faces_surface = 50000\n\n # Voronoi solid parameters\n voxel = 0.5\n thickness = 2.5\n faces_solid = 200000\n\n # Scaling parameters\n size = 75 # desired max size of the curve\n curve_max_size = 2*(1 + 1.5) # the 1.5 s/b inner_radius, but am keepng current scaling\n scale = (size-2*(thickness + amplitude) - tube_size[1])/curve_max_size\n\n # File names\n file_color = 'quatrefoil_color.ply'\n file_voronoi_surf = 'quatrefoil_voronoi_surf.ply'\n file_voronoi_solid = 'quatrefoil_voronoi_solid.ply'\n file_voronoi_color = 'quatrefoil_voronoi_final.ply'\n\n # Create FilterScript objects for each step in the process\n quatrefoil_color = mlx.FilterScript(\n file_in=None, file_out=file_color, ml_version=ml_version)\n quatrefoil_voronoi_surf = mlx.FilterScript(\n file_in=file_color, file_out=file_voronoi_surf, ml_version=ml_version)\n quatrefoil_voronoi_solid = mlx.FilterScript(\n file_in=file_voronoi_surf, file_out=file_voronoi_solid,\n ml_version=ml_version)\n quatrefoil_voronoi_color = mlx.FilterScript(\n file_in=[file_color, file_voronoi_solid], file_out=file_voronoi_color,\n ml_version=ml_version)\n\n\n print('\\n Create colored quatrefoil curve ...')\n mlx.create.cube_open_hires(\n quatrefoil_color, size=tube_size, x_segments=segments[0],\n y_segments=segments[1], z_segments=segments[2], center=True)\n mlx.transform.translate(quatrefoil_color, [0, 0, length/2])\n\n # Sinusoidal deformation\n r_func = '({a})*sin(({f})*({i}) + ({p})) + ({c})'.format(\n f=freq, i=increment, p=math.radians(phase), a=amplitude, c=center)\n mlx.transform.function_cyl_co(\n quatrefoil_color, r_func=r_func, theta_func='theta', z_func='z')\n\n # Save max radius in quality field so that we can save it with the file\n # for use in the next step\n max_radius = math.sqrt((tube_size[0]/2)**2+(tube_size[1]/2)**2) # at corners\n q_func = '({a})*sin(({f})*({i}) + ({p})) + ({c})'.format(\n f=freq, i=increment, p=math.radians(phase), a=amplitude, c=max_radius)\n mlx.mp_func.vq_function(quatrefoil_color, function=q_func)\n\n # Apply rainbow vertex colors\n mlx.vert_color.cyclic_rainbow(\n quatrefoil_color, direction='z', start_pt=c_start_pt, amplitude=255 / 2,\n center=255 / 2, freq=c_freq, phase=c_phase)\n\n # Deform mesh to quatrefoil curve. Merge vertices after, which\n # will weld the ends together so it becomes watertight\n quatrefoil_func = mlx.transform.deform2curve(\n quatrefoil_color,\n curve=mlx.mp_func.torus_knot('t', p=3, q=4, scale=scale,\n radius=inner_radius))\n mlx.clean.merge_vert(quatrefoil_color, threshold=0.0001)\n\n # Run script\n mlx.layers.delete_lower(quatrefoil_color)\n quatrefoil_color.run_script(output_mask='-m vc vq')\n\n print('\\n Create Voronoi surface ...')\n # Move quality value into radius attribute\n mlx.mp_func.vert_attr(quatrefoil_voronoi_surf, name='radius', function='q')\n\n # Create seed vertices\n # For grid style holes, we will create a mesh similar to the original\n # but with fewer vertices.\n mlx.create.cube_open_hires(\n quatrefoil_voronoi_surf, size=tube_size, x_segments=holes[0]+1,\n y_segments=holes[1]+1, z_segments=holes[2]+1, center=True)\n mlx.select.all(quatrefoil_voronoi_surf, vert=False)\n mlx.delete.selected(quatrefoil_voronoi_surf, vert=False)\n mlx.select.cylindrical_vert(quatrefoil_voronoi_surf,\n radius=max_radius-0.0001, inside=False)\n mlx.transform.translate(quatrefoil_voronoi_surf, [0, 0, 20])\n mlx.delete.selected(quatrefoil_voronoi_surf, face=False)\n\n mlx.transform.function_cyl_co(quatrefoil_voronoi_surf, r_func=r_func,\n theta_func='theta', z_func='z')\n mlx.transform.vert_function(\n quatrefoil_voronoi_surf, x_func=quatrefoil_func[0],\n y_func=quatrefoil_func[1], z_func=quatrefoil_func[2])\n\n mlx.layers.change(quatrefoil_voronoi_surf, 0)\n mlx.vert_color.voronoi(quatrefoil_voronoi_surf)\n\n if quatrefoil_voronoi_surf.ml_version == '1.3.4BETA':\n sel_func = '(q <= {}) or ((radius)<={})'.format(web_thickness, solid_radius)\n else:\n sel_func = '(q <= {}) || ((radius)<={})'.format(web_thickness, solid_radius)\n mlx.select.vert_function(quatrefoil_voronoi_surf, function=sel_func)\n #mlx.select.face_function(quatrefoil_voronoi_surf, function='(vsel0 && vsel1 && vsel2)')\n mlx.select.invert(quatrefoil_voronoi_surf, face=False)\n mlx.delete.selected(quatrefoil_voronoi_surf, face=False)\n\n mlx.smooth.laplacian(quatrefoil_voronoi_surf, iterations=3)\n mlx.remesh.simplify(quatrefoil_voronoi_surf, texture=False, faces=faces_surface)\n\n mlx.layers.delete_lower(quatrefoil_voronoi_surf)\n #quatrefoil_voronoi_surf.save_to_file('temp_script.mlx')\n quatrefoil_voronoi_surf.run_script(script_file=None, output_mask='-m vc vq')\n\n print('\\n Solidify Voronoi surface ...')\n mlx.remesh.uniform_resampling(quatrefoil_voronoi_solid, voxel=voxel,\n offset=thickness/2, thicken=True)\n mlx.layers.delete_lower(quatrefoil_voronoi_solid)\n quatrefoil_voronoi_solid.run_script()\n\n print('\\n Clean up & transfer color to final model ...')\n # Clean up from uniform mesh resamplng\n mlx.delete.small_parts(quatrefoil_voronoi_color)\n mlx.delete.unreferenced_vert(quatrefoil_voronoi_color)\n mlx.delete.faces_from_nonmanifold_edges(quatrefoil_voronoi_color)\n mlx.clean.split_vert_on_nonmanifold_face(quatrefoil_voronoi_color)\n mlx.clean.close_holes(quatrefoil_voronoi_color)\n\n # Simplify (to improve triangulation quality), refine, & smooth\n mlx.remesh.simplify(quatrefoil_voronoi_color, texture=False, faces=faces_solid)\n mlx.subdivide.ls3loop(quatrefoil_voronoi_color, iterations=1)\n mlx.smooth.laplacian(quatrefoil_voronoi_color, iterations=3)\n\n # Transfer colors from original curve\n mlx.transfer.vert_attr_2_meshes(\n quatrefoil_voronoi_color, source_mesh=0, target_mesh=1, color=True,\n max_distance=7)\n mlx.layers.delete_lower(quatrefoil_voronoi_color)\n quatrefoil_voronoi_color.run_script(script_file=None)\n print(' done! Took %.1f sec' % (time.time() - start_time))\n\n return None", "response": "Rainbow colored voronoi quatrefoil"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nflip the faces of a single - component watertight polygon.", "response": "def flip(script, force_flip=False, selected=False):\n \"\"\" Invert faces orientation, flipping the normals of the mesh.\n\n If requested, it tries to guess the right orientation; mainly it decides to\n flip all the faces if the minimum/maximum vertexes have not outward point\n normals for a few directions. Works well for single component watertight\n objects.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n force_flip (bool): If selected, the normals will always be flipped;\n otherwise, the filter tries to set them outside.\n selected (bool): If selected, only selected faces will be affected.\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef point_sets(script, neighbors=10, smooth_iteration=0, flip=False,\n viewpoint_pos=(0.0, 0.0, 0.0)):\n \"\"\" Compute the normals of the vertices of a mesh without exploiting the\n triangle connectivity, useful for dataset with no faces.\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n neighbors (int): The number of neighbors used to estimate normals.\n smooth_iteration (int): The number of smoothing iteration done on the\n p used to estimate and propagate normals.\n flip (bool): Flip normals w.r.t. viewpoint. If the 'viewpoint' (i.e.\n scanner position) is known, it can be used to disambiguate normals\n orientation, so that all the normals will be oriented in the same\n direction.\n viewpoint_pos (single xyz point, tuple or list): Set the x, y, z\n coordinates of the viewpoint position.\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function returns a filter that computes the normals of the vertices of a point set."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef laplacian(script, iterations=1, boundary=True, cotangent_weight=True,\n selected=False):\n \"\"\" Laplacian smooth of the mesh: for each vertex it calculates the average\n position with nearest vertex\n\n Args:\n script: the FilterScript object or script filename to write\n the filter to.\n iterations (int): The number of times that the whole algorithm (normal\n smoothing + vertex fitting) is iterated.\n boundary (bool): If true the boundary edges are smoothed only by\n themselves (e.g. the polyline forming the boundary of the mesh is\n independently smoothed). Can reduce the shrinking on the border but\n can have strange effects on very small boundaries.\n cotangent_weight (bool): If True the cotangent weighting scheme is\n computed for the averaging of the position. Otherwise (False) the\n simpler umbrella scheme (1 if the edge is present) is used.\n selected (bool): If selected the filter is performed only on the\n selected faces\n\n Layer stack:\n No impacts\n\n MeshLab versions:\n 2016.12\n 1.3.4BETA\n \"\"\"\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function calculates the Laplacian smooth of the mesh and writes it to the file."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef taubin(script, iterations=10, t_lambda=0.5, t_mu=-0.53, selected=False):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function writes a taubin filter to the Siggraph 1995."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef depth(script, iterations=3, viewpoint=(0, 0, 0), selected=False):\n filter_xml = ''.join([\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n',\n ' \\n'])\n util.write_filter(script, filter_xml)\n return None", "response": "This function writes a filter that is constrained to move vertices along the view direction."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nmeasure the axis aligned bounding box (aabb) of a mesh in multiple coordinate systems. Args: fbasename (str): filename of input model log (str): filename of log file coord_system (enum in ['CARTESIAN', 'CYLINDRICAL'] Coordinate system to use: 'CARTESIAN': lists contain [x, y, z] 'CYLINDRICAL': lists contain [r, theta, z] Returns: dict: dictionary with the following aabb properties min (3 element list): minimum values max (3 element list): maximum values center (3 element list): the center point size (3 element list): size of the aabb in each coordinate (max-min) diagonal (float): the diagonal of the aabb", "response": "def measure_aabb(fbasename=None, log=None, coord_system='CARTESIAN'):\n \"\"\" Measure the axis aligned bounding box (aabb) of a mesh\n in multiple coordinate systems.\n\n Args:\n fbasename (str): filename of input model\n log (str): filename of log file\n coord_system (enum in ['CARTESIAN', 'CYLINDRICAL']\n Coordinate system to use:\n 'CARTESIAN': lists contain [x, y, z]\n 'CYLINDRICAL': lists contain [r, theta, z]\n Returns:\n dict: dictionary with the following aabb properties\n min (3 element list): minimum values\n max (3 element list): maximum values\n center (3 element list): the center point\n size (3 element list): size of the aabb in each coordinate (max-min)\n diagonal (float): the diagonal of the aabb\n \"\"\"\n # TODO: add center point, spherical coordinate system\n fext = os.path.splitext(fbasename)[1][1:].strip().lower()\n if fext != 'xyz':\n fin = 'TEMP3D_aabb.xyz'\n run(log=log, file_in=fbasename, file_out=fin, script=None)\n else:\n fin = fbasename\n fread = open(fin, 'r')\n aabb = {'min': [999999.0, 999999.0, 999999.0], 'max': [-999999.0, -999999.0, -999999.0]}\n for line in fread:\n x_co, y_co, z_co = line.split()\n x_co = util.to_float(x_co)\n y_co = util.to_float(y_co)\n z_co = util.to_float(z_co)\n if coord_system == 'CARTESIAN':\n if x_co < aabb['min'][0]:\n aabb['min'][0] = x_co\n if y_co < aabb['min'][1]:\n aabb['min'][1] = y_co\n if z_co < aabb['min'][2]:\n aabb['min'][2] = z_co\n if x_co > aabb['max'][0]:\n aabb['max'][0] = x_co\n if y_co > aabb['max'][1]:\n aabb['max'][1] = y_co\n if z_co > aabb['max'][2]:\n aabb['max'][2] = z_co\n elif coord_system == 'CYLINDRICAL':\n radius = math.sqrt(x_co**2 + y_co**2)\n theta = math.degrees(math.atan2(y_co, x_co))\n if radius < aabb['min'][0]:\n aabb['min'][0] = radius\n if theta < aabb['min'][1]:\n aabb['min'][1] = theta\n if z_co < aabb['min'][2]:\n aabb['min'][2] = z_co\n if radius > aabb['max'][0]:\n aabb['max'][0] = radius\n if theta > aabb['max'][1]:\n aabb['max'][1] = theta\n if z_co > aabb['max'][2]:\n aabb['max'][2] = z_co\n fread.close()\n try:\n aabb['center'] = [(aabb['max'][0] + aabb['min'][0]) / 2,\n (aabb['max'][1] + aabb['min'][1]) / 2,\n (aabb['max'][2] + aabb['min'][2]) / 2]\n aabb['size'] = [aabb['max'][0] - aabb['min'][0], aabb['max'][1] - aabb['min'][1],\n aabb['max'][2] - aabb['min'][2]]\n aabb['diagonal'] = math.sqrt(\n aabb['size'][0]**2 +\n aabb['size'][1]**2 +\n aabb['size'][2]**2)\n except UnboundLocalError:\n print('Error: aabb input file does not contain valid data. Exiting ...')\n sys.exit(1)\n for key, value in aabb.items():\n if log is None:\n print('{:10} = {}'.format(key, value))\n else:\n log_file = open(log, 'a')\n log_file.write('{:10} = {}\\n'.format(key, value))\n log_file.close()\n \"\"\"\n if log is not None:\n log_file = open(log, 'a')\n #log_file.write('***Axis Aligned Bounding Results for file \"%s\":\\n' % fbasename)\n log_file.write('min = %s\\n' % aabb['min'])\n log_file.write('max = %s\\n' % aabb['max'])\n log_file.write('center = %s\\n' % aabb['center'])\n log_file.write('size = %s\\n' % aabb['size'])\n log_file.write('diagonal = %s\\n\\n' % aabb['diagonal'])\n log_file.close()\n # print(aabb)\n \"\"\"\n return aabb"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef measure_section(fbasename=None, log=None, axis='z', offset=0.0,\n rotate_x_angle=None, ml_version=ml_version):\n \"\"\"Measure a cross section of a mesh\n \n Perform a plane cut in one of the major axes (X, Y, Z). If you want to cut on\n a different plane you will need to rotate the model in place, perform the cut,\n and rotate it back.\n \n Args:\n fbasename (str): filename of input model\n log (str): filename of log file\n axis (str): axis perpendicular to the cutting plane, e.g. specify \"z\" to cut\n parallel to the XY plane.\n offset (float): amount to offset the cutting plane from the origin\n rotate_x_angle (float): degrees to rotate about the X axis. Useful for correcting \"Up\" direction: 90 to rotate Y to Z, and -90 to rotate Z to Y. \n\n Returns:\n dict: dictionary with the following keys for the aabb of the section:\n min (list): list of the x, y & z minimum values\n max (list): list of the x, y & z maximum values\n center (list): the x, y & z coordinates of the center of the aabb\n size (list): list of the x, y & z sizes (max - min)\n diagonal (float): the diagonal of the aabb\n \"\"\"\n ml_script1_file = 'TEMP3D_measure_section.mlx'\n file_out = 'TEMP3D_sect_aabb.xyz'\n\n ml_script1 = mlx.FilterScript(file_in=fbasename, file_out=file_out, ml_version=ml_version)\n if rotate_x_angle is not None:\n transform.rotate(ml_script1, axis='x', angle=rotate_x_angle)\n compute.section(ml_script1, axis=axis, offset=offset)\n layers.delete_lower(ml_script1)\n ml_script1.save_to_file(ml_script1_file)\n ml_script1.run_script(log=log, script_file=ml_script1_file)\n aabb = measure_aabb(file_out, log)\n return aabb", "response": "Measure a cross section of a mesh."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef polylinesort(fbasename=None, log=None):\n fext = os.path.splitext(fbasename)[1][1:].strip().lower()\n if fext != 'obj':\n print('Input file must be obj. Exiting ...')\n sys.exit(1)\n fread = open(fbasename, 'r')\n first = True\n polyline_vertices = []\n line_segments = []\n for line in fread:\n element, x_co, y_co, z_co = line.split()\n if element == 'v':\n polyline_vertices.append(\n [util.to_float(x_co), util.to_float(y_co), util.to_float(z_co)])\n elif element == 'l':\n p1 = x_co\n p2 = y_co\n line_segments.append([int(p1), int(p2)])\n\n fread.close()\n if log is not None:\n log_file = open(log, 'a')\n #log_file.write('***Axis Aligned Bounding Results for file \"%s\":\\n' % fbasename)\n \"\"\"log_file.write('min = %s\\n' % aabb['min'])\n log_file.write('max = %s\\n' % aabb['max'])\n log_file.write('center = %s\\n' % aabb['center'])\n log_file.write('size = %s\\n' % aabb['size'])\n log_file.write('diagonal = %s\\n' % aabb['diagonal'])\"\"\"\n log_file.close()\n # print(aabb)\n return None", "response": "Sort a continuous polyline or polylines."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nmeasuring the topology of a single mesh.", "response": "def measure_topology(fbasename=None, log=None, ml_version=ml_version):\n \"\"\"Measures mesh topology\n\n Args:\n fbasename (str): input filename.\n log (str): filename to log output\n\n Returns:\n dict: dictionary with the following keys:\n vert_num (int): number of vertices\n edge_num (int): number of edges\n face_num (int): number of faces\n unref_vert_num (int): number or unreferenced vertices\n boundry_edge_num (int): number of boundary edges\n part_num (int): number of parts (components) in the mesh.\n manifold (bool): True if mesh is two-manifold, otherwise false.\n non_manifold_edge (int): number of non_manifold edges.\n non_manifold_vert (int): number of non-manifold verices\n genus (int or str): genus of the mesh, either a number or\n 'undefined' if the mesh is non-manifold.\n holes (int or str): number of holes in the mesh, either a number\n or 'undefined' if the mesh is non-manifold.\n\n \"\"\"\n ml_script1_file = 'TEMP3D_measure_topology.mlx'\n ml_script1 = mlx.FilterScript(file_in=fbasename, ml_version=ml_version)\n compute.measure_topology(ml_script1)\n ml_script1.save_to_file(ml_script1_file)\n ml_script1.run_script(log=log, script_file=ml_script1_file)\n topology = ml_script1.topology\n return topology"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nmeasures mesh geometry aabb and topology.", "response": "def measure_all(fbasename=None, log=None, ml_version=ml_version):\n \"\"\"Measures mesh geometry, aabb and topology.\"\"\"\n ml_script1_file = 'TEMP3D_measure_gAndT.mlx'\n if ml_version == '1.3.4BETA':\n file_out = 'TEMP3D_aabb.xyz'\n else:\n file_out = None\n\n ml_script1 = mlx.FilterScript(file_in=fbasename, file_out=file_out, ml_version=ml_version)\n compute.measure_geometry(ml_script1)\n compute.measure_topology(ml_script1)\n ml_script1.save_to_file(ml_script1_file)\n ml_script1.run_script(log=log, script_file=ml_script1_file)\n geometry = ml_script1.geometry\n topology = ml_script1.topology\n\n if ml_version == '1.3.4BETA':\n if log is not None:\n log_file = open(log, 'a')\n log_file.write(\n '***Axis Aligned Bounding Results for file \"%s\":\\n' %\n fbasename)\n log_file.close()\n aabb = measure_aabb(file_out, log)\n else:\n aabb = geometry['aabb']\n return aabb, geometry, topology"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef measure_dimension(fbasename=None, log=None, axis1=None, offset1=0.0,\n axis2=None, offset2=0.0, ml_version=ml_version):\n \"\"\"Measure a dimension of a mesh\"\"\"\n axis1 = axis1.lower()\n axis2 = axis2.lower()\n ml_script1_file = 'TEMP3D_measure_dimension.mlx'\n file_out = 'TEMP3D_measure_dimension.xyz'\n\n ml_script1 = mlx.FilterScript(file_in=fbasename, file_out=file_out, ml_version=ml_version)\n compute.section(ml_script1, axis1, offset1, surface=True)\n compute.section(ml_script1, axis2, offset2, surface=False)\n layers.delete_lower(ml_script1)\n ml_script1.save_to_file(ml_script1_file)\n ml_script1.run_script(log=log, script_file=ml_script1_file)\n\n for val in ('x', 'y', 'z'):\n if val not in (axis1, axis2):\n axis = val\n # ord: Get number that represents letter in ASCII\n # Here we find the offset from 'x' to determine the list reference\n # i.e. 0 for x, 1 for y, 2 for z\n axis_num = ord(axis) - ord('x')\n aabb = measure_aabb(file_out, log)\n dimension = {'min': aabb['min'][axis_num], 'max': aabb['max'][axis_num],\n 'length': aabb['size'][axis_num], 'axis': axis}\n if log is None:\n print('\\nFor file \"%s\"' % fbasename)\n print('Dimension parallel to %s with %s=%s & %s=%s:' % (axis, axis1, offset1,\n axis2, offset2))\n print(' Min = %s, Max = %s, Total length = %s' % (dimension['min'],\n dimension['max'], dimension['length']))\n else:\n log_file = open(log, 'a')\n log_file.write('\\nFor file \"%s\"\\n' % fbasename)\n log_file.write('Dimension parallel to %s with %s=%s & %s=%s:\\n' % (axis, axis1, offset1,\n axis2, offset2))\n log_file.write('min = %s\\n' % dimension['min'])\n log_file.write('max = %s\\n' % dimension['max'])\n log_file.write('Total length = %s\\n' % dimension['length'])\n log_file.close()\n return dimension", "response": "Measure a dimension of a mesh"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef lowercase_ext(filename):\n if '.' in filename:\n main, ext = os.path.splitext(filename)\n return main + ext.lower()\n # For consistency with os.path.splitext,\n # do not treat a filename without an extension as an extension.\n # That is, do not return filename.lower().\n return filename", "response": "Lowercase the extension of a file."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\npatch the request class of the app.", "response": "def patch_request_class(app, size=64 * 1024 * 1024):\n \"\"\"\n By default, Flask will accept uploads to an arbitrary size. While Werkzeug\n switches uploads from memory to a temporary file when they hit 500 KiB,\n it's still possible for someone to overload your disk space with a\n gigantic file.\n\n This patches the app's request class's\n `~werkzeug.BaseRequest.max_content_length` attribute so that any upload\n larger than the given size is rejected with an HTTP error.\n\n .. note::\n\n In Flask 0.6, you can do this by setting the `MAX_CONTENT_LENGTH`\n setting, without patching the request class. To emulate this behavior,\n you can pass `None` as the size (you must pass it explicitly). That is\n the best way to call this function, as it won't break the Flask 0.6\n functionality if it exists.\n\n .. versionchanged:: 0.1.1\n\n :param app: The app to patch the request class of.\n :param size: The maximum size to accept, in bytes. The default is 64 MiB.\n If it is `None`, the app's `MAX_CONTENT_LENGTH` configuration\n setting will be used to patch.\n \"\"\"\n if size is None:\n if isinstance(app.request_class.__dict__['max_content_length'],\n property):\n return\n size = app.config.get('MAX_CONTENT_LENGTH')\n reqclass = app.request_class\n patched = type(reqclass.__name__, (reqclass,),\n {'max_content_length': size})\n app.request_class = patched"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef config_for_set(uset, app, defaults=None):\n config = app.config\n prefix = 'UPLOADED_%s_' % uset.name.upper()\n using_defaults = False\n if defaults is None:\n defaults = dict(dest=None, url=None)\n\n allow_extns = tuple(config.get(prefix + 'ALLOW', ()))\n deny_extns = tuple(config.get(prefix + 'DENY', ()))\n destination = config.get(prefix + 'DEST')\n base_url = config.get(prefix + 'URL')\n\n if destination is None:\n # the upload set's destination wasn't given\n if uset.default_dest:\n # use the \"default_dest\" callable\n destination = uset.default_dest(app)\n if destination is None: # still\n # use the default dest from the config\n if defaults['dest'] is not None:\n using_defaults = True\n destination = os.path.join(defaults['dest'], uset.name)\n else:\n raise RuntimeError(\"no destination for set %s\" % uset.name)\n\n if base_url is None and using_defaults and defaults['url']:\n base_url = addslash(defaults['url']) + uset.name + '/'\n\n return UploadConfiguration(destination, base_url, allow_extns, deny_extns)", "response": "This function returns the configuration for a single set."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nconfiguring the uploads module for the given app.", "response": "def configure_uploads(app, upload_sets):\n \"\"\"\n Call this after the app has been configured. It will go through all the\n upload sets, get their configuration, and store the configuration on the\n app. It will also register the uploads module if it hasn't been set. This\n can be called multiple times with different upload sets.\n\n .. versionchanged:: 0.1.3\n The uploads module/blueprint will only be registered if it is needed\n to serve the upload sets.\n\n :param app: The `~flask.Flask` instance to get the configuration from.\n :param upload_sets: The `UploadSet` instances to configure.\n \"\"\"\n if isinstance(upload_sets, UploadSet):\n upload_sets = (upload_sets,)\n\n if not hasattr(app, 'upload_set_config'):\n app.upload_set_config = {}\n set_config = app.upload_set_config\n defaults = dict(dest=app.config.get('UPLOADS_DEFAULT_DEST'),\n url=app.config.get('UPLOADS_DEFAULT_URL'))\n\n for uset in upload_sets:\n config = config_for_set(uset, app, defaults)\n set_config[uset.name] = config\n\n should_serve = any(s.base_url is None for s in set_config.values())\n if '_uploads' not in app.blueprints and should_serve:\n app.register_blueprint(uploads_mod)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting the current configuration for the specified upload set.", "response": "def config(self):\n \"\"\"\n This gets the current configuration. By default, it looks up the\n current application and gets the configuration from there. But if you\n don't want to go to the full effort of setting an application, or it's\n otherwise outside of a request context, set the `_config` attribute to\n an `UploadConfiguration` instance, then set it back to `None` when\n you're done.\n \"\"\"\n if self._config is not None:\n return self._config\n try:\n return current_app.upload_set_config[self.name]\n except AttributeError:\n raise RuntimeError(\"cannot access configuration outside request\")"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef extension_allowed(self, ext):\n return ((ext in self.config.allow) or\n (ext in self.extensions and ext not in self.config.deny))", "response": "Determines whether a specific extension is allowed."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef save(self, storage, folder=None, name=None):\n if not isinstance(storage, FileStorage):\n raise TypeError(\"storage must be a werkzeug.FileStorage\")\n\n if folder is None and name is not None and \"/\" in name:\n folder, name = os.path.split(name)\n\n basename = self.get_basename(storage.filename)\n if name:\n if name.endswith('.'):\n basename = name + extension(basename)\n else:\n basename = name\n\n if not self.file_allowed(storage, basename):\n raise UploadNotAllowed()\n\n if folder:\n target_folder = os.path.join(self.config.destination, folder)\n else:\n target_folder = self.config.destination\n if not os.path.exists(target_folder):\n os.makedirs(target_folder)\n if os.path.exists(os.path.join(target_folder, basename)):\n basename = self.resolve_conflict(target_folder, basename)\n\n target = os.path.join(target_folder, basename)\n storage.save(target)\n if folder:\n return posixpath.join(folder, basename)\n else:\n return basename", "response": "This method saves a file into a specific folder or name."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn actual version specified in filename.", "response": "def get_vprof_version(filename):\n \"\"\"Returns actual version specified in filename.\"\"\"\n with open(filename) as src_file:\n version_match = re.search(r\"^__version__ = ['\\\"]([^'\\\"]*)['\\\"]\",\n src_file.read(), re.M)\n if version_match:\n return version_match.group(1)\n raise RuntimeError('Unable to find version info.')"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _remove_duplicates(objects):\n seen, uniq = set(), []\n for obj in objects:\n obj_id = id(obj)\n if obj_id in seen:\n continue\n seen.add(obj_id)\n uniq.append(obj)\n return uniq", "response": "Removes duplicate objects.\n\n http://www.peterbe.com/plog/uniqifiers-benchmark."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _get_obj_count_difference(objs1, objs2):\n clean_obj_list1 = _process_in_memory_objects(objs1)\n clean_obj_list2 = _process_in_memory_objects(objs2)\n obj_count_1 = _get_object_count_by_type(clean_obj_list1)\n obj_count_2 = _get_object_count_by_type(clean_obj_list2)\n return obj_count_1 - obj_count_2", "response": "Returns count difference in two collections of Python objects."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _trace_memory_usage(self, frame, event, arg): #pylint: disable=unused-argument\n if event == 'line' and frame.f_code.co_filename in self.target_modules:\n self._events_list.append(\n (frame.f_lineno, self._process.memory_info().rss,\n frame.f_code.co_name, frame.f_code.co_filename))\n return self._trace_memory_usage", "response": "Checks memory usage when line event occur."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef code_events(self):\n if self._resulting_events:\n return self._resulting_events\n for i, (lineno, mem, func, fname) in enumerate(self._events_list):\n mem_in_mb = float(mem - self.mem_overhead) / _BYTES_IN_MB\n if (self._resulting_events and\n self._resulting_events[-1][0] == lineno and\n self._resulting_events[-1][2] == func and\n self._resulting_events[-1][3] == fname and\n self._resulting_events[-1][1] < mem_in_mb):\n self._resulting_events[-1][1] = mem_in_mb\n else:\n self._resulting_events.append(\n [i + 1, lineno, mem_in_mb, func, fname])\n return self._resulting_events", "response": "Returns processed memory usage."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef obj_overhead(self):\n overhead = [\n self,\n self._resulting_events,\n self._events_list,\n self._process\n ]\n overhead_count = _get_object_count_by_type(overhead)\n # One for reference to __dict__ and one for reference to\n # the current module.\n overhead_count[dict] += 2\n return overhead_count", "response": "Returns all objects that are considered a profiler overhead."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ncomputes the memory overhead of the process.", "response": "def compute_mem_overhead(self):\n \"\"\"Returns memory overhead.\"\"\"\n self.mem_overhead = (self._process.memory_info().rss -\n builtins.initial_rss_size)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef profile_package(self):\n target_modules = base_profiler.get_pkg_module_names(self._run_object)\n try:\n with _CodeEventsTracker(target_modules) as prof:\n prof.compute_mem_overhead()\n runpy.run_path(self._run_object, run_name='__main__')\n except SystemExit:\n pass\n return prof, None", "response": "Returns memory stats for a package."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning memory stats for a module.", "response": "def profile_module(self):\n \"\"\"Returns memory stats for a module.\"\"\"\n target_modules = {self._run_object}\n try:\n with open(self._run_object, 'rb') as srcfile,\\\n _CodeEventsTracker(target_modules) as prof:\n code = compile(srcfile.read(), self._run_object, 'exec')\n prof.compute_mem_overhead()\n exec(code, self._globs, None)\n except SystemExit:\n pass\n return prof, None"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef profile_function(self):\n target_modules = {self._run_object.__code__.co_filename}\n with _CodeEventsTracker(target_modules) as prof:\n prof.compute_mem_overhead()\n result = self._run_object(*self._run_args, **self._run_kwargs)\n return prof, result", "response": "Returns memory stats for a function."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef run(self):\n existing_objects = _get_in_memory_objects()\n prof, result = self.profile()\n new_objects = _get_in_memory_objects()\n\n new_obj_count = _get_obj_count_difference(new_objects, existing_objects)\n result_obj_count = new_obj_count - prof.obj_overhead\n\n # existing_objects list is also profiler overhead\n result_obj_count[list] -= 1\n pretty_obj_count = _format_obj_count(result_obj_count)\n return {\n 'objectName': self._object_name,\n 'codeEvents': prof.code_events,\n 'totalEvents': len(prof.code_events),\n 'objectsCount': pretty_obj_count,\n 'result': result,\n 'timestamp': int(time.time())\n }", "response": "Collects memory stats for specified Python program."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_pkg_module_names(package_path):\n module_names = set()\n for fobj, modname, _ in pkgutil.iter_modules(path=[package_path]):\n filename = os.path.join(fobj.path, '%s.py' % modname)\n if os.path.exists(filename):\n module_names.add(os.path.abspath(filename))\n return module_names", "response": "Returns a set of module filenames from a Python package."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nrun function in separate process.", "response": "def run_in_separate_process(func, *args, **kwargs):\n \"\"\"Runs function in separate process.\n\n This function is used instead of a decorator, since Python multiprocessing\n module can't serialize decorated function on all platforms.\n \"\"\"\n manager = multiprocessing.Manager()\n manager_dict = manager.dict()\n process = ProcessWithException(\n manager_dict, target=func, args=args, kwargs=kwargs)\n process.start()\n process.join()\n exc = process.exception\n if exc:\n raise exc\n return process.output"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ndetermines the type of run object.", "response": "def get_run_object_type(run_object):\n \"\"\"Determines run object type.\"\"\"\n if isinstance(run_object, tuple):\n return 'function'\n run_object, _, _ = run_object.partition(' ')\n if os.path.isdir(run_object):\n return 'package'\n return 'module'"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef init_module(self, run_object):\n self.profile = self.profile_module\n self._run_object, _, self._run_args = run_object.partition(' ')\n self._object_name = '%s (module)' % self._run_object\n self._globs = {\n '__file__': self._run_object,\n '__name__': '__main__',\n '__package__': None,\n }\n program_path = os.path.dirname(self._run_object)\n if sys.path[0] != program_path:\n sys.path.insert(0, program_path)\n self._replace_sysargs()", "response": "Initializes profiler with a module."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ninitializing profiler with a package.", "response": "def init_package(self, run_object):\n \"\"\"Initializes profiler with a package.\"\"\"\n self.profile = self.profile_package\n self._run_object, _, self._run_args = run_object.partition(' ')\n self._object_name = '%s (package)' % self._run_object\n self._replace_sysargs()"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ninitializes profiler with a function.", "response": "def init_function(self, run_object):\n \"\"\"Initializes profiler with a function.\"\"\"\n self.profile = self.profile_function\n self._run_object, self._run_args, self._run_kwargs = run_object\n filename = inspect.getsourcefile(self._run_object)\n self._object_name = '%s @ %s (function)' % (\n self._run_object.__name__, filename)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _replace_sysargs(self):\n sys.argv[:] = [self._run_object]\n if self._run_args:\n sys.argv += self._run_args.split()", "response": "Replaces sys. argv with proper args to pass to script."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsamples the current stack and adds result in self. _stats.", "response": "def sample(self, signum, frame): #pylint: disable=unused-argument\n \"\"\"Samples current stack and adds result in self._stats.\n\n Args:\n signum: Signal that activates handler.\n frame: Frame on top of the stack when signal is handled.\n \"\"\"\n stack = []\n while frame and frame != self.base_frame:\n stack.append((\n frame.f_code.co_name,\n frame.f_code.co_filename,\n frame.f_code.co_firstlineno))\n frame = frame.f_back\n self._stats[tuple(stack)] += 1\n signal.setitimer(signal.ITIMER_PROF, _SAMPLE_INTERVAL)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ninsert a stack into the call tree.", "response": "def _insert_stack(stack, sample_count, call_tree):\n \"\"\"Inserts stack into the call tree.\n\n Args:\n stack: Call stack.\n sample_count: Sample count of call stack.\n call_tree: Call tree.\n \"\"\"\n curr_level = call_tree\n for func in stack:\n next_level_index = {\n node['stack']: node for node in curr_level['children']}\n if func not in next_level_index:\n new_node = {'stack': func, 'children': [], 'sampleCount': 0}\n curr_level['children'].append(new_node)\n curr_level = new_node\n else:\n curr_level = next_level_index[func]\n curr_level['sampleCount'] = sample_count"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncount and fills sample count inside call tree.", "response": "def _fill_sample_count(self, node):\n \"\"\"Counts and fills sample counts inside call tree.\"\"\"\n node['sampleCount'] += sum(\n self._fill_sample_count(child) for child in node['children'])\n return node['sampleCount']"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _format_tree(self, node, total_samples):\n funcname, filename, _ = node['stack']\n sample_percent = self._get_percentage(\n node['sampleCount'], total_samples)\n color_hash = base_profiler.hash_name('%s @ %s' % (funcname, filename))\n return {\n 'stack': node['stack'],\n 'children': [self._format_tree(child, total_samples)\n for child in node['children']],\n 'sampleCount': node['sampleCount'],\n 'samplePercentage': sample_percent,\n 'colorHash': color_hash\n }", "response": "Reformats the call tree for the UI."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns a call tree.", "response": "def call_tree(self):\n \"\"\"Returns call tree.\"\"\"\n call_tree = {'stack': 'base', 'sampleCount': 0, 'children': []}\n for stack, sample_count in self._stats.items():\n self._insert_stack(reversed(stack), sample_count, call_tree)\n self._fill_sample_count(call_tree)\n if not call_tree['children']:\n return {}\n return self._format_tree(\n call_tree['children'][0], call_tree['sampleCount'])"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nrun statistical profiler on a package.", "response": "def _profile_package(self):\n \"\"\"Runs statistical profiler on a package.\"\"\"\n with _StatProfiler() as prof:\n prof.base_frame = inspect.currentframe()\n try:\n runpy.run_path(self._run_object, run_name='__main__')\n except SystemExit:\n pass\n\n call_tree = prof.call_tree\n return {\n 'objectName': self._object_name,\n 'sampleInterval': _SAMPLE_INTERVAL,\n 'runTime': prof.run_time,\n 'callStats': call_tree,\n 'totalSamples': call_tree.get('sampleCount', 0),\n 'timestamp': int(time.time())\n }"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nruns statistical profiler on a module.", "response": "def _profile_module(self):\n \"\"\"Runs statistical profiler on a module.\"\"\"\n with open(self._run_object, 'rb') as srcfile, _StatProfiler() as prof:\n code = compile(srcfile.read(), self._run_object, 'exec')\n prof.base_frame = inspect.currentframe()\n try:\n exec(code, self._globs, None)\n except SystemExit:\n pass\n\n call_tree = prof.call_tree\n return {\n 'objectName': self._object_name,\n 'sampleInterval': _SAMPLE_INTERVAL,\n 'runTime': prof.run_time,\n 'callStats': call_tree,\n 'totalSamples': call_tree.get('sampleCount', 0),\n 'timestamp': int(time.time())\n }"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef profile_function(self):\n with _StatProfiler() as prof:\n result = self._run_object(*self._run_args, **self._run_kwargs)\n\n call_tree = prof.call_tree\n return {\n 'objectName': self._object_name,\n 'sampleInterval': _SAMPLE_INTERVAL,\n 'runTime': prof.run_time,\n 'callStats': call_tree,\n 'totalSamples': call_tree.get('sampleCount', 0),\n 'result': result,\n 'timestamp': int(time.time())\n }", "response": "Runs statistical profiler on a function."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _transform_stats(prof):\n records = []\n for info, params in prof.stats.items():\n filename, lineno, funcname = info\n cum_calls, num_calls, time_per_call, cum_time, _ = params\n if prof.total_tt == 0:\n percentage = 0\n else:\n percentage = round(100 * (cum_time / prof.total_tt), 4)\n cum_time = round(cum_time, 4)\n func_name = '%s @ %s' % (funcname, filename)\n color_hash = base_profiler.hash_name(func_name)\n records.append(\n (filename, lineno, funcname, cum_time, percentage, num_calls,\n cum_calls, time_per_call, filename, color_hash))\n return sorted(records, key=operator.itemgetter(4), reverse=True)", "response": "Processes collected stats for UI."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _profile_package(self):\n prof = cProfile.Profile()\n prof.enable()\n try:\n runpy.run_path(self._run_object, run_name='__main__')\n except SystemExit:\n pass\n prof.disable()\n prof_stats = pstats.Stats(prof)\n prof_stats.calc_callees()\n return {\n 'objectName': self._object_name,\n 'callStats': self._transform_stats(prof_stats),\n 'totalTime': prof_stats.total_tt,\n 'primitiveCalls': prof_stats.prim_calls,\n 'totalCalls': prof_stats.total_calls,\n 'timestamp': int(time.time())\n }", "response": "Runs cProfile on a package."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _profile_module(self):\n prof = cProfile.Profile()\n try:\n with open(self._run_object, 'rb') as srcfile:\n code = compile(srcfile.read(), self._run_object, 'exec')\n prof.runctx(code, self._globs, None)\n except SystemExit:\n pass\n prof_stats = pstats.Stats(prof)\n prof_stats.calc_callees()\n return {\n 'objectName': self._object_name,\n 'callStats': self._transform_stats(prof_stats),\n 'totalTime': prof_stats.total_tt,\n 'primitiveCalls': prof_stats.prim_calls,\n 'totalCalls': prof_stats.total_calls,\n 'timestamp': int(time.time())\n }", "response": "Runs cProfile on a module."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nrunning cProfile on a function.", "response": "def profile_function(self):\n \"\"\"Runs cProfile on a function.\"\"\"\n prof = cProfile.Profile()\n prof.enable()\n result = self._run_object(*self._run_args, **self._run_kwargs)\n prof.disable()\n prof_stats = pstats.Stats(prof)\n prof_stats.calc_callees()\n return {\n 'objectName': self._object_name,\n 'callStats': self._transform_stats(prof_stats),\n 'totalTime': prof_stats.total_tt,\n 'primitiveCalls': prof_stats.prim_calls,\n 'totalCalls': prof_stats.total_calls,\n 'result': result,\n 'timestamp': int(time.time())\n }"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef show_guestbook():\n cursor = flask.g.db.execute(\n 'SELECT name, message FROM entry ORDER BY id DESC;')\n entries = [{'name': row[0], 'message': row[1]} for row in cursor.fetchall()]\n return jinja2.Template(LAYOUT).render(entries=entries)", "response": "Returns all existing guestbook records."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadding a single guestbook record.", "response": "def add_entry():\n \"\"\"Adds single guestbook record.\"\"\"\n name, msg = flask.request.form['name'], flask.request.form['message']\n flask.g.db.execute(\n 'INSERT INTO entry (name, message) VALUES (?, ?)', (name, msg))\n flask.g.db.commit()\n return flask.redirect('/')"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef start(host, port, profiler_stats, dont_start_browser, debug_mode):\n stats_handler = functools.partial(StatsHandler, profiler_stats)\n if not debug_mode:\n sys.stderr = open(os.devnull, 'w')\n print('Starting HTTP server...')\n if not dont_start_browser:\n webbrowser.open('http://{}:{}/'.format(host, port))\n try:\n StatsServer((host, port), stats_handler).serve_forever()\n except KeyboardInterrupt:\n print('Stopping...')\n sys.exit(0)", "response": "Starts the HTTP server with specified parameters."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nhandle index. html requests.", "response": "def _handle_root():\n \"\"\"Handles index.html requests.\"\"\"\n res_filename = os.path.join(\n os.path.dirname(__file__), _PROFILE_HTML)\n with io.open(res_filename, 'rb') as res_file:\n content = res_file.read()\n return content, 'text/html'"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nhandles other files requests.", "response": "def _handle_other(self):\n \"\"\"Handles static files requests.\"\"\"\n res_filename = os.path.join(\n os.path.dirname(__file__), _STATIC_DIR, self.path[1:])\n with io.open(res_filename, 'rb') as res_file:\n content = res_file.read()\n _, extension = os.path.splitext(self.path)\n return content, 'text/%s' % extension[1:]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nhandle HTTP GET requests.", "response": "def do_GET(self):\n \"\"\"Handles HTTP GET requests.\"\"\"\n handler = self.uri_map.get(self.path) or self._handle_other\n content, content_type = handler()\n compressed_content = gzip.compress(content)\n self._send_response(\n 200, headers=(('Content-type', '%s; charset=utf-8' % content_type),\n ('Content-Encoding', 'gzip'),\n ('Content-Length', len(compressed_content))))\n self.wfile.write(compressed_content)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nhandle HTTP POST requests.", "response": "def do_POST(self):\n \"\"\"Handles HTTP POST requests.\"\"\"\n post_data = self.rfile.read(int(self.headers['Content-Length']))\n json_data = gzip.decompress(post_data)\n self._profile_json.update(json.loads(json_data.decode('utf-8')))\n self._send_response(\n 200, headers=(('Content-type', '%s; charset=utf-8' % 'text/json'),\n ('Content-Encoding', 'gzip'),\n ('Content-Length', len(post_data))))"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsends HTTP response code message and headers.", "response": "def _send_response(self, http_code, message=None, headers=None):\n \"\"\"Sends HTTP response code, message and headers.\"\"\"\n self.send_response(http_code, message)\n if headers:\n for header in headers:\n self.send_header(*header)\n self.end_headers()"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks whether path belongs to standard library or installed modules.", "response": "def check_standard_dir(module_path):\n \"\"\"Checks whether path belongs to standard library or installed modules.\"\"\"\n if 'site-packages' in module_path:\n return True\n for stdlib_path in _STDLIB_PATHS:\n if fnmatch.fnmatchcase(module_path, stdlib_path + '*'):\n return True\n return False"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef record_line(self, frame, event, arg): # pylint: disable=unused-argument\n if event == 'line':\n if self.prev_timestamp:\n runtime = time.time() - self.prev_timestamp\n self.lines.append([self.prev_path, self.prev_lineno, runtime])\n self.prev_lineno = frame.f_lineno\n self.prev_path = frame.f_code.co_filename\n self.prev_timestamp = time.time()\n return self.record_line", "response": "Records the execution time of the line."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nfilter code from standard library from self. lines.", "response": "def lines_without_stdlib(self):\n \"\"\"Filters code from standard library from self.lines.\"\"\"\n prev_line = None\n current_module_path = inspect.getabsfile(inspect.currentframe())\n for module_path, lineno, runtime in self.lines:\n module_abspath = os.path.abspath(module_path)\n if not prev_line:\n prev_line = [module_abspath, lineno, runtime]\n else:\n if (not check_standard_dir(module_path) and\n module_abspath != current_module_path):\n yield prev_line\n prev_line = [module_abspath, lineno, runtime]\n else:\n prev_line[2] += runtime\n yield prev_line"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nfills code heatmap and execution count dictionaries.", "response": "def fill_heatmap(self):\n \"\"\"Fills code heatmap and execution count dictionaries.\"\"\"\n for module_path, lineno, runtime in self.lines_without_stdlib:\n self._execution_count[module_path][lineno] += 1\n self._heatmap[module_path][lineno] += runtime"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncalculates the skip map for large sources.", "response": "def _calc_skips(self, heatmap, num_lines):\n \"\"\"Calculates skip map for large sources.\n Skip map is a list of tuples where first element of tuple is line\n number and second is length of the skip region:\n [(1, 10), (15, 10)] means skipping 10 lines after line 1 and\n 10 lines after line 15.\n \"\"\"\n if num_lines < self.MIN_SKIP_SIZE:\n return []\n skips, prev_line = [], 0\n for line in sorted(heatmap):\n curr_skip = line - prev_line - 1\n if curr_skip > self.SKIP_LINES:\n skips.append((prev_line, curr_skip))\n prev_line = line\n if num_lines - prev_line > self.SKIP_LINES:\n skips.append((prev_line, num_lines - prev_line))\n return skips"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _skip_lines(src_code, skip_map):\n if not skip_map:\n return [['line', j + 1, l] for j, l in enumerate(src_code)]\n code_with_skips, i = [], 0\n for line, length in skip_map:\n code_with_skips.extend(\n ['line', i + j + 1, l] for j, l in enumerate(src_code[i:line]))\n if (code_with_skips\n and code_with_skips[-1][0] == 'skip'): # Merge skips.\n code_with_skips[-1][1] += length\n else:\n code_with_skips.append(['skip', length])\n i = line + length\n code_with_skips.extend(\n ['line', i + j + 1, l] for j, l in enumerate(src_code[i:]))\n return code_with_skips", "response": "Skips lines in src_code specified by skip_map."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncalculate the run time for the given object.", "response": "def _profile_package(self):\n \"\"\"Calculates heatmap for package.\"\"\"\n with _CodeHeatmapCalculator() as prof:\n try:\n runpy.run_path(self._run_object, run_name='__main__')\n except SystemExit:\n pass\n\n heatmaps = []\n for filename, heatmap in prof.heatmap.items():\n if os.path.isfile(filename):\n heatmaps.append(\n self._format_heatmap(\n filename, heatmap, prof.execution_count[filename]))\n\n run_time = sum(heatmap['runTime'] for heatmap in heatmaps)\n return {\n 'objectName': self._run_object,\n 'runTime': run_time,\n 'heatmaps': heatmaps\n }"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nformatting heatmap for UI.", "response": "def _format_heatmap(self, filename, heatmap, execution_count):\n \"\"\"Formats heatmap for UI.\"\"\"\n with open(filename) as src_file:\n file_source = src_file.read().split('\\n')\n skip_map = self._calc_skips(heatmap, len(file_source))\n run_time = sum(time for time in heatmap.values())\n return {\n 'name': filename,\n 'heatmap': heatmap,\n 'executionCount': execution_count,\n 'srcCode': self._skip_lines(file_source, skip_map),\n 'runTime': run_time\n }"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _profile_module(self):\n with open(self._run_object, 'r') as srcfile:\n src_code = srcfile.read()\n code = compile(src_code, self._run_object, 'exec')\n try:\n with _CodeHeatmapCalculator() as prof:\n exec(code, self._globs, None)\n except SystemExit:\n pass\n\n heatmaps = []\n for filename, heatmap in prof.heatmap.items():\n if os.path.isfile(filename):\n heatmaps.append(\n self._format_heatmap(\n filename, heatmap, prof.execution_count[filename]))\n\n run_time = sum(heatmap['runTime'] for heatmap in heatmaps)\n return {\n 'objectName': self._run_object,\n 'runTime': run_time,\n 'heatmaps': heatmaps\n }", "response": "Calculates the heatmap for the module."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ncalculates heatmap for function.", "response": "def profile_function(self):\n \"\"\"Calculates heatmap for function.\"\"\"\n with _CodeHeatmapCalculator() as prof:\n result = self._run_object(*self._run_args, **self._run_kwargs)\n code_lines, start_line = inspect.getsourcelines(self._run_object)\n\n source_lines = []\n for line in code_lines:\n source_lines.append(('line', start_line, line))\n start_line += 1\n\n filename = os.path.abspath(inspect.getsourcefile(self._run_object))\n heatmap = prof.heatmap[filename]\n run_time = sum(time for time in heatmap.values())\n return {\n 'objectName': self._object_name,\n 'runTime': run_time,\n 'result': result,\n 'timestamp': int(time.time()),\n 'heatmaps': [{\n 'name': self._object_name,\n 'heatmap': heatmap,\n 'executionCount': prof.execution_count[filename],\n 'srcCode': source_lines,\n 'runTime': run_time\n }]\n }"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef run_profilers(run_object, prof_config, verbose=False):\n if len(prof_config) > len(set(prof_config)):\n raise AmbiguousConfigurationError(\n 'Profiler configuration %s is ambiguous' % prof_config)\n\n available_profilers = {opt for opt, _ in _PROFILERS}\n for option in prof_config:\n if option not in available_profilers:\n raise BadOptionError('Unknown option: %s' % option)\n\n run_stats = OrderedDict()\n present_profilers = ((o, p) for o, p in _PROFILERS if o in prof_config)\n for option, prof in present_profilers:\n curr_profiler = prof(run_object)\n if verbose:\n print('Running %s...' % curr_profiler.__class__.__name__)\n run_stats[option] = curr_profiler.run()\n return run_stats", "response": "Runs profilers on a single object."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nrun a function on a node.", "response": "def run(func, options, args=(), kwargs={}, host='localhost', port=8000): # pylint: disable=dangerous-default-value\n \"\"\"Runs profilers on a function.\n\n Args:\n func: A Python function.\n options: A string with profilers configuration (i.e. 'cmh').\n args: func non-keyword arguments.\n kwargs: func keyword arguments.\n host: Host name to send collected data.\n port: Port number to send collected data.\n\n Returns:\n A result of func execution.\n \"\"\"\n run_stats = run_profilers((func, args, kwargs), options)\n\n result = None\n for prof in run_stats:\n if not result:\n result = run_stats[prof]['result']\n del run_stats[prof]['result'] # Don't send result to remote host\n\n post_data = gzip.compress(\n json.dumps(run_stats).encode('utf-8'))\n urllib.request.urlopen('http://%s:%s' % (host, port), post_data)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\npredict probability estimates for the test vectors X.", "response": "def predict_proba(self, X):\n \"\"\"\n Return probability estimates for the RDD containing test vector X.\n\n Parameters\n ----------\n X : RDD containing array-like items, shape = [m_samples, n_features]\n\n Returns\n -------\n C : RDD with array-like items , shape = [n_samples, n_classes]\n Returns the probability of the samples for each class in\n the models for each RDD block. The columns correspond to the classes\n in sorted order, as they appear in the attribute `classes_`.\n \"\"\"\n check_rdd(X, (sp.spmatrix, np.ndarray))\n return X.map(\n lambda X: super(SparkBaseNB, self).predict_proba(X))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef predict_log_proba(self, X):\n # required, scikit call self.predict_log_proba(X) in predict_proba\n # and thus this function is call, it must have the same behavior when\n # not called by sparkit-learn\n if not isinstance(X, BlockRDD):\n return super(SparkBaseNB, self).predict_log_proba(X)\n\n check_rdd(X, (sp.spmatrix, np.ndarray))\n return X.map(\n lambda X: super(SparkBaseNB, self).predict_log_proba(X))", "response": "Predicts the log probability estimates for the test vectors X."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fit(self, Z, classes=None):\n check_rdd(Z, {'X': (sp.spmatrix, np.ndarray), 'y': (sp.spmatrix, np.ndarray)})\n models = Z[:, ['X', 'y']].map(\n lambda X_y: self.partial_fit(X_y[0], X_y[1], classes))\n avg = models.reduce(operator.add)\n self.__dict__.update(avg.__dict__)\n return self", "response": "Fit Gaussian Naive Bayes according to X y."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfitting Multinomial Naive Bayes according to X y.", "response": "def fit(self, Z, classes=None):\n \"\"\"\n TODO fulibacsi fix docstring\n Fit Multinomial Naive Bayes according to (X,y) pair\n which is zipped into TupleRDD Z.\n\n Parameters\n ----------\n Z : TupleRDD containing X [array-like, shape (m_samples, n_features)]\n and y [array-like, shape (m_samples,)] tuple\n Training vectors, where ,_samples is the number of samples in the\n block and n_features is the number of features, and y contains\n the target values.\n\n Returns\n -------\n self : object\n Returns self.\n \"\"\"\n check_rdd(Z, {'X': (sp.spmatrix, np.ndarray), 'y': (sp.spmatrix, np.ndarray)})\n if 'w' in Z.columns:\n models = Z[:, ['X', 'y', 'w']].map(\n lambda X_y_w: self.partial_fit(\n X_y_w[0], X_y_w[1], classes, X_y_w[2]\n )\n )\n else:\n models = Z[:, ['X', 'y']].map(\n lambda X_y: self.partial_fit(X_y[0], X_y[1], classes))\n avg = models.sum()\n self.__dict__.update(avg.__dict__)\n return self"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncreate vocabulary for the current language.", "response": "def _init_vocab(self, analyzed_docs):\n \"\"\"Create vocabulary\n \"\"\"\n class SetAccum(AccumulatorParam):\n\n def zero(self, initialValue):\n return set(initialValue)\n\n def addInPlace(self, v1, v2):\n v1 |= v2\n return v1\n\n if not self.fixed_vocabulary_:\n accum = analyzed_docs._rdd.context.accumulator(set(), SetAccum())\n analyzed_docs.foreach(\n lambda x: accum.add(set(chain.from_iterable(x))))\n vocabulary = {t: i for i, t in enumerate(accum.value)}\n else:\n vocabulary = self.vocabulary_\n\n if not vocabulary:\n raise ValueError(\"empty vocabulary; perhaps the documents only\"\n \" contain stop words\")\n return vocabulary"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _count_vocab(self, analyzed_docs):\n vocabulary = self.vocabulary_\n j_indices = _make_int_array()\n indptr = _make_int_array()\n indptr.append(0)\n for doc in analyzed_docs:\n for feature in doc:\n try:\n j_indices.append(vocabulary[feature])\n except KeyError:\n # Ignore out-of-vocabulary items for fixed_vocab=True\n continue\n indptr.append(len(j_indices))\n\n j_indices = frombuffer_empty(j_indices, dtype=np.intc)\n indptr = np.frombuffer(indptr, dtype=np.intc)\n values = np.ones(len(j_indices))\n\n X = sp.csr_matrix((values, j_indices, indptr),\n shape=(len(indptr) - 1, len(vocabulary)),\n dtype=self.dtype)\n X.sum_duplicates()\n\n if self.binary:\n X.data.fill(1)\n\n return X", "response": "Create sparse feature matrix and vocabulary where fixed_vocab = False\n is True"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsorting features by name Returns a reordered matrix and modifies the vocabulary in place", "response": "def _sort_features(self, vocabulary):\n \"\"\"Sort features by name\n\n Returns a reordered matrix and modifies the vocabulary in place\n \"\"\"\n sorted_features = sorted(six.iteritems(vocabulary))\n map_index = np.empty(len(sorted_features), dtype=np.int32)\n for new_val, (term, old_val) in enumerate(sorted_features):\n map_index[new_val] = old_val\n vocabulary[term] = new_val\n\n return map_index"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nremoves too rare or too common features.", "response": "def _limit_features(self, X, vocabulary, high=None, low=None,\n limit=None):\n \"\"\"Remove too rare or too common features.\n\n Prune features that are non zero in more samples than high or less\n documents than low, modifying the vocabulary, and restricting it to\n at most the limit most frequent.\n\n This does not prune samples with zero features.\n \"\"\"\n if high is None and low is None and limit is None:\n return X, set()\n\n # Calculate a mask based on document frequencies\n dfs = X.map(_document_frequency).sum()\n tfs = X.map(lambda x: np.asarray(x.sum(axis=0))).sum().ravel()\n mask = np.ones(len(dfs), dtype=bool)\n if high is not None:\n mask &= dfs <= high\n if low is not None:\n mask &= dfs >= low\n if limit is not None and mask.sum() > limit:\n mask_inds = (-tfs[mask]).argsort()[:limit]\n new_mask = np.zeros(len(dfs), dtype=bool)\n new_mask[np.where(mask)[0][mask_inds]] = True\n mask = new_mask\n\n new_indices = np.cumsum(mask) - 1 # maps old indices to new\n removed_terms = set()\n for term, old_index in list(six.iteritems(vocabulary)):\n if mask[old_index]:\n vocabulary[term] = new_indices[old_index]\n else:\n del vocabulary[term]\n removed_terms.add(term)\n kept_indices = np.where(mask)[0]\n\n if len(kept_indices) == 0:\n raise ValueError(\"After pruning, no terms remain. Try a lower\"\n \" min_df or a higher max_df.\")\n\n return kept_indices, removed_terms"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nlearns the vocabulary dictionary and return the term - document matrix.", "response": "def fit_transform(self, Z):\n \"\"\"Learn the vocabulary dictionary and return term-document matrix.\n\n This is equivalent to fit followed by transform, but more efficiently\n implemented.\n\n Parameters\n ----------\n Z : iterable or DictRDD with column 'X'\n An iterable of raw_documents which yields either str, unicode or\n file objects; or a DictRDD with column 'X' containing such\n iterables.\n\n Returns\n -------\n X : array, [n_samples, n_features] or DictRDD\n Document-term matrix.\n \"\"\"\n self._validate_vocabulary()\n\n # map analyzer and cache result\n analyze = self.build_analyzer()\n A = Z.transform(lambda X: list(map(analyze, X)), column='X').persist()\n\n # create vocabulary\n X = A[:, 'X'] if isinstance(A, DictRDD) else A\n self.vocabulary_ = self._init_vocab(X)\n\n # transform according to vocabulary\n mapper = self.broadcast(self._count_vocab, A.context)\n Z = A.transform(mapper, column='X', dtype=sp.spmatrix)\n\n\n if not self.fixed_vocabulary_:\n X = Z[:, 'X'] if isinstance(Z, DictRDD) else Z\n\n max_df = self.max_df\n min_df = self.min_df\n max_features = self.max_features\n\n # limit features according to min_df, max_df parameters\n n_doc = X.shape[0]\n max_doc_count = (max_df\n if isinstance(max_df, numbers.Integral)\n else max_df * n_doc)\n min_doc_count = (min_df\n if isinstance(min_df, numbers.Integral)\n else min_df * n_doc)\n if max_doc_count < min_doc_count:\n raise ValueError(\n \"max_df corresponds to < documents than min_df\")\n kept_indices, self.stop_words_ = self._limit_features(\n X, self.vocabulary_, max_doc_count, min_doc_count, max_features)\n\n # sort features\n map_index = self._sort_features(self.vocabulary_)\n\n # combined mask\n mask = kept_indices[map_index]\n\n Z = Z.transform(lambda x: x[:, mask], column='X', dtype=sp.spmatrix)\n A.unpersist()\n return Z"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ntransforms documents to document - term matrix.", "response": "def transform(self, Z):\n \"\"\"Transform documents to document-term matrix.\n\n Extract token counts out of raw text documents using the vocabulary\n fitted with fit or the one provided to the constructor.\n\n Parameters\n ----------\n raw_documents : iterable\n An iterable which yields either str, unicode or file objects.\n\n Returns\n -------\n X : sparse matrix, [n_samples, n_features]\n Document-term matrix.\n \"\"\"\n if not hasattr(self, 'vocabulary_'):\n self._validate_vocabulary()\n\n self._check_vocabulary()\n\n analyze = self.build_analyzer()\n mapper = self.broadcast(self._count_vocab, Z.context)\n\n Z = Z.transform(lambda X: list(map(analyze, X)), column='X') \\\n .transform(mapper, column='X', dtype=sp.spmatrix)\n\n return Z"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ntransforms an ArrayRDD or DictRDD containing sequence of documents to a document - term matrix.", "response": "def transform(self, Z):\n \"\"\"Transform an ArrayRDD (or DictRDD with column 'X') containing\n sequence of documents to a document-term matrix.\n\n Parameters\n ----------\n Z : ArrayRDD or DictRDD with raw text documents\n Samples. Each sample must be a text document (either bytes or\n unicode strings) which will be tokenized and hashed.\n\n Returns\n -------\n Z : SparseRDD/DictRDD containg scipy.sparse matrix\n Document-term matrix.\n \"\"\"\n mapper = super(SparkHashingVectorizer, self).transform\n return Z.transform(mapper, column='X', dtype=sp.spmatrix)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fit(self, Z):\n\n X = Z[:, 'X'] if isinstance(Z, DictRDD) else Z\n check_rdd(X, (sp.spmatrix, np.ndarray))\n\n def mapper(X, use_idf=self.use_idf):\n if not sp.issparse(X):\n X = sp.csc_matrix(X)\n if use_idf:\n return _document_frequency(X)\n\n if self.use_idf:\n n_samples, n_features = X.shape\n df = X.map(mapper).treeReduce(operator.add)\n\n # perform idf smoothing if required\n df += int(self.smooth_idf)\n n_samples += int(self.smooth_idf)\n\n # log1p instead of log makes sure terms with zero idf don't get\n # suppressed entirely\n idf = np.log(float(n_samples) / df) + 1.0\n self._idf_diag = sp.spdiags(idf,\n diags=0, m=n_features, n=n_features)\n return self", "response": "Learn the idf vector for the current term counts and the term counts in Z."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nfit the log - likelihood to the given set of data.", "response": "def fit(self, Z):\n \"\"\"Compute the mean and std to be used for later scaling.\n Parameters\n ----------\n Z : DictRDD containing (X, y) pairs\n X - Training vector.\n {array-like, sparse matrix}, shape [n_samples, n_features]\n The data used to compute the mean and standard deviation\n used for later scaling along the features axis.\n y - Target labels\n Passthrough for ``Pipeline`` compatibility.\n \"\"\"\n\n # Reset internal state before fitting\n self._reset()\n X = Z[:, 'X'] if isinstance(Z, DictRDD) else Z\n check_rdd(X, (np.ndarray, sp.spmatrix))\n\n def mapper(X):\n \"\"\"Calculate statistics for every numpy or scipy blocks.\"\"\"\n X = check_array(X, ('csr', 'csc'), dtype=np.float64)\n if hasattr(X, \"toarray\"): # sparse matrix\n mean, var = mean_variance_axis(X, axis=0)\n else:\n mean, var = np.mean(X, axis=0), np.var(X, axis=0)\n return X.shape[0], mean, var\n\n def reducer(a, b):\n \"\"\"Calculate the combined statistics.\"\"\"\n n_a, mean_a, var_a = a\n n_b, mean_b, var_b = b\n n_ab = n_a + n_b\n mean_ab = ((mean_a * n_a) + (mean_b * n_b)) / n_ab\n var_ab = (((n_a * var_a) + (n_b * var_b)) / n_ab) + \\\n ((n_a * n_b) * ((mean_b - mean_a) / n_ab) ** 2)\n return (n_ab, mean_ab, var_ab)\n\n if check_rdd_dtype(X, (sp.spmatrix)):\n if self.with_mean:\n raise ValueError(\n \"Cannot center sparse matrices: pass `with_mean=False` \"\n \"instead. See docstring for motivation and alternatives.\")\n self.n_samples_seen_, self.mean_, self.var_ = X.map(mapper).treeReduce(reducer)\n\n if self.with_std:\n self.scale_ = _handle_zeros_in_scale(np.sqrt(self.var_))\n else:\n self.scale_ = None\n\n return self"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef transform(self, Z):\n X = Z[:, 'X'] if isinstance(Z, DictRDD) else Z\n check_rdd(X, (np.ndarray, sp.spmatrix))\n\n if check_rdd_dtype(X, (sp.spmatrix)):\n if self.with_mean:\n raise ValueError(\n \"Cannot center sparse matrices: pass `with_mean=False` \"\n \"instead. See docstring for motivation and alternatives.\")\n if self.scale_ is not None:\n def mapper(X):\n inplace_column_scale(X, 1 / self.scale_)\n return X\n else:\n if self.with_mean:\n if self.with_std:\n def mapper(X):\n X -= self.mean_\n X /= self.scale_\n return X\n else:\n def mapper(X):\n X -= self.mean_\n return X\n else:\n if self.with_std:\n def mapper(X):\n X /= self.scale_\n return X\n else:\n raise ValueError(\"Need with_std or with_mean \")\n return Z.transform(mapper, column=\"X\")", "response": "Perform standardization by centering and scaling by centering and scaling."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef to_scikit(self):\n scaler = StandardScaler(with_mean=self.with_mean,\n with_std=self.with_std,\n copy=self.copy)\n scaler.__dict__ = self.__dict__\n return scaler", "response": "Convert to equivalent StandardScaler"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nwraps a Scikit - learn Linear model s fit method to use with RDDs input.", "response": "def _spark_fit(self, cls, Z, *args, **kwargs):\n \"\"\"Wraps a Scikit-learn Linear model's fit method to use with RDD\n input.\n\n Parameters\n ----------\n cls : class object\n The sklearn linear model's class to wrap.\n Z : TupleRDD or DictRDD\n The distributed train data in a DictRDD.\n\n Returns\n -------\n self: the wrapped class\n \"\"\"\n mapper = lambda X_y: super(cls, self).fit(\n X_y[0], X_y[1], *args, **kwargs\n )\n models = Z.map(mapper)\n avg = models.reduce(operator.add) / models.count()\n self.__dict__.update(avg.__dict__)\n return self"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _spark_predict(self, cls, X, *args, **kwargs):\n return X.map(lambda X: super(cls, self).predict(X, *args, **kwargs))", "response": "Wraps a Scikit - learn Linear model s predict method to use with RDDs\n input."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfitting linear model. Parameters ---------- Z : DictRDD with (X, y) values X containing numpy array or sparse matrix - The training data y containing the target values Returns ------- self : returns an instance of self.", "response": "def fit(self, Z):\n \"\"\"\n Fit linear model.\n\n Parameters\n ----------\n Z : DictRDD with (X, y) values\n X containing numpy array or sparse matrix - The training data\n y containing the target values\n\n Returns\n -------\n self : returns an instance of self.\n \"\"\"\n check_rdd(Z, {'X': (sp.spmatrix, np.ndarray)})\n return self._spark_fit(SparkLinearRegression, Z)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nfit all the transforms one after the other and transform the data using the final estimator.", "response": "def fit(self, Z, **fit_params):\n \"\"\"Fit all the transforms one after the other and transform the\n data, then fit the transformed data using the final estimator.\n\n Parameters\n ----------\n Z : ArrayRDD, TupleRDD or DictRDD\n Input data in blocked distributed format.\n\n Returns\n -------\n self : SparkPipeline\n \"\"\"\n Zt, fit_params = self._pre_transform(Z, **fit_params)\n self.steps[-1][-1].fit(Zt, **fit_params)\n Zt.unpersist()\n return self"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfitting all the transforms one after the other and transform the data. Use fit_transform on transformed data using the final estimator.", "response": "def fit_transform(self, Z, **fit_params):\n \"\"\"Fit all the transforms one after the other and transform the\n data, then use fit_transform on transformed data using the final\n estimator.\"\"\"\n Zt, fit_params = self._pre_transform(Z, **fit_params)\n if hasattr(self.steps[-1][-1], 'fit_transform'):\n return self.steps[-1][-1].fit_transform(Zt, **fit_params)\n else:\n return self.steps[-1][-1].fit(Zt, **fit_params).transform(Zt)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef score(self, Z):\n Zt = Z\n for name, transform in self.steps[:-1]:\n Zt = transform.transform(Zt)\n return self.steps[-1][-1].score(Zt)", "response": "Applies transforms to the data and returns the score method of the final estimator. Valid only if the final estimator implements the score method. Valid only if the final estimator implements the score method. Valid only if the final estimator implements the score method."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef fit(self, Z, **fit_params):\n fit_params_steps = dict((step, {})\n for step, _ in self.transformer_list)\n\n for pname, pval in six.iteritems(fit_params):\n step, param = pname.split('__', 1)\n fit_params_steps[step][param] = pval\n\n transformers = Parallel(n_jobs=self.n_jobs, backend=\"threading\")(\n delayed(_fit_one_transformer)(trans, Z, **fit_params_steps[name])\n for name, trans in self.transformer_list)\n self._update_transformer_list(transformers)\n return self", "response": "Fits all transformers using X."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef fit_transform(self, Z, **fit_params):\n return self.fit(Z, **fit_params).transform(Z)", "response": "Fit all transformers using X transform the data and concatenate the results."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef transform(self, Z):\n if isinstance(Z, DictRDD):\n X = Z[:, 'X']\n else:\n X = Z\n\n Zs = [_transform_one(trans, name, X, self.transformer_weights)\n for name, trans in self.transformer_list]\n X_rdd = reduce(lambda x, y: x.zip(y._rdd), Zs)\n X_rdd = X_rdd.map(flatten)\n mapper = np.hstack\n for item in X_rdd.first():\n if sp.issparse(item):\n mapper = sp.hstack\n X_rdd = X_rdd.map(lambda x: mapper(x))\n\n if isinstance(Z, DictRDD):\n return DictRDD([X_rdd, Z[:, 'y']],\n columns=Z.columns,\n dtype=Z.dtype,\n bsize=Z.bsize)\n else:\n return X_rdd", "response": "Transform X by each transformer."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef fit(self, Z, classes=None):\n check_rdd(Z, {'X': (sp.spmatrix, np.ndarray)})\n mapper = lambda X_y: super(SparkRandomForestClassifier, self).fit(\n X_y[0], X_y[1]\n )\n\n models = Z.map(mapper).collect()\n\n self.__dict__ = models[0].__dict__\n self.estimators_ = []\n for m in models:\n self.estimators_ += m.estimators_\n self.n_estimators = len(self.estimators_)\n return self", "response": "Fit the model according to the given training data."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _fit(self, Z, parameter_iterable):\n self.scorer_ = check_scoring(self.estimator, scoring=self.scoring)\n\n cv = self.cv\n cv = _check_cv(cv, Z)\n\n if self.verbose > 0:\n if isinstance(parameter_iterable, Sized):\n n_candidates = len(parameter_iterable)\n print(\"Fitting {0} folds for each of {1} candidates, totalling\"\n \" {2} fits\".format(len(cv), n_candidates,\n n_candidates * len(cv)))\n\n base_estimator = clone(self.estimator)\n\n pre_dispatch = self.pre_dispatch\n\n out = Parallel(\n n_jobs=self.n_jobs, verbose=self.verbose,\n pre_dispatch=pre_dispatch, backend=\"threading\"\n )(\n delayed(_fit_and_score)(clone(base_estimator), Z, self.scorer_,\n train, test, self.verbose, parameters,\n self.fit_params, return_parameters=True,\n error_score=self.error_score)\n for parameters in parameter_iterable\n for train, test in cv)\n\n # Out is a list of triplet: score, estimator, n_test_samples\n n_fits = len(out)\n n_folds = len(cv)\n\n scores = list()\n grid_scores = list()\n for grid_start in range(0, n_fits, n_folds):\n n_test_samples = 0\n score = 0\n all_scores = []\n for this_score, this_n_test_samples, _, parameters in \\\n out[grid_start:grid_start + n_folds]:\n all_scores.append(this_score)\n if self.iid:\n this_score *= this_n_test_samples\n n_test_samples += this_n_test_samples\n score += this_score\n if self.iid:\n score /= float(n_test_samples)\n else:\n score /= float(n_folds)\n scores.append((score, parameters))\n # TODO: shall we also store the test_fold_sizes?\n grid_scores.append(_CVScoreTuple(\n parameters,\n score,\n np.array(all_scores)))\n # Store the computed scores\n self.grid_scores_ = grid_scores\n\n # Find the best parameters by comparing on the mean validation score:\n # note that `sorted` is deterministic in the way it breaks ties\n best = sorted(grid_scores, key=lambda x: x.mean_validation_score,\n reverse=True)[0]\n self.best_params_ = best.parameters\n self.best_score_ = best.mean_validation_score\n\n if self.refit:\n # fit the best estimator using the entire dataset\n # clone first to work around broken estimators\n best_estimator = clone(base_estimator).set_params(\n **best.parameters)\n best_estimator.fit(Z, **self.fit_params)\n self.best_estimator_ = best_estimator\n return self", "response": "Perform the actual fitting of the parameters."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef fit(self, y):\n\n def mapper(y):\n y = column_or_1d(y, warn=True)\n _check_numpy_unicode_bug(y)\n return np.unique(y)\n\n def reducer(a, b):\n return np.unique(np.concatenate((a, b)))\n\n self.classes_ = y.map(mapper).reduce(reducer)\n\n return self", "response": "Fit label encoder to target values."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ntransform labels to normalized encoding.", "response": "def transform(self, y):\n \"\"\"Transform labels to normalized encoding.\n Parameters\n ----------\n y : ArrayRDD [n_samples]\n Target values.\n Returns\n -------\n y : ArrayRDD [n_samples]\n \"\"\"\n mapper = super(SparkLabelEncoder, self).transform\n mapper = self.broadcast(mapper, y.context)\n return y.transform(mapper)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _score(estimator, Z_test, scorer):\n score = scorer(estimator, Z_test)\n if not isinstance(score, numbers.Number):\n raise ValueError(\"scoring must return a number, got %s (%s) instead.\"\n % (str(score), type(score)))\n return score", "response": "Compute the score of an estimator on a given test set."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef fit(self, Z):\n X = Z[:, 'X'] if isinstance(Z, DictRDD) else Z\n check_rdd(X, (np.ndarray, sp.spmatrix))\n if self.init == 'k-means||':\n self._mllib_model = MLlibKMeans.train(\n X.unblock(),\n self.n_clusters,\n maxIterations=self.max_iter,\n initializationMode=\"k-means||\")\n self.cluster_centers_ = self._mllib_model.centers\n else:\n models = X.map(lambda X: super(SparkKMeans, self).fit(X))\n models = models.map(lambda model: model.cluster_centers_).collect()\n return super(SparkKMeans, self).fit(np.concatenate(models))", "response": "Compute k - means clustering."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\npredict the closest cluster each sample in X belongs to.", "response": "def predict(self, X):\n \"\"\"Predict the closest cluster each sample in X belongs to.\n\n In the vector quantization literature, `cluster_centers_` is called\n the code book and each value returned by `predict` is the index of\n the closest code in the code book.\n\n Parameters\n ----------\n X : ArrayRDD containing array-like, sparse matrix\n New data to predict.\n\n Returns\n -------\n labels : ArrayRDD with predictions\n Index of the cluster each sample belongs to.\n\n \"\"\"\n check_rdd(X, (np.ndarray, sp.spmatrix))\n if hasattr(self, '_mllib_model'):\n if isinstance(X, ArrayRDD):\n X = X.unblock()\n return X.map(lambda x: self._mllib_model.predict(x))\n else:\n rdd = X.map(lambda X: super(SparkKMeans, self).predict(X))\n return ArrayRDD(rdd)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef fit(self, Z, classes=None):\n check_rdd(Z, {'X': (sp.spmatrix, np.ndarray)})\n self._classes_ = np.unique(classes)\n return self._spark_fit(SparkSGDClassifier, Z)", "response": "Fit the model according to the given training data."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef predict(self, X):\n check_rdd(X, (sp.spmatrix, np.ndarray))\n return self._spark_predict(SparkSGDClassifier, X)", "response": "Distributed method to predict class labels for samples in X."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nchecks if the types of blocks in the RDD are in the expected types.", "response": "def check_rdd_dtype(rdd, expected_dtype):\n \"\"\"Checks if the blocks in the RDD matches the expected types.\n\n Parameters:\n -----------\n rdd: splearn.BlockRDD\n The RDD to check\n expected_dtype: {type, list of types, tuple of types, dict of types}\n Expected type(s). If the RDD is a DictRDD the parameter type is\n restricted to dict.\n\n Returns:\n --------\n accept: bool\n Returns if the types are matched.\n \"\"\"\n if not isinstance(rdd, BlockRDD):\n raise TypeError(\"Expected {0} for parameter rdd, got {1}.\"\n .format(BlockRDD, type(rdd)))\n if isinstance(rdd, DictRDD):\n if not isinstance(expected_dtype, dict):\n raise TypeError('Expected {0} for parameter '\n 'expected_dtype, got {1}.'\n .format(dict, type(expected_dtype)))\n accept = True\n types = dict(list(zip(rdd.columns, rdd.dtype)))\n for key, values in expected_dtype.items():\n if not isinstance(values, (tuple, list)):\n values = [values]\n accept = accept and types[key] in values\n return accept\n\n if not isinstance(expected_dtype, (tuple, list)):\n expected_dtype = [expected_dtype]\n\n return rdd.dtype in expected_dtype"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nlearning a list of feature name - > indices mappings.", "response": "def fit(self, Z):\n \"\"\"Learn a list of feature name -> indices mappings.\n\n Parameters\n ----------\n Z : DictRDD with column 'X'\n Dict(s) or Mapping(s) from feature names (arbitrary Python\n objects) to feature values (strings or convertible to dtype).\n\n Returns\n -------\n self\n \"\"\"\n X = Z[:, 'X'] if isinstance(Z, DictRDD) else Z\n\n \"\"\"Create vocabulary\n \"\"\"\n class SetAccum(AccumulatorParam):\n\n def zero(self, initialValue):\n return set(initialValue)\n\n def addInPlace(self, v1, v2):\n v1 |= v2\n return v1\n\n accum = X.context.accumulator(set(), SetAccum())\n\n def mapper(X, separator=self.separator):\n feature_names = []\n for x in X:\n for f, v in six.iteritems(x):\n if isinstance(v, six.string_types):\n f = \"%s%s%s\" % (f, self.separator, v)\n feature_names.append(f)\n accum.add(set(feature_names))\n\n X.foreach(mapper) # init vocabulary\n feature_names = list(accum.value)\n\n if self.sort:\n feature_names.sort()\n\n vocab = dict((f, i) for i, f in enumerate(feature_names))\n\n self.feature_names_ = feature_names\n self.vocabulary_ = vocab\n\n return self"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ntransform ArrayRDDs or DictRDD s X column s feature - > value dicts to array or sparse matrix.", "response": "def transform(self, Z):\n \"\"\"Transform ArrayRDD's (or DictRDD's 'X' column's) feature->value dicts\n to array or sparse matrix.\n Named features not encountered during fit or fit_transform will be\n silently ignored.\n\n Parameters\n ----------\n Z : ArrayRDD or DictRDD with column 'X' containing Mapping or\n iterable over Mappings, length = n_samples\n Dict(s) or Mapping(s) from feature names (arbitrary Python\n objects) to feature values (strings or convertible to dtype).\n\n Returns\n -------\n Z : transformed, containing {array, sparse matrix}\n Feature vectors; always 2-d.\n \"\"\"\n mapper = self.broadcast(super(SparkDictVectorizer, self).transform,\n Z.context)\n dtype = sp.spmatrix if self.sparse else np.ndarray\n return Z.transform(mapper, column='X', dtype=dtype)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nlearn empirical variances from X.", "response": "def fit(self, Z):\n \"\"\"Learn empirical variances from X.\n\n Parameters\n ----------\n X : {array-like, sparse matrix}, shape (n_samples, n_features)\n Sample vectors from which to compute variances.\n\n y : any\n Ignored. This parameter exists only for compatibility with\n sklearn.pipeline.Pipeline.\n\n Returns\n -------\n self\n \"\"\"\n\n X = Z[:, 'X'] if isinstance(Z, DictRDD) else Z\n check_rdd(X, (np.ndarray, sp.spmatrix))\n\n def mapper(X):\n \"\"\"Calculate statistics for every numpy or scipy blocks.\"\"\"\n X = check_array(X, ('csr', 'csc'), dtype=np.float64)\n if hasattr(X, \"toarray\"): # sparse matrix\n mean, var = mean_variance_axis(X, axis=0)\n else:\n mean, var = np.mean(X, axis=0), np.var(X, axis=0)\n return X.shape[0], mean, var\n\n def reducer(a, b):\n \"\"\"Calculate the combined statistics.\"\"\"\n n_a, mean_a, var_a = a\n n_b, mean_b, var_b = b\n n_ab = n_a + n_b\n mean_ab = ((mean_a * n_a) + (mean_b * n_b)) / n_ab\n var_ab = (((n_a * var_a) + (n_b * var_b)) / n_ab) + \\\n ((n_a * n_b) * ((mean_b - mean_a) / n_ab) ** 2)\n return (n_ab, mean_ab, var_ab)\n\n _, _, self.variances_ = X.map(mapper).treeReduce(reducer)\n\n if np.all(self.variances_ <= self.threshold):\n msg = \"No feature in X meets the variance threshold {0:.5f}\"\n if X.shape[0] == 1:\n msg += \" (X contains only one sample)\"\n raise ValueError(msg.format(self.threshold))\n\n return self"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef svd(blocked_rdd, k):\n\n # compute the covariance matrix (without mean subtraction)\n # TODO use one func for this (with mean subtraction as an option?)\n c = blocked_rdd.map(lambda x: (x.T.dot(x), x.shape[0]))\n prod, n = c.reduce(lambda x, y: (x[0] + y[0], x[1] + y[1]))\n\n # do local eigendecomposition\n w, v = ln.eig(prod / n)\n w = np.real(w)\n v = np.real(v)\n inds = np.argsort(w)[::-1]\n s = np.sqrt(w[inds[0:k]]) * np.sqrt(n)\n v = v[:, inds[0:k]].T\n\n # project back into data, normalize by singular values\n u = blocked_rdd.map(lambda x: np.inner(x, v) / s)\n\n return u, s, v", "response": "Calculate the SVD of a blocked RDD directly returning only the leading k\n singular vectors."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncalculate the SVD of a blocked RDD using an expectation maximization algorithm that avoids explicitly computing the leading kometric subspace.", "response": "def svd_em(blocked_rdd, k, maxiter=20, tol=1e-6, compute_u=True, seed=None):\n \"\"\"\n Calculate the SVD of a blocked RDD using an expectation maximization\n algorithm (from Roweis, NIPS, 1997) that avoids explicitly\n computing the covariance matrix, returning only the leading k\n singular vectors. Assumes n rows and d columns, does not require\n d^2 to fit into memory on a single machine.\n Parameters\n ----------\n blocked_rdd : ArrayRDD\n ArrayRDD with data points in numpy array blocks\n k : Int\n Number of singular vectors to return\n maxiter : Int, optional, default = 20\n Number of iterations to perform\n tol : Double, optional, default = 1e-5\n Tolerance for stopping iterative updates\n seed : Int, optional, default = None\n Seed for random number generator for initializing subspace\n Returns\n ----------\n u : RDD of blocks\n Left eigenvectors\n s : numpy array\n Singular values\n v : numpy array\n Right eigenvectors\n \"\"\"\n\n n, m = blocked_rdd.shape[:2]\n sc = blocked_rdd._rdd.context\n\n def outerprod(x):\n return x.T.dot(x)\n\n # global run_sum\n\n # def accumsum(x):\n # global run_sum\n # run_sum += x\n\n # class MatrixAccum(AccumulatorParam):\n\n # def zero(self, value):\n # return np.zeros(np.shape(value))\n\n # def addInPlace(self, val1, val2):\n # val1 += val2\n # return val1\n\n if seed is not None:\n rng = np.random.RandomState(seed)\n c = rng.randn(k, m)\n else:\n c = np.random.randn(k, m)\n\n iter = 0\n error = 100\n\n # iteratively update subspace using expectation maximization\n # e-step: x = (cc')^-1 c y\n # m-step: c = y x' (xx')^-1\n while (iter < maxiter) & (error > tol):\n c_old = c\n\n # pre compute (cc')^-1 c\n c_inv = np.dot(c.T, ln.inv(np.dot(c, c.T)))\n premult1 = sc.broadcast(c_inv)\n\n # compute (xx')^-1 through a map reduce\n xx = blocked_rdd.map(lambda x: outerprod(safe_sparse_dot(x, premult1.value))) \\\n .treeReduce(add)\n\n # compute (xx')^-1 using an accumulator\n # run_sum = sc.accumulator(np.zeros((k, k)), MatrixAccum())\n # blocked_rdd.map(lambda x: outerprod(safe_sparse_dot(x, premult1.value))) \\\n # .foreachPartition(lambda l: accumsum(sum(l)))\n # xx = run_sum.value\n xx_inv = ln.inv(xx)\n\n # pre compute (cc')^-1 c (xx')^-1\n premult2 = blocked_rdd.context.broadcast(np.dot(c_inv, xx_inv))\n\n # compute the new c through a map reduce\n c = blocked_rdd.map(lambda x: safe_sparse_dot(x.T, safe_sparse_dot(x, premult2.value))) \\\n .treeReduce(add)\n\n # compute the new c using an accumulator\n # run_sum = sc.accumulator(np.zeros((m, k)), MatrixAccum())\n # blocked_rdd.map(lambda x: safe_sparse_dot(x.T, safe_sparse_dot(x, premult2.value))) \\\n # .foreachPartition(lambda l: accumsum(sum(l)))\n # c = run_sum.value\n c = c.T\n\n error = np.sum((c - c_old) ** 2)\n iter += 1\n\n # project data into subspace spanned by columns of c\n # use standard eigendecomposition to recover an orthonormal basis\n c = ln.orth(c.T).T\n cov = blocked_rdd.map(lambda x: safe_sparse_dot(x, c.T)) \\\n .map(lambda x: outerprod(x)) \\\n .treeReduce(add)\n w, v = ln.eig(cov / n)\n w = np.real(w)\n v = np.real(v)\n inds = np.argsort(w)[::-1]\n s = np.sqrt(w[inds[0:k]]) * np.sqrt(n)\n v = np.dot(v[:, inds[0:k]].T, c)\n if compute_u:\n v_broadcasted = blocked_rdd.context.broadcast(v)\n u = blocked_rdd.map(\n lambda x: safe_sparse_dot(x, v_broadcasted.value.T) / s)\n return u, s, v\n else:\n return s, v"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nfitting LSI model to X and perform dimensionality reduction on X.", "response": "def fit_transform(self, Z):\n \"\"\"Fit LSI model to X and perform dimensionality reduction on X.\n\n Parameters\n ----------\n X : {array-like, sparse matrix}, shape (n_samples, n_features)\n Training data.\n\n Returns\n -------\n X_new : array, shape (n_samples, n_components)\n Reduced version of X. This will always be a dense array.\n \"\"\"\n X = Z[:, 'X'] if isinstance(Z, DictRDD) else Z\n check_rdd(X, (sp.spmatrix, np.ndarray))\n if self.algorithm == \"em\":\n X = X.persist() # boosting iterative svm\n Sigma, V = svd_em(X, k=self.n_components, maxiter=self.n_iter,\n tol=self.tol, compute_u=False,\n seed=self.random_state)\n self.components_ = V\n X.unpersist()\n return self.transform(Z)\n else:\n # TODO: raise warning non distributed\n return super(SparkTruncatedSVD, self).fit_transform(X.tosparse())"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nperforming dimensionality reduction on X.", "response": "def transform(self, Z):\n \"\"\"Perform dimensionality reduction on X.\n\n Parameters\n ----------\n X : {array-like, sparse matrix}, shape (n_samples, n_features)\n New data.\n\n Returns\n -------\n X_new : array, shape (n_samples, n_components)\n Reduced version of X. This will always be a dense array.\n \"\"\"\n X = Z[:, 'X'] if isinstance(Z, DictRDD) else Z\n check_rdd(X, (sp.spmatrix, np.ndarray))\n\n mapper = self.broadcast(\n super(SparkTruncatedSVD, self).transform, Z.context)\n return Z.transform(mapper, column='X', dtype=np.ndarray)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\npack rdd with a specific collection constructor.", "response": "def _block_collection(iterator, dtype, bsize=-1):\n \"\"\"Pack rdd with a specific collection constructor.\"\"\"\n i = 0\n accumulated = []\n for a in iterator:\n if (bsize > 0) and (i >= bsize):\n yield _pack_accumulated(accumulated, dtype)\n accumulated = []\n i = 0\n accumulated.append(a)\n i += 1\n if i > 0:\n yield _pack_accumulated(accumulated, dtype)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\npack rdd of tuples as tuples of arrays or scipy. sparse matrices.", "response": "def _block_tuple(iterator, dtypes, bsize=-1):\n \"\"\"Pack rdd of tuples as tuples of arrays or scipy.sparse matrices.\"\"\"\n i = 0\n blocked_tuple = None\n for tuple_i in iterator:\n if blocked_tuple is None:\n blocked_tuple = tuple([] for _ in range(len(tuple_i)))\n\n if (bsize > 0) and (i >= bsize):\n yield tuple(_pack_accumulated(x, dtype)\n for x, dtype in zip(blocked_tuple, dtypes))\n blocked_tuple = tuple([] for _ in range(len(tuple_i)))\n i = 0\n\n for x_j, x in zip(tuple_i, blocked_tuple):\n x.append(x_j)\n i += 1\n if i > 0:\n yield tuple(_pack_accumulated(x, dtype)\n for x, dtype in zip(blocked_tuple, dtypes))"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nblocks an RDD of data points into a single block.", "response": "def block(rdd, bsize=-1, dtype=None):\n \"\"\"Block an RDD\n\n Parameters\n ----------\n\n rdd : RDD\n RDD of data points to block into either numpy arrays,\n scipy sparse matrices, or pandas data frames.\n Type of data point will be automatically inferred\n and blocked accordingly.\n\n bsize : int, optional, default None\n Size of each block (number of elements), if None all data points\n from each partition will be combined in a block.\n\n Returns\n -------\n\n rdd : ArrayRDD or TupleRDD or DictRDD\n The transformed rdd with added functionality\n \"\"\"\n try:\n entry = rdd.first()\n except IndexError:\n # empty RDD: do not block\n return rdd\n\n # do different kinds of block depending on the type\n if isinstance(entry, dict):\n rdd = rdd.map(lambda x: list(x.values()))\n return DictRDD(rdd, list(entry.keys()), bsize, dtype)\n elif isinstance(entry, tuple):\n return DictRDD(rdd, bsize=bsize, dtype=dtype)\n elif sp.issparse(entry):\n return SparseRDD(rdd, bsize)\n elif isinstance(entry, np.ndarray):\n return ArrayRDD(rdd, bsize)\n else:\n return BlockRDD(rdd, bsize, dtype)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nexecuting the blocking process on the given rdd.", "response": "def _block(self, rdd, bsize, dtype):\n \"\"\"Execute the blocking process on the given rdd.\n\n Parameters\n ----------\n rdd : pyspark.rdd.RDD\n Distributed data to block\n bsize : int or None\n The desired size of the blocks\n\n Returns\n -------\n rdd : pyspark.rdd.RDD\n Blocked rdd.\n \"\"\"\n return rdd.mapPartitions(\n lambda x: _block_collection(x, dtype, bsize))"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nreturn the shape of the data.", "response": "def shape(self):\n \"\"\"Returns the shape of the data.\"\"\"\n # TODO cache\n first = self.first().shape\n shape = self._rdd.map(lambda x: x.shape[0]).sum()\n return (shape,) + first[1:]"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef toarray(self):\n rdd = self._rdd.map(lambda x: x.toarray())\n return np.concatenate(rdd.collect())", "response": "Returns the data as numpy. array from each partition."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _block(self, rdd, bsize, dtype):\n return rdd.mapPartitions(lambda x: _block_tuple(x, dtype, bsize))", "response": "Execute the blocking process on the given rdd."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef transform(self, fn, column=None, dtype=None):\n dtypes = self.dtype\n if column is None:\n indices = list(range(len(self.columns)))\n else:\n if not type(column) in (list, tuple):\n column = [column]\n indices = [self.columns.index(c) for c in column]\n\n if dtype is not None:\n if not type(dtype) in (list, tuple):\n dtype = [dtype]\n dtypes = [dtype[indices.index(i)] if i in indices else t\n for i, t in enumerate(self.dtype)]\n\n def mapper(values):\n result = fn(*[values[i] for i in indices])\n\n if len(indices) == 1:\n result = (result,)\n elif not isinstance(result, (tuple, list)):\n raise ValueError(\"Transformer function must return an\"\n \" iterable!\")\n elif len(result) != len(indices):\n raise ValueError(\"Transformer result's length must be\"\n \" equal to the given columns length!\")\n\n return tuple(result[indices.index(i)] if i in indices else v\n for i, v in enumerate(values))\n\n return DictRDD(self._rdd.map(mapper),\n columns=self.columns, dtype=dtypes,\n bsize=self.bsize, noblock=True)", "response": "Execute a transformation on a column or columns. Returns the modified DictRDD with transformed column names."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a positive value if there are no permissions for a bit of the perm. Otherwise it returns a negative value.", "response": "def bitperm(s, perm, pos):\n \"\"\"Returns zero if there are no permissions for a bit of the perm. of a file. Otherwise it returns a positive value\n\n :param os.stat_result s: os.stat(file) object\n :param str perm: R (Read) or W (Write) or X (eXecute)\n :param str pos: USR (USeR) or GRP (GRouP) or OTH (OTHer)\n :return: mask value\n :rtype: int\n \"\"\"\n perm = perm.upper()\n pos = pos.upper()\n assert perm in ['R', 'W', 'X']\n assert pos in ['USR', 'GRP', 'OTH']\n return s.st_mode & getattr(stat, 'S_I{}{}'.format(perm, pos))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck if file is only writable by root", "response": "def only_root_write(path):\n \"\"\"File is only writable by root\n\n :param str path: Path to file\n :return: True if only root can write\n :rtype: bool\n \"\"\"\n s = os.stat(path)\n for ug, bp in [(s.st_uid, bitperm(s, 'w', 'usr')), (s.st_gid, bitperm(s, 'w', 'grp'))]:\n # User id (is not root) and bit permission\n if ug and bp:\n return False\n if bitperm(s, 'w', 'oth'):\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef check_config(file, printfn=print):\n Config(file).read()\n printfn('The configuration file \"{}\" is correct'.format(file))", "response": "Checks the configuration file. Raises InvalidConfig on error"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nparses and validate the config file.", "response": "def read(self):\n \"\"\"Parse and validate the config file. The read data is accessible as a dictionary in this instance\n\n :return: None\n \"\"\"\n try:\n data = load(open(self.file), Loader)\n except (UnicodeDecodeError, YAMLError) as e:\n raise InvalidConfig(self.file, '{}'.format(e))\n try:\n validate(data, SCHEMA)\n except ValidationError as e:\n raise InvalidConfig(self.file, e)\n self.update(data)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef run_as_cmd(cmd, user, shell='bash'):\n to_execute = get_shell(shell) + [EXECUTE_SHELL_PARAM, cmd]\n if user == 'root':\n return to_execute\n return ['sudo', '-s', '--set-home', '-u', user] + to_execute", "response": "Get the arguments to execute a command as a user\n "} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nexecute a command on thread Taxonomy.", "response": "def execute_cmd(cmd, cwd=None, timeout=5):\n \"\"\"Excecute command on thread\n\n :param cmd: Command to execute\n :param cwd: current working directory\n :return: None\n \"\"\"\n p = subprocess.Popen(cmd, cwd=cwd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)\n try:\n p.wait(timeout=timeout)\n except subprocess.TimeoutExpired:\n return None\n else:\n stdout, stderr = p.stdout.read(), p.stderr.read()\n if sys.version_info >= (3,):\n stdout, stderr = stdout.decode('utf-8', errors='ignore'), stderr.decode('utf-8', errors='ignore')\n if p.returncode:\n raise ExecuteError('Error running command {}: The error code {} has returned. Stderr: {}'.format(\n ' '.join(cmd), p.returncode, stderr\n ))\n else:\n return stdout, stderr"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nexecuting command on remote machine using SSH", "response": "def execute_over_ssh(cmd, ssh, cwd=None, shell='bash'):\n \"\"\"Excecute command on remote machine using SSH\n\n :param cmd: Command to execute\n :param ssh: Server to connect. Port is optional\n :param cwd: current working directory\n :return: None\n \"\"\"\n port = None\n parts = ssh.split(':', 1)\n if len(parts) > 1 and not parts[1].isdigit():\n raise InvalidConfig(extra_body='Invalid port number on ssh config: {}'.format(parts[1]))\n elif len(parts) > 1:\n port = parts[1]\n quoted_cmd = ' '.join([x.replace(\"'\", \"\"\"'\"'\"'\"\"\") for x in cmd.split(' ')])\n remote_cmd = ' '.join([\n ' '.join(get_shell(shell)), # /usr/bin/env bash\n ' '.join([EXECUTE_SHELL_PARAM, \"'\", ' '.join((['cd', cwd, ';'] if cwd else []) + [quoted_cmd]), \"'\"])],\n )\n return ['ssh', parts[0]] + (['-p', port] if port else []) + ['-C'] + [remote_cmd]"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef execute(self, root_allowed=False):\n if self.user == ROOT_USER and not root_allowed and not self.data.get('ssh'):\n raise SecurityException('For security, execute commands as root is not allowed. '\n 'Use --root-allowed to allow executing commands as root. '\n ' It is however recommended to add a user to the configuration '\n 'of the device (device: {})'.format(self.name))\n if self.data.get('user') and self.data.get('ssh'):\n raise InvalidConfig('User option is unsupported in ssh mode. The ssh user must be defined in '\n 'the ssh option. For example: user@machine')\n if self.data.get('ssh'):\n cmd = execute_over_ssh(self.data['cmd'], self.data['ssh'], self.data.get('cwd'))\n output = execute_cmd(cmd)\n else:\n cmd = run_as_cmd(self.data['cmd'], self.user)\n output = execute_cmd(cmd, self.data.get('cwd'))\n if output:\n return output[0]", "response": "Execute command and return the result."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncheck self. data. Raise InvalidConfig on error", "response": "def validate(self):\n \"\"\"Check self.data. Raise InvalidConfig on error\n\n :return: None\n \"\"\"\n if (self.data.get('content-type') or self.data.get('body')) and \\\n self.data.get('method', '').lower() not in CONTENT_TYPE_METHODS:\n raise InvalidConfig(\n extra_body='The body/content-type option only can be used with the {} methods. The device is {}. '\n 'Check the configuration file.'.format(', '.join(CONTENT_TYPE_METHODS), self.name)\n )\n self.data['content-type'] = CONTENT_TYPE_ALIASES.get(self.data.get('content-type'),\n self.data.get('content-type'))\n form_type = CONTENT_TYPE_ALIASES['form']\n if self.data.get('body') and (self.data.get('content-type') or form_type) == form_type:\n try:\n self.data['body'] = json.loads(self.data['body'])\n except JSONDecodeError:\n raise InvalidConfig(\n extra_body='Invalid JSON body on {} device.'.format(self.name)\n )"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef execute(self, root_allowed=False):\n kwargs = {'stream': True, 'timeout': 15,\n 'headers': self.data.get('headers', {})}\n if self.data.get('content-type'):\n kwargs['headers']['content-type'] = self.data['content-type']\n if self.data.get('body'):\n kwargs['data'] = self.data['body']\n if self.data.get('auth'):\n kwargs['auth'] = tuple(self.data['auth'].split(':', 1))\n try:\n resp = request(self.data.get('method', 'get').lower(), self.data['url'],\n verify=self.data.get('verify', True),\n **kwargs)\n except RequestException as e:\n raise ExecuteError('Exception on request to {}: {}'.format(self.data['url'], e))\n if resp.status_code >= 400:\n raise ExecuteError('\"{}\" return code {}.'.format(self.data['url'], resp.status_code))\n data = resp.raw.read(1000, decode_content=True)\n if sys.version_info >= (3,):\n data = data.decode('utf-8', errors='ignore')\n return data", "response": "Execute the command using self. data\n\nAttributeNames"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_headers(self):\n headers = copy.copy(self.default_headers or {})\n headers.update(self.data.get('headers') or {})\n return headers", "response": "Get HTTP Headers to send. By default default_headers."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_body(self):\n if self.default_body:\n return self.default_body\n data = self.data.get('data')\n if isinstance(data, dict):\n return json.dumps(data)\n return data", "response": "Return the data value on self. data\n\n "} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_url(self):\n url = super(ExecuteHomeAssistant, self).get_url()\n if not self.data.get('event'):\n raise InvalidConfig(extra_body='Event option is required for HomeAsistant on {} device.'.format(self.name))\n url += '/api/events/{}'.format(self.data['event'])\n return url", "response": "Home assistant url\n\n :return: url\n :rtype: str"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_url(self):\n if not self.data[self.execute_name]:\n raise InvalidConfig(extra_body='Value for IFTTT is required on {} device. Get your key here: '\n 'https://ifttt.com/services/maker_webhooks/settings'.format(self.name))\n if not self.data.get('event'):\n raise InvalidConfig(extra_body='Event option is required for IFTTT on {} device. '\n 'You define the event name when creating a Webhook '\n 'applet'.format(self.name))\n url = self.url_pattern.format(event=self.data['event'], key=self.data[self.execute_name])\n return url", "response": "Get the url for the IFTTT Webhook."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef pkt_text(pkt):\n if pkt.src.upper() in BANNED_DEVICES:\n body = ''\n elif pkt.src.upper()[:8] in AMAZON_DEVICES:\n body = '{} (Amazon Device)'.format(pkt.src)\n else:\n body = pkt.src\n return body", "response": "Return the text of the packet that is used for this Scapy Packet."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nprinting help and scan devices on screen.", "response": "def discover(interface=None):\n \"\"\"Print help and scan devices on screen.\n\n :return: None\n \"\"\"\n click.secho(HELP, fg='yellow')\n scan_devices(discovery_print, lfilter=lambda d: d.src not in mac_id_list, iface=interface)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nexecuting this device and return the result.", "response": "def execute(self, root_allowed=False):\n \"\"\"Execute this device\n\n :param bool root_allowed: Only used for ExecuteCmd\n :return: None\n \"\"\"\n logger.debug('%s device executed (mac %s)', self.name, self.src)\n if not self.execute_instance:\n msg = '%s: There is not execution method in device conf.'\n logger.warning(msg, self.name)\n self.send_confirmation(msg % self.name, False)\n return\n try:\n result = self.execute_instance.execute(root_allowed)\n except Exception as e:\n self.send_confirmation('Error executing the device {}: {}'.format(self.name, e), False)\n raise\n else:\n result = 'The {} device has been started and is running right now'.format(self.name) \\\n if result is None else result\n result = result or 'The {} device has been executed successfully'.format(self.name)\n self.send_confirmation(result)\n return result"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef send_confirmation(self, message, success=True):\n message = message.strip()\n if not self.confirmation:\n return\n try:\n self.confirmation.send(message, success)\n except Exception as e:\n logger.warning('Error sending confirmation on device {}: {}'.format(self.name, e))", "response": "Send a confirmation message to the device"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\npressing button. Check DEFAULT_DELAY. :param scapy.packet.Packet device: Scapy packet :return: None", "response": "def on_push(self, device):\n \"\"\"Press button. Check DEFAULT_DELAY.\n\n :param scapy.packet.Packet device: Scapy packet\n :return: None\n \"\"\"\n src = device.src.lower()\n if last_execution[src] + self.settings.get('delay', DEFAULT_DELAY) > time.time():\n return\n last_execution[src] = time.time()\n self.execute(device)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef execute(self, device):\n src = device.src.lower()\n device = self.devices[src]\n threading.Thread(target=device.execute, kwargs={\n 'root_allowed': self.root_allowed\n }).start()", "response": "Execute a device. Used to execute a packet."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nstarting daemon mode :param bool root_allowed: Only used for ExecuteCmd :return: loop", "response": "def run(self, root_allowed=False):\n \"\"\"Start daemon mode\n\n :param bool root_allowed: Only used for ExecuteCmd\n :return: loop\n \"\"\"\n self.root_allowed = root_allowed\n scan_devices(self.on_push, lambda d: d.src.lower() in self.devices, self.settings.get('interface'))"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nscans the packet on the loop availabe.", "response": "def scan_devices(fn, lfilter, iface=None):\n \"\"\"Sniff packages\n\n :param fn: callback on packet\n :param lfilter: filter packages\n :return: loop\n \"\"\"\n try:\n sniff(prn=fn, store=0,\n # filter=\"udp\",\n filter=\"arp or (udp and src port 68 and dst port 67 and src host 0.0.0.0)\",\n lfilter=lfilter, iface=iface)\n except PermissionError:\n raise SocketPermissionError"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nopen a web page in the current browser session.", "response": "def open_url(absolute_or_relative_url):\n \"\"\"\n Loads a web page in the current browser session.\n :param absolgenerateute_or_relative_url:\n an absolute url to web page in case of config.base_url is not specified,\n otherwise - relative url correspondingly\n\n :Usage:\n open_url('http://mydomain.com/subpage1')\n open_url('http://mydomain.com/subpage2')\n # OR\n config.base_url = 'http://mydomain.com'\n open_url('/subpage1')\n open_url('/subpage2')\n \"\"\"\n # todo: refactor next line when app_host is removed\n base_url = selene.config.app_host if selene.config.app_host else selene.config.base_url\n driver().get(base_url + absolute_or_relative_url)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef convert(self, txn):\n\n ofxid = self.mk_ofxid(txn.id)\n metadata = {}\n posting_metadata = {\"ofxid\": ofxid}\n\n if isinstance(txn, OfxTransaction):\n posting = Posting(self.name,\n Amount(txn.amount, self.currency),\n metadata=posting_metadata)\n return Transaction(\n date=txn.date,\n payee=self.format_payee(txn),\n postings=[\n posting,\n posting.clone_inverted(\n self.mk_dynamic_account(self.format_payee(txn),\n exclude=self.name))])\n elif isinstance(txn, InvestmentTransaction):\n acct1 = self.name\n acct2 = self.name\n\n posting1 = None\n posting2 = None\n\n security = self.maybe_get_ticker(txn.security)\n\n if isinstance(txn.type, str):\n # recent versions of ofxparse\n if re.match('^(buy|sell)', txn.type):\n acct2 = self.unknownaccount or 'Assets:Unknown'\n elif txn.type == 'transfer':\n acct2 = 'Transfer'\n elif txn.type == 'reinvest':\n # reinvestment of income\n # TODO: make this configurable\n acct2 = 'Income:Interest'\n elif txn.type == 'income' and txn.income_type == 'DIV':\n # Fidelity lists non-reinvested dividend income as\n # type: income, income_type: DIV\n # TODO: determine how dividend income is listed from other institutions\n # income/DIV transactions do not involve buying or selling a security\n # so their postings need special handling compared to\n # others\n metadata['dividend_from'] = security\n acct2 = 'Income:Dividends'\n posting1 = Posting(acct1,\n Amount(txn.total, self.currency),\n metadata=posting_metadata)\n posting2 = posting1.clone_inverted(acct2)\n else:\n # ???\n pass\n else:\n # Old version of ofxparse\n if (txn.type in [0, 1, 3, 4]):\n # buymf, sellmf, buystock, sellstock\n acct2 = self.unknownaccount or 'Assets:Unknown'\n elif (txn.type == 2):\n # reinvest\n acct2 = 'Income:Interest'\n else:\n # ???\n pass\n\n aux_date = None\n if txn.settleDate is not None and \\\n txn.settleDate != txn.tradeDate:\n aux_date = txn.settleDate\n\n # income/DIV already defined above;\n # this block defines all other posting types\n if posting1 is None and posting2 is None:\n posting1 = Posting(\n acct1,\n Amount(\n txn.units,\n security,\n unlimited=True),\n unit_price=Amount(\n txn.unit_price,\n self.currency,\n unlimited=True),\n metadata=posting_metadata)\n posting2 = Posting(\n acct2,\n Amount(\n txn.units *\n txn.unit_price,\n self.currency,\n reverse=True))\n else:\n # Previously defined if type:income income_type/DIV\n pass\n\n return Transaction(\n date=txn.tradeDate,\n aux_date=aux_date,\n payee=self.format_payee(txn),\n metadata=metadata,\n postings=[posting1, posting2]\n )", "response": "Convert an OFX Transaction to a posting - related Transaction object"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef find_ledger_file(ledgerrcpath=None):\n if ledgerrcpath is None:\n ledgerrcpath = os.path.abspath(os.path.expanduser(\"~/.ledgerrc\"))\n if \"LEDGER_FILE\" in os.environ:\n return os.path.abspath(os.path.expanduser(os.environ[\"LEDGER_FILE\"]))\n elif os.path.exists(ledgerrcpath):\n # hacky\n ledgerrc = open(ledgerrcpath)\n for line in ledgerrc.readlines():\n md = re.match(r\"--file\\s+([^\\s]+).*\", line)\n if md is not None:\n return os.path.abspath(os.path.expanduser(md.group(1)))\n else:\n return None", "response": "Returns main ledger file path or raise exception if it cannot be \\\nfound."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nprint the results of the main program.", "response": "def print_results(converter, ofx, ledger, txns, args):\n \"\"\"\n This function is the final common pathway of program:\n\n Print initial balance if requested;\n Print transactions surviving de-duplication filter;\n Print balance assertions if requested;\n Print commodity prices obtained from position statements\n \"\"\"\n\n if args.initial:\n if (not(ledger.check_transaction_by_id\n (\"ofxid\", converter.mk_ofxid(AUTOSYNC_INITIAL))) and\n not(ledger.check_transaction_by_id(\"ofxid\", ALL_AUTOSYNC_INITIAL))):\n print(converter.format_initial_balance(ofx.account.statement))\n for txn in txns:\n print(converter.convert(txn).format(args.indent))\n if args.assertions:\n print(converter.format_balance(ofx.account.statement))\n\n # if OFX has positions use these to obtain commodity prices\n # and print \"P\" records to provide dated/timed valuations\n # Note that this outputs only the commodity price,\n # not your position (e.g. # shares), even though this is in the OFX record\n if hasattr(ofx.account.statement, 'positions'):\n for pos in ofx.account.statement.positions:\n print(converter.format_position(pos))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef compatibility(session, install):\n\n session.install('-e', '.[dev]')\n session.install(install)\n _run_tests(session)", "response": "Run the unit test suite with each support library and Python version."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns the width of a string in pixels.", "response": "def text_width(self, text: str) -> float:\n \"\"\"Returns the width, in pixels, of a string in DejaVu Sans 110pt.\"\"\"\n width, _ = self._font.getsize(text)\n return width"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ntransforms README. md into a usable long description.", "response": "def get_long_description():\n \"\"\"Transform README.md into a usable long description.\n\n Replaces relative references to svg images to absolute https references.\n \"\"\"\n\n with open('README.md') as f:\n read_me = f.read()\n\n def replace_relative_with_absolute(match):\n svg_path = match.group(0)[1:-1]\n return ('(https://github.com/google/pybadges/raw/master/'\n '%s?sanitize=true)' % svg_path)\n\n return re.sub(r'\\(tests/golden-images/.*?\\.svg\\)',\n replace_relative_with_absolute,\n read_me)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns the width in pixels of a string in DejaVu Sans 110pt.", "response": "def text_width(self, text: str) -> float:\n \"\"\"Returns the width, in pixels, of a string in DejaVu Sans 110pt.\"\"\"\n width = 0\n for index, c in enumerate(text):\n width += self._char_to_width.get(c, self._default_character_width)\n width -= self._pair_to_kern.get(text[index:index + 2], 0)\n\n return width"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a PrecalculatedTextMeasurer given a JSON stream.", "response": "def from_json(f: TextIO) -> 'PrecalculatedTextMeasurer':\n \"\"\"Return a PrecalculatedTextMeasurer given a JSON stream.\n\n See precalculate_text.py for details on the required format.\n \"\"\"\n o = json.load(f)\n return PrecalculatedTextMeasurer(o['mean-character-length'],\n o['character-lengths'],\n o['kerning-pairs'])"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef default(cls) -> 'PrecalculatedTextMeasurer':\n if cls._default_cache is not None:\n return cls._default_cache\n\n if pkg_resources.resource_exists(__name__, 'default-widths.json.xz'):\n import lzma\n with pkg_resources.resource_stream(__name__,\n 'default-widths.json.xz') as f:\n with lzma.open(f, \"rt\") as g:\n cls._default_cache = PrecalculatedTextMeasurer.from_json(\n cast(TextIO, g))\n return cls._default_cache\n elif pkg_resources.resource_exists(__name__, 'default-widths.json'):\n with pkg_resources.resource_stream(__name__,\n 'default-widths.json') as f:\n cls._default_cache = PrecalculatedTextMeasurer.from_json(\n io.TextIOWrapper(f, encoding='utf-8'))\n return cls._default_cache\n else:\n raise ValueError('could not load default-widths.json')", "response": "Returns a reasonable default PrecalculatedTextMeasurer."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef badge(left_text: str, right_text: str, left_link: Optional[str] = None,\n right_link: Optional[str] = None,\n whole_link: Optional[str] = None, logo: Optional[str] = None,\n left_color: str = '#555', right_color: str = '#007ec6',\n measurer: Optional[text_measurer.TextMeasurer] = None,\n embed_logo: bool = False) -> str:\n \"\"\"Creates a github-style badge as an SVG image.\n\n >>> badge(left_text='coverage', right_text='23%', right_color='red')\n ''\n >>> badge(left_text='build', right_text='green', right_color='green',\n ... whole_link=\"http://www.example.com/\")\n ''\n\n Args:\n left_text: The text that should appear on the left-hand-side of the\n badge e.g. \"coverage\".\n right_text: The text that should appear on the right-hand-side of the\n badge e.g. \"23%\".\n left_link: The URL that should be redirected to when the left-hand text\n is selected.\n right_link: The URL that should be redirected to when the right-hand\n text is selected.\n whole_link: The link that should be redirected to when the badge is\n selected. If set then left_link and right_right may not be set.\n logo: A url representing a logo that will be displayed inside the\n badge. Can be a data URL e.g. \"data:image/svg+xml;utf8, Iterable[str]:\n font = ttLib.TTFont(deja_vu_sans_path)\n for cmap in font['cmap'].tables:\n if cmap.isUnicode():\n for code in cmap.cmap:\n yield chr(code)", "response": "Generate the supported characters by the font at the given path."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngenerating the subset of characters that can be encoded by encodings.", "response": "def generate_encodeable_characters(characters: Iterable[str],\n encodings: Iterable[str]) -> Iterable[str]:\n \"\"\"Generates the subset of 'characters' that can be encoded by 'encodings'.\n\n Args:\n characters: The characters to check for encodeability e.g. 'abcd'.\n encodings: The encodings to check against e.g. ['cp1252', 'iso-8859-5'].\n\n Returns:\n The subset of 'characters' that can be encoded using one of the provided\n encodings.\n \"\"\"\n for c in characters:\n for encoding in encodings:\n try:\n c.encode(encoding)\n yield c\n except UnicodeEncodeError:\n pass"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef calculate_character_to_length_mapping(\n measurer: text_measurer.TextMeasurer,\n characters: Iterable[str]) -> Mapping[str, float]:\n \"\"\"Return a mapping between each given character and its length.\n\n Args:\n measurer: The TextMeasurer used to measure the width of the text in\n pixels.\n characters: The characters to measure e.g. \"ml\".\n\n Returns:\n A mapping from the given characters to their length in pixels, as\n determined by 'measurer' e.g. {'m': 5.2, 'l', 1.2}.\n \"\"\"\n char_to_length = {}\n\n for c in characters:\n char_to_length[c] = measurer.text_width(c)\n return char_to_length", "response": "Calculates the character length mapping between each given character and its length in pixels."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef calculate_pair_to_kern_mapping(\n measurer: text_measurer.TextMeasurer,\n char_to_length: Mapping[str, float],\n characters: Iterable[str]) -> Mapping[str, float]:\n \"\"\"Returns a mapping between each *pair* of characters and their kerning.\n\n Args:\n measurer: The TextMeasurer used to measure the width of each pair of\n characters.\n char_to_length: A mapping between characters and their length in pixels.\n Must contain every character in 'characters' e.g.\n {'h': 5.2, 'e': 4.0, 'l', 1.2, 'o': 5.0}.\n characters: The characters to generate the kerning mapping for e.g.\n 'hel'.\n\n Returns:\n A mapping between each pair of given characters\n (e.g. 'hh', he', hl', 'eh', 'ee', 'el', 'lh, 'le', 'll') and the kerning\n adjustment for that pair of characters i.e. the difference between the\n length of the two characters calculated using 'char_to_length' vs.\n the length calculated by `measurer`. Positive values indicate that the\n length is less than using the sum of 'char_to_length'. Zero values are\n excluded from the map e.g. {'hl': 3.1, 'ee': -0.5}.\n \"\"\"\n pair_to_kerning = {}\n for a, b in itertools.permutations(characters, 2):\n kerned_width = measurer.text_width(a + b)\n unkerned_width = char_to_length[a] + char_to_length[b]\n kerning = unkerned_width - kerned_width\n if abs(kerning) > 0.05:\n pair_to_kerning[a + b] = round(kerning, 3)\n return pair_to_kerning", "response": "Calculates the kerning mapping between each pair of characters and the total length of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kerning of the kered characters."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nwrites the data required by PrecalculatedTextMeasurer to a stream.", "response": "def write_json(f: TextIO, deja_vu_sans_path: str,\n measurer: text_measurer.TextMeasurer,\n encodings: Iterable[str]) -> None:\n \"\"\"Write the data required by PrecalculatedTextMeasurer to a stream.\"\"\"\n supported_characters = list(\n generate_supported_characters(deja_vu_sans_path))\n kerning_characters = ''.join(\n generate_encodeable_characters(supported_characters, encodings))\n char_to_length = calculate_character_to_length_mapping(measurer,\n supported_characters)\n pair_to_kerning = calculate_pair_to_kern_mapping(measurer, char_to_length,\n kerning_characters)\n json.dump(\n {'mean-character-length': statistics.mean(char_to_length.values()),\n 'character-lengths': char_to_length,\n 'kerning-characters': kerning_characters,\n 'kerning-pairs': pair_to_kerning},\n f, sort_keys=True, indent=1)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngenerating a gaussian kernel.", "response": "def get_gaussian_kernel(gaussian_kernel_width=11, gaussian_kernel_sigma=1.5):\n \"\"\"Generate a gaussian kernel.\"\"\"\n # 1D Gaussian kernel definition\n gaussian_kernel_1d = numpy.ndarray((gaussian_kernel_width))\n norm_mu = int(gaussian_kernel_width / 2)\n\n # Fill Gaussian kernel\n for i in range(gaussian_kernel_width):\n gaussian_kernel_1d[i] = (exp(-(((i - norm_mu) ** 2)) /\n (2 * (gaussian_kernel_sigma ** 2))))\n return gaussian_kernel_1d / numpy.sum(gaussian_kernel_1d)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nconvert PIL image to numpy grayscale array and numpy alpha array.", "response": "def to_grayscale(img):\n \"\"\"Convert PIL image to numpy grayscale array and numpy alpha array.\n\n Args:\n img (PIL.Image): PIL Image object.\n\n Returns:\n (gray, alpha): both numpy arrays.\n \"\"\"\n gray = numpy.asarray(ImageOps.grayscale(img)).astype(numpy.float)\n\n imbands = img.getbands()\n alpha = None\n if 'A' in imbands:\n alpha = numpy.asarray(img.split()[-1]).astype(numpy.float)\n\n return gray, alpha"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef main():\n\n description = '\\n'.join([\n 'Compares an image with a list of images using the SSIM metric.',\n ' Example:',\n ' pyssim test-images/test1-1.png \"test-images/*\"'\n ])\n\n parser = argparse.ArgumentParser(\n prog='pyssim', formatter_class=argparse.RawTextHelpFormatter,\n description=description)\n parser.add_argument('--cw', help='compute the complex wavelet SSIM',\n action='store_true')\n parser.add_argument(\n 'base_image', metavar='image1.png', type=argparse.FileType('r'))\n parser.add_argument(\n 'comparison_images', metavar='image path with* or image2.png')\n parser.add_argument('--width', type=int, default=None,\n help='scales the image before computing SSIM')\n parser.add_argument('--height', type=int, default=None,\n help='scales the image before computing SSIM')\n\n args = parser.parse_args()\n\n if args.width and args.height:\n size = (args.width, args.height)\n else:\n size = None\n\n if not args.cw:\n gaussian_kernel_sigma = 1.5\n gaussian_kernel_width = 11\n gaussian_kernel_1d = get_gaussian_kernel(\n gaussian_kernel_width, gaussian_kernel_sigma)\n\n comparison_images = glob.glob(args.comparison_images)\n is_a_single_image = len(comparison_images) == 1\n\n for comparison_image in comparison_images:\n\n if args.cw:\n ssim = SSIM(args.base_image.name, size=size)\n ssim_value = ssim.cw_ssim_value(comparison_image)\n else:\n ssim = SSIM(args.base_image.name, gaussian_kernel_1d, size=size)\n ssim_value = ssim.ssim_value(comparison_image)\n\n if is_a_single_image:\n sys.stdout.write('%.7g' % ssim_value)\n else:\n sys.stdout.write('%s - %s: %.7g' % (\n args.base_image.name, comparison_image, ssim_value))\n sys.stdout.write('\\n')", "response": "Main function for pyssim."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncomputing the SSIM value from the reference image to the target image.", "response": "def ssim_value(self, target):\n \"\"\"Compute the SSIM value from the reference image to the target image.\n\n Args:\n target (str or PIL.Image): Input image to compare the reference image\n to. This may be a PIL Image object or, to save time, an SSIMImage\n object (e.g. the img member of another SSIM object).\n\n Returns:\n Computed SSIM float value.\n \"\"\"\n # Performance boost if handed a compatible SSIMImage object.\n if not isinstance(target, SSIMImage) \\\n or not np.array_equal(self.gaussian_kernel_1d,\n target.gaussian_kernel_1d):\n target = SSIMImage(target, self.gaussian_kernel_1d, self.img.size)\n\n img_mat_12 = self.img.img_gray * target.img_gray\n img_mat_sigma_12 = convolve_gaussian_2d(\n img_mat_12, self.gaussian_kernel_1d)\n img_mat_mu_12 = self.img.img_gray_mu * target.img_gray_mu\n img_mat_sigma_12 = img_mat_sigma_12 - img_mat_mu_12\n\n # Numerator of SSIM\n num_ssim = ((2 * img_mat_mu_12 + self.c_1) *\n (2 * img_mat_sigma_12 + self.c_2))\n\n # Denominator of SSIM\n den_ssim = (\n (self.img.img_gray_mu_squared + target.img_gray_mu_squared +\n self.c_1) *\n (self.img.img_gray_sigma_squared +\n target.img_gray_sigma_squared + self.c_2))\n\n ssim_map = num_ssim / den_ssim\n index = np.average(ssim_map)\n return index"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncomputing the complex wavelet SSIM value from the reference image to the target image.", "response": "def cw_ssim_value(self, target, width=30):\n \"\"\"Compute the complex wavelet SSIM (CW-SSIM) value from the reference\n image to the target image.\n\n Args:\n target (str or PIL.Image): Input image to compare the reference image\n to. This may be a PIL Image object or, to save time, an SSIMImage\n object (e.g. the img member of another SSIM object).\n width: width for the wavelet convolution (default: 30)\n\n Returns:\n Computed CW-SSIM float value.\n \"\"\"\n if not isinstance(target, SSIMImage):\n target = SSIMImage(target, size=self.img.size)\n\n # Define a width for the wavelet convolution\n widths = np.arange(1, width+1)\n\n # Use the image data as arrays\n sig1 = np.asarray(self.img.img_gray.getdata())\n sig2 = np.asarray(target.img_gray.getdata())\n\n # Convolution\n cwtmatr1 = signal.cwt(sig1, signal.ricker, widths)\n cwtmatr2 = signal.cwt(sig2, signal.ricker, widths)\n\n # Compute the first term\n c1c2 = np.multiply(abs(cwtmatr1), abs(cwtmatr2))\n c1_2 = np.square(abs(cwtmatr1))\n c2_2 = np.square(abs(cwtmatr2))\n num_ssim_1 = 2 * np.sum(c1c2, axis=0) + self.k\n den_ssim_1 = np.sum(c1_2, axis=0) + np.sum(c2_2, axis=0) + self.k\n\n # Compute the second term\n c1c2_conj = np.multiply(cwtmatr1, np.conjugate(cwtmatr2))\n num_ssim_2 = 2 * np.abs(np.sum(c1c2_conj, axis=0)) + self.k\n den_ssim_2 = 2 * np.sum(np.abs(c1c2_conj), axis=0) + self.k\n\n # Construct the result\n ssim_map = (num_ssim_1 / den_ssim_1) * (num_ssim_2 / den_ssim_2)\n\n # Average the per pixel results\n index = np.average(ssim_map)\n return index"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncompute the SSIM value between two images.", "response": "def compute_ssim(image1, image2, gaussian_kernel_sigma=1.5,\n gaussian_kernel_width=11):\n \"\"\"Computes SSIM.\n\n Args:\n im1: First PIL Image object to compare.\n im2: Second PIL Image object to compare.\n\n Returns:\n SSIM float value.\n \"\"\"\n gaussian_kernel_1d = get_gaussian_kernel(\n gaussian_kernel_width, gaussian_kernel_sigma)\n return SSIM(image1, gaussian_kernel_1d).ssim_value(image2)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef replicated(*decArgs, **decKwargs):\n def replicatedImpl(func):\n def newFunc(self, *args, **kwargs):\n\n if kwargs.pop('_doApply', False):\n return func(self, *args, **kwargs)\n else:\n if isinstance(self, SyncObj):\n applier = self._applyCommand\n funcName = self._getFuncName(func.__name__)\n funcID = self._methodToID[funcName]\n elif isinstance(self, SyncObjConsumer):\n consumerId = id(self)\n funcName = self._syncObj._getFuncName((consumerId, func.__name__))\n funcID = self._syncObj._methodToID[(consumerId, funcName)]\n applier = self._syncObj._applyCommand\n else:\n raise SyncObjException(\"Class should be inherited from SyncObj or SyncObjConsumer\")\n\n callback = kwargs.pop('callback', None)\n if kwargs:\n cmd = (funcID, args, kwargs)\n elif args and not kwargs:\n cmd = (funcID, args)\n else:\n cmd = funcID\n sync = kwargs.pop('sync', False)\n if callback is not None:\n sync = False\n\n if sync:\n asyncResult = AsyncResult()\n callback = asyncResult.onResult\n\n timeout = kwargs.pop('timeout', None)\n applier(pickle.dumps(cmd), callback, _COMMAND_TYPE.REGULAR)\n\n if sync:\n res = asyncResult.event.wait(timeout)\n if not res:\n raise SyncObjException('Timeout')\n if not asyncResult.error == 0:\n raise SyncObjException(asyncResult.error)\n return asyncResult.result\n\n func_dict = newFunc.__dict__ if is_py3 else newFunc.func_dict\n func_dict['replicated'] = True\n func_dict['ver'] = int(decKwargs.get('ver', 0))\n func_dict['origName'] = func.__name__\n\n callframe = sys._getframe(1 if decKwargs else 2)\n namespace = callframe.f_locals\n newFuncName = func.__name__ + '_v' + str(func_dict['ver'])\n namespace[newFuncName] = __copy_func(newFunc, newFuncName)\n functools.update_wrapper(newFunc, func)\n return newFunc\n\n if len(decArgs) == 1 and len(decKwargs) == 0 and callable(decArgs[0]):\n return replicatedImpl(decArgs[0])\n\n return replicatedImpl", "response": "Decorator that marks the class members that modify a class state."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef destroy(self):\n if self.__conf.autoTick:\n self.__destroying = True\n else:\n self._doDestroy()", "response": "Correctly destroy SyncObj. Stop autoTickThread close connections etc."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nwaits until the binded port is ready.", "response": "def waitBinded(self):\n \"\"\"\n Waits until initialized (binded port).\n If success - just returns.\n If failed to initialized after conf.maxBindRetries - raise SyncObjException.\n \"\"\"\n try:\n self.__transport.waitReady()\n except TransportNotReadyError:\n raise SyncObjException('BindError')\n if not self.__transport.ready:\n raise SyncObjException('BindError')"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef setCodeVersion(self, newVersion, callback = None):\n assert isinstance(newVersion, int)\n if newVersion > self.__selfCodeVersion:\n raise Exception('wrong version, current version is %d, requested version is %d' % (self.__selfCodeVersion, newVersion))\n if newVersion < self.__enabledCodeVersion:\n raise Exception('wrong version, enabled version is %d, requested version is %d' % (self.__enabledCodeVersion, newVersion))\n self._applyCommand(pickle.dumps(newVersion), callback, _COMMAND_TYPE.VERSION)", "response": "Switch to a new code version on all cluster nodes."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nremoves single node from cluster. Async. You should wait until node successfully added before adding next node. Async. You should wait until node successfully added before adding next node.", "response": "def removeNodeFromCluster(self, node, callback = None):\n \"\"\"Remove single node from cluster (dynamic membership changes). Async.\n You should wait until node successfully added before adding\n next node.\n\n :param node: node object or 'nodeHost:nodePort'\n :type node: Node | str\n :param callback: will be called on success or fail\n :type callback: function(`FAIL_REASON <#pysyncobj.FAIL_REASON>`_, None)\n \"\"\"\n if not self.__conf.dynamicMembershipChange:\n raise Exception('dynamicMembershipChange is disabled')\n if not isinstance(node, Node):\n node = self.__nodeClass(node)\n self._applyCommand(pickle.dumps(['rem', node.id, node]), callback, _COMMAND_TYPE.MEMBERSHIP)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndumping different debug info about cluster to dict and return it", "response": "def getStatus(self):\n \"\"\"Dumps different debug info about cluster to dict and return it\"\"\"\n\n status = {}\n status['version'] = VERSION\n status['revision'] = REVISION\n status['self'] = self.__selfNode\n status['state'] = self.__raftState\n status['leader'] = self.__raftLeader\n status['partner_nodes_count'] = len(self.__otherNodes)\n for node in self.__otherNodes:\n status['partner_node_status_server_' + node.id] = 2 if node in self.__connectedNodes else 0\n status['readonly_nodes_count'] = len(self.__readonlyNodes)\n for node in self.__readonlyNodes:\n status['readonly_node_status_server_' + node.id] = 2 if node in self.__connectedNodes else 0\n status['log_len'] = len(self.__raftLog)\n status['last_applied'] = self.__raftLastApplied\n status['commit_idx'] = self.__raftCommitIndex\n status['raft_term'] = self.__raftCurrentTerm\n status['next_node_idx_count'] = len(self.__raftNextIndex)\n for node, idx in iteritems(self.__raftNextIndex):\n status['next_node_idx_server_' + node.id] = idx\n status['match_idx_count'] = len(self.__raftMatchIndex)\n for node, idx in iteritems(self.__raftMatchIndex):\n status['match_idx_server_' + node.id] = idx\n status['leader_commit_idx'] = self.__leaderCommitIndex\n status['uptime'] = int(time.time() - self.__startTime)\n status['self_code_version'] = self.__selfCodeVersion\n status['enabled_code_version'] = self.__enabledCodeVersion\n return status"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef printStatus(self):\n status = self.getStatus()\n for k, v in iteritems(status):\n logging.info('%s: %s' % (str(k), str(v)))", "response": "Dumps different debug info about cluster to default logger"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreturns the node that corresponds to the given connection object.", "response": "def _connToNode(self, conn):\n \"\"\"\n Find the node to which a connection belongs.\n\n :param conn: connection object\n :type conn: TcpConnection\n :returns corresponding node or None if the node cannot be found\n :rtype Node or None\n \"\"\"\n\n for node in self._connections:\n if self._connections[node] is conn:\n return node\n return None"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating the TCP server and initialize self. _server", "response": "def _createServer(self):\n \"\"\"\n Create the TCP server (but don't bind yet)\n \"\"\"\n\n conf = self._syncObj.conf\n bindAddr = conf.bindAddress or getattr(self._selfNode, 'address')\n if not bindAddr:\n raise RuntimeError('Unable to determine bind address')\n host, port = bindAddr.rsplit(':', 1)\n host = globalDnsResolver().resolve(host)\n self._server = TcpServer(self._syncObj._poller, host, port, onNewConnection = self._onNewIncomingConnection,\n sendBufferSize = conf.sendBufferSize,\n recvBufferSize = conf.recvBufferSize,\n connectionTimeout = conf.connectionTimeout)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _maybeBind(self):\n\n if self._ready or self._selfIsReadonlyNode or time.time() < self._lastBindAttemptTime + self._syncObj.conf.bindRetryTime:\n return\n self._lastBindAttemptTime = time.time()\n try:\n self._server.bind()\n except Exception as e:\n self._bindAttempts += 1\n if self._syncObj.conf.maxBindRetries and self._bindAttempts >= self._syncObj.conf.maxBindRetries:\n self._bindOverEvent.set()\n raise TransportNotReadyError\n else:\n self._ready = True\n self._bindOverEvent.set()", "response": "Bind the server unless it is already bound."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _onNewIncomingConnection(self, conn):\n\n self._unknownConnections.add(conn)\n encryptor = self._syncObj.encryptor\n if encryptor:\n conn.encryptor = encryptor\n conn.setOnMessageReceivedCallback(functools.partial(self._onIncomingMessageReceived, conn))\n conn.setOnDisconnectedCallback(functools.partial(self._onDisconnected, conn))", "response": "Callback for incoming connections initiated by the other side."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _utilityCallback(self, res, err, conn, cmd, arg):\n\n cmdResult = 'FAIL'\n if err == FAIL_REASON.SUCCESS:\n cmdResult = 'SUCCESS'\n conn.send(cmdResult + ' ' + cmd + ' ' + arg)", "response": "This is the callback function that is called when the utility message is received from the utility server."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _shouldConnect(self, node):\n\n return isinstance(node, TCPNode) and node not in self._preventConnectNodes and (self._selfIsReadonlyNode or self._selfNode.address > node.address)", "response": "Check whether this node should initiate a connection to another node"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _connectIfNecessarySingle(self, node):\n\n if node in self._connections and self._connections[node].state != CONNECTION_STATE.DISCONNECTED:\n return True\n if not self._shouldConnect(node):\n return False\n assert node in self._connections # Since we \"should connect\" to this node, there should always be a connection object already in place.\n if node in self._lastConnectAttempt and time.time() - self._lastConnectAttempt[node] < self._syncObj.conf.connectionRetryTime:\n return False\n self._lastConnectAttempt[node] = time.time()\n return self._connections[node].connect(node.ip, node.port)", "response": "Connect to a node if necessary."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _onOutgoingMessageReceived(self, conn, message):\n\n if not conn.sendRandKey:\n conn.sendRandKey = message\n conn.send(self._selfNode.address)\n\n node = self._connToNode(conn)\n conn.setOnMessageReceivedCallback(functools.partial(self._onMessageReceived, node))\n self._onNodeConnected(node)", "response": "Callback for receiving a new outgoing message on a new connection."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef addNode(self, node):\n\n self._nodes.add(node)\n self._nodeAddrToNode[node.address] = node\n if self._shouldConnect(node):\n conn = TcpConnection(poller = self._syncObj._poller,\n timeout = self._syncObj.conf.connectionTimeout,\n sendBufferSize = self._syncObj.conf.sendBufferSize,\n recvBufferSize = self._syncObj.conf.recvBufferSize)\n conn.encryptor = self._syncObj.encryptor\n conn.setOnConnectedCallback(functools.partial(self._onOutgoingConnected, conn))\n conn.setOnMessageReceivedCallback(functools.partial(self._onMessageReceived, node))\n conn.setOnDisconnectedCallback(functools.partial(self._onDisconnected, conn))\n self._connections[node] = conn", "response": "Adds a node to the network."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef dropNode(self, node):\n\n conn = self._connections.pop(node, None)\n if conn is not None:\n # Calling conn.disconnect() immediately triggers the onDisconnected callback if the connection isn't already disconnected, so this is necessary to prevent the automatic reconnect.\n self._preventConnectNodes.add(node)\n conn.disconnect()\n self._preventConnectNodes.remove(node)\n if isinstance(node, TCPNode):\n self._nodes.discard(node)\n self._nodeAddrToNode.pop(node.address, None)\n else:\n self._readonlyNodes.discard(node)\n self._lastConnectAttempt.pop(node, None)", "response": "Drop a node from the network."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsending a message to a node. Returns False if the connection is dead. Returns True if the connection is still open or if the connection is dead.", "response": "def send(self, node, message):\n \"\"\"\n Send a message to a node. Returns False if the connection appears to be dead either before or after actually trying to send the message.\n\n :param node: target node\n :type node: Node\n :param message: message\n :param message: any\n :returns success\n :rtype bool\n \"\"\"\n\n if node not in self._connections or self._connections[node].state != CONNECTION_STATE.CONNECTED:\n return False\n self._connections[node].send(message)\n if self._connections[node].state != CONNECTION_STATE.CONNECTED:\n return False\n return True"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef put(self, item):\n if self.__maxsize and len(self.__data) >= self.__maxsize:\n return False\n self.__data.append(item)\n return True", "response": "Put an item into the queue."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nputs an item into the queue.", "response": "def put(self, item):\n \"\"\"Put an item into the queue. Items should be comparable, eg. tuples.\n True - if item placed in queue.\n False - if queue is full and item can not be placed.\"\"\"\n if self.__maxsize and len(self.__data) >= self.__maxsize:\n return False\n heapq.heappush(self.__data, item)\n return True"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get(self, default=None):\n if not self.__data:\n return default\n return heapq.heappop(self.__data)", "response": "Extract the smallest item from the queue. Return default if queue is empty."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef tryAcquire(self, lockID, callback=None, sync=False, timeout=None):\n return self.__lockImpl.acquire(lockID, self.__selfID, time.time(), callback=callback, sync=sync, timeout=timeout)", "response": "Attempt to acquire a lock."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking if a lock is acquired by ourselves.", "response": "def isAcquired(self, lockID):\n \"\"\"Check if lock is acquired by ourselves.\n\n :param lockID: unique lock identifier.\n :type lockID: str\n :return True if lock is acquired by ourselves.\n \"\"\"\n return self.__lockImpl.isAcquired(lockID, self.__selfID, time.time())"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef release(self, lockID, callback=None, sync=False, timeout=None):\n self.__lockImpl.release(lockID, self.__selfID, callback=callback, sync=sync, timeout=timeout)", "response": "Release a previously acquired lock."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef check(func):\n def wrapped(*args, **kwargs):\n check_name = func.__name__\n arg_name = None\n if args:\n arg_name = args[0]\n try:\n if arg_name:\n logger.debug(\"Checking '%s' for '%s'\", check_name, arg_name)\n else:\n logger.debug(\"Checking '%s'\", check_name)\n response = func(*args, **kwargs)\n except Exception as e:\n message = str(e)\n response = {\n \"ok\": False,\n \"error\": message,\n \"stacktrace\": traceback.format_exc(),\n }\n # The check contains several individual checks (e.g., one per\n # database). Preface the results by name.\n if arg_name:\n response = {arg_name: response}\n logger.exception(\n \"Error calling '%s' for '%s': %s\",\n check_name,\n arg_name,\n message\n )\n else:\n logger.exception(\n \"Error calling '%s': %s\",\n check_name,\n message\n )\n\n return response\n return wrapped", "response": "Decorator that wraps checks and returns an error response on failure."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nset the Elasticsearch hosts to use Taxonomy", "response": "def set_hosts(hosts, use_ssl=False, ssl_cert_path=None):\n \"\"\"\n Sets the Elasticsearch hosts to use\n\n Args:\n hosts (str): A single hostname or URL, or list of hostnames or URLs\n use_ssl (bool): Use a HTTPS connection to the server\n ssl_cert_path (str): Path to the certificate chain\n \"\"\"\n if type(hosts) != list:\n hosts = [hosts]\n conn_params = {\n \"hosts\": hosts,\n \"timeout\": 20\n }\n if use_ssl:\n conn_params['use_ssl'] = True\n if ssl_cert_path:\n conn_params['verify_certs'] = True\n conn_params['ca_certs'] = ssl_cert_path\n else:\n conn_params['verify_certs'] = False\n connections.create_connection(**conn_params)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ncreating Elasticsearch indexes for the given list of index names.", "response": "def create_indexes(names, settings=None):\n \"\"\"\n Create Elasticsearch indexes\n\n Args:\n names (list): A list of index names\n settings (dict): Index settings\n\n \"\"\"\n for name in names:\n index = Index(name)\n try:\n if not index.exists():\n logger.debug(\"Creating Elasticsearch index: {0}\".format(name))\n if settings is None:\n index.settings(number_of_shards=1,\n number_of_replicas=1)\n else:\n index.settings(**settings)\n index.create()\n except Exception as e:\n raise ElasticsearchError(\n \"Elasticsearch error: {0}\".format(e.__str__()))"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef migrate_indexes(aggregate_indexes=None, forensic_indexes=None):\n version = 2\n if aggregate_indexes is None:\n aggregate_indexes = []\n if forensic_indexes is None:\n forensic_indexes = []\n for aggregate_index_name in aggregate_indexes:\n if not Index(aggregate_index_name).exists():\n continue\n aggregate_index = Index(aggregate_index_name)\n doc = \"doc\"\n fo_field = \"published_policy.fo\"\n fo = \"fo\"\n fo_mapping = aggregate_index.get_field_mapping(fields=[fo_field])\n fo_mapping = fo_mapping[list(fo_mapping.keys())[0]][\"mappings\"]\n if doc not in fo_mapping:\n continue\n\n fo_mapping = fo_mapping[doc][fo_field][\"mapping\"][fo]\n fo_type = fo_mapping[\"type\"]\n if fo_type == \"long\":\n new_index_name = \"{0}-v{1}\".format(aggregate_index_name, version)\n body = {\"properties\": {\"published_policy.fo\": {\n \"type\": \"text\",\n \"fields\": {\n \"keyword\": {\n \"type\": \"keyword\",\n \"ignore_above\": 256\n }\n }\n }\n }\n }\n Index(new_index_name).create()\n Index(new_index_name).put_mapping(doc_type=doc, body=body)\n reindex(connections.get_connection(), aggregate_index_name,\n new_index_name)\n Index(aggregate_index_name).delete()\n\n for forensic_index in forensic_indexes:\n pass", "response": "Migrate the indexes in the aggregate_indexes list to the new ones."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef save_aggregate_report_to_elasticsearch(aggregate_report,\n index_suffix=None,\n monthly_indexes=False):\n \"\"\"\n Saves a parsed DMARC aggregate report to ElasticSearch\n\n Args:\n aggregate_report (OrderedDict): A parsed forensic report\n index_suffix (str): The suffix of the name of the index to save to\n monthly_indexes (bool): Use monthly indexes instead of daily indexes\n\n Raises:\n AlreadySaved\n \"\"\"\n logger.debug(\"Saving aggregate report to Elasticsearch\")\n aggregate_report = aggregate_report.copy()\n metadata = aggregate_report[\"report_metadata\"]\n org_name = metadata[\"org_name\"]\n report_id = metadata[\"report_id\"]\n domain = aggregate_report[\"policy_published\"][\"domain\"]\n begin_date = human_timestamp_to_datetime(metadata[\"begin_date\"])\n end_date = human_timestamp_to_datetime(metadata[\"end_date\"])\n begin_date_human = begin_date.strftime(\"%Y-%m-%d %H:%M:%S\")\n end_date_human = end_date.strftime(\"%Y-%m-%d %H:%M:%S\")\n if monthly_indexes:\n index_date = begin_date.strftime(\"%Y-%m\")\n else:\n index_date = begin_date.strftime(\"%Y-%m-%d\")\n aggregate_report[\"begin_date\"] = begin_date\n aggregate_report[\"end_date\"] = end_date\n date_range = [aggregate_report[\"begin_date\"],\n aggregate_report[\"end_date\"]]\n\n org_name_query = Q(dict(match=dict(org_name=org_name)))\n report_id_query = Q(dict(match=dict(report_id=report_id)))\n domain_query = Q(dict(match={\"published_policy.domain\": domain}))\n begin_date_query = Q(dict(match=dict(date_range=begin_date)))\n end_date_query = Q(dict(match=dict(date_range=end_date)))\n\n search = Search(index=\"dmarc_aggregate*\")\n query = org_name_query & report_id_query & domain_query\n query = query & begin_date_query & end_date_query\n search.query = query\n\n existing = search.execute()\n if len(existing) > 0:\n raise AlreadySaved(\"An aggregate report ID {0} from {1} about {2} \"\n \"with a date range of {3} UTC to {4} UTC already \"\n \"exists in \"\n \"Elasticsearch\".format(report_id,\n org_name,\n domain,\n begin_date_human,\n end_date_human))\n published_policy = _PublishedPolicy(\n domain=aggregate_report[\"policy_published\"][\"domain\"],\n adkim=aggregate_report[\"policy_published\"][\"adkim\"],\n aspf=aggregate_report[\"policy_published\"][\"aspf\"],\n p=aggregate_report[\"policy_published\"][\"p\"],\n sp=aggregate_report[\"policy_published\"][\"sp\"],\n pct=aggregate_report[\"policy_published\"][\"pct\"],\n fo=aggregate_report[\"policy_published\"][\"fo\"]\n )\n\n for record in aggregate_report[\"records\"]:\n agg_doc = _AggregateReportDoc(\n xml_schemea=aggregate_report[\"xml_schema\"],\n org_name=metadata[\"org_name\"],\n org_email=metadata[\"org_email\"],\n org_extra_contact_info=metadata[\"org_extra_contact_info\"],\n report_id=metadata[\"report_id\"],\n date_range=date_range,\n errors=metadata[\"errors\"],\n published_policy=published_policy,\n source_ip_address=record[\"source\"][\"ip_address\"],\n source_country=record[\"source\"][\"country\"],\n source_reverse_dns=record[\"source\"][\"reverse_dns\"],\n source_base_domain=record[\"source\"][\"base_domain\"],\n message_count=record[\"count\"],\n disposition=record[\"policy_evaluated\"][\"disposition\"],\n dkim_aligned=record[\"policy_evaluated\"][\"dkim\"] == \"pass\",\n spf_aligned=record[\"policy_evaluated\"][\"spf\"] == \"pass\",\n header_from=record[\"identifiers\"][\"header_from\"],\n envelope_from=record[\"identifiers\"][\"envelope_from\"],\n envelope_to=record[\"identifiers\"][\"envelope_to\"]\n )\n\n for override in record[\"policy_evaluated\"][\"policy_override_reasons\"]:\n agg_doc.add_policy_override(type_=override[\"type\"],\n comment=override[\"comment\"])\n\n for dkim_result in record[\"auth_results\"][\"dkim\"]:\n agg_doc.add_dkim_result(domain=dkim_result[\"domain\"],\n selector=dkim_result[\"selector\"],\n result=dkim_result[\"result\"])\n\n for spf_result in record[\"auth_results\"][\"spf\"]:\n agg_doc.add_spf_result(domain=spf_result[\"domain\"],\n scope=spf_result[\"scope\"],\n result=spf_result[\"result\"])\n\n index = \"dmarc_aggregate\"\n if index_suffix:\n index = \"{0}_{1}\".format(index, index_suffix)\n index = \"{0}-{1}\".format(index, index_date)\n create_indexes([index])\n agg_doc.meta.index = index\n\n try:\n agg_doc.save()\n except Exception as e:\n raise ElasticsearchError(\n \"Elasticsearch error: {0}\".format(e.__str__()))", "response": "Saves an aggregate report to Elasticsearch"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsaving a parsed DMARC forensic report to ElasticSearch", "response": "def save_forensic_report_to_elasticsearch(forensic_report,\n index_suffix=None,\n monthly_indexes=False):\n \"\"\"\n Saves a parsed DMARC forensic report to ElasticSearch\n\n Args:\n forensic_report (OrderedDict): A parsed forensic report\n index_suffix (str): The suffix of the name of the index to save to\n monthly_indexes (bool): Use monthly indexes instead of daily\n indexes\n\n Raises:\n AlreadySaved\n\n \"\"\"\n logger.debug(\"Saving forensic report to Elasticsearch\")\n forensic_report = forensic_report.copy()\n sample_date = None\n if forensic_report[\"parsed_sample\"][\"date\"] is not None:\n sample_date = forensic_report[\"parsed_sample\"][\"date\"]\n sample_date = human_timestamp_to_datetime(sample_date)\n original_headers = forensic_report[\"parsed_sample\"][\"headers\"]\n headers = OrderedDict()\n for original_header in original_headers:\n headers[original_header.lower()] = original_headers[original_header]\n\n arrival_date_human = forensic_report[\"arrival_date_utc\"]\n arrival_date = human_timestamp_to_datetime(arrival_date_human)\n\n search = Search(index=\"dmarc_forensic*\")\n arrival_query = {\"match\": {\"arrival_date\": arrival_date}}\n q = Q(arrival_query)\n\n from_ = None\n to_ = None\n subject = None\n if \"from\" in headers:\n from_ = headers[\"from\"]\n from_query = {\"match\": {\"sample.headers.from\": from_}}\n q = q & Q(from_query)\n if \"to\" in headers:\n to_ = headers[\"to\"]\n to_query = {\"match\": {\"sample.headers.to\": to_}}\n q = q & Q(to_query)\n if \"subject\" in headers:\n subject = headers[\"subject\"]\n subject_query = {\"match\": {\"sample.headers.subject\": subject}}\n q = q & Q(subject_query)\n\n search.query = q\n existing = search.execute()\n\n if len(existing) > 0:\n raise AlreadySaved(\"A forensic sample to {0} from {1} \"\n \"with a subject of {2} and arrival date of {3} \"\n \"already exists in \"\n \"Elasticsearch\".format(to_,\n from_,\n subject,\n arrival_date_human\n ))\n\n parsed_sample = forensic_report[\"parsed_sample\"]\n sample = _ForensicSampleDoc(\n raw=forensic_report[\"sample\"],\n headers=headers,\n headers_only=forensic_report[\"sample_headers_only\"],\n date=sample_date,\n subject=forensic_report[\"parsed_sample\"][\"subject\"],\n filename_safe_subject=parsed_sample[\"filename_safe_subject\"],\n body=forensic_report[\"parsed_sample\"][\"body\"]\n )\n\n for address in forensic_report[\"parsed_sample\"][\"to\"]:\n sample.add_to(display_name=address[\"display_name\"],\n address=address[\"address\"])\n for address in forensic_report[\"parsed_sample\"][\"reply_to\"]:\n sample.add_reply_to(display_name=address[\"display_name\"],\n address=address[\"address\"])\n for address in forensic_report[\"parsed_sample\"][\"cc\"]:\n sample.add_cc(display_name=address[\"display_name\"],\n address=address[\"address\"])\n for address in forensic_report[\"parsed_sample\"][\"bcc\"]:\n sample.add_bcc(display_name=address[\"display_name\"],\n address=address[\"address\"])\n for attachment in forensic_report[\"parsed_sample\"][\"attachments\"]:\n sample.add_attachment(filename=attachment[\"filename\"],\n content_type=attachment[\"mail_content_type\"],\n sha256=attachment[\"sha256\"])\n try:\n forensic_doc = _ForensicReportDoc(\n feedback_type=forensic_report[\"feedback_type\"],\n user_agent=forensic_report[\"user_agent\"],\n version=forensic_report[\"version\"],\n original_mail_from=forensic_report[\"original_mail_from\"],\n arrival_date=arrival_date,\n domain=forensic_report[\"reported_domain\"],\n original_envelope_id=forensic_report[\"original_envelope_id\"],\n authentication_results=forensic_report[\"authentication_results\"],\n delivery_results=forensic_report[\"delivery_result\"],\n source_ip_address=forensic_report[\"source\"][\"ip_address\"],\n source_country=forensic_report[\"source\"][\"country\"],\n source_reverse_dns=forensic_report[\"source\"][\"reverse_dns\"],\n source_base_domain=forensic_report[\"source\"][\"base_domain\"],\n authentication_mechanisms=forensic_report[\n \"authentication_mechanisms\"],\n auth_failure=forensic_report[\"auth_failure\"],\n dkim_domain=forensic_report[\"dkim_domain\"],\n original_rcpt_to=forensic_report[\"original_rcpt_to\"],\n sample=sample\n )\n\n index = \"dmarc_forensic\"\n if index_suffix:\n index = \"{0}_{1}\".format(index, index_suffix)\n if monthly_indexes:\n index_date = arrival_date.strftime(\"%Y-%m\")\n else:\n index_date = arrival_date.strftime(\"%Y-%m-%d\")\n index = \"{0}-{1}\".format(index, index_date)\n create_indexes([index])\n forensic_doc.meta.index = index\n try:\n forensic_doc.save()\n except Exception as e:\n raise ElasticsearchError(\n \"Elasticsearch error: {0}\".format(e.__str__()))\n except KeyError as e:\n raise InvalidForensicReport(\n \"Forensic report missing required field: {0}\".format(e.__str__()))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef strip_metadata(report):\n report['org_name'] = report['report_metadata']['org_name']\n report['org_email'] = report['report_metadata']['org_email']\n report['report_id'] = report['report_metadata']['report_id']\n report.pop('report_metadata')\n\n return report", "response": "Removes report_metadata key from report dict."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef generate_daterange(report):\n\n metadata = report[\"report_metadata\"]\n begin_date = human_timestamp_to_datetime(metadata[\"begin_date\"])\n end_date = human_timestamp_to_datetime(metadata[\"end_date\"])\n begin_date_human = begin_date.strftime(\"%Y-%m-%dT%H:%M:%S\")\n end_date_human = end_date.strftime(\"%Y-%m-%dT%H:%M:%S\")\n date_range = [begin_date_human,\n end_date_human]\n logger.debug(\"date_range is {}\".format(date_range))\n return date_range", "response": "Generates a date_range timestamp based on begin and end dates for Kibana."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef save_aggregate_reports_to_kafka(self, aggregate_reports,\n aggregate_topic):\n \"\"\"\n Saves aggregate DMARC reports to Kafka\n\n Args:\n aggregate_reports (list): A list of aggregate report dictionaries\n to save to Kafka\n aggregate_topic (str): The name of the Kafka topic\n\n \"\"\"\n if (type(aggregate_reports) == dict or\n type(aggregate_reports) == OrderedDict):\n aggregate_reports = [aggregate_reports]\n\n if len(aggregate_reports) < 1:\n return\n\n for report in aggregate_reports:\n report['date_range'] = self.generate_daterange(report)\n report = self.strip_metadata(report)\n\n for slice in report['records']:\n slice['date_range'] = report['date_range']\n slice['org_name'] = report['org_name']\n slice['org_email'] = report['org_email']\n slice['policy_published'] = report['policy_published']\n slice['report_id'] = report['report_id']\n logger.debug(\"Sending slice.\")\n try:\n logger.debug(\"Saving aggregate report to Kafka\")\n self.producer.send(aggregate_topic, slice)\n except UnknownTopicOrPartitionError:\n raise KafkaError(\n \"Kafka error: Unknown topic or partition on broker\")\n except Exception as e:\n raise KafkaError(\n \"Kafka error: {0}\".format(e.__str__()))\n try:\n self.producer.flush()\n except Exception as e:\n raise KafkaError(\n \"Kafka error: {0}\".format(e.__str__()))", "response": "Saves aggregate report to Kafka"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nsave forensic reports to Kafka", "response": "def save_forensic_reports_to_kafka(self, forensic_reports, forensic_topic):\n \"\"\"\n Saves forensic DMARC reports to Kafka, sends individual\n records (slices) since Kafka requires messages to be <= 1MB\n by default.\n\n Args:\n forensic_reports (list): A list of forensic report dicts\n to save to Kafka\n forensic_topic (str): The name of the Kafka topic\n\n \"\"\"\n if type(forensic_reports) == dict:\n forensic_reports = [forensic_reports]\n\n if len(forensic_reports) < 1:\n return\n\n try:\n logger.debug(\"Saving forensic reports to Kafka\")\n self.producer.send(forensic_topic, forensic_reports)\n except UnknownTopicOrPartitionError:\n raise KafkaError(\n \"Kafka error: Unknown topic or partition on broker\")\n except Exception as e:\n raise KafkaError(\n \"Kafka error: {0}\".format(e.__str__()))\n try:\n self.producer.flush()\n except Exception as e:\n raise KafkaError(\n \"Kafka error: {0}\".format(e.__str__()))"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nconvert a DMARC aggregate report record into a more consistent record.", "response": "def _parse_report_record(record, nameservers=None, dns_timeout=2.0,\n parallel=False):\n \"\"\"\n Converts a record from a DMARC aggregate report into a more consistent\n format\n\n Args:\n record (OrderedDict): The record to convert\n nameservers (list): A list of one or more nameservers to use\n (Cloudflare's public DNS resolvers by default)\n dns_timeout (float): Sets the DNS timeout in seconds\n\n Returns:\n OrderedDict: The converted record\n \"\"\"\n if nameservers is None:\n nameservers = [\"1.1.1.1\", \"1.0.0.1\",\n \"2606:4700:4700::1111\", \"2606:4700:4700::1001\",\n ]\n record = record.copy()\n new_record = OrderedDict()\n new_record_source = get_ip_address_info(record[\"row\"][\"source_ip\"],\n cache=IP_ADDRESS_CACHE,\n nameservers=nameservers,\n timeout=dns_timeout,\n parallel=parallel)\n new_record[\"source\"] = new_record_source\n new_record[\"count\"] = int(record[\"row\"][\"count\"])\n policy_evaluated = record[\"row\"][\"policy_evaluated\"].copy()\n new_policy_evaluated = OrderedDict([(\"disposition\", \"none\"),\n (\"dkim\", \"fail\"),\n (\"spf\", \"fail\"),\n (\"policy_override_reasons\", [])\n ])\n if \"disposition\" in policy_evaluated:\n new_policy_evaluated[\"disposition\"] = policy_evaluated[\"disposition\"]\n if new_policy_evaluated[\"disposition\"].strip().lower() == \"pass\":\n new_policy_evaluated[\"disposition\"] = \"none\"\n if \"dkim\" in policy_evaluated:\n new_policy_evaluated[\"dkim\"] = policy_evaluated[\"dkim\"]\n if \"spf\" in policy_evaluated:\n new_policy_evaluated[\"spf\"] = policy_evaluated[\"spf\"]\n reasons = []\n spf_aligned = policy_evaluated[\"spf\"] == \"pass\"\n dkim_aligned = policy_evaluated[\"dkim\"] == \"pass\"\n dmarc_aligned = spf_aligned or dkim_aligned\n new_record[\"alignment\"] = dict()\n new_record[\"alignment\"][\"spf\"] = spf_aligned\n new_record[\"alignment\"][\"dkim\"] = dkim_aligned\n new_record[\"alignment\"][\"dmarc\"] = dmarc_aligned\n if \"reason\" in policy_evaluated:\n if type(policy_evaluated[\"reason\"]) == list:\n reasons = policy_evaluated[\"reason\"]\n else:\n reasons = [policy_evaluated[\"reason\"]]\n for reason in reasons:\n if \"comment\" not in reason:\n reason[\"comment\"] = None\n new_policy_evaluated[\"policy_override_reasons\"] = reasons\n new_record[\"policy_evaluated\"] = new_policy_evaluated\n new_record[\"identifiers\"] = record[\"identifiers\"].copy()\n new_record[\"auth_results\"] = OrderedDict([(\"dkim\", []), (\"spf\", [])])\n if record[\"auth_results\"] is not None:\n auth_results = record[\"auth_results\"].copy()\n if \"spf\" not in auth_results:\n auth_results[\"spf\"] = []\n if \"dkim\" not in auth_results:\n auth_results[\"dkim\"] = []\n else:\n auth_results = new_record[\"auth_results\"].copy()\n\n if type(auth_results[\"dkim\"]) != list:\n auth_results[\"dkim\"] = [auth_results[\"dkim\"]]\n for result in auth_results[\"dkim\"]:\n if \"domain\" in result and result[\"domain\"] is not None:\n new_result = OrderedDict([(\"domain\", result[\"domain\"])])\n if \"selector\" in result and result[\"selector\"] is not None:\n new_result[\"selector\"] = result[\"selector\"]\n else:\n new_result[\"selector\"] = \"none\"\n if \"result\" in result and result[\"result\"] is not None:\n new_result[\"result\"] = result[\"result\"]\n else:\n new_result[\"result\"] = \"none\"\n new_record[\"auth_results\"][\"dkim\"].append(new_result)\n\n if type(auth_results[\"spf\"]) != list:\n auth_results[\"spf\"] = [auth_results[\"spf\"]]\n for result in auth_results[\"spf\"]:\n new_result = OrderedDict([(\"domain\", result[\"domain\"])])\n if \"scope\" in result and result[\"scope\"] is not None:\n new_result[\"scope\"] = result[\"scope\"]\n else:\n new_result[\"scope\"] = \"mfrom\"\n if \"result\" in result and result[\"result\"] is not None:\n new_result[\"result\"] = result[\"result\"]\n else:\n new_result[\"result\"] = \"none\"\n new_record[\"auth_results\"][\"spf\"].append(new_result)\n\n if \"envelope_from\" not in new_record[\"identifiers\"]:\n envelope_from = None\n if len(auth_results[\"spf\"]) > 0:\n envelope_from = new_record[\"auth_results\"][\"spf\"][-1][\"domain\"]\n if envelope_from is not None:\n envelope_from = str(envelope_from).lower()\n new_record[\"identifiers\"][\"envelope_from\"] = envelope_from\n\n elif new_record[\"identifiers\"][\"envelope_from\"] is None:\n if len(auth_results[\"spf\"]) > 0:\n envelope_from = new_record[\"auth_results\"][\"spf\"][-1][\"domain\"]\n if envelope_from is not None:\n envelope_from = str(envelope_from).lower()\n new_record[\"identifiers\"][\"envelope_from\"] = envelope_from\n\n envelope_to = None\n if \"envelope_to\" in new_record[\"identifiers\"]:\n envelope_to = new_record[\"identifiers\"][\"envelope_to\"]\n del new_record[\"identifiers\"][\"envelope_to\"]\n\n new_record[\"identifiers\"][\"envelope_to\"] = envelope_to\n\n return new_record"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nparsing a DMARC XML report string and returns a consistent OrderedDict of the aggregate report.", "response": "def parse_aggregate_report_xml(xml, nameservers=None, timeout=2.0,\n parallel=False):\n \"\"\"Parses a DMARC XML report string and returns a consistent OrderedDict\n\n Args:\n xml (str): A string of DMARC aggregate report XML\n nameservers (list): A list of one or more nameservers to use\n (Cloudflare's public DNS resolvers by default)\n timeout (float): Sets the DNS timeout in seconds\n parallel (bool): Parallel processing\n\n Returns:\n OrderedDict: The parsed aggregate DMARC report\n \"\"\"\n errors = []\n\n try:\n xmltodict.parse(xml)[\"feedback\"]\n except Exception as e:\n errors.append(e.__str__())\n\n try:\n # Replace XML header (sometimes they are invalid)\n xml = xml_header_regex.sub(\"\", xml)\n\n # Remove invalid schema tags\n xml = xml_schema_regex.sub('', xml)\n\n report = xmltodict.parse(xml)[\"feedback\"]\n report_metadata = report[\"report_metadata\"]\n schema = \"draft\"\n if \"version\" in report:\n schema = report[\"version\"]\n new_report = OrderedDict([(\"xml_schema\", schema)])\n new_report_metadata = OrderedDict()\n if report_metadata[\"org_name\"] is None:\n if report_metadata[\"email\"] is not None:\n report_metadata[\"org_name\"] = report_metadata[\n \"email\"].split(\"@\")[-1]\n org_name = report_metadata[\"org_name\"]\n if org_name is not None:\n org_name = get_base_domain(org_name)\n new_report_metadata[\"org_name\"] = org_name\n new_report_metadata[\"org_email\"] = report_metadata[\"email\"]\n extra = None\n if \"extra_contact_info\" in report_metadata:\n extra = report_metadata[\"extra_contact_info\"]\n new_report_metadata[\"org_extra_contact_info\"] = extra\n new_report_metadata[\"report_id\"] = report_metadata[\"report_id\"]\n report_id = new_report_metadata[\"report_id\"]\n report_id = report_id.replace(\"<\",\n \"\").replace(\">\", \"\").split(\"@\")[0]\n new_report_metadata[\"report_id\"] = report_id\n date_range = report[\"report_metadata\"][\"date_range\"]\n date_range[\"begin\"] = timestamp_to_human(date_range[\"begin\"])\n date_range[\"end\"] = timestamp_to_human(date_range[\"end\"])\n new_report_metadata[\"begin_date\"] = date_range[\"begin\"]\n new_report_metadata[\"end_date\"] = date_range[\"end\"]\n if \"error\" in report[\"report_metadata\"]:\n if type(report[\"report_metadata\"][\"error\"]) != list:\n errors = [report[\"report_metadata\"][\"error\"]]\n else:\n errors = report[\"report_metadata\"][\"error\"]\n new_report_metadata[\"errors\"] = errors\n new_report[\"report_metadata\"] = new_report_metadata\n records = []\n policy_published = report[\"policy_published\"]\n new_policy_published = OrderedDict()\n new_policy_published[\"domain\"] = policy_published[\"domain\"]\n adkim = \"r\"\n if \"adkim\" in policy_published:\n if policy_published[\"adkim\"] is not None:\n adkim = policy_published[\"adkim\"]\n new_policy_published[\"adkim\"] = adkim\n aspf = \"r\"\n if \"aspf\" in policy_published:\n if policy_published[\"aspf\"] is not None:\n aspf = policy_published[\"aspf\"]\n new_policy_published[\"aspf\"] = aspf\n new_policy_published[\"p\"] = policy_published[\"p\"]\n sp = new_policy_published[\"p\"]\n if \"sp\" in policy_published:\n if policy_published[\"sp\"] is not None:\n sp = report[\"policy_published\"][\"sp\"]\n new_policy_published[\"sp\"] = sp\n pct = \"100\"\n if \"pct\" in policy_published:\n if policy_published[\"pct\"] is not None:\n pct = report[\"policy_published\"][\"pct\"]\n new_policy_published[\"pct\"] = pct\n fo = \"0\"\n if \"fo\" in policy_published:\n if policy_published[\"fo\"] is not None:\n fo = report[\"policy_published\"][\"fo\"]\n new_policy_published[\"fo\"] = fo\n new_report[\"policy_published\"] = new_policy_published\n\n if type(report[\"record\"]) == list:\n for record in report[\"record\"]:\n report_record = _parse_report_record(record,\n nameservers=nameservers,\n dns_timeout=timeout,\n parallel=parallel)\n records.append(report_record)\n\n else:\n report_record = _parse_report_record(report[\"record\"],\n nameservers=nameservers,\n dns_timeout=timeout,\n parallel=parallel)\n records.append(report_record)\n\n new_report[\"records\"] = records\n\n return new_report\n\n except expat.ExpatError as error:\n raise InvalidAggregateReport(\n \"Invalid XML: {0}\".format(error.__str__()))\n\n except KeyError as error:\n raise InvalidAggregateReport(\n \"Missing field: {0}\".format(error.__str__()))\n except AttributeError:\n raise InvalidAggregateReport(\"Report missing required section\")\n\n except Exception as error:\n raise InvalidAggregateReport(\n \"Unexpected error: {0}\".format(error.__str__()))"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef extract_xml(input_):\n if type(input_) == str:\n file_object = open(input_, \"rb\")\n elif type(input_) == bytes:\n file_object = BytesIO(input_)\n else:\n file_object = input_\n try:\n header = file_object.read(6)\n file_object.seek(0)\n if header.startswith(MAGIC_ZIP):\n _zip = zipfile.ZipFile(file_object)\n xml = _zip.open(_zip.namelist()[0]).read().decode()\n elif header.startswith(MAGIC_GZIP):\n xml = GzipFile(fileobj=file_object).read().decode()\n elif header.startswith(MAGIC_XML):\n xml = file_object.read().decode()\n else:\n file_object.close()\n raise InvalidAggregateReport(\"Not a valid zip, gzip, or xml file\")\n\n file_object.close()\n\n except UnicodeDecodeError:\n raise InvalidAggregateReport(\"File objects must be opened in binary \"\n \"(rb) mode\")\n except Exception as error:\n raise InvalidAggregateReport(\n \"Invalid archive file: {0}\".format(error.__str__()))\n\n return xml", "response": "Extracts XML from a zip or gzip file at the given path."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef parse_aggregate_report_file(_input, nameservers=None, dns_timeout=2.0,\n parallel=False):\n \"\"\"Parses a file at the given path, a file-like object. or bytes as a\n aggregate DMARC report\n\n Args:\n _input: A path to a file, a file like object, or bytes\n nameservers (list): A list of one or more nameservers to use\n (Cloudflare's public DNS resolvers by default)\n dns_timeout (float): Sets the DNS timeout in seconds\n parallel (bool): Parallel processing\n\n Returns:\n OrderedDict: The parsed DMARC aggregate report\n \"\"\"\n xml = extract_xml(_input)\n\n return parse_aggregate_report_xml(xml,\n nameservers=nameservers,\n timeout=dns_timeout,\n parallel=parallel)", "response": "Parses a file at the given path or bytes as a\n aggregate report."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef parsed_aggregate_reports_to_csv(reports):\n\n def to_str(obj):\n return str(obj).lower()\n\n fields = [\"xml_schema\", \"org_name\", \"org_email\",\n \"org_extra_contact_info\", \"report_id\", \"begin_date\", \"end_date\",\n \"errors\", \"domain\", \"adkim\", \"aspf\", \"p\", \"sp\", \"pct\", \"fo\",\n \"source_ip_address\", \"source_country\", \"source_reverse_dns\",\n \"source_base_domain\", \"count\", \"disposition\", \"dkim_alignment\",\n \"spf_alignment\", \"policy_override_reasons\",\n \"policy_override_comments\", \"envelope_from\", \"header_from\",\n \"envelope_to\", \"dkim_domains\", \"dkim_selectors\", \"dkim_results\",\n \"spf_domains\", \"spf_scopes\", \"spf_results\"]\n\n csv_file_object = StringIO(newline=\"\\n\")\n writer = DictWriter(csv_file_object, fields)\n writer.writeheader()\n\n if type(reports) == OrderedDict:\n reports = [reports]\n\n for report in reports:\n xml_schema = report[\"xml_schema\"]\n org_name = report[\"report_metadata\"][\"org_name\"]\n org_email = report[\"report_metadata\"][\"org_email\"]\n org_extra_contact = report[\"report_metadata\"][\"org_extra_contact_info\"]\n report_id = report[\"report_metadata\"][\"report_id\"]\n begin_date = report[\"report_metadata\"][\"begin_date\"]\n end_date = report[\"report_metadata\"][\"end_date\"]\n errors = \"|\".join(report[\"report_metadata\"][\"errors\"])\n domain = report[\"policy_published\"][\"domain\"]\n adkim = report[\"policy_published\"][\"adkim\"]\n aspf = report[\"policy_published\"][\"aspf\"]\n p = report[\"policy_published\"][\"p\"]\n sp = report[\"policy_published\"][\"sp\"]\n pct = report[\"policy_published\"][\"pct\"]\n fo = report[\"policy_published\"][\"fo\"]\n\n report_dict = dict(xml_schema=xml_schema, org_name=org_name,\n org_email=org_email,\n org_extra_contact_info=org_extra_contact,\n report_id=report_id, begin_date=begin_date,\n end_date=end_date, errors=errors, domain=domain,\n adkim=adkim, aspf=aspf, p=p, sp=sp, pct=pct, fo=fo)\n\n for record in report[\"records\"]:\n row = report_dict\n row[\"source_ip_address\"] = record[\"source\"][\"ip_address\"]\n row[\"source_country\"] = record[\"source\"][\"country\"]\n row[\"source_reverse_dns\"] = record[\"source\"][\"reverse_dns\"]\n row[\"source_base_domain\"] = record[\"source\"][\"base_domain\"]\n row[\"count\"] = record[\"count\"]\n row[\"disposition\"] = record[\"policy_evaluated\"][\"disposition\"]\n row[\"spf_alignment\"] = record[\"policy_evaluated\"][\"spf\"]\n row[\"dkim_alignment\"] = record[\"policy_evaluated\"][\"dkim\"]\n policy_override_reasons = list(map(\n lambda r: r[\"type\"],\n record[\"policy_evaluated\"]\n [\"policy_override_reasons\"]))\n policy_override_comments = list(map(\n lambda r: r[\"comment\"] or \"none\",\n record[\"policy_evaluated\"]\n [\"policy_override_reasons\"]))\n row[\"policy_override_reasons\"] = \",\".join(\n policy_override_reasons)\n row[\"policy_override_comments\"] = \"|\".join(\n policy_override_comments)\n row[\"envelope_from\"] = record[\"identifiers\"][\"envelope_from\"]\n row[\"header_from\"] = record[\"identifiers\"][\"header_from\"]\n envelope_to = record[\"identifiers\"][\"envelope_to\"]\n row[\"envelope_to\"] = envelope_to\n dkim_domains = []\n dkim_selectors = []\n dkim_results = []\n for dkim_result in record[\"auth_results\"][\"dkim\"]:\n dkim_domains.append(dkim_result[\"domain\"])\n if \"selector\" in dkim_result:\n dkim_selectors.append(dkim_result[\"selector\"])\n dkim_results.append(dkim_result[\"result\"])\n row[\"dkim_domains\"] = \",\".join(map(to_str, dkim_domains))\n row[\"dkim_selectors\"] = \",\".join(map(to_str, dkim_selectors))\n row[\"dkim_results\"] = \",\".join(map(to_str, dkim_results))\n spf_domains = []\n spf_scopes = []\n spf_results = []\n for spf_result in record[\"auth_results\"][\"spf\"]:\n spf_domains.append(spf_result[\"domain\"])\n spf_scopes.append(spf_result[\"scope\"])\n spf_results.append(spf_result[\"result\"])\n row[\"spf_domains\"] = \",\".join(map(to_str, spf_domains))\n row[\"spf_scopes\"] = \",\".join(map(to_str, spf_scopes))\n row[\"spf_results\"] = \",\".join(map(to_str, dkim_results))\n\n writer.writerow(row)\n csv_file_object.flush()\n\n return csv_file_object.getvalue()", "response": "Converts one or more parsed aggregate report data in flat CSV format including headers\n "} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef parse_forensic_report(feedback_report, sample, msg_date,\n nameservers=None, dns_timeout=2.0,\n strip_attachment_payloads=False,\n parallel=False):\n \"\"\"\n Converts a DMARC forensic report and sample to a ``OrderedDict``\n\n Args:\n feedback_report (str): A message's feedback report as a string\n sample (str): The RFC 822 headers or RFC 822 message sample\n msg_date (str): The message's date header\n nameservers (list): A list of one or more nameservers to use\n (Cloudflare's public DNS resolvers by default)\n dns_timeout (float): Sets the DNS timeout in seconds\n strip_attachment_payloads (bool): Remove attachment payloads from\n forensic report results\n parallel (bool): Parallel processing\n\n Returns:\n OrderedDict: A parsed report and sample\n \"\"\"\n delivery_results = [\"delivered\", \"spam\", \"policy\", \"reject\", \"other\"]\n\n try:\n parsed_report = OrderedDict()\n report_values = feedback_report_regex.findall(feedback_report)\n for report_value in report_values:\n key = report_value[0].lower().replace(\"-\", \"_\")\n parsed_report[key] = report_value[1]\n\n if \"arrival_date\" not in parsed_report:\n if msg_date is None:\n raise InvalidForensicReport(\n \"Forensic sample is not a valid email\")\n parsed_report[\"arrival_date\"] = msg_date.isoformat()\n\n if \"version\" not in parsed_report:\n parsed_report[\"version\"] = 1\n\n if \"user_agent\" not in parsed_report:\n parsed_report[\"user_agent\"] = None\n\n if \"delivery_result\" not in parsed_report:\n parsed_report[\"delivery_result\"] = None\n else:\n for delivery_result in delivery_results:\n if delivery_result in parsed_report[\"delivery_result\"].lower():\n parsed_report[\"delivery_result\"] = delivery_result\n break\n if parsed_report[\"delivery_result\"] not in delivery_results:\n parsed_report[\"delivery_result\"] = \"other\"\n\n arrival_utc = human_timestamp_to_datetime(\n parsed_report[\"arrival_date\"], to_utc=True)\n arrival_utc = arrival_utc.strftime(\"%Y-%m-%d %H:%M:%S\")\n parsed_report[\"arrival_date_utc\"] = arrival_utc\n\n ip_address = parsed_report[\"source_ip\"]\n parsed_report_source = get_ip_address_info(ip_address,\n nameservers=nameservers,\n timeout=dns_timeout,\n parallel=parallel)\n parsed_report[\"source\"] = parsed_report_source\n del parsed_report[\"source_ip\"]\n\n if \"identity_alignment\" not in parsed_report:\n parsed_report[\"authentication_mechanisms\"] = []\n elif parsed_report[\"identity_alignment\"] == \"none\":\n parsed_report[\"authentication_mechanisms\"] = []\n del parsed_report[\"identity_alignment\"]\n else:\n auth_mechanisms = parsed_report[\"identity_alignment\"]\n auth_mechanisms = auth_mechanisms.split(\",\")\n parsed_report[\"authentication_mechanisms\"] = auth_mechanisms\n del parsed_report[\"identity_alignment\"]\n\n if \"auth_failure\" not in parsed_report:\n parsed_report[\"auth_failure\"] = \"dmarc\"\n auth_failure = parsed_report[\"auth_failure\"].split(\",\")\n parsed_report[\"auth_failure\"] = auth_failure\n\n optional_fields = [\"original_envelope_id\", \"dkim_domain\",\n \"original_mail_from\", \"original_rcpt_to\"]\n for optional_field in optional_fields:\n if optional_field not in parsed_report:\n parsed_report[optional_field] = None\n\n parsed_sample = parse_email(\n sample,\n strip_attachment_payloads=strip_attachment_payloads)\n\n if \"reported_domain\" not in parsed_report:\n parsed_report[\"reported_domain\"] = parsed_sample[\"from\"][\"domain\"]\n\n sample_headers_only = False\n number_of_attachments = len(parsed_sample[\"attachments\"])\n if number_of_attachments < 1 and parsed_sample[\"body\"] is None:\n sample_headers_only = True\n if sample_headers_only and parsed_sample[\"has_defects\"]:\n del parsed_sample[\"defects\"]\n del parsed_sample[\"defects_categories\"]\n del parsed_sample[\"has_defects\"]\n parsed_report[\"sample_headers_only\"] = sample_headers_only\n parsed_report[\"sample\"] = sample\n parsed_report[\"parsed_sample\"] = parsed_sample\n\n return parsed_report\n\n except KeyError as error:\n raise InvalidForensicReport(\"Missing value: {0}\".format(\n error.__str__()))\n\n except Exception as error:\n raise InvalidForensicReport(\n \"Unexpected error: {0}\".format(error.__str__()))", "response": "Parses a DMARC forensic report and returns a dictionary of the relevant information."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef parsed_forensic_reports_to_csv(reports):\n fields = [\"feedback_type\", \"user_agent\", \"version\", \"original_envelope_id\",\n \"original_mail_from\", \"original_rcpt_to\", \"arrival_date\",\n \"arrival_date_utc\", \"subject\", \"message_id\",\n \"authentication_results\", \"dkim_domain\", \"source_ip_address\",\n \"source_country\", \"source_reverse_dns\", \"source_base_domain\",\n \"delivery_result\", \"auth_failure\", \"reported_domain\",\n \"authentication_mechanisms\", \"sample_headers_only\"]\n\n if type(reports) == OrderedDict:\n reports = [reports]\n csv_file = StringIO()\n csv_writer = DictWriter(csv_file, fieldnames=fields)\n csv_writer.writeheader()\n for report in reports:\n row = report.copy()\n row[\"source_ip_address\"] = report[\"source\"][\"ip_address\"]\n row[\"source_reverse_dns\"] = report[\"source\"][\"reverse_dns\"]\n row[\"source_base_domain\"] = report[\"source\"][\"base_domain\"]\n row[\"source_country\"] = report[\"source\"][\"country\"]\n del row[\"source\"]\n row[\"subject\"] = report[\"parsed_sample\"][\"subject\"]\n row[\"auth_failure\"] = \",\".join(report[\"auth_failure\"])\n authentication_mechanisms = report[\"authentication_mechanisms\"]\n row[\"authentication_mechanisms\"] = \",\".join(\n authentication_mechanisms)\n del row[\"sample\"]\n del row[\"parsed_sample\"]\n csv_writer.writerow(row)\n\n return csv_file.getvalue()", "response": "Converts one or more parsed forensic reports to a flat CSV format including the headers of the forensic reports"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef parse_report_email(input_, nameservers=None, dns_timeout=2.0,\n strip_attachment_payloads=False, parallel=False):\n \"\"\"\n Parses a DMARC report from an email\n\n Args:\n input_: An emailed DMARC report in RFC 822 format, as bytes or a string\n nameservers (list): A list of one or more nameservers to use\n dns_timeout (float): Sets the DNS timeout in seconds\n strip_attachment_payloads (bool): Remove attachment payloads from\n forensic report results\n parallel (bool): Parallel processing\n\n Returns:\n OrderedDict:\n * ``report_type``: ``aggregate`` or ``forensic``\n * ``report``: The parsed report\n \"\"\"\n result = None\n\n try:\n if is_outlook_msg(input_):\n input_ = convert_outlook_msg(input_)\n if type(input_) == bytes:\n input_ = input_.decode(encoding=\"utf8\")\n msg = mailparser.parse_from_string(input_)\n msg_headers = json.loads(msg.headers_json)\n date = email.utils.format_datetime(datetime.utcnow())\n if \"Date\" in msg_headers:\n date = human_timestamp_to_datetime(\n msg_headers[\"Date\"])\n msg = email.message_from_string(input_)\n\n except Exception as e:\n raise InvalidDMARCReport(e.__str__())\n subject = None\n feedback_report = None\n sample = None\n if \"Subject\" in msg_headers:\n subject = msg_headers[\"Subject\"]\n for part in msg.walk():\n content_type = part.get_content_type()\n payload = part.get_payload()\n if type(payload) != list:\n payload = [payload]\n payload = payload[0].__str__()\n if content_type == \"message/feedback-report\":\n try:\n if \"Feedback-Type\" in payload:\n feedback_report = payload\n else:\n feedback_report = b64decode(payload).__str__()\n feedback_report = feedback_report.lstrip(\n \"b'\").rstrip(\"'\")\n feedback_report = feedback_report.replace(\"\\\\r\", \"\")\n feedback_report = feedback_report.replace(\"\\\\n\", \"\\n\")\n except (ValueError, TypeError, binascii.Error):\n feedback_report = payload\n\n elif content_type == \"text/rfc822-headers\":\n sample = payload\n elif content_type == \"message/rfc822\":\n sample = payload\n else:\n try:\n payload = b64decode(payload)\n if payload.startswith(MAGIC_ZIP) or \\\n payload.startswith(MAGIC_GZIP) or \\\n payload.startswith(MAGIC_XML):\n ns = nameservers\n aggregate_report = parse_aggregate_report_file(\n payload,\n nameservers=ns,\n dns_timeout=dns_timeout,\n parallel=parallel)\n result = OrderedDict([(\"report_type\", \"aggregate\"),\n (\"report\", aggregate_report)])\n return result\n\n except (TypeError, ValueError, binascii.Error):\n pass\n\n except InvalidAggregateReport as e:\n error = 'Message with subject \"{0}\" ' \\\n 'is not a valid ' \\\n 'aggregate DMARC report: {1}'.format(subject, e)\n raise InvalidAggregateReport(error)\n\n except FileNotFoundError as e:\n error = 'Unable to parse message with ' \\\n 'subject \"{0}\": {1}'.format(subject, e)\n raise InvalidDMARCReport(error)\n\n if feedback_report and sample:\n try:\n forensic_report = parse_forensic_report(\n feedback_report,\n sample,\n date,\n nameservers=nameservers,\n dns_timeout=dns_timeout,\n strip_attachment_payloads=strip_attachment_payloads,\n parallel=parallel)\n except InvalidForensicReport as e:\n error = 'Message with subject \"{0}\" ' \\\n 'is not a valid ' \\\n 'forensic DMARC report: {1}'.format(subject, e)\n raise InvalidForensicReport(error)\n except Exception as e:\n raise InvalidForensicReport(e.__str__())\n\n result = OrderedDict([(\"report_type\", \"forensic\"),\n (\"report\", forensic_report)])\n return result\n\n if result is None:\n error = 'Message with subject \"{0}\" is ' \\\n 'not a valid DMARC report'.format(subject)\n raise InvalidDMARCReport(error)", "response": "Parses an email report from an emailed DMARC report in RFC 822 format."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nparses a DMARC aggregate or forensic report file at the given path.", "response": "def parse_report_file(input_, nameservers=None, dns_timeout=2.0,\n strip_attachment_payloads=False, parallel=False):\n \"\"\"Parses a DMARC aggregate or forensic file at the given path, a\n file-like object. or bytes\n\n Args:\n input_: A path to a file, a file like object, or bytes\n nameservers (list): A list of one or more nameservers to use\n (Cloudflare's public DNS resolvers by default)\n dns_timeout (float): Sets the DNS timeout in seconds\n strip_attachment_payloads (bool): Remove attachment payloads from\n forensic report results\n parallel (bool): Parallel processing\n\n Returns:\n OrderedDict: The parsed DMARC report\n \"\"\"\n if type(input_) == str:\n file_object = open(input_, \"rb\")\n elif type(input_) == bytes:\n file_object = BytesIO(input_)\n else:\n file_object = input_\n\n content = file_object.read()\n try:\n report = parse_aggregate_report_file(content, nameservers=nameservers,\n dns_timeout=dns_timeout,\n parallel=parallel)\n results = OrderedDict([(\"report_type\", \"aggregate\"),\n (\"report\", report)])\n except InvalidAggregateReport:\n try:\n sa = strip_attachment_payloads\n results = parse_report_email(content,\n nameservers=nameservers,\n dns_timeout=dns_timeout,\n strip_attachment_payloads=sa,\n parallel=parallel)\n except InvalidDMARCReport:\n raise InvalidDMARCReport(\"Not a valid aggregate or forensic \"\n \"report\")\n return results"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn a list of IMAP server s capabilities", "response": "def get_imap_capabilities(server):\n \"\"\"\n Returns a list of an IMAP server's capabilities\n\n Args:\n server (imapclient.IMAPClient): An instance of imapclient.IMAPClient\n\n Returns (list): A list of capabilities\n \"\"\"\n\n capabilities = list(map(str, list(server.capabilities())))\n for i in range(len(capabilities)):\n capabilities[i] = str(capabilities[i]).replace(\"b'\",\n \"\").replace(\"'\",\n \"\")\n logger.debug(\"IMAP server supports: {0}\".format(capabilities))\n\n return capabilities"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nfetches and parses DMARC reports from the INBOX file and returns a dictionary of the results.", "response": "def get_dmarc_reports_from_inbox(host=None,\n user=None,\n password=None,\n connection=None,\n port=None,\n ssl=True,\n ssl_context=None,\n move_supported=None,\n reports_folder=\"INBOX\",\n archive_folder=\"Archive\",\n delete=False,\n test=False,\n nameservers=None,\n dns_timeout=6.0,\n strip_attachment_payloads=False,\n results=None):\n \"\"\"\n Fetches and parses DMARC reports from sn inbox\n\n Args:\n host: The mail server hostname or IP address\n user: The mail server user\n password: The mail server password\n connection: An IMAPCLient connection to reuse\n port: The mail server port\n ssl (bool): Use SSL/TLS\n ssl_context (SSLContext): A SSL context\n move_supported: Indicate if the IMAP server supports the MOVE command\n (autodetect if None)\n reports_folder: The IMAP folder where reports can be found\n archive_folder: The folder to move processed mail to\n delete (bool): Delete messages after processing them\n test (bool): Do not move or delete messages after processing them\n nameservers (list): A list of DNS nameservers to query\n dns_timeout (float): Set the DNS query timeout\n strip_attachment_payloads (bool): Remove attachment payloads from\n forensic report results\n results (dict): Results from the previous run\n\n Returns:\n OrderedDict: Lists of ``aggregate_reports`` and ``forensic_reports``\n \"\"\"\n\n def chunks(l, n):\n \"\"\"Yield successive n-sized chunks from l.\"\"\"\n for i in range(0, len(l), n):\n yield l[i:i + n]\n\n if delete and test:\n raise ValueError(\"delete and test options are mutually exclusive\")\n\n if connection is None and (user is None or password is None):\n raise ValueError(\"Must supply a connection, or a username and \"\n \"password\")\n\n aggregate_reports = []\n forensic_reports = []\n aggregate_report_msg_uids = []\n forensic_report_msg_uids = []\n aggregate_reports_folder = \"{0}/Aggregate\".format(archive_folder)\n forensic_reports_folder = \"{0}/Forensic\".format(archive_folder)\n invalid_reports_folder = \"{0}/Invalid\".format(archive_folder)\n\n if results:\n aggregate_reports = results[\"aggregate_reports\"].copy()\n forensic_reports = results[\"forensic_reports\"].copy()\n\n try:\n if connection:\n server = connection\n else:\n if not ssl:\n logger.debug(\"Connecting to IMAP over plain text\")\n if ssl_context is None:\n ssl_context = create_default_context()\n server = imapclient.IMAPClient(host,\n port=port,\n ssl=ssl,\n ssl_context=ssl_context,\n use_uid=True)\n server.login(user, password)\n\n if move_supported is None:\n server_capabilities = get_imap_capabilities(server)\n move_supported = \"MOVE\" in server_capabilities\n\n def delete_messages(msg_uids):\n logger.debug(\"Deleting message UID(s) {0}\".format(\",\".join(\n str(uid) for uid in msg_uids)))\n if type(msg_uids) == str or type(msg_uids) == int:\n msg_uids = [int(msg_uids)]\n\n server.delete_messages(msg_uids, silent=True)\n server.expunge(msg_uids)\n\n def move_messages(msg_uids, folder):\n if type(msg_uids) == str or type(msg_uids) == int:\n msg_uids = [int(msg_uids)]\n for chunk in chunks(msg_uids, 100):\n if move_supported:\n logger.debug(\"Moving message UID(s) {0} to {1}\".format(\n \",\".join(str(uid) for uid in chunk), folder\n ))\n server.move(chunk, folder)\n else:\n logger.debug(\"Copying message UID(s) {0} to {1}\".format(\n \",\".join(str(uid) for uid in chunk), folder\n ))\n server.copy(msg_uids, folder)\n delete_messages(msg_uids)\n\n if not server.folder_exists(archive_folder):\n logger.debug(\"Creating IMAP folder: {0}\".format(archive_folder))\n server.create_folder(archive_folder)\n try:\n # Test subfolder creation\n if not server.folder_exists(aggregate_reports_folder):\n server.create_folder(aggregate_reports_folder)\n logger.debug(\n \"Creating IMAP folder: {0}\".format(\n aggregate_reports_folder))\n except imapclient.exceptions.IMAPClientError:\n # Only replace / with . when . doesn't work\n # This usually indicates a dovecot IMAP server\n aggregate_reports_folder = aggregate_reports_folder.replace(\"/\",\n \".\")\n forensic_reports_folder = forensic_reports_folder.replace(\"/\",\n \".\")\n invalid_reports_folder = invalid_reports_folder.replace(\"/\",\n \".\")\n subfolders = [aggregate_reports_folder,\n forensic_reports_folder,\n invalid_reports_folder]\n\n for subfolder in subfolders:\n if not server.folder_exists(subfolder):\n logger.debug(\n \"Creating IMAP folder: {0}\".format(subfolder))\n server.create_folder(subfolder)\n server.select_folder(reports_folder)\n messages = server.search()\n total_messages = len(messages)\n logger.debug(\"Found {0} messages in IMAP folder {1}\".format(\n len(messages), reports_folder))\n for i in range(len(messages)):\n msg_uid = messages[i]\n logger.debug(\"Processing message {0} of {1}: UID {2}\".format(\n i+1,\n total_messages,\n msg_uid\n ))\n try:\n try:\n raw_msg = server.fetch(msg_uid,\n [\"RFC822\"])[msg_uid]\n msg_keys = [b'RFC822', b'BODY[NULL]', b'BODY[]']\n msg_key = ''\n for key in msg_keys:\n if key in raw_msg.keys():\n msg_key = key\n break\n raw_msg = raw_msg[msg_key]\n\n except (ConnectionResetError, socket.error,\n TimeoutError,\n imapclient.exceptions.IMAPClientError) as error:\n error = error.__str__().lstrip(\"b'\").rstrip(\"'\").rstrip(\n \".\")\n logger.debug(\"IMAP error: {0}\".format(error.__str__()))\n logger.debug(\"Reconnecting to IMAP\")\n try:\n server.shutdown()\n except Exception as e:\n logger.debug(\n \"Failed to log out: {0}\".format(e.__str__()))\n if not ssl:\n logger.debug(\"Connecting to IMAP over plain text\")\n server = imapclient.IMAPClient(host,\n port=port,\n ssl=ssl,\n ssl_context=ssl_context,\n use_uid=True)\n server.login(user, password)\n server.select_folder(reports_folder)\n raw_msg = server.fetch(msg_uid,\n [\"RFC822\"])[msg_uid][b\"RFC822\"]\n\n msg_content = raw_msg.decode(\"utf-8\", errors=\"replace\")\n sa = strip_attachment_payloads\n parsed_email = parse_report_email(msg_content,\n nameservers=nameservers,\n dns_timeout=dns_timeout,\n strip_attachment_payloads=sa)\n if parsed_email[\"report_type\"] == \"aggregate\":\n aggregate_reports.append(parsed_email[\"report\"])\n aggregate_report_msg_uids.append(msg_uid)\n elif parsed_email[\"report_type\"] == \"forensic\":\n forensic_reports.append(parsed_email[\"report\"])\n forensic_report_msg_uids.append(msg_uid)\n except InvalidDMARCReport as error:\n logger.warning(error.__str__())\n if not test:\n if delete:\n logger.debug(\n \"Deleting message UID {0}\".format(msg_uid))\n delete_messages([msg_uid])\n else:\n logger.debug(\n \"Moving message UID {0} to {1}\".format(\n msg_uid, invalid_reports_folder))\n move_messages([msg_uid], invalid_reports_folder)\n\n if not test:\n if delete:\n processed_messages = aggregate_report_msg_uids + \\\n forensic_report_msg_uids\n\n number_of_processed_msgs = len(processed_messages)\n for i in range(number_of_processed_msgs):\n msg_uid = processed_messages[i]\n logger.debug(\n \"Deleting message {0} of {1}: UID {2}\".format(\n i + 1, number_of_processed_msgs, msg_uid))\n try:\n delete_messages([msg_uid])\n\n except imapclient.exceptions.IMAPClientError as e:\n e = e.__str__().lstrip(\"b'\").rstrip(\n \"'\").rstrip(\".\")\n message = \"Error deleting message UID\"\n e = \"{0} {1}: \" \"{2}\".format(message, msg_uid, e)\n logger.error(\"IMAP error: {0}\".format(e))\n except (ConnectionResetError, socket.error,\n TimeoutError) as e:\n logger.debug(\"IMAP error: {0}\".format(e.__str__()))\n logger.debug(\"Reconnecting to IMAP\")\n try:\n server.shutdown()\n except Exception as e:\n logger.debug(\n \"Failed to log out: {0}\".format(e.__str__()))\n if not ssl:\n logger.debug(\"Connecting to IMAP over plain text\")\n server = imapclient.IMAPClient(host,\n port=port,\n ssl=ssl,\n ssl_context=ssl_context,\n use_uid=True)\n server.login(user, password)\n server.select_folder(reports_folder)\n delete_messages([msg_uid])\n else:\n if len(aggregate_report_msg_uids) > 0:\n log_message = \"Moving aggregate report messages from\"\n logger.debug(\n \"{0} {1} to {1}\".format(\n log_message, reports_folder,\n aggregate_reports_folder))\n number_of_agg_report_msgs = len(aggregate_report_msg_uids)\n for i in range(number_of_agg_report_msgs):\n msg_uid = aggregate_report_msg_uids[i]\n logger.debug(\n \"Moving message {0} of {1}: UID {2}\".format(\n i+1, number_of_agg_report_msgs, msg_uid))\n try:\n move_messages([msg_uid],\n aggregate_reports_folder)\n except imapclient.exceptions.IMAPClientError as e:\n e = e.__str__().lstrip(\"b'\").rstrip(\n \"'\").rstrip(\".\")\n message = \"Error moving message UID\"\n e = \"{0} {1}: {2}\".format(message, msg_uid, e)\n logger.error(\"IMAP error: {0}\".format(e))\n except (ConnectionResetError, socket.error,\n TimeoutError) as error:\n logger.debug(\"IMAP error: {0}\".format(\n error.__str__()))\n logger.debug(\"Reconnecting to IMAP\")\n try:\n server.shutdown()\n except Exception as e:\n logger.debug(\"Failed to log out: {0}\".format(\n e.__str__()))\n if not ssl:\n logger.debug(\n \"Connecting to IMAP over plain text\")\n server = imapclient.IMAPClient(\n host,\n port=port,\n ssl=ssl,\n ssl_context=ssl_context,\n use_uid=True\n )\n server.login(user, password)\n server.select_folder(reports_folder)\n move_messages([msg_uid],\n aggregate_reports_folder)\n\n if len(forensic_report_msg_uids) > 0:\n message = \"Moving forensic report messages from\"\n logger.debug(\n \"{0} {1} to {2}\".format(message,\n reports_folder,\n forensic_reports_folder))\n number_of_forensic_msgs = len(forensic_report_msg_uids)\n for i in range(number_of_forensic_msgs):\n msg_uid = forensic_report_msg_uids[i]\n message = \"Moving message\"\n logger.debug(\"{0} {1} of {2}: UID {2}\".format(\n message,\n i + 1, number_of_forensic_msgs, msg_uid))\n try:\n move_messages([msg_uid],\n forensic_reports_folder)\n except imapclient.exceptions.IMAPClientError as e:\n e = e.__str__().lstrip(\"b'\").rstrip(\n \"'\").rstrip(\".\")\n e = \"Error moving message UID {0}: {1}\".format(\n msg_uid, e)\n logger.error(\"IMAP error: {0}\".format(e))\n except (ConnectionResetError, TimeoutError) as error:\n logger.debug(\"IMAP error: {0}\".format(\n error.__str__()))\n logger.debug(\"Reconnecting to IMAP\")\n try:\n server.shutdown()\n except Exception as e:\n logger.debug(\"Failed to \"\n \"disconnect: {0}\".format(\n e.__str__()))\n if not ssl:\n logger.debug(\n \"Connecting to IMAP over plain text\")\n server = imapclient.IMAPClient(\n host,\n port=port,\n ssl=ssl,\n ssl_context=ssl_context,\n use_uid=True)\n server.login(user, password)\n server.select_folder(reports_folder)\n move_messages([msg_uid],\n forensic_reports_folder)\n\n results = OrderedDict([(\"aggregate_reports\", aggregate_reports),\n (\"forensic_reports\", forensic_reports)])\n\n if not test and total_messages > 0:\n # Process emails that came in during the last run\n results = get_dmarc_reports_from_inbox(\n host=host,\n user=user,\n password=password,\n connection=connection,\n port=port,\n ssl=ssl,\n ssl_context=ssl_context,\n move_supported=move_supported,\n reports_folder=reports_folder,\n archive_folder=archive_folder,\n delete=delete,\n test=test,\n nameservers=nameservers,\n dns_timeout=dns_timeout,\n strip_attachment_payloads=strip_attachment_payloads,\n results=results\n )\n\n return results\n except imapclient.exceptions.IMAPClientError as error:\n error = error.__str__().lstrip(\"b'\").rstrip(\"'\").rstrip(\".\")\n # Workaround for random Exchange/Office365 IMAP errors\n if \"unexpected response\" in error or \"BAD\" in error:\n sleep_minutes = 5\n logger.debug(\n \"{0}. \"\n \"Waiting {1} minutes before trying again\".format(\n error,\n sleep_minutes))\n time.sleep(sleep_minutes * 60)\n results = get_dmarc_reports_from_inbox(\n host=host,\n user=user,\n password=password,\n connection=connection,\n port=port,\n ssl=ssl,\n ssl_context=ssl_context,\n move_supported=move_supported,\n reports_folder=reports_folder,\n archive_folder=archive_folder,\n delete=delete,\n test=test,\n nameservers=nameservers,\n dns_timeout=dns_timeout,\n strip_attachment_payloads=strip_attachment_payloads,\n results=results\n )\n\n return results\n\n raise IMAPError(error)\n except socket.gaierror:\n raise IMAPError(\"DNS resolution failed\")\n except ConnectionRefusedError:\n raise IMAPError(\"Connection refused\")\n except ConnectionResetError:\n sleep_minutes = 5\n logger.debug(\n \"Connection reset. \"\n \"Waiting {0} minutes before trying again\".format(sleep_minutes))\n time.sleep(sleep_minutes * 60)\n results = get_dmarc_reports_from_inbox(\n host=host,\n user=user,\n password=password,\n connection=connection,\n port=port,\n ssl=ssl,\n ssl_context=ssl_context,\n move_supported=move_supported,\n reports_folder=reports_folder,\n archive_folder=archive_folder,\n delete=delete,\n test=test,\n nameservers=nameservers,\n dns_timeout=dns_timeout,\n strip_attachment_payloads=strip_attachment_payloads,\n results=results\n )\n\n return results\n except ConnectionAbortedError:\n raise IMAPError(\"Connection aborted\")\n except TimeoutError:\n raise IMAPError(\"Connection timed out\")\n except SSLError as error:\n raise IMAPError(\"SSL error: {0}\".format(error.__str__()))\n except CertificateError as error:\n raise IMAPError(\"Certificate error: {0}\".format(error.__str__()))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsaving the results to the given directory.", "response": "def save_output(results, output_directory=\"output\"):\n \"\"\"\n Save report data in the given directory\n\n Args:\n results (OrderedDict): Parsing results\n output_directory: The patch to the directory to save in\n \"\"\"\n\n aggregate_reports = results[\"aggregate_reports\"]\n forensic_reports = results[\"forensic_reports\"]\n\n if os.path.exists(output_directory):\n if not os.path.isdir(output_directory):\n raise ValueError(\"{0} is not a directory\".format(output_directory))\n else:\n os.makedirs(output_directory)\n\n with open(\"{0}\".format(os.path.join(output_directory, \"aggregate.json\")),\n \"w\", newline=\"\\n\", encoding=\"utf-8\") as agg_json:\n agg_json.write(json.dumps(aggregate_reports, ensure_ascii=False,\n indent=2))\n\n with open(\"{0}\".format(os.path.join(output_directory, \"aggregate.csv\")),\n \"w\", newline=\"\\n\", encoding=\"utf-8\") as agg_csv:\n csv = parsed_aggregate_reports_to_csv(aggregate_reports)\n agg_csv.write(csv)\n\n with open(\"{0}\".format(os.path.join(output_directory, \"forensic.json\")),\n \"w\", newline=\"\\n\", encoding=\"utf-8\") as for_json:\n for_json.write(json.dumps(forensic_reports, ensure_ascii=False,\n indent=2))\n\n with open(\"{0}\".format(os.path.join(output_directory, \"forensic.csv\")),\n \"w\", newline=\"\\n\", encoding=\"utf-8\") as for_csv:\n csv = parsed_forensic_reports_to_csv(forensic_reports)\n for_csv.write(csv)\n\n samples_directory = os.path.join(output_directory, \"samples\")\n if not os.path.exists(samples_directory):\n os.makedirs(samples_directory)\n\n sample_filenames = []\n for forensic_report in forensic_reports:\n sample = forensic_report[\"sample\"]\n message_count = 0\n parsed_sample = forensic_report[\"parsed_sample\"]\n subject = parsed_sample[\"filename_safe_subject\"]\n filename = subject\n\n while filename in sample_filenames:\n message_count += 1\n filename = \"{0} ({1})\".format(subject, message_count)\n\n sample_filenames.append(filename)\n\n filename = \"{0}.eml\".format(filename)\n path = os.path.join(samples_directory, filename)\n with open(path, \"w\", newline=\"\\n\", encoding=\"utf-8\") as sample_file:\n sample_file.write(sample)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_report_zip(results):\n def add_subdir(root_path, subdir):\n subdir_path = os.path.join(root_path, subdir)\n for subdir_root, subdir_dirs, subdir_files in os.walk(subdir_path):\n for subdir_file in subdir_files:\n subdir_file_path = os.path.join(root_path, subdir, subdir_file)\n if os.path.isfile(subdir_file_path):\n rel_path = os.path.relpath(subdir_root, subdir_file_path)\n subdir_arc_name = os.path.join(rel_path, subdir_file)\n zip_file.write(subdir_file_path, subdir_arc_name)\n for subdir in subdir_dirs:\n add_subdir(subdir_path, subdir)\n\n storage = BytesIO()\n tmp_dir = tempfile.mkdtemp()\n try:\n save_output(results, tmp_dir)\n with zipfile.ZipFile(storage, 'w', zipfile.ZIP_DEFLATED) as zip_file:\n for root, dirs, files in os.walk(tmp_dir):\n for file in files:\n file_path = os.path.join(root, file)\n if os.path.isfile(file_path):\n arcname = os.path.join(os.path.relpath(root, tmp_dir),\n file)\n zip_file.write(file_path, arcname)\n for directory in dirs:\n dir_path = os.path.join(root, directory)\n if os.path.isdir(dir_path):\n zip_file.write(dir_path, directory)\n add_subdir(root, directory)\n finally:\n shutil.rmtree(tmp_dir)\n\n return storage.getvalue()", "response": "Creates a zip file of parsed report results"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nemail parsing results as a zip file", "response": "def email_results(results, host, mail_from, mail_to, port=0,\n ssl=False, user=None, password=None, subject=None,\n attachment_filename=None, message=None, ssl_context=None):\n \"\"\"\n Emails parsing results as a zip file\n\n Args:\n results (OrderedDict): Parsing results\n host: Mail server hostname or IP address\n mail_from: The value of the message from header\n mail_to : A list of addresses to mail to\n port (int): Port to use\n ssl (bool): Require a SSL connection from the start\n user: An optional username\n password: An optional password\n subject: Overrides the default message subject\n attachment_filename: Override the default attachment filename\n message: Override the default plain text body\n ssl_context: SSL context options\n \"\"\"\n logging.debug(\"Emailing report to: {0}\".format(\",\".join(mail_to)))\n date_string = datetime.now().strftime(\"%Y-%m-%d\")\n if attachment_filename:\n if not attachment_filename.lower().endswith(\".zip\"):\n attachment_filename += \".zip\"\n filename = attachment_filename\n else:\n filename = \"DMARC-{0}.zip\".format(date_string)\n\n assert isinstance(mail_to, list)\n\n msg = MIMEMultipart()\n msg['From'] = mail_from\n msg['To'] = \", \".join(mail_to)\n msg['Date'] = email.utils.formatdate(localtime=True)\n msg['Subject'] = subject or \"DMARC results for {0}\".format(date_string)\n text = message or \"Please see the attached zip file\\n\"\n\n msg.attach(MIMEText(text))\n\n zip_bytes = get_report_zip(results)\n part = MIMEApplication(zip_bytes, Name=filename)\n\n part['Content-Disposition'] = 'attachment; filename=\"{0}\"'.format(filename)\n msg.attach(part)\n\n try:\n if ssl_context is None:\n ssl_context = create_default_context()\n if ssl:\n server = smtplib.SMTP_SSL(host, port=port, context=ssl_context)\n server.connect(host, port)\n server.ehlo_or_helo_if_needed()\n else:\n server = smtplib.SMTP(host, port=port)\n server.connect(host, port)\n server.ehlo_or_helo_if_needed()\n if server.has_extn(\"starttls\"):\n server.starttls(context=ssl_context)\n server.ehlo()\n else:\n logger.warning(\"SMTP server does not support STARTTLS. \"\n \"Proceeding in plain text!\")\n if user and password:\n server.login(user, password)\n server.sendmail(mail_from, mail_to, msg.as_string())\n except smtplib.SMTPException as error:\n error = error.__str__().lstrip(\"b'\").rstrip(\"'\").rstrip(\".\")\n raise SMTPError(error)\n except socket.gaierror:\n raise SMTPError(\"DNS resolution failed\")\n except ConnectionRefusedError:\n raise SMTPError(\"Connection refused\")\n except ConnectionResetError:\n raise SMTPError(\"Connection reset\")\n except ConnectionAbortedError:\n raise SMTPError(\"Connection aborted\")\n except TimeoutError:\n raise SMTPError(\"Connection timed out\")\n except SSLError as error:\n raise SMTPError(\"SSL error: {0}\".format(error.__str__()))\n except CertificateError as error:\n raise SMTPError(\"Certificate error: {0}\".format(error.__str__()))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nwatch an IMAP inbox and return the parsed results.", "response": "def watch_inbox(host, username, password, callback, port=None, ssl=True,\n ssl_context=None, reports_folder=\"INBOX\",\n archive_folder=\"Archive\", delete=False, test=False, wait=30,\n nameservers=None, dns_timeout=6.0,\n strip_attachment_payloads=False):\n \"\"\"\n Use an IDLE IMAP connection to parse incoming emails, and pass the results\n to a callback function\n\n Args:\n host: The mail server hostname or IP address\n username: The mail server username\n password: The mail server password\n callback: The callback function to receive the parsing results\n port: The mail server port\n ssl (bool): Use SSL/TLS\n ssl_context (SSLContext): A SSL context\n reports_folder: The IMAP folder where reports can be found\n archive_folder: The folder to move processed mail to\n delete (bool): Delete messages after processing them\n test (bool): Do not move or delete messages after processing them\n wait (int): Number of seconds to wait for a IMAP IDLE response\n nameservers (list): A list of one or more nameservers to use\n (Cloudflare's public DNS resolvers by default)\n dns_timeout (float): Set the DNS query timeout\n strip_attachment_payloads (bool): Replace attachment payloads in\n forensic report samples with None\n \"\"\"\n rf = reports_folder\n af = archive_folder\n ns = nameservers\n dt = dns_timeout\n if ssl_context is None:\n ssl_context = create_default_context()\n server = imapclient.IMAPClient(host, port=port, ssl=ssl,\n ssl_context=ssl_context,\n use_uid=True)\n\n try:\n server.login(username, password)\n imap_capabilities = get_imap_capabilities(server)\n if \"IDLE\" not in imap_capabilities:\n raise IMAPError(\"Cannot watch inbox: IMAP server does not support \"\n \"the IDLE command\")\n\n ms = \"MOVE\" in imap_capabilities\n server.select_folder(rf)\n idle_start_time = time.monotonic()\n server.idle()\n\n except imapclient.exceptions.IMAPClientError as error:\n error = error.__str__().replace(\"b'\", \"\").replace(\"'\", \"\")\n # Workaround for random Exchange/Office365 IMAP errors\n if \"unexpected response\" in error or \"BAD\" in error:\n sleep_minutes = 5\n logger.debug(\n \"{0}. \"\n \"Waiting {1} minutes before trying again\".format(\n error,\n sleep_minutes))\n logger.debug(\"Reconnecting watcher\")\n try:\n server.logout()\n except Exception as e:\n logger.debug(\"Failed to log out: {0}\".format(e.__str__()))\n server = imapclient.IMAPClient(host)\n server.login(username, password)\n server.select_folder(rf)\n idle_start_time = time.monotonic()\n ms = \"MOVE\" in get_imap_capabilities(server)\n sa = strip_attachment_payloads\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt,\n strip_attachment_payloads=sa)\n callback(res)\n server.idle()\n else:\n raise IMAPError(error)\n except socket.gaierror:\n raise IMAPError(\"DNS resolution failed\")\n except ConnectionRefusedError:\n raise IMAPError(\"Connection refused\")\n except ConnectionResetError:\n logger.debug(\"IMAP error: Connection reset\")\n logger.debug(\"Reconnecting watcher\")\n try:\n server.shutdown()\n except Exception as e:\n logger.debug(\"Failed to disconnect: {0}\".format(e.__str__()))\n server = imapclient.IMAPClient(host)\n server.login(username, password)\n server.select_folder(rf)\n idle_start_time = time.monotonic()\n ms = \"MOVE\" in get_imap_capabilities(server)\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt)\n callback(res)\n server.idle()\n except KeyError:\n logger.debug(\"IMAP error: Server returned unexpected result\")\n logger.debug(\"Reconnecting watcher\")\n try:\n server.logout()\n except Exception as e:\n logger.debug(\"Failed to log out: {0}\".format(e.__str__()))\n server = imapclient.IMAPClient(host)\n server.login(username, password)\n server.select_folder(rf)\n idle_start_time = time.monotonic()\n ms = \"MOVE\" in get_imap_capabilities(server)\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt)\n callback(res)\n server.idle()\n except ConnectionAbortedError:\n raise IMAPError(\"Connection aborted\")\n except TimeoutError:\n raise IMAPError(\"Connection timed out\")\n except SSLError as error:\n raise IMAPError(\"SSL error: {0}\".format(error.__str__()))\n except CertificateError as error:\n raise IMAPError(\"Certificate error: {0}\".format(error.__str__()))\n except BrokenPipeError:\n logger.debug(\"IMAP error: Broken pipe\")\n logger.debug(\"Reconnecting watcher\")\n try:\n server.shutdown()\n except Exception as e:\n logger.debug(\"Failed to disconnect: {0}\".format(e.__str__()))\n server = imapclient.IMAPClient(host)\n server.login(username, password)\n server.select_folder(rf)\n idle_start_time = time.monotonic()\n ms = \"MOVE\" in get_imap_capabilities(server)\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt)\n callback(res)\n server.idle()\n\n while True:\n try:\n # Refresh the IDLE session every 5 minutes to stay connected\n if time.monotonic() - idle_start_time > 5 * 60:\n logger.debug(\"IMAP: Refreshing IDLE session\")\n server.idle_done()\n server.idle()\n idle_start_time = time.monotonic()\n responses = server.idle_check(timeout=wait)\n if responses is not None:\n if len(responses) == 0:\n # Gmail/G-Suite does not generate anything in the responses\n server.idle_done()\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt)\n callback(res)\n server.idle()\n idle_start_time = time.monotonic()\n for response in responses:\n logging.debug(\"Received response: {0}\".format(response))\n if response[0] > 0 and response[1] == b'RECENT':\n server.idle_done()\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt)\n callback(res)\n server.idle()\n idle_start_time = time.monotonic()\n break\n except imapclient.exceptions.IMAPClientError as error:\n error = error.__str__().replace(\"b'\", \"\").replace(\"'\", \"\")\n # Workaround for random Exchange/Office365 IMAP errors\n if \"unexpected response\" in error or \"BAD\" in error:\n sleep_minutes = 5\n logger.debug(\n \"{0}. \"\n \"Waiting {1} minutes before trying again\".format(\n error,\n sleep_minutes))\n logger.debug(\"Reconnecting watcher\")\n try:\n server.logout()\n except Exception as e:\n logger.debug(\"Failed to disconnect: {0}\".format(\n e.__str__()))\n server = imapclient.IMAPClient(host)\n server.login(username, password)\n server.select_folder(rf)\n idle_start_time = time.monotonic()\n ms = \"MOVE\" in get_imap_capabilities(server)\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt)\n callback(res)\n server.idle()\n else:\n raise IMAPError(error)\n except socket.gaierror:\n raise IMAPError(\"DNS resolution failed\")\n except ConnectionRefusedError:\n raise IMAPError(\"Connection refused\")\n except (KeyError, socket.error, BrokenPipeError, ConnectionResetError):\n logger.debug(\"IMAP error: Connection reset\")\n logger.debug(\"Reconnecting watcher\")\n try:\n server.logout()\n except Exception as e:\n logger.debug(\"Failed to disconnect: {0}\".format(e.__str__()))\n server = imapclient.IMAPClient(host)\n server.login(username, password)\n server.select_folder(rf)\n idle_start_time = time.monotonic()\n ms = \"MOVE\" in get_imap_capabilities(server)\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt)\n callback(res)\n server.idle()\n except KeyError:\n logger.debug(\"IMAP error: Server returned unexpected result\")\n logger.debug(\"Reconnecting watcher\")\n try:\n server.logout()\n except Exception as e:\n logger.debug(\"Failed to log out: {0}\".format(e.__str__()))\n server = imapclient.IMAPClient(host)\n server.login(username, password)\n server.select_folder(rf)\n idle_start_time = time.monotonic()\n ms = \"MOVE\" in get_imap_capabilities(server)\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt)\n callback(res)\n server.idle()\n except ConnectionAbortedError:\n raise IMAPError(\"Connection aborted\")\n except TimeoutError:\n raise IMAPError(\"Connection timed out\")\n except SSLError as error:\n raise IMAPError(\"SSL error: {0}\".format(error.__str__()))\n except CertificateError as error:\n raise IMAPError(\"Certificate error: {0}\".format(error.__str__()))\n except BrokenPipeError:\n logger.debug(\"IMAP error: Broken pipe\")\n logger.debug(\"Reconnecting watcher\")\n try:\n server.shutdown()\n except Exception as e:\n logger.debug(\"Failed to disconnect: {0}\".format(e.__str__()))\n server = imapclient.IMAPClient(host)\n server.login(username, password)\n server.select_folder(rf)\n idle_start_time = time.monotonic()\n res = get_dmarc_reports_from_inbox(connection=server,\n move_supported=ms,\n reports_folder=rf,\n archive_folder=af,\n delete=delete,\n test=test,\n nameservers=ns,\n dns_timeout=dt)\n callback(res)\n server.idle()\n except KeyboardInterrupt:\n break\n\n try:\n server.idle_done()\n except BrokenPipeError:\n pass"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nsave aggregate reports to Splunk", "response": "def save_aggregate_reports_to_splunk(self, aggregate_reports):\n \"\"\"\n Saves aggregate DMARC reports to Splunk\n\n Args:\n aggregate_reports: A list of aggregate report dictionaries\n to save in Splunk\n\n \"\"\"\n logger.debug(\"Saving aggregate reports to Splunk\")\n if type(aggregate_reports) == dict:\n aggregate_reports = [aggregate_reports]\n\n if len(aggregate_reports) < 1:\n return\n\n data = self._common_data.copy()\n json_str = \"\"\n for report in aggregate_reports:\n for record in report[\"records\"]:\n new_report = dict()\n for metadata in report[\"report_metadata\"]:\n new_report[metadata] = report[\"report_metadata\"][metadata]\n new_report[\"published_policy\"] = report[\"policy_published\"]\n new_report[\"source_ip_address\"] = record[\"source\"][\n \"ip_address\"]\n new_report[\"source_country\"] = record[\"source\"][\"country\"]\n new_report[\"source_reverse_dns\"] = record[\"source\"][\n \"reverse_dns\"]\n new_report[\"source_base_domain\"] = record[\"source\"][\n \"base_domain\"]\n new_report[\"message_count\"] = record[\"count\"]\n new_report[\"disposition\"] = record[\"policy_evaluated\"][\n \"disposition\"\n ]\n new_report[\"spf_aligned\"] = record[\"alignment\"][\"spf\"]\n new_report[\"dkim_aligned\"] = record[\"alignment\"][\"dkim\"]\n new_report[\"passed_dmarc\"] = record[\"alignment\"][\"dmarc\"]\n new_report[\"header_from\"] = record[\"identifiers\"][\n \"header_from\"]\n new_report[\"envelope_from\"] = record[\"identifiers\"][\n \"envelope_from\"]\n if \"dkim\" in record[\"auth_results\"]:\n new_report[\"dkim_results\"] = record[\"auth_results\"][\n \"dkim\"]\n if \"spf\" in record[\"auth_results\"]:\n new_report[\"spf_results\"] = record[\"auth_results\"][\n \"spf\"]\n\n data[\"sourcetype\"] = \"dmarc:aggregate\"\n timestamp = human_timestamp_to_timestamp(\n new_report[\"begin_date\"])\n data[\"time\"] = timestamp\n data[\"event\"] = new_report.copy()\n json_str += \"{0}\\n\".format(json.dumps(data))\n\n if not self.session.verify:\n logger.debug(\"Skipping certificate verification for Splunk HEC\")\n try:\n response = self.session.post(self.url, data=json_str,\n timeout=self.timeout)\n response = response.json()\n except Exception as e:\n raise SplunkError(e.__str__())\n if response[\"code\"] != 0:\n raise SplunkError(response[\"text\"])"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef save_forensic_reports_to_splunk(self, forensic_reports):\n logger.debug(\"Saving forensic reports to Splunk\")\n if type(forensic_reports) == dict:\n forensic_reports = [forensic_reports]\n\n if len(forensic_reports) < 1:\n return\n\n json_str = \"\"\n for report in forensic_reports:\n data = self._common_data.copy()\n data[\"sourcetype\"] = \"dmarc:forensic\"\n timestamp = human_timestamp_to_timestamp(\n report[\"arrival_date_utc\"])\n data[\"time\"] = timestamp\n data[\"event\"] = report.copy()\n json_str += \"{0}\\n\".format(json.dumps(data))\n\n if not self.session.verify:\n logger.debug(\"Skipping certificate verification for Splunk HEC\")\n try:\n response = self.session.post(self.url, data=json_str,\n timeout=self.timeout)\n response = response.json()\n except Exception as e:\n raise SplunkError(e.__str__())\n if response[\"code\"] != 0:\n raise SplunkError(response[\"text\"])", "response": "Saves a list of forensic report dictionaries to Splunk"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef decode_base64(data):\n data = bytes(data, encoding=\"ascii\")\n missing_padding = len(data) % 4\n if missing_padding != 0:\n data += b'=' * (4 - missing_padding)\n return base64.b64decode(data)", "response": "Decode a base64 encoded string into a base64 string"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets the base domain name for the given domain.", "response": "def get_base_domain(domain, use_fresh_psl=False):\n \"\"\"\n Gets the base domain name for the given domain\n\n .. note::\n Results are based on a list of public domain suffixes at\n https://publicsuffix.org/list/public_suffix_list.dat.\n\n Args:\n domain (str): A domain or subdomain\n use_fresh_psl (bool): Download a fresh Public Suffix List\n\n Returns:\n str: The base domain of the given domain\n\n \"\"\"\n psl_path = os.path.join(tempdir, \"public_suffix_list.dat\")\n\n def download_psl():\n url = \"https://publicsuffix.org/list/public_suffix_list.dat\"\n # Use a browser-like user agent string to bypass some proxy blocks\n headers = {\"User-Agent\": USER_AGENT}\n fresh_psl = requests.get(url, headers=headers).text\n with open(psl_path, \"w\", encoding=\"utf-8\") as fresh_psl_file:\n fresh_psl_file.write(fresh_psl)\n\n if use_fresh_psl:\n if not os.path.exists(psl_path):\n download_psl()\n else:\n psl_age = datetime.now() - datetime.fromtimestamp(\n os.stat(psl_path).st_mtime)\n if psl_age > timedelta(hours=24):\n try:\n download_psl()\n except Exception as error:\n logger.warning(\n \"Failed to download an updated PSL {0}\".format(error))\n with open(psl_path, encoding=\"utf-8\") as psl_file:\n psl = publicsuffix2.PublicSuffixList(psl_file)\n\n return psl.get_public_suffix(domain)\n else:\n return publicsuffix2.get_public_suffix(domain)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nquery DNS for a domain or subdomain.", "response": "def query_dns(domain, record_type, cache=None, nameservers=None, timeout=2.0):\n \"\"\"\n Queries DNS\n\n Args:\n domain (str): The domain or subdomain to query about\n record_type (str): The record type to query for\n cache (ExpiringDict): Cache storage\n nameservers (list): A list of one or more nameservers to use\n (Cloudflare's public DNS resolvers by default)\n timeout (float): Sets the DNS timeout in seconds\n\n Returns:\n list: A list of answers\n \"\"\"\n domain = str(domain).lower()\n record_type = record_type.upper()\n cache_key = \"{0}_{1}\".format(domain, record_type)\n if cache:\n records = cache.get(cache_key, None)\n if records:\n return records\n\n resolver = dns.resolver.Resolver()\n timeout = float(timeout)\n if nameservers is None:\n nameservers = [\"1.1.1.1\", \"1.0.0.1\",\n \"2606:4700:4700::1111\", \"2606:4700:4700::1001\",\n ]\n resolver.nameservers = nameservers\n resolver.timeout = timeout\n resolver.lifetime = timeout\n if record_type == \"TXT\":\n resource_records = list(map(\n lambda r: r.strings,\n resolver.query(domain, record_type, tcp=True)))\n _resource_record = [\n resource_record[0][:0].join(resource_record)\n for resource_record in resource_records if resource_record]\n records = [r.decode() for r in _resource_record]\n else:\n records = list(map(\n lambda r: r.to_text().replace('\"', '').rstrip(\".\"),\n resolver.query(domain, record_type, tcp=True)))\n if cache:\n cache[cache_key] = records\n\n return records"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nresolve an IP address to a hostname using a reverse DNS query", "response": "def get_reverse_dns(ip_address, cache=None, nameservers=None, timeout=2.0):\n \"\"\"\n Resolves an IP address to a hostname using a reverse DNS query\n\n Args:\n ip_address (str): The IP address to resolve\n cache (ExpiringDict): Cache storage\n nameservers (list): A list of one or more nameservers to use\n (Cloudflare's public DNS resolvers by default)\n timeout (float): Sets the DNS query timeout in seconds\n\n Returns:\n str: The reverse DNS hostname (if any)\n \"\"\"\n hostname = None\n try:\n address = dns.reversename.from_address(ip_address)\n hostname = query_dns(address, \"PTR\", cache=cache,\n nameservers=nameservers,\n timeout=timeout)[0]\n\n except dns.exception.DNSException:\n pass\n\n return hostname"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef human_timestamp_to_datetime(human_timestamp, to_utc=False):\n\n settings = {}\n\n if to_utc:\n settings = {\"TO_TIMEZONE\": \"UTC\"}\n\n return dateparser.parse(human_timestamp, settings=settings)", "response": "Converts a human - readable timestamp into a Python datetime object"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_ip_address_country(ip_address, parallel=False):\n def download_country_database(location=\"GeoLite2-Country.mmdb\"):\n \"\"\"Downloads the MaxMind Geolite2 Country database\n\n Args:\n location (str): Local location for the database file\n \"\"\"\n if parallel:\n logging.warning(\"Cannot download GeoIP database in parallel mode\")\n return\n url = \"https://geolite.maxmind.com/download/geoip/database/\" \\\n \"GeoLite2-Country.tar.gz\"\n # Use a browser-like user agent string to bypass some proxy blocks\n headers = {\"User-Agent\": USER_AGENT}\n original_filename = \"GeoLite2-Country.mmdb\"\n try:\n response = requests.get(url, headers=headers)\n response.raise_for_status()\n tar_bytes = response.content\n tar_file = tarfile.open(fileobj=BytesIO(tar_bytes), mode=\"r:gz\")\n tar_dir = tar_file.getnames()[0]\n tar_path = \"{0}/{1}\".format(tar_dir, original_filename)\n tar_file.extract(tar_path)\n shutil.move(tar_path, location)\n shutil.rmtree(tar_dir)\n except Exception as e:\n logger.warning(\"Error downloading {0}: {1}\".format(url,\n e.__str__()))\n\n system_paths = [\n \"GeoLite2-Country.mmdb\",\n \"/usr/local/share/GeoIP/GeoLite2-Country.mmdb\",\n \"/usr/share/GeoIP/GeoLite2-Country.mmdb\",\n \"/var/lib/GeoIP/GeoLite2-Country.mmdb\",\n \"/var/local/lib/GeoIP/GeoLite2-Country.mmdb\",\n \"C:\\\\GeoIP\\\\GeoLite2-Country.mmdb\"\n ]\n\n db_path = None\n\n for system_path in system_paths:\n if os.path.exists(system_path):\n db_path = system_path\n break\n\n if db_path is None:\n db_path = os.path.join(tempdir, \"GeoLite2-Country.mmdb\")\n if not os.path.exists(db_path):\n download_country_database(db_path)\n if not os.path.exists(db_path):\n return None\n else:\n db_age = datetime.now() - datetime.fromtimestamp(\n os.stat(db_path).st_mtime)\n if db_age > timedelta(days=7):\n download_country_database()\n db_path = db_path\n\n db_reader = geoip2.database.Reader(db_path)\n\n country = None\n\n try:\n country = db_reader.country(ip_address).country.iso_code\n except geoip2.errors.AddressNotFoundError:\n pass\n\n return country", "response": "Downloads the MaxMind Geolite2 Country database and returns the ISO code associated with the given IP address."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the reverse DNS and country information for the given IP address.", "response": "def get_ip_address_info(ip_address, cache=None, nameservers=None,\n timeout=2.0, parallel=False):\n \"\"\"\n Returns reverse DNS and country information for the given IP address\n\n Args:\n ip_address (str): The IP address to check\n cache (ExpiringDict): Cache storage\n nameservers (list): A list of one or more nameservers to use\n (Cloudflare's public DNS resolvers by default)\n timeout (float): Sets the DNS timeout in seconds\n parallel (bool): parallel processing\n\n Returns:\n OrderedDict: ``ip_address``, ``reverse_dns``\n\n \"\"\"\n ip_address = ip_address.lower()\n if cache:\n info = cache.get(ip_address, None)\n if info:\n return info\n info = OrderedDict()\n info[\"ip_address\"] = ip_address\n reverse_dns = get_reverse_dns(ip_address,\n nameservers=nameservers,\n timeout=timeout)\n country = get_ip_address_country(ip_address, parallel=parallel)\n info[\"country\"] = country\n info[\"reverse_dns\"] = reverse_dns\n info[\"base_domain\"] = None\n if reverse_dns is not None:\n base_domain = get_base_domain(reverse_dns)\n info[\"base_domain\"] = base_domain\n\n return info"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nconverting a string to a string that is safe for a filename", "response": "def get_filename_safe_string(string):\n \"\"\"\n Converts a string to a string that is safe for a filename\n Args:\n string (str): A string to make safe for a filename\n\n Returns:\n str: A string safe for a filename\n \"\"\"\n invalid_filename_chars = ['\\\\', '/', ':', '\"', '*', '?', '|', '\\n',\n '\\r']\n if string is None:\n string = \"None\"\n for char in invalid_filename_chars:\n string = string.replace(char, \"\")\n string = string.rstrip(\".\")\n\n return string"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef convert_outlook_msg(msg_bytes):\n if not is_outlook_msg(msg_bytes):\n raise ValueError(\"The supplied bytes are not an Outlook MSG file\")\n orig_dir = os.getcwd()\n tmp_dir = tempfile.mkdtemp()\n os.chdir(tmp_dir)\n with open(\"sample.msg\", \"wb\") as msg_file:\n msg_file.write(msg_bytes)\n try:\n subprocess.check_call([\"msgconvert\", \"sample.msg\"],\n stdout=null_file, stderr=null_file)\n eml_path = \"sample.eml\"\n with open(eml_path, \"rb\") as eml_file:\n rfc822 = eml_file.read()\n except FileNotFoundError:\n raise EmailParserError(\n \"Failed to convert Outlook MSG: msgconvert utility not found\")\n finally:\n os.chdir(orig_dir)\n shutil.rmtree(tmp_dir)\n\n return rfc822", "response": "Converts an Outlook MS file to an RFC 822 string."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef parse_email(data, strip_attachment_payloads=False):\n\n if type(data) == bytes:\n if is_outlook_msg(data):\n data = convert_outlook_msg(data)\n data = data.decode(\"utf-8\", errors=\"replace\")\n parsed_email = mailparser.parse_from_string(data)\n headers = json.loads(parsed_email.headers_json).copy()\n parsed_email = json.loads(parsed_email.mail_json).copy()\n parsed_email[\"headers\"] = headers\n\n if \"received\" in parsed_email:\n for received in parsed_email[\"received\"]:\n if \"date_utc\" in received:\n if received[\"date_utc\"] is None:\n del received[\"date_utc\"]\n else:\n received[\"date_utc\"] = received[\"date_utc\"].replace(\"T\",\n \" \")\n\n if \"from\" not in parsed_email:\n if \"From\" in parsed_email[\"headers\"]:\n parsed_email[\"from\"] = parsed_email[\"Headers\"][\"From\"]\n else:\n parsed_email[\"from\"] = None\n\n if parsed_email[\"from\"] is not None:\n parsed_email[\"from\"] = parse_email_address(parsed_email[\"from\"][0])\n\n if \"date\" in parsed_email:\n parsed_email[\"date\"] = parsed_email[\"date\"].replace(\"T\", \" \")\n else:\n parsed_email[\"date\"] = None\n if \"reply_to\" in parsed_email:\n parsed_email[\"reply_to\"] = list(map(lambda x: parse_email_address(x),\n parsed_email[\"reply_to\"]))\n else:\n parsed_email[\"reply_to\"] = []\n\n if \"to\" in parsed_email:\n parsed_email[\"to\"] = list(map(lambda x: parse_email_address(x),\n parsed_email[\"to\"]))\n else:\n parsed_email[\"to\"] = []\n\n if \"cc\" in parsed_email:\n parsed_email[\"cc\"] = list(map(lambda x: parse_email_address(x),\n parsed_email[\"cc\"]))\n else:\n parsed_email[\"cc\"] = []\n\n if \"bcc\" in parsed_email:\n parsed_email[\"bcc\"] = list(map(lambda x: parse_email_address(x),\n parsed_email[\"bcc\"]))\n else:\n parsed_email[\"bcc\"] = []\n\n if \"delivered_to\" in parsed_email:\n parsed_email[\"delivered_to\"] = list(\n map(lambda x: parse_email_address(x),\n parsed_email[\"delivered_to\"])\n )\n\n if \"attachments\" not in parsed_email:\n parsed_email[\"attachments\"] = []\n else:\n for attachment in parsed_email[\"attachments\"]:\n if \"payload\" in attachment:\n payload = attachment[\"payload\"]\n try:\n if \"content_transfer_encoding\" in attachment:\n if attachment[\"content_transfer_encoding\"] == \"base64\":\n payload = decode_base64(payload)\n else:\n payload = str.encode(payload)\n attachment[\"sha256\"] = hashlib.sha256(payload).hexdigest()\n except Exception as e:\n logger.debug(\"Unable to decode attachment: {0}\".format(\n e.__str__()\n ))\n if strip_attachment_payloads:\n for attachment in parsed_email[\"attachments\"]:\n if \"payload\" in attachment:\n del attachment[\"payload\"]\n\n if \"subject\" not in parsed_email:\n parsed_email[\"subject\"] = None\n\n parsed_email[\"filename_safe_subject\"] = get_filename_safe_string(\n parsed_email[\"subject\"])\n\n if \"body\" not in parsed_email:\n parsed_email[\"body\"] = None\n\n return parsed_email", "response": "Parses an email string or MSG into a simplified email object."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _str_to_list(s):\n _list = s.split(\",\")\n return list(map(lambda i: i.lstrip(), _list))", "response": "Converts a comma separated string to a list"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef cli_parse(file_path, sa, nameservers, dns_timeout, parallel=False):\n try:\n file_results = parse_report_file(file_path,\n nameservers=nameservers,\n dns_timeout=dns_timeout,\n strip_attachment_payloads=sa,\n parallel=parallel)\n except ParserError as error:\n return error, file_path\n finally:\n global counter\n with counter.get_lock():\n counter.value += 1\n return file_results, file_path", "response": "This function is separated by this function for multiprocessing"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _main():\n def process_reports(reports_):\n output_str = \"{0}\\n\".format(json.dumps(reports_,\n ensure_ascii=False,\n indent=2))\n if not opts.silent:\n print(output_str)\n if opts.kafka_hosts:\n try:\n ssl_context = None\n if opts.kafka_skip_certificate_verification:\n logger.debug(\"Skipping Kafka certificate verification\")\n ssl_context = create_default_context()\n ssl_context.check_hostname = False\n ssl_context.verify_mode = CERT_NONE\n kafka_client = kafkaclient.KafkaClient(\n opts.kafka_hosts,\n username=opts.kafka_username,\n password=opts.kafka_password,\n ssl_context=ssl_context\n )\n except Exception as error_:\n logger.error(\"Kafka Error: {0}\".format(error_.__str__()))\n if opts.save_aggregate:\n for report in reports_[\"aggregate_reports\"]:\n try:\n if opts.elasticsearch_hosts:\n elastic.save_aggregate_report_to_elasticsearch(\n report,\n index_suffix=opts.elasticsearch_index_suffix,\n monthly_indexes=opts.elasticsearch_monthly_indexes)\n except elastic.AlreadySaved as warning:\n logger.warning(warning.__str__())\n except elastic.ElasticsearchError as error_:\n logger.error(\"Elasticsearch Error: {0}\".format(\n error_.__str__()))\n try:\n if opts.kafka_hosts:\n kafka_client.save_aggregate_reports_to_kafka(\n report, kafka_aggregate_topic)\n except Exception as error_:\n logger.error(\"Kafka Error: {0}\".format(\n error_.__str__()))\n if opts.hec:\n try:\n aggregate_reports_ = reports_[\"aggregate_reports\"]\n if len(aggregate_reports_) > 0:\n hec_client.save_aggregate_reports_to_splunk(\n aggregate_reports_)\n except splunk.SplunkError as e:\n logger.error(\"Splunk HEC error: {0}\".format(e.__str__()))\n if opts.save_forensic:\n for report in reports_[\"forensic_reports\"]:\n try:\n if opts.elasticsearch_hosts:\n elastic.save_forensic_report_to_elasticsearch(\n report,\n index_suffix=opts.elasticsearch_index_suffix,\n monthly_indexes=opts.elasticsearch_monthly_indexes)\n except elastic.AlreadySaved as warning:\n logger.warning(warning.__str__())\n except elastic.ElasticsearchError as error_:\n logger.error(\"Elasticsearch Error: {0}\".format(\n error_.__str__()))\n except InvalidDMARCReport as error_:\n logger.error(error_.__str__())\n try:\n if opts.kafka_hosts:\n kafka_client.save_forensic_reports_to_kafka(\n report, kafka_forensic_topic)\n except Exception as error_:\n logger.error(\"Kafka Error: {0}\".format(\n error_.__str__()))\n if opts.hec:\n try:\n forensic_reports_ = reports_[\"forensic_reports\"]\n if len(forensic_reports_) > 0:\n hec_client.save_forensic_reports_to_splunk(\n forensic_reports_)\n except splunk.SplunkError as e:\n logger.error(\"Splunk HEC error: {0}\".format(e.__str__()))\n\n arg_parser = ArgumentParser(description=\"Parses DMARC reports\")\n arg_parser.add_argument(\"-c\", \"--config-file\",\n help=\"A path to a configuration file \"\n \"(--silent implied)\")\n arg_parser.add_argument(\"file_path\", nargs=\"*\",\n help=\"one or more paths to aggregate or forensic \"\n \"report files or emails\")\n strip_attachment_help = \"remove attachment payloads from forensic \" \\\n \"report output\"\n arg_parser.add_argument(\"--strip-attachment-payloads\",\n help=strip_attachment_help, action=\"store_true\")\n arg_parser.add_argument(\"-o\", \"--output\",\n help=\"write output files to the given directory\")\n arg_parser.add_argument(\"-n\", \"--nameservers\", nargs=\"+\",\n help=\"nameservers to query \"\n \"(default is Cloudflare's nameservers)\")\n arg_parser.add_argument(\"-t\", \"--dns_timeout\",\n help=\"number of seconds to wait for an answer \"\n \"from DNS (default: 6.0)\",\n type=float,\n default=6.0)\n arg_parser.add_argument(\"-s\", \"--silent\", action=\"store_true\",\n help=\"only print errors and warnings\")\n arg_parser.add_argument(\"--debug\", action=\"store_true\",\n help=\"print debugging information\")\n arg_parser.add_argument(\"--log-file\", default=None,\n help=\"output logging to a file\")\n arg_parser.add_argument(\"-v\", \"--version\", action=\"version\",\n version=__version__)\n\n aggregate_reports = []\n forensic_reports = []\n\n args = arg_parser.parse_args()\n opts = Namespace(file_path=args.file_path,\n config_file=args.config_file,\n strip_attachment_payloads=args.strip_attachment_payloads,\n output=args.output,\n nameservers=args.nameservers,\n silent=args.silent,\n dns_timeout=args.dns_timeout,\n debug=args.debug,\n save_aggregate=False,\n save_forensic=False,\n imap_host=None,\n imap_skip_certificate_verification=False,\n imap_ssl=True,\n imap_port=993,\n imap_user=None,\n imap_password=None,\n imap_reports_folder=\"INBOX\",\n imap_archive_folder=\"Archive\",\n imap_watch=False,\n imap_delete=False,\n imap_test=False,\n hec=None,\n hec_token=None,\n hec_index=None,\n hec_skip_certificate_verification=False,\n elasticsearch_hosts=None,\n elasticsearch_index_suffix=None,\n elasticsearch_ssl=True,\n elasticsearch_ssl_cert_path=None,\n elasticsearch_monthly_indexes=False,\n kafka_hosts=None,\n kafka_username=None,\n kafka_password=None,\n kafka_aggregate_topic=None,\n kafka_forensic_topic=None,\n kafka_ssl=False,\n kafka_skip_certificate_verification=False,\n smtp_host=None,\n smtp_port=25,\n smtp_ssl=False,\n smtp_skip_certificate_verification=False,\n smtp_user=None,\n smtp_password=None,\n smtp_from=None,\n smtp_to=[],\n smtp_subject=\"parsedmarc report\",\n smtp_message=\"Please see the attached DMARC results.\",\n log_file=args.log_file,\n n_procs=1,\n chunk_size=1\n )\n args = arg_parser.parse_args()\n\n if args.config_file:\n abs_path = os.path.abspath(args.config_file)\n if not os.path.exists(abs_path):\n logger.error(\"A file does not exist at {0}\".format(abs_path))\n exit(-1)\n opts.silent = True\n config = ConfigParser()\n config.read(args.config_file)\n if \"general\" in config.sections():\n general_config = config[\"general\"]\n if \"strip_attachment_payloads\" in general_config:\n opts.strip_attachment_payloads = general_config[\n \"strip_attachment_payloads\"]\n if \"output\" in general_config:\n opts.output = general_config[\"output\"]\n if \"nameservers\" in general_config:\n opts.nameservers = _str_to_list(general_config[\"nameservers\"])\n if \"dns_timeout\" in general_config:\n opts.dns_timeout = general_config.getfloat(\"dns_timeout\")\n if \"save_aggregate\" in general_config:\n opts.save_aggregate = general_config[\"save_aggregate\"]\n if \"save_forensic\" in general_config:\n opts.save_forensic = general_config[\"save_forensic\"]\n if \"debug\" in general_config:\n opts.debug = general_config.getboolean(\"debug\")\n if \"silent\" in general_config:\n opts.silent = general_config.getboolean(\"silent\")\n if \"log_file\" in general_config:\n opts.log_file = general_config[\"log_file\"]\n if \"n_procs\" in general_config:\n opts.n_procs = general_config.getint(\"n_procs\")\n if \"chunk_size\" in general_config:\n opts.chunk_size = general_config.getint(\"chunk_size\")\n if \"imap\" in config.sections():\n imap_config = config[\"imap\"]\n if \"host\" in imap_config:\n opts.imap_host = imap_config[\"host\"]\n else:\n logger.error(\"host setting missing from the \"\n \"imap config section\")\n exit(-1)\n if \"port\" in imap_config:\n opts.imap_port = imap_config[\"port\"]\n if \"ssl\" in imap_config:\n opts.imap_ssl = imap_config.getboolean(\"ssl\")\n if \"skip_certificate_verification\" in imap_config:\n imap_verify = imap_config.getboolean(\n \"skip_certificate_verification\")\n opts.imap_skip_certificate_verification = imap_verify\n if \"user\" in imap_config:\n opts.imap_user = imap_config[\"user\"]\n else:\n logger.critical(\"user setting missing from the \"\n \"imap config section\")\n exit(-1)\n if \"password\" in imap_config:\n opts.imap_password = imap_config[\"password\"]\n else:\n logger.critical(\"password setting missing from the \"\n \"imap config section\")\n exit(-1)\n\n if \"reports_folder\" in imap_config:\n opts.imap_reports_folder = imap_config[\"reports_folder\"]\n if \"archive_folder\" in imap_config:\n opts.imap_archive_folder = imap_config[\"archive_folder\"]\n if \"watch\" in imap_config:\n opts.imap_watch = imap_config.getboolean(\"watch\")\n if \"delete\" in imap_config:\n opts.imap_delete = imap_config.getboolean(\"delete\")\n if \"test\" in imap_config:\n opts.imap_test = imap_config.getboolean(\"test\")\n if \"elasticsearch\" in config:\n elasticsearch_config = config[\"elasticsearch\"]\n if \"hosts\" in elasticsearch_config:\n opts.elasticsearch_hosts = _str_to_list(elasticsearch_config[\n \"hosts\"])\n else:\n logger.critical(\"hosts setting missing from the \"\n \"elasticsearch config section\")\n exit(-1)\n if \"index_suffix\" in elasticsearch_config:\n opts.elasticsearch_index_suffix = elasticsearch_config[\n \"index_suffix\"]\n if \"monthly_indexes\" in elasticsearch_config:\n monthly = elasticsearch_config.getboolean(\"monthly_indexes\")\n opts.elasticsearch_monthly_indexes = monthly\n if \"ssl\" in elasticsearch_config:\n opts.elasticsearch_ssl = elasticsearch_config.getboolean(\n \"ssl\")\n if \"cert_path\" in elasticsearch_config:\n opts.elasticsearch_ssl_cert_path = elasticsearch_config[\n \"cert_path\"]\n if \"splunk_hec\" in config.sections():\n hec_config = config[\"splunk_hec\"]\n if \"url\" in hec_config:\n opts.hec = hec_config[\"url\"]\n else:\n logger.critical(\"url setting missing from the \"\n \"splunk_hec config section\")\n exit(-1)\n if \"token\" in hec_config:\n opts.hec_token = hec_config[\"token\"]\n else:\n logger.critical(\"token setting missing from the \"\n \"splunk_hec config section\")\n exit(-1)\n if \"index\" in hec_config:\n opts.hec_index = hec_config[\"index\"]\n else:\n logger.critical(\"index setting missing from the \"\n \"splunk_hec config section\")\n exit(-1)\n if \"skip_certificate_verification\" in hec_config:\n opts.hec_skip_certificate_verification = hec_config[\n \"skip_certificate_verification\"]\n if \"kafka\" in config.sections():\n kafka_config = config[\"kafka\"]\n if \"hosts\" in kafka_config:\n opts.kafka_hosts = _str_to_list(kafka_config[\"hosts\"])\n else:\n logger.critical(\"hosts setting missing from the \"\n \"kafka config section\")\n exit(-1)\n if \"user\" in kafka_config:\n opts.kafka_username = kafka_config[\"user\"]\n else:\n logger.critical(\"user setting missing from the \"\n \"kafka config section\")\n exit(-1)\n if \"password\" in kafka_config:\n opts.kafka_password = kafka_config[\"password\"]\n else:\n logger.critical(\"password setting missing from the \"\n \"kafka config section\")\n exit(-1)\n if \"ssl\" in kafka_config:\n opts.kafka_ssl = kafka_config[\"ssl\"].getboolean()\n if \"skip_certificate_verification\" in kafka_config:\n kafka_verify = kafka_config.getboolean(\n \"skip_certificate_verification\")\n opts.kafka_skip_certificate_verification = kafka_verify\n if \"aggregate_topic\" in kafka_config:\n opts.kafka_aggregate = kafka_config[\"aggregate_topic\"]\n else:\n logger.critical(\"aggregate_topic setting missing from the \"\n \"kafka config section\")\n exit(-1)\n if \"forensic_topic\" in kafka_config:\n opts.kafka_username = kafka_config[\"forensic_topic\"]\n else:\n logger.critical(\"forensic_topic setting missing from the \"\n \"splunk_hec config section\")\n if \"smtp\" in config.sections():\n smtp_config = config[\"smtp\"]\n if \"host\" in smtp_config:\n opts.smtp_host = smtp_config[\"host\"]\n else:\n logger.critical(\"host setting missing from the \"\n \"smtp config section\")\n exit(-1)\n if \"port\" in smtp_config:\n opts.smtp_port = smtp_config[\"port\"]\n if \"ssl\" in smtp_config:\n opts.smtp_ssl = smtp_config.getboolean(\"ssl\")\n if \"skip_certificate_verification\" in smtp_config:\n smtp_verify = smtp_config.getboolean(\n \"skip_certificate_verification\")\n opts.smtp_skip_certificate_verification = smtp_verify\n if \"user\" in smtp_config:\n opts.smtp_user = smtp_config[\"user\"]\n else:\n logger.critical(\"user setting missing from the \"\n \"smtp config section\")\n exit(-1)\n if \"password\" in smtp_config:\n opts.smtp_password = smtp_config[\"password\"]\n else:\n logger.critical(\"password setting missing from the \"\n \"smtp config section\")\n exit(-1)\n if \"from\" in smtp_config:\n opts.smtp_from = smtp_config[\"from\"]\n else:\n logger.critical(\"from setting missing from the \"\n \"smtp config section\")\n if \"to\" in smtp_config:\n opts.smtp_to = _str_to_list(smtp_config[\"to\"])\n else:\n logger.critical(\"to setting missing from the \"\n \"smtp config section\")\n if \"subject\" in smtp_config:\n opts.smtp_subject = smtp_config[\"subject\"]\n if \"attachment\" in smtp_config:\n opts.smtp_attachment = smtp_config[\"attachment\"]\n if \"message\" in smtp_config:\n opts.smtp_message = smtp_config[\"message\"]\n\n logging.basicConfig(level=logging.WARNING)\n logger.setLevel(logging.WARNING)\n\n if opts.debug:\n logging.basicConfig(level=logging.DEBUG)\n logger.setLevel(logging.DEBUG)\n if opts.log_file:\n fh = logging.FileHandler(opts.log_file)\n formatter = logging.Formatter(\n '%(asctime)s - '\n '%(levelname)s - [%(filename)s:%(lineno)d] - %(message)s')\n fh.setFormatter(formatter)\n logger.addHandler(fh)\n if opts.imap_host is None and len(opts.file_path) == 0:\n logger.error(\"You must supply input files, or an IMAP configuration\")\n exit(1)\n\n if opts.save_aggregate or opts.save_forensic:\n try:\n if opts.elasticsearch_hosts:\n es_aggregate_index = \"dmarc_aggregate\"\n es_forensic_index = \"dmarc_forensic\"\n if opts.elasticsearch_index_suffix:\n suffix = opts.elasticsearch_index_suffix\n es_aggregate_index = \"{0}_{1}\".format(\n es_aggregate_index, suffix)\n es_forensic_index = \"{0}_{1}\".format(\n es_forensic_index, suffix)\n elastic.set_hosts(opts.elasticsearch_hosts,\n opts.elasticsearch_ssl,\n opts.elasticsearch_ssl_cert_path)\n elastic.migrate_indexes(aggregate_indexes=[es_aggregate_index],\n forensic_indexes=[es_forensic_index])\n except elastic.ElasticsearchError as error:\n logger.error(\"Elasticsearch Error: {0}\".format(error.__str__()))\n exit(1)\n\n if opts.hec:\n if opts.hec_token is None or opts.hec_index is None:\n logger.error(\"HEC token and HEC index are required when \"\n \"using HEC URL\")\n exit(1)\n\n verify = True\n if opts.hec_skip_certificate_verification:\n verify = False\n hec_client = splunk.HECClient(opts.hec, opts.hec_token,\n opts.hec_index,\n verify=verify)\n\n kafka_aggregate_topic = opts.kafka_aggregate_topic\n kafka_forensic_topic = opts.kafka_forensic_topic\n\n file_paths = []\n for file_path in args.file_path:\n file_paths += glob(file_path)\n file_paths = list(set(file_paths))\n\n counter = Value('i', 0)\n pool = Pool(opts.n_procs, initializer=init, initargs=(counter,))\n results = pool.starmap_async(cli_parse,\n zip(file_paths,\n repeat(opts.strip_attachment_payloads),\n repeat(opts.nameservers),\n repeat(opts.dns_timeout),\n repeat(opts.n_procs >= 1)),\n opts.chunk_size)\n pbar = tqdm(total=len(file_paths))\n while not results.ready():\n pbar.update(counter.value - pbar.n)\n time.sleep(0.1)\n pbar.close()\n results = results.get()\n pool.close()\n pool.join()\n\n for result in results:\n if type(result[0]) is InvalidDMARCReport:\n logger.error(\"Failed to parse {0} - {1}\".format(result[1],\n result[0]))\n else:\n if result[0][\"report_type\"] == \"aggregate\":\n aggregate_reports.append(result[0][\"report\"])\n elif result[0][\"report_type\"] == \"forensic\":\n forensic_reports.append(result[0][\"report\"])\n\n if opts.imap_host:\n try:\n if opts.imap_user is None or opts.imap_password is None:\n logger.error(\"IMAP user and password must be specified if\"\n \"host is specified\")\n\n rf = opts.imap_reports_folder\n af = opts.imap_archive_folder\n ns = opts.nameservers\n sa = opts.strip_attachment_payloads\n ssl = True\n ssl_context = None\n if opts.imap_skip_certificate_verification:\n logger.debug(\"Skipping IMAP certificate verification\")\n ssl_context = create_default_context()\n ssl_context.check_hostname = False\n ssl_context.verify_mode = CERT_NONE\n if opts.imap_ssl is False:\n ssl = False\n reports = get_dmarc_reports_from_inbox(host=opts.imap_host,\n port=opts.imap_port,\n ssl=ssl,\n ssl_context=ssl_context,\n user=opts.imap_user,\n password=opts.imap_password,\n reports_folder=rf,\n archive_folder=af,\n delete=opts.imap_delete,\n nameservers=ns,\n test=opts.imap_test,\n strip_attachment_payloads=sa\n )\n\n aggregate_reports += reports[\"aggregate_reports\"]\n forensic_reports += reports[\"forensic_reports\"]\n\n except IMAPError as error:\n logger.error(\"IMAP Error: {0}\".format(error.__str__()))\n exit(1)\n\n results = OrderedDict([(\"aggregate_reports\", aggregate_reports),\n (\"forensic_reports\", forensic_reports)])\n\n if opts.output:\n save_output(results, output_directory=opts.output)\n\n process_reports(results)\n\n if opts.smtp_host:\n try:\n ssl_context = None\n if opts.smtp_skip_certificate_verification:\n logger.debug(\"Skipping SMTP certificate verification\")\n ssl_context = create_default_context()\n ssl_context.check_hostname = False\n ssl_context.verify_mode = CERT_NONE\n email_results(results, opts.smtp_host, opts.smtp_from,\n opts.smtp_to, ssl=opts.smtp_ssl,\n user=opts.smtp_user,\n password=opts.smtp_password,\n subject=opts.smtp_subject,\n ssl_context=ssl_context)\n except SMTPError as error:\n logger.error(\"SMTP Error: {0}\".format(error.__str__()))\n exit(1)\n\n if opts.imap_host and opts.imap_watch:\n logger.info(\"Watching for email - Quit with ctrl-c\")\n ssl = True\n ssl_context = None\n if opts.imap_skip_certificate_verification:\n logger.debug(\"Skipping IMAP certificate verification\")\n ssl_context = create_default_context()\n ssl_context.check_hostname = False\n ssl_context.verify_mode = CERT_NONE\n if opts.imap_ssl is False:\n ssl = False\n try:\n sa = opts.strip_attachment_payloads\n watch_inbox(opts.imap_host, opts.imap_user, opts.imap_password,\n process_reports, port=opts.imap_port, ssl=ssl,\n ssl_context=ssl_context,\n reports_folder=opts.imap_reports_folder,\n archive_folder=opts.imap_archive_folder,\n delete=opts.imap_delete,\n test=opts.imap_test, nameservers=opts.nameservers,\n dns_timeout=opts.dns_timeout,\n strip_attachment_payloads=sa)\n except IMAPError as error:\n logger.error(\"IMAP error: {0}\".format(error.__str__()))\n exit(1)", "response": "Main function for the main function of the module."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ndrain a single entry from the broker.", "response": "def drain(self, sid=None):\n \"\"\"\n Drain will put a connection into a drain state. All subscriptions will\n immediately be put into a drain state. Upon completion, the publishers\n will be drained and can not publish any additional messages. Upon draining\n of the publishers, the connection will be closed. Use the `closed_cb'\n option to know when the connection has moved from draining to closed.\n\n If a sid is passed, just the subscription with that sid will be drained\n without closing the connection.\n \"\"\"\n if self.is_draining:\n return\n if self.is_closed:\n raise ErrConnectionClosed\n if self.is_connecting or self.is_reconnecting:\n raise ErrConnectionReconnecting\n\n if sid is not None:\n return self._drain_sub(sid)\n\n # Start draining the subscriptions\n self._status = Client.DRAINING_SUBS\n\n drain_tasks = []\n for ssid, sub in self._subs.items():\n task = self._drain_sub(ssid)\n drain_tasks.append(task)\n\n drain_is_done = asyncio.gather(*drain_tasks)\n try:\n yield from asyncio.wait_for(drain_is_done, self.options[\"drain_timeout\"])\n except asyncio.TimeoutError:\n drain_is_done.exception()\n drain_is_done.cancel()\n if self._error_cb is not None:\n yield from self._error_cb(ErrDrainTimeout)\n except asyncio.CancelledError:\n pass\n finally:\n self._status = Client.DRAINING_PUBS\n yield from self.flush()\n yield from self._close(Client.CLOSED)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef publish(self, subject, payload):\n if self.is_closed:\n raise ErrConnectionClosed\n if self.is_draining_pubs:\n raise ErrConnectionDraining\n\n payload_size = len(payload)\n if payload_size > self._max_payload:\n raise ErrMaxPayload\n yield from self._publish(subject, _EMPTY_, payload, payload_size)", "response": "Sends a PUB command to the server on the specified subject and payload."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef publish_request(self, subject, reply, payload):\n if self.is_closed:\n raise ErrConnectionClosed\n if self.is_draining_pubs:\n raise ErrConnectionDraining\n\n payload_size = len(payload)\n if payload_size > self._max_payload:\n raise ErrMaxPayload\n yield from self._publish(subject, reply.encode(), payload, payload_size)", "response": "Publishes a message with a reply subscription."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _publish(self, subject, reply, payload, payload_size):\n if subject == \"\":\n # Avoid sending messages with empty replies.\n raise ErrBadSubject\n\n payload_size_bytes = (\"%d\" % payload_size).encode()\n pub_cmd = b''.join([PUB_OP, _SPC_, subject.encode(\n ), _SPC_, reply, _SPC_, payload_size_bytes, _CRLF_, payload, _CRLF_])\n self.stats['out_msgs'] += 1\n self.stats['out_bytes'] += payload_size\n yield from self._send_command(pub_cmd)\n if self._flush_queue.empty():\n yield from self._flush_pending()", "response": "Sends a PUB command to the NATS server."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsubscribing to a specific subject and queues.", "response": "def subscribe(self, subject,\n queue=\"\",\n cb=None,\n future=None,\n max_msgs=0,\n is_async=False,\n pending_msgs_limit=DEFAULT_SUB_PENDING_MSGS_LIMIT,\n pending_bytes_limit=DEFAULT_SUB_PENDING_BYTES_LIMIT,\n ):\n \"\"\"\n Takes a subject string and optional queue string to send a SUB cmd,\n and a callback which to which messages (Msg) will be dispatched to\n be processed sequentially by default.\n \"\"\"\n if subject == \"\":\n raise ErrBadSubject\n\n if self.is_closed:\n raise ErrConnectionClosed\n\n if self.is_draining:\n raise ErrConnectionDraining\n\n sub = Subscription(subject=subject,\n queue=queue,\n max_msgs=max_msgs,\n is_async=is_async,\n )\n if cb is not None:\n if asyncio.iscoroutinefunction(cb):\n sub.coro = cb\n elif sub.is_async:\n raise NatsError(\n \"nats: must use coroutine for async subscriptions\")\n else:\n # NOTE: Consider to deprecate this eventually, it should always\n # be coroutines otherwise they could affect the single thread,\n # for now still allow to be flexible.\n sub.cb = cb\n\n sub.pending_msgs_limit = pending_msgs_limit\n sub.pending_bytes_limit = pending_bytes_limit\n sub.pending_queue = asyncio.Queue(\n maxsize=pending_msgs_limit,\n loop=self._loop,\n )\n\n # Close the delivery coroutine over the sub and error handler\n # instead of having subscription type hold over state of the conn.\n err_cb = self._error_cb\n\n @asyncio.coroutine\n def wait_for_msgs():\n nonlocal sub\n nonlocal err_cb\n\n while True:\n try:\n msg = yield from sub.pending_queue.get()\n sub.pending_size -= len(msg.data)\n\n try:\n # Invoke depending of type of handler.\n if sub.coro is not None:\n if sub.is_async:\n # NOTE: Deprecate this usage in a next release,\n # the handler implementation ought to decide\n # the concurrency level at which the messages\n # should be processed.\n self._loop.create_task(sub.coro(msg))\n else:\n yield from sub.coro(msg)\n elif sub.cb is not None:\n if sub.is_async:\n raise NatsError(\n \"nats: must use coroutine for async subscriptions\")\n else:\n # Schedule regular callbacks to be processed sequentially.\n self._loop.call_soon(sub.cb, msg)\n except asyncio.CancelledError:\n # In case the coroutine handler gets cancelled\n # then stop task loop and return.\n break\n except Exception as e:\n # All errors from calling a handler\n # are async errors.\n if err_cb is not None:\n yield from err_cb(e)\n\n except asyncio.CancelledError:\n break\n\n # Start task for each subscription, it should be cancelled\n # on both unsubscribe and closing as well.\n sub.wait_for_msgs_task = self._loop.create_task(\n wait_for_msgs())\n\n elif future is not None:\n # Used to handle the single response from a request.\n sub.future = future\n else:\n raise NatsError(\"nats: invalid subscription type\")\n\n self._ssid += 1\n ssid = self._ssid\n self._subs[ssid] = sub\n yield from self._subscribe(sub, ssid)\n return ssid"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsubscribes to a subject asynchronously.", "response": "def subscribe_async(self, subject, **kwargs):\n \"\"\"\n Sets the subcription to use a task per message to be processed.\n\n ..deprecated:: 7.0\n Will be removed 9.0.\n \"\"\"\n kwargs[\"is_async\"] = True\n sid = yield from self.subscribe(subject, **kwargs)\n return sid"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef unsubscribe(self, ssid, max_msgs=0):\n if self.is_closed:\n raise ErrConnectionClosed\n if self.is_draining:\n raise ErrConnectionDraining\n\n self._remove_sub(ssid, max_msgs)\n\n # We will send these for all subs when we reconnect anyway,\n # so that we can suppress here.\n if not self.is_reconnecting:\n yield from self.auto_unsubscribe(ssid, max_msgs)", "response": "Unsubscribe from a specific subscription sequence id."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef request(self, subject, payload, timeout=0.5, expected=1, cb=None):\n if self.is_draining_pubs:\n raise ErrConnectionDraining\n\n # If callback given then continue to use old style.\n if cb is not None:\n next_inbox = INBOX_PREFIX[:]\n next_inbox.extend(self._nuid.next())\n inbox = next_inbox.decode()\n\n sid = yield from self.subscribe(inbox, cb=cb)\n yield from self.auto_unsubscribe(sid, expected)\n yield from self.publish_request(subject, inbox, payload)\n return sid\n\n if self._resp_sub_prefix is None:\n self._resp_map = {}\n\n # Create a prefix and single wildcard subscription once.\n self._resp_sub_prefix = INBOX_PREFIX[:]\n self._resp_sub_prefix.extend(self._nuid.next())\n self._resp_sub_prefix.extend(b'.')\n resp_mux_subject = self._resp_sub_prefix[:]\n resp_mux_subject.extend(b'*')\n sub = Subscription(subject=resp_mux_subject.decode())\n\n # FIXME: Allow setting pending limits for responses mux subscription.\n sub.pending_msgs_limit = DEFAULT_SUB_PENDING_MSGS_LIMIT\n sub.pending_bytes_limit = DEFAULT_SUB_PENDING_BYTES_LIMIT\n sub.pending_queue = asyncio.Queue(\n maxsize=sub.pending_msgs_limit,\n loop=self._loop,\n )\n\n # Single task for handling the requests\n @asyncio.coroutine\n def wait_for_msgs():\n nonlocal sub\n while True:\n try:\n msg = yield from sub.pending_queue.get()\n token = msg.subject[INBOX_PREFIX_LEN:]\n\n try:\n fut = self._resp_map[token]\n fut.set_result(msg)\n del self._resp_map[token]\n except (asyncio.CancelledError, asyncio.InvalidStateError):\n # Request may have timed out already so remove entry.\n del self._resp_map[token]\n continue\n except KeyError:\n # Future already handled so drop any extra\n # responses which may have made it.\n continue\n\n except asyncio.CancelledError:\n break\n\n sub.wait_for_msgs_task = self._loop.create_task(\n wait_for_msgs())\n\n # Store the subscription in the subscriptions map,\n # then send the protocol commands to the server.\n self._ssid += 1\n ssid = self._ssid\n self._subs[ssid] = sub\n yield from self._subscribe(sub, ssid)\n\n # Use a new NUID for the token inbox and then use the future.\n token = self._nuid.next()\n inbox = self._resp_sub_prefix[:]\n inbox.extend(token)\n future = asyncio.Future(loop=self._loop)\n self._resp_map[token.decode()] = future\n yield from self.publish_request(subject, inbox.decode(), payload)\n\n # Wait for the response or give up on timeout.\n try:\n msg = yield from asyncio.wait_for(future, timeout, loop=self._loop)\n return msg\n except asyncio.TimeoutError:\n future.cancel()\n raise ErrTimeout", "response": "Sends a request to the broker."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nimplementing the request and response pattern via pub / sub .", "response": "def timed_request(self, subject, payload, timeout=0.5):\n \"\"\"\n Implements the request/response pattern via pub/sub\n using an ephemeral subscription which will be published\n with a limited interest of 1 reply returning the response\n or raising a Timeout error.\n\n ->> SUB _INBOX.2007314fe0fcb2cdc2a2914c1 90\n ->> UNSUB 90 1\n ->> PUB hello _INBOX.2007314fe0fcb2cdc2a2914c1 5\n ->> MSG_PAYLOAD: world\n <<- MSG hello 2 _INBOX.2007314fe0fcb2cdc2a2914c1 5\n\n \"\"\"\n next_inbox = INBOX_PREFIX[:]\n next_inbox.extend(self._nuid.next())\n inbox = next_inbox.decode()\n\n future = asyncio.Future(loop=self._loop)\n sid = yield from self.subscribe(inbox, future=future, max_msgs=1)\n yield from self.auto_unsubscribe(sid, 1)\n yield from self.publish_request(subject, inbox, payload)\n\n try:\n msg = yield from asyncio.wait_for(future, timeout, loop=self._loop)\n return msg\n except asyncio.TimeoutError:\n future.cancel()\n raise ErrTimeout"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef flush(self, timeout=60):\n if timeout <= 0:\n raise ErrBadTimeout\n\n if self.is_closed:\n raise ErrConnectionClosed\n\n future = asyncio.Future(loop=self._loop)\n try:\n yield from self._send_ping(future)\n yield from asyncio.wait_for(future, timeout, loop=self._loop)\n except asyncio.TimeoutError:\n future.cancel()\n raise ErrTimeout", "response": "Send a ping to the server expecting a pong to send a ping and waits for a pong to be sent."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _select_next_server(self):\n\n while True:\n if len(self._server_pool) == 0:\n self._current_server = None\n raise ErrNoServers\n\n now = time.monotonic()\n s = self._server_pool.pop(0)\n if self.options[\"max_reconnect_attempts\"] > 0:\n if s.reconnects > self.options[\"max_reconnect_attempts\"]:\n # Discard server since already tried to reconnect too many times\n continue\n\n # Not yet exceeded max_reconnect_attempts so can still use\n # this server in the future.\n self._server_pool.append(s)\n if s.last_attempt is not None and now < s.last_attempt + self.options[\"reconnect_time_wait\"]:\n # Backoff connecting to server if we attempted recently.\n yield from asyncio.sleep(self.options[\"reconnect_time_wait\"], loop=self._loop)\n try:\n s.last_attempt = time.monotonic()\n r, w = yield from asyncio.open_connection(\n s.uri.hostname,\n s.uri.port,\n loop=self._loop,\n limit=DEFAULT_BUFFER_SIZE)\n self._current_server = s\n\n # We keep a reference to the initial transport we used when\n # establishing the connection in case we later upgrade to TLS\n # after getting the first INFO message. This is in order to\n # prevent the GC closing the socket after we send CONNECT\n # and replace the transport.\n #\n # See https://github.com/nats-io/asyncio-nats/issues/43\n self._bare_io_reader = self._io_reader = r\n self._bare_io_writer = self._io_writer = w\n break\n except Exception as e:\n s.last_attempt = time.monotonic()\n s.reconnects += 1\n\n self._err = e\n if self._error_cb is not None:\n yield from self._error_cb(e)\n continue", "response": "Tries to connect to the next server in the server pool."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _process_err(self, err_msg):\n if STALE_CONNECTION in err_msg:\n yield from self._process_op_err(ErrStaleConnection)\n return\n\n if AUTHORIZATION_VIOLATION in err_msg:\n self._err = ErrAuthorization\n else:\n m = b'nats: ' + err_msg[0]\n self._err = NatsError(m.decode())\n\n do_cbs = False\n if not self.is_connecting:\n do_cbs = True\n\n # FIXME: Some errors such as 'Invalid Subscription'\n # do not cause the server to close the connection.\n # For now we handle similar as other clients and close.\n self._loop.create_task(self._close(Client.CLOSED, do_cbs))", "response": "Processes the raw error message sent by the server and closes the connection with the current server."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nprocess errors which occured while reading or parsing the protocol.", "response": "def _process_op_err(self, e):\n \"\"\"\n Process errors which occured while reading or parsing\n the protocol. If allow_reconnect is enabled it will\n try to switch the server to which it is currently connected\n otherwise it will disconnect.\n \"\"\"\n if self.is_connecting or self.is_closed or self.is_reconnecting:\n return\n\n if self.options[\"allow_reconnect\"] and self.is_connected:\n self._status = Client.RECONNECTING\n self._ps.reset()\n\n if self._reconnection_task is not None and not self._reconnection_task.cancelled():\n # Cancel the previous task in case it may still be running.\n self._reconnection_task.cancel()\n\n self._reconnection_task = self._loop.create_task(self._attempt_reconnect())\n else:\n self._process_disconnect()\n self._err = e\n yield from self._close(Client.CLOSED, True)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _connect_command(self):\n '''\n Generates a JSON string with the params to be used\n when sending CONNECT to the server.\n\n ->> CONNECT {\"lang\": \"python3\"}\n\n '''\n options = {\n \"verbose\": self.options[\"verbose\"],\n \"pedantic\": self.options[\"pedantic\"],\n \"lang\": __lang__,\n \"version\": __version__,\n \"protocol\": PROTOCOL\n }\n if \"auth_required\" in self._server_info:\n if self._server_info[\"auth_required\"]:\n # In case there is no password, then consider handle\n # sending a token instead.\n if self.options[\"user\"] is not None and self.options[\"password\"] is not None:\n options[\"user\"] = self.options[\"user\"]\n options[\"pass\"] = self.options[\"password\"]\n elif self.options[\"token\"] is not None:\n options[\"auth_token\"] = self.options[\"token\"]\n elif self._current_server.uri.password is None:\n options[\"auth_token\"] = self._current_server.uri.username\n else:\n options[\"user\"] = self._current_server.uri.username\n options[\"pass\"] = self._current_server.uri.password\n if self.options[\"name\"] is not None:\n options[\"name\"] = self.options[\"name\"]\n if self.options[\"no_echo\"] is not None:\n options[\"echo\"] = not self.options[\"no_echo\"]\n\n connect_opts = json.dumps(options, sort_keys=True)\n return b''.join([CONNECT_OP + _SPC_ + connect_opts.encode() + _CRLF_])", "response": "Generates a JSON string with the params to be used when sending a CONNECT to the server."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _process_pong(self):\n if len(self._pongs) > 0:\n future = self._pongs.pop(0)\n future.set_result(True)\n self._pongs_received += 1\n self._pings_outstanding -= 1", "response": "Process PONG sent by server."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _process_msg(self, sid, subject, reply, data):\n payload_size = len(data)\n self.stats['in_msgs'] += 1\n self.stats['in_bytes'] += payload_size\n\n sub = self._subs.get(sid)\n if sub is None:\n # Skip in case no subscription present.\n return\n\n sub.received += 1\n if sub.max_msgs > 0 and sub.received >= sub.max_msgs:\n # Enough messages so can throwaway subscription now.\n self._subs.pop(sid, None)\n msg = self._build_message(subject, reply, data)\n\n # Check if it is an old style request.\n if sub.future is not None:\n if sub.future.cancelled():\n # Already gave up, nothing to do.\n return\n sub.future.set_result(msg)\n return\n\n # Let subscription wait_for_msgs coroutine process the messages,\n # but in case sending to the subscription task would block,\n # then consider it to be an slow consumer and drop the message.\n try:\n sub.pending_size += payload_size\n if sub.pending_size >= sub.pending_bytes_limit:\n # Substract again the bytes since throwing away\n # the message so would not be pending data.\n sub.pending_size -= payload_size\n\n if self._error_cb is not None:\n yield from self._error_cb(\n ErrSlowConsumer(subject=subject, sid=sid))\n return\n sub.pending_queue.put_nowait(msg)\n except asyncio.QueueFull:\n if self._error_cb is not None:\n yield from self._error_cb(\n ErrSlowConsumer(subject=subject, sid=sid))", "response": "Process a single message sent by server."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nprocessing INFO lines sent by the server to reconfigure client with latest updates from cluster.", "response": "def _process_info(self, info):\n \"\"\"\n Process INFO lines sent by the server to reconfigure client\n with latest updates from cluster to enable server discovery.\n \"\"\"\n if 'connect_urls' in info:\n if info['connect_urls']:\n connect_urls = []\n for connect_url in info['connect_urls']:\n uri = urlparse(\"nats://%s\" % connect_url)\n srv = Srv(uri)\n srv.discovered = True\n\n # Filter for any similar server in the server pool already.\n should_add = True\n for s in self._server_pool:\n if uri.netloc == s.uri.netloc:\n should_add = False\n if should_add:\n connect_urls.append(srv)\n\n if self.options[\"dont_randomize\"] is not True:\n shuffle(connect_urls)\n for srv in connect_urls:\n self._server_pool.append(srv)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nprocesses INFO received from the server and CONNECT to the server.", "response": "def _process_connect_init(self):\n \"\"\"\n Process INFO received from the server and CONNECT to the server\n with authentication. It is also responsible of setting up the\n reading and ping interval tasks from the client.\n \"\"\"\n self._status = Client.CONNECTING\n\n connection_completed = self._io_reader.readline()\n info_line = yield from asyncio.wait_for(connection_completed, self.options[\"connect_timeout\"])\n if INFO_OP not in info_line:\n raise NatsError(\"nats: empty response from server when expecting INFO message\")\n\n _, info = info_line.split(INFO_OP + _SPC_, 1)\n\n try:\n srv_info = json.loads(info.decode())\n except:\n raise NatsError(\"nats: info message, json parse error\")\n\n self._process_info(srv_info)\n self._server_info = srv_info\n\n if 'max_payload' in self._server_info:\n self._max_payload = self._server_info[\"max_payload\"]\n\n if 'tls_required' in self._server_info and self._server_info['tls_required']:\n ssl_context = None\n if \"tls\" in self.options:\n ssl_context = self.options.get('tls')\n elif self._current_server.uri.scheme == 'tls':\n ssl_context = ssl.create_default_context()\n else:\n raise NatsError('nats: no ssl context provided')\n\n transport = self._io_writer.transport\n sock = transport.get_extra_info('socket')\n if not sock:\n # This shouldn't happen\n raise NatsError('nats: unable to get socket')\n\n yield from self._io_writer.drain() # just in case something is left\n\n self._io_reader, self._io_writer = \\\n yield from asyncio.open_connection(\n loop=self._loop,\n limit=DEFAULT_BUFFER_SIZE,\n sock=sock,\n ssl=ssl_context,\n server_hostname=self._current_server.uri.hostname,\n )\n\n # Refresh state of parser upon reconnect.\n if self.is_reconnecting:\n self._ps.reset()\n\n connect_cmd = self._connect_command()\n self._io_writer.write(connect_cmd)\n self._io_writer.write(PING_PROTO)\n yield from self._io_writer.drain()\n\n # FIXME: Add readline timeout\n next_op = yield from self._io_reader.readline()\n if self.options[\"verbose\"] and OK_OP in next_op:\n next_op = yield from self._io_reader.readline()\n\n if ERR_OP in next_op:\n err_line = next_op.decode()\n _, err_msg = err_line.split(\" \", 1)\n # FIXME: Maybe handling could be more special here,\n # checking for ErrAuthorization for example.\n # yield from self._process_err(err_msg)\n raise NatsError(\"nats: \" + err_msg.rstrip('\\r\\n'))\n\n if PONG_PROTO in next_op:\n self._status = Client.CONNECTED\n\n self._reading_task = self._loop.create_task(self._read_loop())\n self._pongs = []\n self._pings_outstanding = 0\n self._ping_interval_task = self._loop.create_task(\n self._ping_interval())\n\n # Task for kicking the flusher queue\n self._flusher_task = self._loop.create_task(self._flusher())"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ncomputing and save a coactivation map given a seed image.", "response": "def coactivation(dataset, seed, threshold=0.0, output_dir='.', prefix='', r=6):\n \"\"\" Compute and save coactivation map given input image as seed.\n\n This is essentially just a wrapper for a meta-analysis defined\n by the contrast between those studies that activate within the seed\n and those that don't.\n\n Args:\n dataset: a Dataset instance containing study and activation data.\n seed: either a Nifti or Analyze image defining the boundaries of the\n seed, or a list of triples (x/y/z) defining the seed(s). Note that\n voxels do not need to be contiguous to define a seed--all supra-\n threshold voxels will be lumped together.\n threshold: optional float indicating the threshold above which voxels\n are considered to be part of the seed ROI (default = 0)\n r: optional integer indicating radius (in mm) of spheres to grow\n (only used if seed is a list of coordinates).\n output_dir: output directory to write to. Defaults to current.\n If none, defaults to using the first part of the seed filename.\n prefix: optional string to prepend to all coactivation images.\n\n Output:\n A set of meta-analysis images identical to that generated by\n meta.MetaAnalysis.\n \"\"\"\n\n if isinstance(seed, string_types):\n ids = dataset.get_studies(mask=seed, activation_threshold=threshold)\n else:\n ids = dataset.get_studies(peaks=seed, r=r,\n activation_threshold=threshold)\n\n ma = meta.MetaAnalysis(dataset, ids)\n ma.save_results(output_dir, prefix)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef decode(self, images, save=None, round=4, names=None, **kwargs):\n\n if isinstance(images, string_types):\n images = [images]\n\n if isinstance(images, list):\n imgs_to_decode = imageutils.load_imgs(images, self.masker)\n else:\n imgs_to_decode = images\n\n methods = {\n 'pearson': self._pearson_correlation,\n 'dot': self._dot_product,\n 'roi': self._roi_association\n }\n\n result = np.around(\n methods[self.method](imgs_to_decode, **kwargs), round)\n\n # if save is not None:\n\n if names is None:\n if type(images).__module__ == np.__name__:\n names = ['image_%d' % i for i in range(images.shape[1])]\n elif self.method == 'roi':\n names = ['cluster_%d' % i for i in range(result.shape[1])]\n else:\n names = images\n\n result = pd.DataFrame(result, columns=names, index=self.feature_names)\n\n if save is not None:\n result.to_csv(save, index_label='Feature')\n return result", "response": "Decodes a set of images into a numpy array."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef load_features(self, features, image_type=None, from_array=False,\n threshold=0.001):\n \"\"\" Load features from current Dataset instance or a list of files.\n Args:\n features: List containing paths to, or names of, features to\n extract. Each element in the list must be a string containing\n either a path to an image, or the name of a feature (as named\n in the current Dataset). Mixing of paths and feature names\n within the list is not allowed.\n image_type: Optional suffix indicating which kind of image to use\n for analysis. Only used if features are taken from the Dataset;\n if features is a list of filenames, image_type is ignored.\n from_array: If True, the features argument is interpreted as a\n string pointing to the location of a 2D ndarray on disk\n containing feature data, where rows are voxels and columns are\n individual features.\n threshold: If features are taken from the dataset, this is the\n threshold passed to the meta-analysis module to generate fresh\n images.\n\n \"\"\"\n if from_array:\n if isinstance(features, list):\n features = features[0]\n self._load_features_from_array(features)\n elif path.exists(features[0]):\n self._load_features_from_images(features)\n else:\n self._load_features_from_dataset(\n features, image_type=image_type, threshold=threshold)", "response": "Loads features from the current Dataset instance or a list of files."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _load_features_from_array(self, features):\n self.feature_images = np.load(features)\n self.feature_names = range(self.feature_images.shape[1])", "response": "Load feature data from a 2D ndarray on disk."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nload feature image data from the current Dataset instance.", "response": "def _load_features_from_dataset(self, features=None, image_type=None,\n threshold=0.001):\n \"\"\" Load feature image data from the current Dataset instance. See\n load_features() for documentation.\n \"\"\"\n self.feature_names = self.dataset.feature_table.feature_names\n if features is not None:\n self.feature_names = [f for f in features\n if f in self.feature_names]\n from neurosynth.analysis import meta\n self.feature_images = meta.analyze_features(\n self.dataset, self.feature_names, image_type=image_type,\n threshold=threshold)\n # Apply a mask if one was originally passed\n if self.masker.layers:\n in_mask = self.masker.get_mask(in_global_mask=True)\n self.feature_images = self.feature_images[in_mask, :]"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nload feature images from image files.", "response": "def _load_features_from_images(self, images, names=None):\n \"\"\" Load feature image data from image files.\n\n Args:\n images: A list of image filenames.\n names: An optional list of strings to use as the feature names. Must\n be in the same order as the images.\n \"\"\"\n if names is not None and len(names) != len(images):\n raise Exception(\n \"Lists of feature names and images must be of same length!\")\n self.feature_names = names if names is not None else images\n self.feature_images = imageutils.load_imgs(images, self.masker)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ndecode images using Pearson s.", "response": "def _pearson_correlation(self, imgs_to_decode):\n \"\"\" Decode images using Pearson's r.\n\n Computes the correlation between each input image and each feature\n image across voxels.\n\n Args:\n imgs_to_decode: An ndarray of images to decode, with voxels in rows\n and images in columns.\n\n Returns:\n An n_features x n_images 2D array, with each cell representing the\n pearson correlation between the i'th feature and the j'th image\n across all voxels.\n \"\"\"\n x, y = imgs_to_decode.astype(float), self.feature_images.astype(float)\n return self._xy_corr(x, y)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _dot_product(self, imgs_to_decode):\n return np.dot(imgs_to_decode.T, self.feature_images).T", "response": "Decoding using the dot product."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _roi_association(self, imgs_to_decode, value='z', binarize=None):\n imgs_to_decode = imgs_to_decode.squeeze()\n x = average_within_regions(self.dataset, imgs_to_decode).astype(float)\n y = self.dataset.feature_table.data[self.feature_names].values\n if binarize is not None:\n y[y > binarize] = 1.\n y[y < 1.] = 0.\n r = self._xy_corr(x.T, y)\n if value == 'r':\n return r\n elif value == 'z':\n f_r = np.arctanh(r)\n return f_r * np.sqrt(y.shape[0] - 3)", "response": "Computes the strength of association between activation in a mask containing a semantic feature and presence of a semantic feature."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nimplementing various kinds of feature selection", "response": "def feature_selection(feat_select, X, y):\n \"\"\"\" Implements various kinds of feature selection \"\"\"\n # K-best\n if re.match('.*-best', feat_select) is not None:\n n = int(feat_select.split('-')[0])\n\n selector = SelectKBest(k=n)\n\n import warnings\n with warnings.catch_warnings():\n warnings.simplefilter('ignore', category=UserWarning)\n features_selected = np.where(\n selector.fit(X, y).get_support() is True)[0]\n\n elif re.match('.*-randombest', feat_select) is not None:\n n = int(feat_select.split('-')[0])\n\n from random import shuffle\n features = range(0, X.shape[1])\n shuffle(features)\n\n features_selected = features[:n]\n\n return features_selected"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsetting up data for a classification task given a set of masks Given a set of masks, this function retrieves studies associated with each mask at the specified threshold, optionally removes overlap and filters by studies and features, and returns studies by feature matrix (X) and class labels (y) Args: dataset: a Neurosynth dataset maks: a list of paths to Nifti masks threshold: percentage of voxels active within the mask for study to be included remove_overlap: A boolean indicating if studies studies that appear in more than one mask should be excluded studies: An optional list of study names used to constrain the set used in classification. If None, will use all features in the dataset. features: An optional list of feature names used to constrain the set used in classification. If None, will use all features in the dataset. regularize: Optional boolean indicating if X should be regularized Returns: A tuple (X, y) of np arrays. X is a feature by studies matrix and y is a vector of class labels", "response": "def get_studies_by_regions(dataset, masks, threshold=0.08, remove_overlap=True,\n studies=None, features=None,\n regularization=\"scale\"):\n \"\"\" Set up data for a classification task given a set of masks\n\n Given a set of masks, this function retrieves studies associated with\n each mask at the specified threshold, optionally removes overlap and\n filters by studies and features, and returns studies by feature matrix\n (X) and class labels (y)\n\n Args:\n dataset: a Neurosynth dataset\n maks: a list of paths to Nifti masks\n threshold: percentage of voxels active within the mask for study\n to be included\n remove_overlap: A boolean indicating if studies studies that\n appear in more than one mask should be excluded\n studies: An optional list of study names used to constrain the set\n used in classification. If None, will use all features in the\n dataset.\n features: An optional list of feature names used to constrain the\n set used in classification. If None, will use all features in\n the dataset.\n regularize: Optional boolean indicating if X should be regularized\n\n Returns:\n A tuple (X, y) of np arrays.\n X is a feature by studies matrix and y is a vector of class labels\n \"\"\"\n\n import nibabel as nib\n import os\n\n # Load masks using NiBabel\n\n try:\n loaded_masks = [nib.load(os.path.relpath(m)) for m in masks]\n except OSError:\n print('Error loading masks. Check the path')\n\n # Get a list of studies that activate for each mask file--i.e., a list of\n # lists\n\n grouped_ids = [dataset.get_studies(mask=m, activation_threshold=threshold)\n for m in loaded_masks]\n\n # Flattened ids\n\n flat_ids = reduce(lambda a, b: a + b, grouped_ids)\n\n # Remove duplicates\n\n if remove_overlap:\n import collections\n flat_ids = [id for (id, count) in\n collections.Counter(flat_ids).items() if count == 1]\n grouped_ids = [[x for x in m if x in flat_ids] for m in\n grouped_ids] # Remove\n\n # Create class label(y)\n y = [[idx] * len(ids) for (idx, ids) in enumerate(grouped_ids)]\n y = reduce(lambda a, b: a + b, y) # Flatten\n y = np.array(y)\n\n # Extract feature set for each class separately\n X = [dataset.get_feature_data(ids=group_ids, features=features)\n for group_ids in grouped_ids]\n\n X = np.vstack(tuple(X))\n\n if regularization:\n X = regularize(X, method=regularization)\n\n return (X, y)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_feature_order(dataset, features):\n all_features = dataset.get_feature_names()\n\n i = [all_features.index(f) for f in features]\n\n return i", "response": "Returns a list with the order that features requested appear in the\n dataset"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef classify_regions(dataset, masks, method='ERF', threshold=0.08,\n remove_overlap=True, regularization='scale',\n output='summary', studies=None, features=None,\n class_weight='auto', classifier=None,\n cross_val='4-Fold', param_grid=None, scoring='accuracy'):\n \"\"\" Perform classification on specified regions\n\n Given a set of masks, this function retrieves studies associated with\n each mask at the specified threshold, optionally removes overlap and\n filters by studies and features. Then it trains an algorithm to\n classify studies based on features and tests performance.\n\n Args:\n dataset: a Neurosynth dataset\n maks: a list of paths to Nifti masks\n method: a string indicating which method to used.\n 'SVM': Support Vector Classifier with rbf kernel\n 'ERF': Extremely Randomized Forest classifier\n 'Dummy': A dummy classifier using stratified classes as\n predictor\n threshold: percentage of voxels active within the mask for study\n to be included\n remove_overlap: A boolean indicating if studies studies that\n appear in more than one mask should be excluded\n regularization: A string indicating type of regularization to use.\n If None, performs no regularization.\n 'scale': Unit scale without demeaning\n output: A string indicating output type\n 'summary': Dictionary with summary statistics including score\n and n\n 'summary_clf': Same as above but also includes classifier\n 'clf': Only returns classifier\n Warning: using cv without grid will return an untrained\n classifier\n studies: An optional list of study names used to constrain the set\n used in classification. If None, will use all features in the\n dataset.\n features: An optional list of feature names used to constrain the\n set used in classification. If None, will use all features in\n the dataset.\n class_weight: Parameter to pass to classifier determining how to\n weight classes\n classifier: An optional sci-kit learn classifier to use instead of\n pre-set up classifiers set up using 'method'\n cross_val: A string indicating type of cross validation to use.\n Can also pass a scikit_classifier\n param_grid: A dictionary indicating which parameters to optimize\n using GridSearchCV. If None, no GridSearch will be used\n\n Returns:\n A tuple (X, y) of np arrays.\n X is a feature by studies matrix and y is a vector of class labels\n \"\"\"\n\n (X, y) = get_studies_by_regions(dataset, masks, threshold, remove_overlap,\n studies, features,\n regularization=regularization)\n\n return classify(X, y, method, classifier, output, cross_val,\n class_weight, scoring=scoring, param_grid=param_grid)", "response": "This function uses the Classifier with RBF to classify studies and features in the set of studies."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef classify(X, y, clf_method='ERF', classifier=None, output='summary_clf',\n cross_val=None, class_weight=None, regularization=None,\n param_grid=None, scoring='accuracy', refit_all=True,\n feat_select=None):\n \"\"\" Wrapper for scikit-learn classification functions\n Imlements various types of classification and cross validation \"\"\"\n\n # Build classifier\n clf = Classifier(clf_method, classifier, param_grid)\n\n # Fit & test model with or without cross-validation\n if cross_val is not None:\n score = clf.cross_val_fit(X, y, cross_val, scoring=scoring,\n feat_select=feat_select,\n class_weight=class_weight)\n else:\n # Does not support scoring function\n score = clf.fit(X, y, class_weight=class_weight).score(X, y)\n\n # Return some stuff...\n from collections import Counter\n\n if output == 'clf':\n return clf\n else:\n if output == 'summary':\n output = {'score': score, 'n': dict(Counter(y))}\n elif output == 'summary_clf':\n output = {\n 'score': score,\n 'n': dict(Counter(y)),\n 'clf': clf,\n 'features_selected': clf.features_selected,\n 'predictions': clf.predictions\n }\n\n return output", "response": "Classify the data X y and return the result as a dict."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fit(self, X, y, cv=None, class_weight='auto'):\n\n # Incorporate error checking such as :\n # if isinstance(self.classifier, ScikitClassifier):\n # do one thingNone\n # otherwiseNone.\n\n self.X = X\n self.y = y\n\n self.set_class_weight(class_weight=class_weight, y=y)\n\n self.clf = self.clf.fit(X, y)\n\n return self.clf", "response": "Fits X to outcomes y using clf"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsetting the class_weight of the classifier to match y", "response": "def set_class_weight(self, class_weight='auto', y=None):\n \"\"\" Sets the class_weight of the classifier to match y \"\"\"\n\n if class_weight is None:\n cw = None\n\n try:\n self.clf.set_params(class_weight=cw)\n except ValueError:\n pass\n\n elif class_weight == 'auto':\n c = np.bincount(y)\n ii = np.nonzero(c)[0]\n c = c / float(c.sum())\n cw = dict(zip(ii[::-1], c[ii]))\n\n try:\n self.clf.set_params(class_weight=cw)\n except ValueError:\n import warnings\n warnings.warn(\n \"Tried to set class_weight, but failed. The classifier \"\n \"probably doesn't support it\")"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nfitting X to outcomes y using clf and cv_method", "response": "def cross_val_fit(self, X, y, cross_val='4-Fold', scoring='accuracy',\n feat_select=None, class_weight='auto'):\n \"\"\" Fits X to outcomes y, using clf and cv_method \"\"\"\n\n from sklearn import cross_validation\n\n self.X = X\n self.y = y\n\n self.set_class_weight(class_weight=class_weight, y=y)\n\n # Set cross validator\n if isinstance(cross_val, string_types):\n if re.match('.*-Fold', cross_val) is not None:\n n = int(cross_val.split('-')[0])\n self.cver = cross_validation.StratifiedKFold(self.y, n)\n else:\n raise Exception('Unrecognized cross validation method')\n else:\n self.cver = cross_val\n\n if feat_select is not None:\n self.features_selected = []\n\n # Perform cross-validated classification\n from sklearn.grid_search import GridSearchCV\n if isinstance(self.clf, GridSearchCV):\n import warnings\n\n if feat_select is not None:\n warnings.warn(\n \"Cross-validated feature selection not supported with \"\n \"GridSearchCV\")\n self.clf.set_params(cv=self.cver, scoring=scoring)\n\n with warnings.catch_warnings():\n warnings.simplefilter('ignore', category=UserWarning)\n self.clf = self.clf.fit(X, y)\n\n self.cvs = self.clf.best_score_\n else:\n self.cvs = self.feat_select_cvs(\n feat_select=feat_select, scoring=scoring)\n\n if feat_select is not None:\n fs = feature_selection(\n feat_select, X, y)\n self.features_selected.append(fs)\n\n X = X[:, fs]\n\n self.clf.fit(X, y)\n\n return self.cvs.mean()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning the scores of the features in the current classifier and the predictions of the classifiers in the current classifier.", "response": "def feat_select_cvs(self, scoring=None, feat_select=None):\n \"\"\" Returns cross validated scores (just like cross_val_score),\n but includes feature selection as part of the cross validation loop \"\"\"\n\n scores = []\n self.predictions = []\n\n for train, test in self.cver:\n X_train, X_test, y_train, y_test = self.X[\n train], self.X[test], self.y[train], self.y[test]\n\n if feat_select is not None:\n # Get which features are kept\n fs = feature_selection(\n feat_select, X_train, y_train)\n\n self.features_selected.append(fs)\n\n # Filter X to only keep selected features\n X_train, X_test = X_train[\n :, fs], X_test[:, fs]\n\n # Set scoring (not implement as accuracy is default)\n\n # Train classifier\n self.clf.fit(X_train, y_train)\n\n # Test classifier\n predicition, s = get_score(\n X_test, y_test, self.clf, scoring=scoring)\n\n scores.append(s)\n self.predictions.append((y_test, predicition))\n\n return np.array(scores)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ngive a dataset fits either features or voxels to y", "response": "def fit_dataset(self, dataset, y, features=None,\n feature_type='features'):\n \"\"\" Given a dataset, fits either features or voxels to y \"\"\"\n\n # Get data from dataset\n\n if feature_type == 'features':\n X = np.rot90(dataset.feature_table.data.toarray())\n elif feature_type == 'voxels':\n X = np.rot90(dataset.image_table.data.toarray())\n\n self.sk_classifier.fit(X, y)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef p_list_andnot(self, p):\n 'list : list ANDNOT list'\n p[0] = p[1].loc[set(p[1].index) - set(p[3].index)]", "response": "get\n list AND NOT list"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nlist : list AND list", "response": "def p_list_and(self, p):\n 'list : list AND list'\n p[0] = pd.concat(\n [p[1], p[3]], axis=1).dropna().apply(self.func, axis=1)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nlist : list OR list", "response": "def p_list_or(self, p):\n 'list : list OR list'\n p[0] = pd.concat(\n [p[1], p[3]], axis=1).fillna(0.0).apply(self.func, axis=1)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef p_list_feature(self, p):\n '''list : feature\n | WORD '''\n p[0] = self.dataset.get_studies(\n features=p[1], frequency_threshold=self.threshold, func=self.func,\n return_type='weights')", "response": "list feature | WORD"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\naggregate over all voxels within each ROI in the input image and a list of regions.", "response": "def average_within_regions(dataset, regions, masker=None, threshold=None,\n remove_zero=True):\n \"\"\" Aggregates over all voxels within each ROI in the input image.\n\n Takes a Dataset and a Nifti image that defines distinct regions, and\n returns a numpy matrix of ROIs x mappables, where the value at each\n ROI is the proportion of active voxels in that ROI. Each distinct ROI\n must have a unique value in the image; non-contiguous voxels with the\n same value will be assigned to the same ROI.\n\n Args:\n dataset: Either a Dataset instance from which image data are\n extracted, or a Numpy array containing image data to use. If\n the latter, the array contains voxels in rows and\n features/studies in columns. The number of voxels must be equal\n to the length of the vectorized image mask in the regions\n image.\n regions: An image defining the boundaries of the regions to use.\n Can be one of:\n 1) A string name of the NIFTI or Analyze-format image\n 2) A NiBabel SpatialImage\n 3) A list of NiBabel images\n 4) A 1D numpy array of the same length as the mask vector in\n the Dataset's current Masker.\n masker: Optional masker used to load image if regions is not a\n numpy array. Must be passed if dataset is a numpy array.\n threshold: An optional float in the range of 0 - 1 or integer. If\n passed, the array will be binarized, with ROI values above the\n threshold assigned to True and values below the threshold\n assigned to False. (E.g., if threshold = 0.05, only ROIs in\n which more than 5% of voxels are active will be considered\n active.) If threshold is integer, studies will only be\n considered active if they activate more than that number of\n voxels in the ROI.\n remove_zero: An optional boolean; when True, assume that voxels\n with value of 0 should not be considered as a separate ROI, and\n will be ignored.\n\n Returns:\n A 2D numpy array with ROIs in rows and mappables in columns.\n \"\"\"\n\n if masker is not None:\n masker = masker\n else:\n if isinstance(dataset, Dataset):\n masker = dataset.masker\n else:\n if not type(regions).__module__.startswith('numpy'):\n raise ValueError(\n \"If dataset is a numpy array and regions is not a numpy \"\n \"array, a masker must be provided.\")\n\n if not type(regions).__module__.startswith('numpy'):\n regions = masker.mask(regions)\n\n if isinstance(dataset, Dataset):\n dataset = dataset.get_image_data(dense=False)\n\n # If multiple images are passed, give each one a unique value\n if regions.ndim == 2:\n m = regions\n for i in range(regions.shape[1]):\n _nz = np.nonzero(m[:, i])[0]\n if isinstance(threshold, int):\n m[_nz, i] = 1.0\n else:\n m[_nz, i] = 1.0 / np.count_nonzero(m[:, i])\n\n # Otherwise create an ROI-coding matrix\n else:\n labels = np.unique(regions)\n\n if remove_zero:\n labels = labels[np.nonzero(labels)]\n\n n_regions = labels.size\n\n m = np.zeros((regions.size, n_regions))\n for i in range(n_regions):\n if isinstance(threshold, int):\n m[regions == labels[i], i] = 1.0\n else:\n m[regions == labels[i], i] = 1.0 / \\\n np.sum(regions == labels[i])\n\n # Call dot() on the array itself as this will use sparse matrix\n # multiplication if possible.\n result = dataset.T.dot(m).T\n\n if threshold is not None:\n result[result < threshold] = 0.0\n result = result.astype(bool)\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating a 3D grid on the brain volume and averages across all voxels in each cell.", "response": "def apply_grid(dataset, masker=None, scale=5, threshold=None):\n \"\"\" Imposes a 3D grid on the brain volume and averages across all voxels\n that fall within each cell.\n Args:\n dataset: Data to apply grid to. Either a Dataset instance, or a numpy\n array with voxels in rows and features in columns.\n masker: Optional Masker instance used to map between the created grid\n and the dataset. This is only needed if dataset is a numpy array;\n if dataset is a Dataset instance, the Masker in the dataset will\n be used.\n scale: int; scaling factor (in mm) to pass onto create_grid().\n threshold: Optional float to pass to reduce.average_within_regions().\n Returns:\n A tuple of length 2, where the first element is a numpy array of\n dimensions n_cubes x n_studies, and the second element is a numpy\n array, with the same dimensions as the Masker instance in the current\n Dataset, that maps voxel identities onto cell IDs in the grid.\n \"\"\"\n if masker is None:\n if isinstance(dataset, Dataset):\n masker = dataset.masker\n else:\n raise ValueError(\n \"If dataset is a numpy array, a masker must be provided.\")\n\n grid = imageutils.create_grid(masker.volume, scale)\n cm = masker.mask(grid, in_global_mask=True)\n data = average_within_regions(dataset, cm, threshold)\n return (data, grid)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning a 2D numpy array with random voxels selected from the dataset.", "response": "def get_random_voxels(dataset, n_voxels):\n \"\"\" Returns mappable data for a random subset of voxels.\n\n May be useful as a baseline in predictive analyses--e.g., to compare\n performance of a more principled feature selection method with simple\n random selection.\n\n Args:\n dataset: A Dataset instance\n n_voxels: An integer specifying the number of random voxels to select.\n\n Returns:\n A 2D numpy array with (randomly-selected) voxels in rows and mappables\n in columns.\n \"\"\"\n voxels = np.arange(dataset.masker.n_vox_in_vol)\n np.random.shuffle(voxels)\n selected = voxels[0:n_voxels]\n return dataset.get_image_data(voxels=selected)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning top forty words from each topic in trained topic model.", "response": "def _get_top_words(model, feature_names, n_top_words=40):\n \"\"\" Return top forty words from each topic in trained topic model.\n \"\"\"\n topic_words = []\n for topic in model.components_:\n top_words = [feature_names[i] for i in topic.argsort()[:-n_top_words-1:-1]]\n topic_words += [top_words]\n return topic_words"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef run_lda(abstracts, n_topics=50, n_words=31, n_iters=1000, alpha=None,\n beta=0.001):\n \"\"\" Perform topic modeling using Latent Dirichlet Allocation with the\n Java toolbox MALLET.\n\n Args:\n abstracts: A pandas DataFrame with two columns ('pmid' and 'abstract')\n containing article abstracts.\n n_topics: Number of topics to generate. Default=50.\n n_words: Number of top words to return for each topic. Default=31,\n based on Poldrack et al. (2012).\n n_iters: Number of iterations to run in training topic model.\n Default=1000.\n alpha: The Dirichlet prior on the per-document topic\n distributions.\n Default: 50 / n_topics, based on Poldrack et al. (2012).\n beta: The Dirichlet prior on the per-topic word distribution.\n Default: 0.001, based on Poldrack et al. (2012).\n\n Returns:\n weights_df: A pandas DataFrame derived from the MALLET\n output-doc-topics output file. Contains the weight assigned\n to each article for each topic, which can be used to select\n articles for topic-based meta-analyses (accepted threshold\n from Poldrack article is 0.001). [n_topics]+1 columns:\n 'pmid' is the first column and the following columns are\n the topic names. The names of the topics match the names\n in df (e.g., topic_000).\n keys_df: A pandas DataFrame derived from the MALLET\n output-topic-keys output file. Contains the top [n_words]\n words for each topic, which can act as a summary of the\n topic's content. Two columns: 'topic' and 'terms'. The\n names of the topics match the names in weights (e.g.,\n topic_000).\n \"\"\"\n if abstracts.index.name != 'pmid':\n abstracts.index = abstracts['pmid']\n\n resdir = os.path.abspath(get_resource_path())\n tempdir = os.path.join(resdir, 'topic_models')\n absdir = os.path.join(tempdir, 'abstracts')\n if not os.path.isdir(tempdir):\n os.mkdir(tempdir)\n\n if alpha is None:\n alpha = 50. / n_topics\n\n # Check for presence of abstract files and convert if necessary\n if not os.path.isdir(absdir):\n print('Abstracts folder not found. Creating abstract files...')\n os.mkdir(absdir)\n for pmid in abstracts.index.values:\n abstract = abstracts.loc[pmid]['abstract']\n with open(os.path.join(absdir, str(pmid) + '.txt'), 'w') as fo:\n fo.write(abstract)\n\n # Run MALLET topic modeling\n print('Generating topics...')\n mallet_bin = join(dirname(dirname(__file__)),\n 'resources/mallet/bin/mallet')\n import_str = ('{mallet} import-dir '\n '--input {absdir} '\n '--output {outdir}/topic-input.mallet '\n '--keep-sequence '\n '--remove-stopwords').format(mallet=mallet_bin,\n absdir=absdir,\n outdir=tempdir)\n\n train_str = ('{mallet} train-topics '\n '--input {out}/topic-input.mallet '\n '--num-topics {n_topics} '\n '--num-top-words {n_words} '\n '--output-topic-keys {out}/topic_keys.txt '\n '--output-doc-topics {out}/doc_topics.txt '\n '--num-iterations {n_iters} '\n '--output-model {out}/saved_model.mallet '\n '--random-seed 1 '\n '--alpha {alpha} '\n '--beta {beta}').format(mallet=mallet_bin, out=tempdir,\n n_topics=n_topics, n_words=n_words,\n n_iters=n_iters,\n alpha=alpha, beta=beta)\n\n subprocess.call(import_str, shell=True)\n subprocess.call(train_str, shell=True)\n\n # Read in and convert doc_topics and topic_keys.\n def clean_str(string):\n return os.path.basename(os.path.splitext(string)[0])\n\n def get_sort(lst):\n return [i[0] for i in sorted(enumerate(lst), key=lambda x:x[1])]\n\n topic_names = ['topic_{0:03d}'.format(i) for i in range(n_topics)]\n\n # doc_topics: Topic weights for each paper.\n # The conversion here is pretty ugly at the moment.\n # First row should be dropped. First column is row number and can be used\n # as the index.\n # Second column is 'file: /full/path/to/pmid.txt' <-- Parse to get pmid.\n # After that, odd columns are topic numbers and even columns are the\n # weights for the topics in the preceding column. These columns are sorted\n # on an individual pmid basis by the weights.\n n_cols = (2 * n_topics) + 1\n dt_df = pd.read_csv(os.path.join(tempdir, 'doc_topics.txt'),\n delimiter='\\t', skiprows=1, header=None, index_col=0)\n dt_df = dt_df[dt_df.columns[:n_cols]]\n\n # Get pmids from filenames\n dt_df[1] = dt_df[1].apply(clean_str)\n\n # Put weights (even cols) and topics (odd cols) into separate dfs.\n weights_df = dt_df[dt_df.columns[2::2]]\n weights_df.index = dt_df[1]\n weights_df.columns = range(n_topics)\n\n topics_df = dt_df[dt_df.columns[1::2]]\n topics_df.index = dt_df[1]\n topics_df.columns = range(n_topics)\n\n # Sort columns in weights_df separately for each row using topics_df.\n sorters_df = topics_df.apply(get_sort, axis=1)\n weights = weights_df.as_matrix()\n sorters = sorters_df.as_matrix()\n # there has to be a better way to do this.\n for i in range(sorters.shape[0]):\n weights[i, :] = weights[i, sorters[i, :]]\n\n # Define topic names (e.g., topic_000)\n index = dt_df[1]\n weights_df = pd.DataFrame(columns=topic_names, data=weights, index=index)\n weights_df.index.name = 'pmid'\n\n # topic_keys: Top [n_words] words for each topic.\n keys_df = pd.read_csv(os.path.join(tempdir, 'topic_keys.txt'),\n delimiter='\\t', header=None, index_col=0)\n\n # Second column is a list of the terms.\n keys_df = keys_df[[2]]\n keys_df.rename(columns={2: 'terms'}, inplace=True)\n keys_df.index = topic_names\n keys_df.index.name = 'topic'\n\n # Remove all temporary files (abstract files, model, and outputs).\n shutil.rmtree(tempdir)\n\n # Return article topic weights and topic keys.\n return weights_df, keys_df", "response": "This function runs the LDA modeling using the Latent Dirichlet Allocation with the base MALLET."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef pearson(x, y):\n data = np.vstack((x, y))\n ms = data.mean(axis=1)[(slice(None, None, None), None)]\n datam = data - ms\n datass = np.sqrt(np.sum(datam**2, axis=1))\n temp = np.dot(datam[1:], datam[0].T)\n rs = temp / (datass[1:] * datass[0])\n return rs", "response": "Correlates row vector x with each row vector in 2D array y."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef two_way(cells):\n # Mute divide-by-zero warning for bad voxels since we account for that\n # later\n warnings.simplefilter(\"ignore\", RuntimeWarning)\n\n cells = cells.astype('float64') # Make sure we don't overflow\n total = np.apply_over_axes(np.sum, cells, [1, 2]).ravel()\n chi_sq = np.zeros(cells.shape, dtype='float64')\n for i in range(2):\n for j in range(2):\n exp = np.sum(cells[:, i, :], 1).ravel() * \\\n np.sum(cells[:, :, j], 1).ravel() / total\n bad_vox = np.where(exp == 0)[0]\n chi_sq[:, i, j] = (cells[:, i, j] - exp) ** 2 / exp\n chi_sq[bad_vox, i, j] = 1.0 # Set p-value for invalid voxels to 1\n chi_sq = np.apply_over_axes(np.sum, chi_sq, [1, 2]).ravel()\n return special.chdtrc(1, chi_sq)", "response": "Two - way chi - square test of independence."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef one_way(data, n):\n term = data.astype('float64')\n no_term = n - term\n t_exp = np.mean(term, 0)\n t_exp = np.array([t_exp, ] * data.shape[0])\n nt_exp = n - t_exp\n t_mss = (term - t_exp) ** 2 / t_exp\n nt_mss = (no_term - nt_exp) ** 2 / nt_exp\n chi2 = t_mss + nt_mss\n return special.chdtrc(1, chi2)", "response": "One - way chi - square test of independence."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef fdr(p, q=.05):\n s = np.sort(p)\n nvox = p.shape[0]\n null = np.array(range(1, nvox + 1), dtype='float') * q / nvox\n below = np.where(s <= null)[0]\n return s[max(below)] if len(below) else -1", "response": "Determine FDR threshold given a p value array and desired false\n discovery rate q."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef download(path='.', url=None, unpack=False):\n\n if url is None:\n url = 'https://github.com/neurosynth/neurosynth-data/blob/master/current_data.tar.gz?raw=true'\n if os.path.exists(path) and os.path.isdir(path):\n basename = os.path.basename(url).split('?')[0]\n filename = os.path.join(path, basename)\n else:\n filename = path\n\n f = open(filename, 'wb')\n\n u = urlopen(url)\n file_size = int(u.headers[\"Content-Length\"][0])\n print(\"Downloading the latest Neurosynth files: {0} bytes: {1}\".format(\n url, file_size))\n\n bytes_dl = 0\n block_size = 8192\n while True:\n buffer = u.read(block_size)\n if not buffer:\n break\n bytes_dl += len(buffer)\n f.write(buffer)\n p = float(bytes_dl) / file_size\n status = r\"{0} [{1:.2%}]\".format(bytes_dl, p)\n status = status + chr(8) * (len(status) + 1)\n sys.stdout.write(status)\n\n f.close()\n\n if unpack:\n import tarfile\n tarfile.open(filename, 'r:gz').extractall(os.path.dirname(filename))", "response": "Download the latest data files."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef download_abstracts(dataset, path='.', email=None, out_file=None):\n try:\n from Bio import Entrez, Medline\n except:\n raise Exception(\n 'Module biopython is required for downloading abstracts from PubMed.')\n\n if email is None:\n raise Exception('No email address provided.')\n Entrez.email = email\n\n if isinstance(dataset, Dataset):\n pmids = dataset.image_table.ids.astype(str).tolist()\n elif isinstance(dataset, list):\n pmids = [str(pmid) for pmid in dataset]\n else:\n raise Exception(\n 'Dataset type not recognized: {0}'.format(type(dataset)))\n\n records = []\n # PubMed only allows you to search ~1000 at a time. I chose 900 to be safe.\n chunks = [pmids[x: x + 900] for x in range(0, len(pmids), 900)]\n for chunk in chunks:\n h = Entrez.efetch(db='pubmed', id=chunk, rettype='medline',\n retmode='text')\n records += list(Medline.parse(h))\n\n # Pull data for studies with abstracts\n data = [[study['PMID'], study['AB']]\n for study in records if study.get('AB', None)]\n df = pd.DataFrame(columns=['pmid', 'abstract'], data=data)\n if out_file is not None:\n df.to_csv(os.path.join(os.path.abspath(path), out_file), index=False)\n return df", "response": "Download the abstracts for a dataset."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nloads the activations from a text file.", "response": "def _load_activations(self, filename):\n \"\"\" Load activation data from a text file.\n\n Args:\n filename (str): a string pointing to the location of the txt file\n to read from.\n \"\"\"\n logger.info(\"Loading activation data from %s...\" % filename)\n\n activations = pd.read_csv(filename, sep='\\t')\n activations.columns = [col.lower()\n for col in list(activations.columns)]\n\n # Make sure all mandatory columns exist\n mc = ['x', 'y', 'z', 'id', 'space']\n if (set(mc) - set(list(activations.columns))):\n logger.error(\n \"At least one of mandatory columns (x, y, z, id, and space) \"\n \"is missing from input file.\")\n return\n\n # Transform to target space where needed\n spaces = activations['space'].unique()\n xyz = activations[['x', 'y', 'z']].values\n for s in spaces:\n if s != self.transformer.target:\n inds = activations['space'] == s\n xyz[inds] = self.transformer.apply(s, xyz[inds])\n activations[['x', 'y', 'z']] = xyz\n\n # xyz --> ijk\n ijk = pd.DataFrame(\n transformations.xyz_to_mat(xyz), columns=['i', 'j', 'k'])\n activations = pd.concat([activations, ijk], axis=1)\n return activations"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create_image_table(self, r=None):\n logger.info(\"Creating image table...\")\n if r is not None:\n self.r = r\n self.image_table = ImageTable(self)", "response": "Create and store an ImageTable instance based on the current\n dataset."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting IDs or data for studies that meet specific criteria. If multiple criteria are passed, the set intersection is returned. For example, passing expression='emotion' and mask='my_mask.nii.gz' would return only those studies that are associated with emotion AND report activation within the voxels indicated in the passed image. Args: ids (list): A list of IDs of studies to retrieve. features (list or str): The name of a feature, or a list of features, to use for selecting studies. expression (str): A string expression to pass to the PEG for study retrieval. mask: the mask image (see Masker documentation for valid data types). peaks (ndarray or list): Either an n x 3 numpy array, or a list of lists or tuples (e.g., [(-10, 22, 14)]) specifying the world (x/y/z) coordinates of the target location(s). frequency_threshold (float): For feature-based or expression-based selection, the threshold for selecting studies--i.e., the cut-off for a study to be included. Must be a float in range [0, 1]. activation_threshold (int or float): For mask-based selection, threshold for a study to be included based on amount of activation displayed. If an integer, represents the absolute number of voxels that must be active within the mask in order for a study to be selected. If a float, it represents the proportion of voxels that must be active. func (Callable): The function to use when aggregating over the list of features. See documentation in FeatureTable.get_ids() for a full explanation. Only used for feature- or expression-based selection. return_type (str): A string specifying what data to return. Valid options are: 'ids': returns a list of IDs of selected studies. 'images': returns a voxel x study matrix of data for all selected studies. 'weights': returns a dict where the keys are study IDs and the values are the computed weights. Only valid when performing feature-based selection. r (int): For peak-based selection, the distance cut-off (in mm) for inclusion (i.e., only studies with one or more activations within r mm of one of the passed foci will be returned). Returns: When return_type is 'ids' (default), returns a list of IDs of the selected studies. When return_type is 'data', returns a 2D numpy array, with voxels in rows and studies in columns. When return_type is 'weights' (valid only for expression-based selection), returns a dict, where the keys are study IDs, and the values are the computed weights. Examples -------- Select all studies tagged with the feature 'emotion': >>> ids = dataset.get_studies(features='emotion') Select all studies that activate at least 20% of voxels in an amygdala mask, and retrieve activation data rather than IDs: >>> data = dataset.get_studies(mask='amygdala_mask.nii.gz', threshold=0.2, return_type='images') Select studies that report at least one activation within 12 mm of at least one of three specific foci: >>> ids = dataset.get_studies(peaks=[[12, -20, 30], [-26, 22, 22], [0, 36, -20]], r=12)", "response": "def get_studies(self, features=None, expression=None, mask=None,\n peaks=None, frequency_threshold=0.001,\n activation_threshold=0.0, func=np.sum, return_type='ids',\n r=6\n ):\n \"\"\" Get IDs or data for studies that meet specific criteria.\n\n If multiple criteria are passed, the set intersection is returned. For\n example, passing expression='emotion' and mask='my_mask.nii.gz' would\n return only those studies that are associated with emotion AND report\n activation within the voxels indicated in the passed image.\n\n Args:\n ids (list): A list of IDs of studies to retrieve.\n features (list or str): The name of a feature, or a list of\n features, to use for selecting studies.\n expression (str): A string expression to pass to the PEG for study\n retrieval.\n mask: the mask image (see Masker documentation for valid data\n types).\n peaks (ndarray or list): Either an n x 3 numpy array, or a list of\n lists or tuples (e.g., [(-10, 22, 14)]) specifying the world\n (x/y/z) coordinates of the target location(s).\n frequency_threshold (float): For feature-based or expression-based\n selection, the threshold for selecting studies--i.e., the\n cut-off for a study to be included. Must be a float in range\n [0, 1].\n activation_threshold (int or float): For mask-based selection,\n threshold for a study to be included based on amount of\n activation displayed. If an integer, represents the absolute\n number of voxels that must be active within the mask in order\n for a study to be selected. If a float, it represents the\n proportion of voxels that must be active.\n func (Callable): The function to use when aggregating over the list\n of features. See documentation in FeatureTable.get_ids() for a\n full explanation. Only used for feature- or expression-based\n selection.\n return_type (str): A string specifying what data to return. Valid\n options are:\n 'ids': returns a list of IDs of selected studies.\n 'images': returns a voxel x study matrix of data for all\n selected studies.\n 'weights': returns a dict where the keys are study IDs and the\n values are the computed weights. Only valid when performing\n feature-based selection.\n r (int): For peak-based selection, the distance cut-off (in mm)\n for inclusion (i.e., only studies with one or more activations\n within r mm of one of the passed foci will be returned).\n\n Returns:\n When return_type is 'ids' (default), returns a list of IDs of the\n selected studies. When return_type is 'data', returns a 2D numpy\n array, with voxels in rows and studies in columns. When return_type\n is 'weights' (valid only for expression-based selection), returns\n a dict, where the keys are study IDs, and the values are the\n computed weights.\n\n Examples\n --------\n Select all studies tagged with the feature 'emotion':\n\n >>> ids = dataset.get_studies(features='emotion')\n\n Select all studies that activate at least 20% of voxels in an amygdala\n mask, and retrieve activation data rather than IDs:\n\n >>> data = dataset.get_studies(mask='amygdala_mask.nii.gz',\n threshold=0.2, return_type='images')\n\n Select studies that report at least one activation within 12 mm of at\n least one of three specific foci:\n\n >>> ids = dataset.get_studies(peaks=[[12, -20, 30], [-26, 22, 22],\n [0, 36, -20]], r=12)\n\n \"\"\"\n results = []\n\n # Feature-based selection\n if features is not None:\n # Need to handle weights as a special case, because we can't\n # retrieve the weights later using just the IDs.\n if return_type == 'weights':\n if expression is not None or mask is not None or \\\n peaks is not None:\n raise ValueError(\n \"return_type cannot be 'weights' when feature-based \"\n \"search is used in conjunction with other search \"\n \"modes.\")\n return self.feature_table.get_ids(\n features, frequency_threshold, func, get_weights=True)\n else:\n results.append(self.feature_table.get_ids(\n features, frequency_threshold, func))\n\n # Logical expression-based selection\n if expression is not None:\n _ids = self.feature_table.get_ids_by_expression(\n expression, frequency_threshold, func)\n results.append(list(_ids))\n\n # Mask-based selection\n if mask is not None:\n mask = self.masker.mask(mask, in_global_mask=True).astype(bool)\n num_vox = np.sum(mask)\n prop_mask_active = self.image_table.data.T.dot(mask).astype(float)\n if isinstance(activation_threshold, float):\n prop_mask_active /= num_vox\n indices = np.where(prop_mask_active > activation_threshold)[0]\n results.append([self.image_table.ids[ind] for ind in indices])\n\n # Peak-based selection\n if peaks is not None:\n r = float(r)\n found = set()\n for p in peaks:\n xyz = np.array(p, dtype=float)\n x = self.activations['x']\n y = self.activations['y']\n z = self.activations['z']\n dists = np.sqrt(np.square(x - xyz[0]) + np.square(y - xyz[1]) +\n np.square(z - xyz[2]))\n inds = np.where((dists > 5.5) & (dists < 6.5))[0]\n tmp = dists[inds]\n found |= set(self.activations[dists <= r]['id'].unique())\n results.append(found)\n\n # Get intersection of all sets\n ids = list(reduce(lambda x, y: set(x) & set(y), results))\n\n if return_type == 'ids':\n return ids\n elif return_type == 'data':\n return self.get_image_data(ids)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef add_features(self, features, append=True, merge='outer',\n duplicates='ignore', min_studies=0.0, threshold=0.001):\n \"\"\" Construct a new FeatureTable from file.\n\n Args:\n features: Feature data to add. Can be:\n (a) A text file containing the feature data, where each row is\n a study in the database, with features in columns. The first\n column must contain the IDs of the studies to match up with the\n image data.\n (b) A pandas DataFrame, where studies are in rows, features are\n in columns, and the index provides the study IDs.\n append (bool): If True, adds new features to existing ones\n incrementally. If False, replaces old features.\n merge, duplicates, min_studies, threshold: Additional arguments\n passed to FeatureTable.add_features().\n \"\"\"\n if (not append) or not hasattr(self, 'feature_table'):\n self.feature_table = FeatureTable(self)\n\n self.feature_table.add_features(features, merge=merge,\n duplicates=duplicates,\n min_studies=min_studies,\n threshold=threshold)", "response": "Add features to the feature table."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn names of features.", "response": "def get_feature_names(self, features=None):\n \"\"\" Returns names of features. If features is None, returns all\n features. Otherwise assumes the user is trying to find the order of the\n features. \"\"\"\n if features:\n return self.feature_table.get_ordered_names(features)\n else:\n return self.feature_table.feature_names"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_feature_counts(self, threshold=0.001):\n counts = np.sum(self.get_feature_data() >= threshold, 0)\n return dict(zip(self.get_feature_names(), list(counts)))", "response": "Returns a dictionary where the keys are the feature names and the values are the number of studies tagged with the feature."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef load(cls, filename):\n try:\n dataset = pickle.load(open(filename, 'rb'))\n except UnicodeDecodeError:\n # Need to try this for python3\n dataset = pickle.load(open(filename, 'rb'), encoding='latin')\n\n if hasattr(dataset, 'feature_table'):\n dataset.feature_table._csr_to_sdf()\n return dataset", "response": "Load a pickled Dataset instance from file."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef save(self, filename):\n if hasattr(self, 'feature_table'):\n self.feature_table._sdf_to_csr()\n\n pickle.dump(self, open(filename, 'wb'), -1)\n\n if hasattr(self, 'feature_table'):\n self.feature_table._csr_to_sdf()", "response": "Save the Dataset instance to a file."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_image_data(self, ids=None, voxels=None, dense=True):\n if dense and ids is None and voxels is None:\n logger.warning(\n \"Warning: get_image_data() is being called without specifying \"\n \"a subset of studies or voxels to retrieve. This may result in\"\n \" a very large amount of data (several GB) being read into \"\n \"memory. If you experience any problems, consider returning a \"\n \"sparse matrix by passing dense=False, or pass in a list of \"\n \"ids of voxels to retrieve only a portion of the data.\")\n\n result = self.data\n if ids is not None:\n idxs = np.where(np.in1d(np.array(self.ids), np.array(ids)))[0]\n result = result[:, idxs]\n if voxels is not None:\n result = result[voxels, :]\n return result.toarray() if dense else result", "response": "Retrieves a subset of image data for the specified studies or voxels."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ntrimming ImageTable to keep only the passed studies.", "response": "def trim(self, ids):\n \"\"\" Trim ImageTable to keep only the passed studies. This is a\n convenience method, and should generally be avoided in favor of\n non-destructive alternatives that don't require slicing (e.g.,\n matrix multiplication). \"\"\"\n self.data = self.get_image_data(ids, dense=False) # .tocoo()\n idxs = np.where(np.in1d(np.array(self.ids), np.array(ids)))[0]\n self.ids = [self.ids[i] for i in idxs]"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef add_features(self, features, merge='outer', duplicates='ignore',\n min_studies=0, threshold=0.0001):\n \"\"\" Add new features to FeatureTable.\n Args:\n features (str, DataFrame): A filename to load data from, or a\n pandas DataFrame. In either case, studies are in rows and\n features are in columns. Values in cells reflect the weight of\n the intersecting feature for the intersecting study. Feature\n names and study IDs should be included as the first column\n and first row, respectively.\n merge (str): The merge strategy to use when merging new features\n with old. This is passed to pandas.merge, so can be 'left',\n 'right', 'outer', or 'inner'. Defaults to outer (i.e., all data\n in both new and old will be kept, and missing values will be\n assigned zeros.)\n duplicates (str): string indicating how to handle features whose\n name matches an existing feature. Valid options:\n 'ignore' (default): ignores the new feature, keeps old data\n 'replace': replace the old feature's data with the new data\n 'merge': keeps both features, renaming them so they're\n different\n min_studies (int): minimum number of studies that pass threshold in\n order to add feature.\n threshold (float): minimum frequency threshold each study must\n exceed in order to count towards min_studies.\n \"\"\"\n if isinstance(features, string_types):\n if not os.path.exists(features):\n raise ValueError(\"%s cannot be found.\" % features)\n try:\n features = pd.read_csv(features, sep='\\t', index_col=0)\n except Exception as e:\n logger.error(\"%s cannot be parsed: %s\" % (features, e))\n\n if min_studies:\n valid = np.where(\n (features.values >= threshold).sum(0) >= min_studies)[0]\n features = features.iloc[:, valid]\n\n # Warn user if no/few IDs match between the FeatureTable and the\n # Dataset. This most commonly happens because older database.txt files\n # used doi's as IDs whereas we now use PMIDs throughout.\n n_studies = len(features)\n n_common_ids = len(\n set(features.index) & set(self.dataset.image_table.ids))\n if float(n_common_ids) / n_studies < 0.01: # Minimum 1% overlap\n msg = \"Only %d\" % n_common_ids if n_common_ids else \"None of the\"\n logger.warning(\n msg + \" studies in the feature file matched studies currently \"\n \"the Dataset. The most likely cause for this is that you're \"\n \"pairing a newer feature set with an older, incompatible \"\n \"database file. You may want to try regenerating the Dataset \"\n \"instance from a newer database file that uses PMIDs rather \"\n \"than doi's as the study identifiers in the first column.\")\n\n old_data = self.data.to_dense()\n # Handle features with duplicate names\n common_features = list(set(old_data.columns) & set(features.columns))\n if duplicates == 'ignore':\n features = features.drop(common_features, axis=1)\n elif duplicates == 'replace':\n old_data = old_data.drop(common_features, axis=1)\n\n data = old_data.merge(\n features, how=merge, left_index=True, right_index=True)\n self.data = data.fillna(0.0).to_sparse()", "response": "Add new features to FeatureTable."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nretrieve a subset of feature data for the specified study IDs and features.", "response": "def get_feature_data(self, ids=None, features=None, dense=True):\n \"\"\" Slices and returns a subset of feature data.\n\n Args:\n ids (list, array): A list or 1D numpy array of study ids to\n return rows for. If None, returns data for all studies\n (i.e., all rows in array).\n features (list, array): A list or 1D numpy array of named features\n to return. If None, returns data for all features (i.e., all\n columns in array).\n dense (bool): Optional boolean. When True (default), convert the\n result to a dense array before returning. When False, keep as\n sparse matrix. Note that if ids is not None, the returned array\n will always be dense.\n Returns:\n A pandas DataFrame with study IDs in rows and features incolumns.\n \"\"\"\n result = self.data\n\n if ids is not None:\n result = result.ix[ids]\n\n if features is not None:\n result = result.ix[:, features]\n\n return result.to_dense() if dense else result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngiving a list of features returns features in order that they appear in the database.", "response": "def get_ordered_names(self, features):\n \"\"\" Given a list of features, returns features in order that they\n appear in database.\n\n Args:\n features (list): A list or 1D numpy array of named features to\n return.\n\n Returns:\n A list of features in order they appear in database.\n \"\"\"\n\n idxs = np.where(\n np.in1d(self.data.columns.values, np.array(features)))[0]\n return list(self.data.columns[idxs].values)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_ids(self, features, threshold=0.0, func=np.sum, get_weights=False):\n if isinstance(features, str):\n features = [features]\n features = self.search_features(features) # Expand wild cards\n feature_weights = self.data.ix[:, features]\n weights = feature_weights.apply(func, 1)\n above_thresh = weights[weights >= threshold]\n # ids_to_keep = self.ids[above_thresh]\n return above_thresh if get_weights else list(above_thresh.index)", "response": "Returns a list of all studies in the table that meet the desired feature - based criteria."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef search_features(self, search):\n ''' Returns all features that match any of the elements in the input\n list.\n\n Args:\n search (str, list): A string or list of strings defining the query.\n\n Returns:\n A list of matching feature names.\n '''\n if isinstance(search, string_types):\n search = [search]\n search = [s.replace('*', '.*') for s in search]\n cols = list(self.data.columns)\n results = []\n for s in search:\n results.extend([f for f in cols if re.match(s + '$', f)])\n return list(set(results))", "response": "Returns all features that match any of the elements in the input\n list."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_ids_by_expression(self, expression, threshold=0.001, func=np.sum):\n lexer = lp.Lexer()\n lexer.build()\n parser = lp.Parser(\n lexer, self.dataset, threshold=threshold, func=func)\n parser.build()\n return parser.parse(expression).keys().values", "response": "Use a PEG to parse expression and return study IDs."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_features_by_ids(self, ids=None, threshold=0.0001, func=np.mean,\n get_weights=False):\n ''' Returns features for which the mean loading across all specified\n studies (in ids) is >= threshold. '''\n weights = self.data.ix[ids].apply(func, 0)\n above_thresh = weights[weights >= threshold]\n return above_thresh if get_weights else list(above_thresh.index)", "response": "Returns features for all specified identifiers."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nconvert FeatureTable to SciPy CSR matrix.", "response": "def _sdf_to_csr(self):\n \"\"\" Convert FeatureTable to SciPy CSR matrix. \"\"\"\n data = self.data.to_dense()\n self.data = {\n 'columns': list(data.columns),\n 'index': list(data.index),\n 'values': sparse.csr_matrix(data.values)\n }"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ndeprecate decorator. Takes optional deprecation message.", "response": "def deprecated(*args):\n \"\"\" Deprecation warning decorator. Takes optional deprecation message,\n otherwise will use a generic warning. \"\"\"\n def wrap(func):\n def wrapped_func(*args, **kwargs):\n warnings.warn(msg, category=DeprecationWarning)\n return func(*args, **kwargs)\n return wrapped_func\n\n if len(args) == 1 and callable(args[0]):\n msg = \"Function '%s' will be deprecated in future versions of \" \\\n \"Neurosynth.\" % args[0].__name__\n return wrap(args[0])\n else:\n msg = args[0]\n return wrap"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef magic(dataset, method='coactivation', roi_mask=None,\n coactivation_mask=None, features=None, feature_threshold=0.05,\n min_voxels_per_study=None, min_studies_per_voxel=None,\n reduce_reference='pca', n_components=100,\n distance_metric='correlation', clustering_algorithm='kmeans',\n n_clusters=5, clustering_kwargs={}, output_dir=None, filename=None,\n coactivation_images=False, coactivation_threshold=0.1):\n ''' Execute a full clustering analysis pipeline.\n Args:\n dataset: a Dataset instance to extract all data from.\n method (str): the overall clustering approach to use. Valid options:\n 'coactivation' (default): Clusters voxel within the ROI mask based\n on shared pattern of coactivation with the rest of the brain.\n 'studies': Treat each study as a feature in an n-dimensional space.\n I.e., voxels will be assigned to the same cluster if they tend\n to be co-reported in similar studies.\n 'features': Voxels will be assigned to the same cluster if they\n tend to have similar feature vectors (i.e., the studies that\n activate those voxels tend to use similar terms).\n roi_mask: A string, nibabel image, or numpy array providing an\n inclusion mask of voxels to cluster. If None, the default mask\n in the Dataset instance is used (typically, all in-brain voxels).\n coactivation_mask: If method='coactivation', this mask defines the\n voxels to use when generating the pairwise distance matrix. For\n example, if a PFC mask is passed, all voxels in the roi_mask will\n be clustered based on how similar their patterns of coactivation\n with PFC voxels are. Can be a str, nibabel image, or numpy array.\n features (str or list): Optional string or list of strings specifying\n any feature names to use for study selection. E.g., passing\n ['emotion', 'reward'] would retain for analysis only those studies\n associated with the features emotion or reward at a frequency\n greater than feature_threshold.\n feature_threshold (float): The threshold to use when selecting studies\n on the basis of features.\n min_voxels_per_study (int): Minimum number of active voxels a study\n must report in order to be retained in the dataset. By default,\n all studies are used.\n min_studies_per_voxel (int): Minimum number of studies a voxel must be\n active in in order to be retained in analysis. By default, all\n voxels are used.\n reduce_reference (str, scikit-learn object or None): The dimensionality\n reduction algorithm to apply to the feature space prior to the\n computation of pairwise distances. If a string is passed (either\n 'pca' or 'ica'), n_components must be specified. If None, no\n dimensionality reduction will be applied. Otherwise, must be a\n scikit-learn-style object that exposes a transform() method.\n n_components (int): Number of components to extract during the\n dimensionality reduction step. Only used if reduce_reference is\n a string.\n distance_metric (str): The distance metric to use when computing\n pairwise distances on the to-be-clustered voxels. Can be any of the\n metrics supported by sklearn.metrics.pairwise_distances.\n clustering_algorithm (str or scikit-learn object): the clustering\n algorithm to use. If a string, must be one of 'kmeans' or 'minik'.\n Otherwise, any sklearn class that exposes a fit_predict() method.\n n_clusters (int): If clustering_algorithm is a string, the number of\n clusters to extract.\n clustering_kwargs (dict): Additional keywords to pass to the clustering\n object.\n output_dir (str): The directory to write results to. If None (default),\n returns the cluster label image rather than saving to disk.\n filename (str): Name of cluster label image file. Defaults to\n cluster_labels_k{k}.nii.gz, where k is the number of clusters.\n coactivation_images (bool): If True, saves a meta-analytic coactivation\n map for every ROI in the resulting cluster map.\n coactivation_threshold (float or int): If coactivation_images is True,\n this is the threshold used to define whether or not a study is\n considered to activation within a cluster ROI. Integer values are\n interpreted as minimum number of voxels within the ROI; floats\n are interpreted as the proportion of voxels. Defaults to 0.1 (i.e.,\n 10% of all voxels within ROI must be active).\n '''\n\n roi = Clusterable(dataset, roi_mask, min_voxels=min_voxels_per_study,\n min_studies=min_studies_per_voxel, features=features,\n feature_threshold=feature_threshold)\n\n if method == 'coactivation':\n reference = Clusterable(dataset, coactivation_mask,\n min_voxels=min_voxels_per_study,\n min_studies=min_studies_per_voxel,\n features=features,\n feature_threshold=feature_threshold)\n elif method == 'features':\n reference = deepcopy(roi)\n feature_data = dataset.feature_table.data\n n_studies = len(feature_data)\n reference.data = reference.data.dot(feature_data.values) / n_studies\n elif method == 'studies':\n reference = roi\n\n if reduce_reference is not None:\n\n if isinstance(reduce_reference, string_types):\n\n # Number of components can't exceed feature count or cluster count\n n_feat = reference.data.shape[1]\n n_components = min(n_components, n_feat)\n\n reduce_reference = {\n 'pca': sk_decomp.PCA,\n 'ica': sk_decomp.FastICA\n }[reduce_reference](n_components)\n\n # For non-coactivation-based approaches, transpose the data matrix\n transpose = (method == 'coactivation')\n reference = reference.transform(reduce_reference, transpose=transpose)\n\n if method == 'coactivation':\n distances = pairwise_distances(roi.data, reference.data,\n metric=distance_metric)\n else:\n distances = reference.data\n\n # TODO: add additional clustering methods\n if isinstance(clustering_algorithm, string_types):\n clustering_algorithm = {\n 'kmeans': sk_cluster.KMeans,\n 'minik': sk_cluster.MiniBatchKMeans\n }[clustering_algorithm](n_clusters, **clustering_kwargs)\n\n labels = clustering_algorithm.fit_predict(distances) + 1.\n header = roi.masker.get_header()\n header['cal_max'] = labels.max()\n header['cal_min'] = labels.min()\n voxel_labels = roi.masker.unmask(labels)\n img = nifti1.Nifti1Image(voxel_labels, None, header)\n\n if output_dir is not None:\n if not exists(output_dir):\n makedirs(output_dir)\n if filename is None:\n filename = 'cluster_labels_k%d.nii.gz' % n_clusters\n outfile = join(output_dir, filename)\n img.to_filename(outfile)\n\n # Write coactivation images\n if coactivation_images:\n for l in np.unique(voxel_labels):\n roi_mask = np.copy(voxel_labels)\n roi_mask[roi_mask != l] = 0\n ids = dataset.get_studies(\n mask=roi_mask, activation_threshold=coactivation_threshold)\n ma = meta.MetaAnalysis(dataset, ids)\n ma.save_results(output_dir=join(output_dir, 'coactivation'),\n prefix='cluster_%d_coactivation' % l)\n else:\n return img", "response": "This function is used to run a full clustering analysis pipeline."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef transform(self, transformer, transpose=False):\n ''' Apply a transformation to the Clusterable instance. Accepts any\n scikit-learn-style class that implements a fit_transform() method. '''\n data = self.data.T if transpose else self.data\n data = transformer.fit_transform(data)\n self.data = data.T if transpose else data\n return self", "response": "Apply a transformation to the Clusterable instance. Accepts a scikit - learn - style class that implements a fit_transform method. Accepts a scikit - learn - style class that implements a fit_transform method. Accepts a transpose = True parameter."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef analyze_features(dataset, features=None, image_type='association-test_z',\n threshold=0.001, q=0.01, output_dir=None, prefix=None):\n \"\"\" Generate meta-analysis images for a set of features.\n Args:\n dataset: A Dataset instance containing feature and activation data.\n features: A list of named features to generate meta-analysis maps for.\n If None, analyzes all features in the current dataset.\n image_type: The type of image to return. Specify one of the extensions\n generated by the MetaAnalysis procedure--e.g., association-test_z,\n uniformity-test_z, etc. By default, will use\n association-test_z (i.e., z-scores reflecting the association\n between presence of activation and presence of feature).\n threshold: The threshold for determining whether or not a Mappable has\n a feature. By default, this is 0.001, which is only sensible in the\n case of term-based features (so be sure to specify it for other\n kinds).\n q: The FDR rate to use for multiple comparisons correction (default =\n 0.05).\n output_dir: Directory to save all meta-analysis images to. If none,\n returns all the data as a matrix.\n prefix: All output images will be prepended with this string (if None,\n defaults to the name of the feature).\n Returns:\n If output_dir is None, an n_voxels x n_features 2D numpy array.\n \"\"\"\n if features is None:\n features = dataset.get_feature_names()\n if output_dir is None:\n result = np.zeros((dataset.masker.n_vox_in_mask, len(features)))\n\n for i, f in enumerate(features):\n ids = dataset.get_studies(features=f, frequency_threshold=threshold)\n ma = MetaAnalysis(dataset, ids, q=q)\n if output_dir is None:\n result[:, i] = ma.images[image_type]\n else:\n pfx = f if prefix is None else prefix + '_' + f\n ma.save_results(output_dir=output_dir, prefix=pfx)\n\n if output_dir is None:\n return result", "response": "Generate meta - analysis images for a set of features."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nsave the results of the meta - analysis to a directory.", "response": "def save_results(self, output_dir='.', prefix='', prefix_sep='_',\n image_list=None):\n \"\"\" Write out any images generated by the meta-analysis.\n Args:\n output_dir (str): folder to write images to\n prefix (str): all image files will be prepended with this string\n prefix_sep (str): glue between the prefix and rest of filename\n image_list (list): optional list of images to save--e.g.,\n ['pFgA_z', 'pAgF']. If image_list is None (default), will save\n all images.\n \"\"\"\n\n if prefix == '':\n prefix_sep = ''\n\n if not exists(output_dir):\n makedirs(output_dir)\n\n logger.debug(\"Saving results...\")\n if image_list is None:\n image_list = self.images.keys()\n for suffix, img in self.images.items():\n if suffix in image_list:\n filename = prefix + prefix_sep + suffix + '.nii.gz'\n outpath = join(output_dir, filename)\n imageutils.save_img(img, outpath, self.dataset.masker)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nconvert coordinates from one space to another using provided matrix.", "response": "def transform(foci, mat):\n \"\"\" Convert coordinates from one space to another using provided\n transformation matrix. \"\"\"\n t = linalg.pinv(mat)\n foci = np.hstack((foci, np.ones((foci.shape[0], 1))))\n return np.dot(foci, t)[:, 0:3]"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef xyz_to_mat(foci, xyz_dims=None, mat_dims=None):\n foci = np.hstack((foci, np.ones((foci.shape[0], 1))))\n mat = np.array([[-0.5, 0, 0, 45], [0, 0.5, 0, 63], [0, 0, 0.5, 36]]).T\n result = np.dot(foci, mat)[:, ::-1] # multiply and reverse column order\n return np.round_(result).astype(int)", "response": "Convert an N x 3 array of XYZ coordinates to matrix indices."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef apply(self, name, foci):\n if name in self.transformations:\n return transform(foci, self.transformations[name])\n else:\n logger.info(\n \"No transformation named '%s' found; coordinates left \"\n \"untransformed.\" % name)\n return foci", "response": "Apply a named transformation to a set of foci objects."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nresets all layers and stack to empty.", "response": "def reset(self):\n \"\"\" Reset/remove all layers, keeping only the initial volume. \"\"\"\n self.layers = {}\n self.stack = []\n self.set_mask()\n self.n_vox_in_vol = len(np.where(self.current_mask)[0])"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef add(self, layers, above=None, below=None):\n\n def add_named_layer(name, image):\n image = self.get_image(image, output='vector')\n if above is not None:\n image[image < above] = 0.\n if below is not None:\n image[image > below] = 0.\n self.layers[name] = image\n self.stack.append(name)\n\n if isinstance(layers, dict):\n for (name, image) in layers.items():\n add_named_layer(name, image)\n\n else:\n if not isinstance(layers, list):\n layers = [layers]\n for image in layers:\n name = 'layer_%d' % len(self.stack)\n add_named_layer(name, image)\n\n self.set_mask()", "response": "Adds one or more layers to the stack of masking layers."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef remove(self, layers):\n if not isinstance(layers, list):\n layers = [layers]\n for l in layers:\n if isinstance(l, string_types):\n if l not in self.layers:\n raise ValueError(\"There's no image/layer named '%s' in \"\n \"the masking stack!\" % l)\n self.stack.remove(l)\n else:\n l = self.stack.pop(l)\n del self.layers[l]\n\n self.set_mask()", "response": "Removes one or more layers from the stack of masking layers."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef get_image(self, image, output='vector'):\n if isinstance(image, string_types):\n image = nb.load(image)\n\n if type(image).__module__.startswith('nibabel'):\n if output == 'image':\n return image\n image = image.get_data()\n\n if not type(image).__module__.startswith('numpy'):\n raise ValueError(\"Input image must be a string, a NiBabel image, \"\n \"or a numpy array.\")\n\n if image.shape[:3] == self.volume.shape:\n if output == 'image':\n return nb.nifti1.Nifti1Image(image, None, self.get_header())\n elif output == 'array':\n return image\n else:\n image = image.ravel()\n\n if output == 'vector':\n return image.ravel()\n\n image = np.reshape(image, self.volume.shape)\n\n if output == 'array':\n return image\n\n return nb.nifti1.Nifti1Image(image, None, self.get_header())", "response": "A flexible method for transforming between different\nAttributeNames representations of image data."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreconstructing a masked vector into the original 3D volume.", "response": "def unmask(self, data, layers=None, output='array'):\n \"\"\" Reconstruct a masked vector into the original 3D volume.\n Args:\n data: The 1D vector to reconstruct. (Can also be a 2D vector where\n the second dimension is time, but then output will always\n be set to 'array'--i.e., a 4D image will be returned.)\n layers: Which mask layers to use (specified as int, string, or list\n of ints and strings). When None, applies the conjunction of all\n layers. Note that the layers specified here must exactly match\n the layers used in the mask() operation, otherwise the shape of\n the mask will be incorrect and bad things will happen.\n output: What kind of object to return. See options in get_image().\n By default, returns an N-dimensional array of reshaped data.\n \"\"\"\n self.set_mask(layers)\n if data.ndim == 2:\n n_volumes = data.shape[1]\n # Assume 1st dimension is voxels, 2nd is time\n # but we generate x,y,z,t volume\n image = np.zeros(self.full.shape + (n_volumes,))\n image[self.current_mask, :] = data\n image = np.reshape(image, self.volume.shape + (n_volumes,))\n else:\n # img = self.full.copy()\n image = np.zeros(self.full.shape)\n image[self.current_mask] = data\n return self.get_image(image, output)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef get_mask(self, layers=None, output='vector', in_global_mask=True):\n if in_global_mask:\n output = 'vector'\n\n if layers is None:\n layers = self.layers.keys()\n elif not isinstance(layers, list):\n layers = [layers]\n\n layers = map(lambda x: x if isinstance(x, string_types)\n else self.stack[x], layers)\n layers = [self.layers[l] for l in layers if l in self.layers]\n\n # Always include the original volume\n layers.append(self.full)\n layers = np.vstack(layers).T.astype(bool)\n mask = layers.all(axis=1)\n mask = self.get_image(mask, output)\n return mask[self.global_mask] if in_global_mask else mask", "response": "Get the current mask by taking the conjunction of all specified layers."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreturns all points within r mm of coordinates.", "response": "def get_sphere(coords, r=4, vox_dims=(2, 2, 2), dims=(91, 109, 91)):\n \"\"\" # Return all points within r mm of coordinates. Generates a cube\n and then discards all points outside sphere. Only returns values that\n fall within the dimensions of the image.\"\"\"\n r = float(r)\n xx, yy, zz = [slice(-r / vox_dims[i], r / vox_dims[\n i] + 0.01, 1) for i in range(len(coords))]\n cube = np.vstack([row.ravel() for row in np.mgrid[xx, yy, zz]])\n sphere = cube[:, np.sum(np.dot(np.diag(\n vox_dims), cube) ** 2, 0) ** .5 <= r]\n sphere = np.round(sphere.T + coords)\n return sphere[(np.min(sphere, 1) >= 0) &\n (np.max(np.subtract(sphere, dims), 1) <= -1), :].astype(int)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nmapping a set of peaks to an image.", "response": "def map_peaks_to_image(peaks, r=4, vox_dims=(2, 2, 2), dims=(91, 109, 91),\n header=None):\n \"\"\" Take a set of discrete foci (i.e., 2-D array of xyz coordinates)\n and generate a corresponding image, convolving each focus with a\n hard sphere of radius r.\"\"\"\n data = np.zeros(dims)\n for p in peaks:\n valid = get_sphere(p, r, vox_dims, dims)\n valid = valid[:, ::-1]\n data[tuple(valid.T)] = 1\n return nifti1.Nifti1Image(data, None, header=header)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef load_imgs(filenames, masker, nan_to_num=True):\n if isinstance(filenames, string_types):\n filenames = [filenames]\n data = np.zeros((masker.n_vox_in_mask, len(filenames)))\n for i, f in enumerate(filenames):\n data[:, i] = masker.mask(f, nan_to_num)\n return data", "response": "Load multiple images from file into an ndarray."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef save_img(data, filename, masker, header=None):\n if not header:\n header = masker.get_header()\n header.set_data_dtype(data.dtype) # Avoids loss of precision\n # Update min/max -- this should happen on save, but doesn't seem to\n header['cal_max'] = data.max()\n header['cal_min'] = data.min()\n img = nifti1.Nifti1Image(masker.unmask(data), None, header)\n img.to_filename(filename)", "response": "Save a vectorized image to file."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef threshold_img(data, threshold, mask=None, mask_out='below'):\n if mask is not None:\n mask = threshold_img(mask, threshold, mask_out=mask_out)\n return data * mask.astype(bool)\n if mask_out.startswith('b'):\n data[data < threshold] = 0\n elif mask_out.startswith('a'):\n data[data > threshold] = 0\n return data", "response": "Thresholds the data array to be within threshold."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef create_grid(image, scale=4, apply_mask=True, save_file=None):\n if isinstance(image, string_types):\n image = nb.load(image)\n\n # create a list of cluster centers\n centers = []\n x_length, y_length, z_length = image.shape\n for x in range(0, x_length, scale):\n for y in range(0, y_length, scale):\n for z in range(0, z_length, scale):\n centers.append((x, y, z))\n\n # create a box around each center with the diameter equal to the scaling\n # factor\n grid = np.zeros(image.shape)\n for (i, (x, y, z)) in enumerate(centers):\n for mov_x in range((-scale + 1) // 2, (scale + 1) // 2):\n for mov_y in range((-scale + 1) // 2, (scale + 1) // 2):\n for mov_z in range((-scale + 1) // 2, (scale + 1) // 2):\n try: # Ignore voxels outside bounds of image\n grid[x + mov_x, y + mov_y, z + mov_z] = i + 1\n except:\n pass\n\n if apply_mask:\n mask = image\n if isinstance(mask, string_types):\n mask = nb.load(mask)\n if type(mask).__module__ != np.__name__:\n mask = mask.get_data()\n grid[~mask.astype(bool)] = 0.0\n\n grid = nb.Nifti1Image(grid, image.get_affine(), image.get_header())\n\n if save_file is not None:\n nb.save(grid, save_file)\n\n return grid", "response": "Creates a 3D grid of labeled cells."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef set_logging_level(level=None):\n if level is None:\n level = os.environ.get('NEUROSYNTH_LOGLEVEL', 'warn')\n if level is not None:\n logger.setLevel(getattr(logging, level.upper()))\n return logger.getEffectiveLevel()", "response": "Set the logging level of the neurosynth."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef expand_address(address, languages=None, **kw):\n address = safe_decode(address, 'utf-8')\n return _expand.expand_address(address, languages=languages, **kw)", "response": "Expand the given address into one or more normalized strings."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nnormalizing a string tokenizes and normalizes each token with string and token - level options.", "response": "def normalized_tokens(s, string_options=DEFAULT_STRING_OPTIONS,\n token_options=DEFAULT_TOKEN_OPTIONS,\n strip_parentheticals=True, whitespace=False,\n languages=None):\n '''\n Normalizes a string, tokenizes, and normalizes each token\n with string and token-level options.\n\n This version only uses libpostal's deterministic normalizations\n i.e. methods with a single output. The string tree version will\n return multiple normalized strings, each with tokens.\n\n Usage:\n normalized_tokens(u'St.-Barth\u00e9lemy')\n '''\n s = safe_decode(s)\n normalized_tokens = _normalize.normalized_tokens(s, string_options, token_options, whitespace, languages=languages)\n\n if strip_parentheticals:\n normalized_tokens = remove_parens(normalized_tokens)\n\n return [(s, token_types.from_id(token_type)) for s, token_type in normalized_tokens]"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nparses an address into a sequence of components.", "response": "def parse_address(address, language=None, country=None):\n \"\"\"\n Parse address into components.\n\n @param address: the address as either Unicode or a UTF-8 encoded string\n @param language (optional): language code\n @param country (optional): country code\n \"\"\"\n address = safe_decode(address, 'utf-8')\n return _parser.parse_address(address, language=language, country=country)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning a list of strings that can be used to group similar addresses together for more detailed pairwise comparison.", "response": "def near_dupe_hashes(labels, values, languages=None, **kw):\n \"\"\"\n Hash the given address into normalized strings that can be used to group similar\n addresses together for more detailed pairwise comparison. This can be thought of\n as the blocking function in record linkage or locally-sensitive hashing in the\n document near-duplicate detection.\n\n Required\n --------\n @param labels: array of component labels as either Unicode or UTF-8 encoded strings\n e.g. [\"house_number\", \"road\", \"postcode\"]\n @param values: array of component values as either Unicode or UTF-8 encoded strings\n e.g. [\"123\", \"Broadway\", \"11216\"]. Note len(values) must be equal to\n len(labels).\n\n Options\n -------\n @param languages: a tuple or list of ISO language code strings (e.g. \"en\", \"fr\", \"de\", etc.)\n to use in expansion. If None is passed, use language classifier\n to detect language automatically.\n @param with_name: use name in the hashes\n @param with_address: use house_number & street in the hashes\n @param with_unit: use secondary unit as part of the hashes\n @param with_city_or_equivalent: use the city, city_district, suburb, or island name as one of\n the geo qualifiers\n @param with_small_containing_boundaries: use small containing boundaries (currently state_district)\n as one of the geo qualifiers\n @param with_postal_code: use postal code as one of the geo qualifiers\n @param with_latlon: use geohash + neighbors as one of the geo qualifiers\n @param latitude: latitude (Y coordinate)\n @param longitude: longitude (X coordinate)\n @param geohash_precision: geohash tile size (default = 6)\n @param name_and_address_keys: include keys with name + address + geo\n @param name_only_keys: include keys with name + geo\n @param address_only_keys: include keys with address + geo\n \"\"\"\n return _near_dupe.near_dupe_hashes(labels, values, languages=languages, **kw)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef has_api_key(file_name):\n f = open(file_name, 'r')\n text = f.read()\n if re.search(real_api_regex, text) is not None and \\\n re.search(zero_api_regex, text) is None:\n return True\n return False", "response": "Detects whether the file contains an api key that is not 40*'0."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nremove the api key from the Token object.", "response": "def remove_api_key(file_name):\n \"\"\"\n Change the api key in the Token object to 40*'0'. See issue #86.\n :param file: path-to-file to change\n \"\"\"\n with open(file_name, 'r') as fp:\n text = fp.read()\n text = re.sub(real_api_regex, zero_token_string, text)\n with open(file_name, 'w') as fp:\n fp.write(text)\n return"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nconverts a python dict to a namedtuple saving memory.", "response": "def dict_to_object(item, object_name):\n \"\"\"Converts a python dict to a namedtuple, saving memory.\"\"\"\n fields = item.keys()\n values = item.values()\n return json.loads(json.dumps(item),\n object_hook=lambda d:\n namedtuple(object_name, fields)(*values))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef list_tickers(self, assetType):\n listing_file_url = \"https://apimedia.tiingo.com/docs/tiingo/daily/supported_tickers.zip\"\n response = requests.get(listing_file_url)\n zipdata = get_zipfile_from_response(response)\n raw_csv = get_buffer_from_zipfile(zipdata, 'supported_tickers.csv')\n reader = csv.DictReader(raw_csv)\n\n return [row for row in reader\n if row.get('assetType') == assetType]", "response": "Return a list of dicts of metadata tickers for all supported tickers\n of the specified assetType."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_ticker_metadata(self, ticker, fmt='json'):\n url = \"tiingo/daily/{}\".format(ticker)\n response = self._request('GET', url)\n data = response.json()\n if fmt == 'json':\n return data\n elif fmt == 'object':\n return dict_to_object(data, \"Ticker\")", "response": "Return metadata for 1 ticker in the specified format"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking to see if frequency is specified correctly.", "response": "def _invalid_frequency(self, frequency):\n \"\"\"\n Check to see that frequency was specified correctly\n :param frequency (string): frequency string\n :return (boolean):\n \"\"\"\n is_valid = self._is_eod_frequency(frequency) or re.match(self._frequency_pattern, frequency)\n return not is_valid"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _get_url(self, ticker, frequency):\n if self._invalid_frequency(frequency):\n etext = (\"Error: {} is an invalid frequency. Check Tiingo API documentation \"\n \"for valid EOD or intraday frequency format.\")\n raise InvalidFrequencyError(etext.format(frequency))\n else:\n if self._is_eod_frequency(frequency):\n return \"tiingo/daily/{}/prices\".format(ticker)\n else:\n return \"iex/{}/prices\".format(ticker)", "response": "Return url based on frequency"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ngets latest EOD Composite Price for a stock ticker.", "response": "def get_ticker_price(self, ticker,\n startDate=None, endDate=None,\n fmt='json', frequency='daily'):\n \"\"\"By default, return latest EOD Composite Price for a stock ticker.\n On average, each feed contains 3 data sources.\n\n Supported tickers + Available Day Ranges are here:\n https://apimedia.tiingo.com/docs/tiingo/daily/supported_tickers.zip\n\n Args:\n ticker (string): Unique identifier for stock ticker\n startDate (string): Start of ticker range in YYYY-MM-DD format\n endDate (string): End of ticker range in YYYY-MM-DD format\n fmt (string): 'csv' or 'json'\n frequency (string): Resample frequency\n \"\"\"\n url = self._get_url(ticker, frequency)\n params = {\n 'format': fmt if fmt != \"object\" else 'json', # conversion local\n 'resampleFreq': frequency\n }\n\n if startDate:\n params['startDate'] = startDate\n if endDate:\n params['endDate'] = endDate\n\n # TODO: evaluate whether to stream CSV to cache on disk, or\n # load as array in memory, or just pass plain text\n response = self._request('GET', url, params=params)\n if fmt == \"json\":\n return response.json()\n elif fmt == \"object\":\n data = response.json()\n return [dict_to_object(item, \"TickerPrice\") for item in data]\n else:\n return response.content.decode(\"utf-8\")"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_dataframe(self, tickers,\n startDate=None, endDate=None, metric_name=None, frequency='daily'):\n\n \"\"\" Return a pandas.DataFrame of historical prices for one or more ticker symbols.\n\n By default, return latest EOD Composite Price for a list of stock tickers.\n On average, each feed contains 3 data sources.\n\n Supported tickers + Available Day Ranges are here:\n https://apimedia.tiingo.com/docs/tiingo/daily/supported_tickers.zip\n or from the TiingoClient.list_tickers() method.\n\n Args:\n tickers (string/list): One or more unique identifiers for a stock ticker.\n startDate (string): Start of ticker range in YYYY-MM-DD format.\n endDate (string): End of ticker range in YYYY-MM-DD format.\n metric_name (string): Optional parameter specifying metric to be returned for each\n ticker. In the event of a single ticker, this is optional and if not specified\n all of the available data will be returned. In the event of a list of tickers,\n this parameter is required.\n frequency (string): Resample frequency (defaults to daily).\n \"\"\"\n\n valid_columns = ['open', 'high', 'low', 'close', 'volume', 'adjOpen', 'adjHigh', 'adjLow',\n 'adjClose', 'adjVolume', 'divCash', 'splitFactor']\n\n if metric_name is not None and metric_name not in valid_columns:\n raise APIColumnNameError('Valid data items are: ' + str(valid_columns))\n\n params = {\n 'format': 'json',\n 'resampleFreq': frequency\n }\n if startDate:\n params['startDate'] = startDate\n if endDate:\n params['endDate'] = endDate\n\n if pandas_is_installed:\n if type(tickers) is str:\n stock = tickers\n url = self._get_url(stock, frequency)\n response = self._request('GET', url, params=params)\n df = pd.DataFrame(response.json())\n if metric_name is not None:\n prices = df[metric_name]\n prices.index = df['date']\n else:\n prices = df\n prices.index = df['date']\n del (prices['date'])\n else:\n prices = pd.DataFrame()\n for stock in tickers:\n url = self._get_url(stock, frequency)\n response = self._request('GET', url, params=params)\n df = pd.DataFrame(response.json())\n df.index = df['date']\n df.rename(index=str, columns={metric_name: stock}, inplace=True)\n prices = pd.concat([prices, df[stock]], axis=1)\n prices.index = pd.to_datetime(prices.index)\n return prices\n else:\n error_message = (\"Pandas is not installed, but .get_ticker_price() was \"\n \"called with fmt=pandas. In order to install tiingo with \"\n \"pandas, reinstall with pandas as an optional dependency. \\n\"\n \"Install tiingo with pandas dependency: \\'pip install tiingo[pandas]\\'\\n\"\n \"Alternatively, just install pandas: pip install pandas.\")\n raise InstallPandasException(error_message)", "response": "Returns a pandas. DataFrame of historical prices for one or more stock symbols."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef get_news(self, tickers=[], tags=[], sources=[], startDate=None,\n endDate=None, limit=100, offset=0, sortBy=\"publishedDate\",\n fmt='json'):\n \"\"\"Return list of news articles matching given search terms\n https://api.tiingo.com/docs/tiingo/news\n\n # Dates are in YYYY-MM-DD Format.\n\n Args:\n tickers [string] : List of unique Stock Tickers to search\n tags [string] : List of topics tagged by Tiingo Algorithms\n sources [string]: List of base urls to include as news sources\n startDate, endDate [date]: Boundaries of news search window\n limit (int): Max results returned. Default 100, max 1000\n offset (int): Search results offset, used for paginating\n sortBy (string): \"publishedDate\" OR (#TODO: UPDATE THIS)\n \"\"\"\n url = \"tiingo/news\"\n params = {\n 'limit': limit,\n 'offset': offset,\n 'sortBy': sortBy,\n 'tickers': tickers,\n 'sources': sources,\n 'tags': tags,\n 'startDate': startDate,\n 'endDate': endDate\n }\n response = self._request('GET', url, params=params)\n data = response.json()\n if fmt == 'json':\n return data\n elif fmt == 'object':\n return [dict_to_object(item, \"NewsArticle\") for item in data]", "response": "Returns list of news articles matching given search terms\n "} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns a list of available file_ids.", "response": "def get_bulk_news(self, file_id=None, fmt='json'):\n \"\"\"Only available to institutional clients.\n If ID is NOT provided, return array of available file_ids.\n If ID is provided, provides URL which you can use to download your\n file, as well as some metadata about that file.\n \"\"\"\n if file_id:\n url = \"tiingo/news/bulk_download/{}\".format(file_id)\n else:\n url = \"tiingo/news/bulk_download\"\n\n response = self._request('GET', url)\n data = response.json()\n if fmt == 'json':\n return data\n elif fmt == 'object':\n return dict_to_object(data, \"BulkNews\")"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nmake HTTP request and return response object", "response": "def _request(self, method, url, **kwargs):\n \"\"\"Make HTTP request and return response object\n\n Args:\n method (str): GET, POST, PUT, DELETE\n url (str): path appended to the base_url to create request\n **kwargs: passed directly to a requests.request object\n \"\"\"\n resp = self._session.request(method,\n '{}/{}'.format(self._base_url, url),\n headers=self._headers,\n **kwargs)\n\n try:\n resp.raise_for_status()\n except HTTPError as e:\n logging.error(resp.content)\n raise RestClientError(e)\n\n return resp"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\nasync def get_bearer_info(self):\n if self.client_id is None:\n raise SpotifyException(_GET_BEARER_ERR % 'client_id')\n\n elif self.client_secret is None:\n raise SpotifyException(_GET_BEARER_ERR % 'client_secret')\n\n token = b64encode(':'.join((self.client_id, self.client_secret)).encode())\n\n kwargs = {\n 'url': 'https://accounts.spotify.com/api/token',\n 'data': {'grant_type': 'client_credentials'},\n 'headers': {'Authorization': 'Basic ' + token.decode()}\n }\n\n async with self._session.post(**kwargs) as resp:\n return json.loads(await resp.text(encoding='utf-8'))", "response": "Get the application bearer token from client_id and client_secret."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nmakes a request to the spotify API with the current bearer credentials.", "response": "async def request(self, route, **kwargs):\n \"\"\"Make a request to the spotify API with the current bearer credentials.\n\n Parameters\n ----------\n route : Union[tuple[str, str], Route]\n A tuple of the method and url or a :class:`Route` object.\n kwargs : Any\n keyword arguments to pass into :class:`aiohttp.ClientSession.request`\n \"\"\"\n if isinstance(route, tuple):\n method, url = route\n else:\n method = route.method\n url = route.url\n\n if self.bearer_info is None:\n self.bearer_info = bearer_info = await self.get_bearer_info()\n access_token = bearer_info['access_token']\n else:\n access_token = self.bearer_info['access_token']\n\n headers = {\n 'Authorization': 'Bearer ' + access_token,\n 'Content-Type': kwargs.get('content_type', 'application/json'),\n **kwargs.pop('headers', {})\n }\n\n for _ in range(self.RETRY_AMOUNT):\n r = await self._session.request(method, url, headers=headers, **kwargs)\n try:\n status = r.status\n\n try:\n data = json.loads(await r.text(encoding='utf-8'))\n except json.decoder.JSONDecodeError:\n data = {}\n\n if 300 > status >= 200:\n return data\n\n if status == 401:\n self.bearer_info = bearer_info = await self.get_bearer_info()\n headers['Authorization'] = 'Bearer ' + bearer_info['access_token']\n continue\n\n if status == 429:\n # we're being rate limited.\n amount = r.headers.get('Retry-After')\n await asyncio.sleep(int(amount), loop=self.loop)\n continue\n\n if status in (502, 503):\n # unconditional retry\n continue\n\n if status == 403:\n raise Forbidden(r, data)\n elif status == 404:\n raise NotFound(r, data)\n finally:\n await r.release()\n else:\n raise HTTPException(r, data)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nget a spotify album by its ID.", "response": "def album(self, spotify_id, market='US'):\n \"\"\"Get a spotify album by its ID.\n\n Parameters\n ----------\n spotify_id : str\n The spotify_id to search by.\n market : Optional[str]\n An ISO 3166-1 alpha-2 country code.\n\n Returns\n -------\n album : Dict\n The album object.\n \"\"\"\n route = Route('GET', '/albums/{spotify_id}', spotify_id=spotify_id)\n payload = {}\n\n if market:\n payload['market'] = market\n\n return self.request(route, params=payload)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef album_tracks(self, spotify_id, limit=20, offset=0, market='US'):\n route = Route('GET', '/albums/{spotify_id}/tracks', spotify_id=spotify_id)\n payload = {'limit': limit, 'offset': offset}\n\n if market:\n payload['market'] = market\n\n return self.request(route, params=payload)", "response": "Get an albums tracks by an ID."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef albums(self, spotify_ids, market='US'):\n route = Route('GET', '/albums/')\n payload = {'ids': spotify_ids}\n\n if market:\n payload['market'] = market\n\n return self.request(route, params=payload)", "response": "Get a spotify album by its ID."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting a spotify artist by their ID.", "response": "def artist(self, spotify_id):\n \"\"\"Get a spotify artist by their ID.\n\n Parameters\n ----------\n spotify_id : str\n The spotify_id to search by.\n \"\"\"\n route = Route('GET', '/artists/{spotify_id}', spotify_id=spotify_id)\n return self.request(route)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef artist_albums(self, spotify_id, include_groups=None, limit=20, offset=0, market='US'):\n route = Route('GET', '/artists/{spotify_id}/albums', spotify_id=spotify_id)\n payload = {'limit': limit, 'offset': offset}\n\n if include_groups:\n payload['include_groups'] = include_groups\n\n if market:\n payload['market'] = market\n\n return self.request(route, params=payload)", "response": "Get an artists albums."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nget an artists top tracks per country with their ID.", "response": "def artist_top_tracks(self, spotify_id, country):\n \"\"\"Get an artists top tracks per country with their ID.\n\n Parameters\n ----------\n spotify_id : str\n The spotify_id to search by.\n country : COUNTRY_TP\n COUNTRY\n \"\"\"\n route = Route('GET', '/artists/{spotify_id}/top-tracks', spotify_id=spotify_id)\n payload = {'country': country}\n return self.request(route, params=payload)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef artist_related_artists(self, spotify_id):\n route = Route('GET', '/artists/{spotify_id}/related-artists', spotify_id=spotify_id)\n return self.request(route)", "response": "Get the related artists for an artist by their ID."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef artists(self, spotify_ids):\n route = Route('GET', '/artists')\n payload = {'ids': spotify_ids}\n return self.request(route, params=payload)", "response": "Get a spotify artists by their IDs."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget a single category used to tag items in Spotify.", "response": "def category(self, category_id, country=None, locale=None):\n \"\"\"Get a single category used to tag items in Spotify.\n\n Parameters\n ----------\n category_id : str\n The Spotify category ID for the category.\n country : COUNTRY_TP\n COUNTRY\n locale : LOCALE_TP\n LOCALE\n \"\"\"\n route = Route('GET', '/browse/categories/{category_id}', category_id=category_id)\n payload = {}\n\n if country:\n payload['country'] = country\n\n if locale:\n payload['locale'] = locale\n\n return self.request(route, params=payload)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef category_playlists(self, category_id, limit=20, offset=0, country=None):\n route = Route('GET', '/browse/categories/{category_id}/playlists', category_id=category_id)\n payload = {'limit': limit, 'offset': offset}\n\n if country:\n payload['country'] = country\n\n return self.request(route, params=payload)", "response": "Get a list of Spotify playlists tagged with a particular category."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef categories(self, limit=20, offset=0, country=None, locale=None):\n route = Route('GET', '/browse/categories')\n payload = {'limit': limit, 'offset': offset}\n\n if country:\n payload['country'] = country\n\n if locale:\n payload['locale'] = locale\n\n return self.request(route, params=payload)", "response": "Get a list of categories used to tag items in Spotify."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting a list of Spotify featured playlists.", "response": "def featured_playlists(self, locale=None, country=None, timestamp=None, limit=20, offset=0):\n \"\"\"Get a list of Spotify featured playlists.\n\n Parameters\n ----------\n locale : LOCALE_TP\n LOCALE\n country : COUNTRY_TP\n COUNTRY\n timestamp : TIMESTAMP_TP\n TIMESTAMP\n limit : Optional[int]\n The maximum number of items to return. Default: 20. Minimum: 1. Maximum: 50.\n offset : Optional[int]\n The index of the first item to return. Default: 0\n \"\"\"\n route = Route('GET', '/browse/featured-playlists')\n payload = {'limit': limit, 'offset': offset}\n\n if country:\n payload['country'] = country\n\n if locale:\n payload['locale'] = locale\n\n if timestamp:\n payload['timestamp'] = timestamp\n\n return self.request(route, params=payload)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef new_releases(self, *, country=None, limit=20, offset=0):\n route = Route('GET', '/browse/new-releases')\n payload = {'limit': limit, 'offset': offset}\n\n if country:\n payload['country'] = country\n\n return self.request(route, params=payload)", "response": "Get a list of new album releases featured in Spotify."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ngetting Recommendations Based on Seeds.", "response": "def recommendations(self, seed_artists, seed_genres, seed_tracks, *, limit=20, market=None, **filters):\n \"\"\"Get Recommendations Based on Seeds.\n\n Parameters\n ----------\n seed_artists : str\n A comma separated list of Spotify IDs for seed artists. Up to 5 seed values may be provided.\n seed_genres : str\n A comma separated list of any genres in the set of available genre seeds. Up to 5 seed values may be provided.\n seed_tracks : str\n A comma separated list of Spotify IDs for a seed track. Up to 5 seed values may be provided.\n limit : Optional[int]\n The maximum number of items to return. Default: 20. Minimum: 1. Maximum: 50.\n market : Optional[str]\n An ISO 3166-1 alpha-2 country code.\n max_* : Optional[Keyword arguments]\n For each tunable track attribute, a hard ceiling on the selected track attribute\u2019s value can be provided.\n min_* : Optional[Keyword arguments]\n For each tunable track attribute, a hard floor on the selected track attribute\u2019s value can be provided.\n target_* : Optional[Keyword arguments]\n For each of the tunable track attributes (below) a target value may be provided.\n \"\"\"\n route = Route('GET', '/recommendations')\n payload = {'seed_artists': seed_artists, 'seed_genres': seed_genres, 'seed_tracks': seed_tracks, 'limit': limit}\n\n if market:\n payload['market'] = market\n\n if filters:\n payload.update(filters)\n\n return self.request(route, param=payload)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef following_artists_or_users(self, ids, *, type='artist'):\n route = Route('GET', '/me/following/contains')\n payload = {'ids': ids, 'type': type}\n\n return self.request(route, params=payload)", "response": "Checks to see if the current user is following one or more artists or other Spotify users."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\nasync def get_albums(self, *, limit: Optional[int] = 20, offset: Optional[int] = 0, include_groups=None, market: Optional[str] = None) -> List[Album]:\n from .album import Album\n\n data = await self.__client.http.artist_albums(self.id, limit=limit, offset=offset, include_groups=include_groups, market=market)\n return list(Album(self.__client, item) for item in data['items'])", "response": "Get the albums of a Spotify artist."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nload all of the artists albums depending on how many the artist has this.", "response": "async def get_all_albums(self, *, market='US') -> List[Album]:\n \"\"\"loads all of the artists albums, depending on how many the artist has this may be a long operation.\n\n Parameters\n ----------\n market : Optional[str]\n An ISO 3166-1 alpha-2 country code.\n\n Returns\n -------\n albums : List[Album]\n The albums of the artist.\n \"\"\"\n from .album import Album\n\n albums = []\n offset = 0\n total = await self.total_albums(market=market)\n\n while len(albums) < total:\n data = await self.__client.http.artist_albums(self.id, limit=50, offset=offset, market=market)\n\n offset += 50\n albums += list(Album(self.__client, item) for item in data['items'])\n\n return albums"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\nasync def total_albums(self, *, market: str = None) -> int:\n data = await self.__client.http.artist_albums(self.id, limit=1, offset=0, market=market)\n return data['total']", "response": "get the total amout of tracks in the album."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\nasync def top_tracks(self, country: str = 'US') -> List[Track]:\n from .track import Track\n\n top = await self.__client.http.artist_top_tracks(self.id, country=country)\n return list(Track(self.__client, item) for item in top['tracks'])", "response": "Get Spotify s top tracks by country."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ngetting the list of artists that are similar to this artist.", "response": "async def related_artists(self) -> List[Artist]:\n \"\"\"Get Spotify catalog information about artists similar to a given artist.\n\n Similarity is based on analysis of the Spotify community\u2019s listening history.\n\n Returns\n -------\n artists : List[Artits]\n The artists deemed similar.\n \"\"\"\n related = await self.__client.http.artist_related_artists(self.id)\n return list(Artist(self.__client, item) for item in related['artists'])"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\nasync def currently_playing(self) -> Tuple[Context, Track]:\n data = await self.http.currently_playing()\n\n if data.get('item'):\n data['Context'] = Context(data.get('context'))\n data['item'] = Track(self.__client, data.get('item'))\n\n return data", "response": "Get the users currently playing track."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\nasync def get_player(self) -> Player:\n self._player = player = Player(self.__client, self, await self.http.current_player())\n return player", "response": "Get information about the users current playback."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets information about the users avaliable devices.", "response": "async def get_devices(self) -> List[Device]:\n \"\"\"Get information about the users avaliable devices.\n\n Returns\n -------\n devices : List[Device]\n The devices the user has available.\n \"\"\"\n data = await self.http.available_devices()\n return [Device(item) for item in data['devices']]"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets tracks from the current users recently played tracks.", "response": "async def recently_played(self) -> List[Dict[str, Union[Track, Context, str]]]:\n \"\"\"Get tracks from the current users recently played tracks.\n\n Returns\n -------\n playlist_history : List[Dict[str, Union[Track, Context, str]]]\n A list of playlist history object.\n Each object is a dict with a timestamp, track and context field.\n \"\"\"\n data = await self.http.recently_played()\n f = lambda data: {'context': Context(data.get('context')), 'track': Track(self.__client, data.get('track'))}\n # List[T] where T: {'track': Track, 'content': Context: 'timestamp': ISO8601}\n return [{'timestamp': track['timestamp'], **f(track)} for track in data['items']]"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\nasync def add_tracks(self, playlist: Union[str, Playlist], *tracks) -> str:\n tracks = [str(track) for track in tracks]\n data = await self.http.add_playlist_tracks(self.id, str(playlist), tracks=','.join(tracks))\n return data['snapshot_id']", "response": "Add one or more tracks to a user s playlist."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreplace all the tracks in a playlist.", "response": "async def replace_tracks(self, playlist, *tracks) -> str:\n \"\"\"Replace all the tracks in a playlist, overwriting its existing tracks. \n This powerful request can be useful for replacing tracks, re-ordering existing tracks, or clearing the playlist.\n\n Parameters\n ----------\n playlist : Union[str, PLaylist]\n The playlist to modify\n tracks : Sequence[Union[str, Track]]\n Tracks to place in the playlist\n \"\"\"\n tracks = [str(track) for track in tracks]\n await self.http.replace_playlist_tracks(self.id, str(playlist), tracks=','.join(tracks))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nremoving one or more tracks from a user s playlist.", "response": "async def remove_tracks(self, playlist, *tracks):\n \"\"\"Remove one or more tracks from a user\u2019s playlist.\n\n Parameters\n ----------\n playlist : Union[str, Playlist]\n The playlist to modify\n tracks : Sequence[Union[str, Track]]\n Tracks to remove from the playlist\n\n Returns\n -------\n snapshot_id : str\n The snapshot id of the playlist.\n \"\"\"\n tracks = [str(track) for track in tracks]\n data = await self.http.remove_playlist_tracks(self.id, str(playlist), tracks=','.join(tracks))\n return data['snapshot_id']"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreorder a playlist or a group of tracks in a playlist.", "response": "async def reorder_tracks(self, playlist, start, insert_before, length=1, *, snapshot_id=None):\n \"\"\"Reorder a track or a group of tracks in a playlist.\n\n Parameters\n ----------\n playlist : Union[str, Playlist]\n The playlist to modify\n start : int\n The position of the first track to be reordered.\n insert_before : int\n The position where the tracks should be inserted.\n length : Optional[int]\n The amount of tracks to be reordered. Defaults to 1 if not set.\n snapshot_id : str\n The playlist\u2019s snapshot ID against which you want to make the changes.\n\n Returns\n -------\n snapshot_id : str\n The snapshot id of the playlist.\n \"\"\"\n data = await self.http.reorder_playlists_tracks(self.id, str(playlist), start, length, insert_before, snapshot_id=snapshot_id)\n return data['snapshot_id']"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchanges a playlist\u2019s name and public/private, collaborative state and description. Parameters ---------- playlist : Union[str, Playlist] The playlist to modify name : Optional[str] The new name of the playlist. public : Optional[bool] The public/private status of the playlist. `True` for public, `False` for private. collaborative : Optional[bool] If `True`, the playlist will become collaborative and other users will be able to modify the playlist. description : Optional[str] The new playlist description", "response": "async def edit_playlist(self, playlist, *, name=None, public=None, collaborative=None, description=None):\n \"\"\"Change a playlist\u2019s name and public/private, collaborative state and description.\n\n Parameters\n ----------\n playlist : Union[str, Playlist]\n The playlist to modify\n name : Optional[str]\n The new name of the playlist.\n public : Optional[bool]\n The public/private status of the playlist.\n `True` for public, `False` for private.\n collaborative : Optional[bool]\n If `True`, the playlist will become collaborative and other users will be able to modify the playlist.\n description : Optional[str]\n The new playlist description\n \"\"\"\n data = {}\n\n if name:\n data['name'] = name\n\n if public:\n data['public'] = public\n\n if collaborative:\n data['collaborative'] = collaborative\n\n if description:\n data['description'] = description\n\n await self.http.change_playlist_details(self.id, str(playlist), data)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\nasync def create_playlist(self, name, *, public=True, collaborative=False, description=None):\n data = {\n 'name': name,\n 'public': public,\n 'collaborative': collaborative\n }\n\n if description:\n data['description'] = description\n\n playlist_data = await self.http.create_playlist(self.id, data)\n return Playlist(self.__client, playlist_data)", "response": "Create a playlist for a Spotify user."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nget the users playlists from spotify.", "response": "async def get_playlists(self, *, limit=20, offset=0):\n \"\"\"get the users playlists from spotify.\n\n Parameters\n ----------\n limit : Optional[int]\n The limit on how many playlists to retrieve for this user (default is 20).\n offset : Optional[int]\n The offset from where the api should start from in the playlists.\n\n Returns\n -------\n playlists : List[Playlist]\n A list of the users playlists.\n \"\"\"\n if hasattr(self, 'http'):\n http = self.http\n else:\n http = self.__client.http\n\n data = await http.get_playlists(self.id, limit=limit, offset=offset)\n return [Playlist(self.__client, playlist_data) for playlist_data in data['items']]"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget the albums tracks from spotify.", "response": "async def get_tracks(self, *, limit: Optional[int] = 20, offset: Optional[int] = 0) -> List[Track]:\n \"\"\"get the albums tracks from spotify.\n\n Parameters\n ----------\n limit : Optional[int]\n The limit on how many tracks to retrieve for this album (default is 20).\n offset : Optional[int]\n The offset from where the api should start from in the tracks.\n \n Returns\n -------\n tracks : List[Track]\n The tracks of the artist.\n \"\"\"\n data = await self.__client.http.album_tracks(self.id, limit=limit, offset=offset)\n return list(Track(self.__client, item) for item in data['items'])"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\nasync def get_all_tracks(self, *, market: Optional[str] = 'US') -> List[Track]:\n tracks = []\n offset = 0\n total = self.total_tracks or None\n\n while True:\n data = await self.__client.http.album_tracks(self.id, limit=50, offset=offset, market=market)\n\n if total is None:\n total = data['total']\n\n offset += 50\n tracks += list(Track(self.__client, item) for item in data['items'])\n\n if len(tracks) >= total:\n break\n\n return tracks", "response": "loads all of the albums tracks depending on how many the albums has this country code."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef oauth2_url(self, redirect_uri: str, scope: Optional[str] = None, state: Optional[str] = None) -> str:\n return OAuth2.url_(self.http.client_id, redirect_uri, scope=scope, state=state)", "response": "Generate an outh2 url for user authentication."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\nasync def get_album(self, spotify_id: str, *, market: str = 'US') -> Album:\n data = await self.http.album(to_id(spotify_id), market=market)\n return Album(self, data)", "response": "Retrive an album with a spotify ID."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\nasync def get_artist(self, spotify_id: str) -> Artist:\n data = await self.http.artist(to_id(spotify_id))\n return Artist(self, data)", "response": "Retrive an artist with a spotify ID."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\nasync def get_track(self, spotify_id: str) -> Track:\n data = await self.http.track(to_id(spotify_id))\n return Track(self, data)", "response": "Retrive an existing track with a spotify ID."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\nasync def get_artists(self, *ids: List[str]) -> List[Artist]:\n data = await self.http.artists(','.join(to_id(_id) for _id in ids))\n return list(Artist(self, artist) for artist in data['artists'])", "response": "Retrive multiple artists with a list of spotify IDs."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\nasync def search(self, q: str, *, types: Optional[Iterable[str]] = ['track', 'playlist', 'artist', 'album'], limit: Optional[int] = 20, offset: Optional[int] = 0, market: Optional[str] = None) -> Dict[str, List[Union[Track, Playlist, Artist, Album]]]:\n if not hasattr(types, '__iter__'):\n raise TypeError('types must be an iterable.')\n\n elif not isinstance(types, list):\n types = list(item for item in types)\n\n types_ = set(types)\n\n if not types_.issubset(_SEARCH_TYPES):\n raise ValueError(_SEARCH_TYPE_ERR % types_.difference(_SEARCH_TYPES).pop())\n\n kwargs = {\n 'q': q.replace(' ', '+'),\n 'queary_type': ','.join(tp.strip() for tp in types),\n 'market': market,\n 'limit': limit,\n 'offset': offset\n }\n\n data = await self.http.search(**kwargs)\n\n return {key: [_TYPES[obj['type']](self, obj) for obj in value['items']] for key, value in data.items()}", "response": "Search for a spotify user s spotify user s spotify track."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks if one or more albums is already saved in the current Spotify user s Music\u2019 library.", "response": "async def contains_albums(self, *albums: Sequence[Union[str, Album]]) -> List[bool]:\n \"\"\"Check if one or more albums is already saved in the current Spotify user\u2019s \u2018Your Music\u2019 library.\n\n Parameters\n ----------\n albums : Union[Album, str]\n A sequence of artist objects or spotify IDs\n \"\"\"\n _albums = [(obj if isinstance(obj, str) else obj.id) for obj in albums]\n return await self.user.http.is_saved_album(_albums)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking if one or more tracks is already saved in the current Spotify user s Music\u2019 library.", "response": "async def contains_tracks(self, *tracks: Sequence[Union[str, Track]]) -> List[bool]:\n \"\"\"Check if one or more tracks is already saved in the current Spotify user\u2019s \u2018Your Music\u2019 library.\n\n Parameters\n ----------\n tracks : Union[Track, str]\n A sequence of track objects or spotify IDs\n \"\"\"\n _tracks = [(obj if isinstance(obj, str) else obj.id) for obj in tracks]\n return await self.user.http.is_saved_track(_tracks)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets a list of the songs saved in the current Spotify user s Music library.", "response": "async def get_tracks(self, *, limit=20, offset=0) -> List[Track]:\n \"\"\"Get a list of the songs saved in the current Spotify user\u2019s \u2018Your Music\u2019 library.\n\n Parameters\n ----------\n limit : Optional[int]\n The maximum number of items to return. Default: 20. Minimum: 1. Maximum: 50.\n offset : Optional[int]\n The index of the first item to return. Default: 0\n \"\"\"\n data = await self.user.http.saved_tracks(limit=limit, offset=offset)\n\n return [Track(self.__client, item['track']) for item in data['items']]"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\nasync def get_albums(self, *, limit=20, offset=0) -> List[Album]:\n data = await self.user.http.saved_albums(limit=limit, offset=offset)\n\n return [Album(self.__client, item['album']) for item in data['items']]", "response": "Get a list of the albums saved in the current Spotify user s Music\u2019 library."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nremoving one or more albums from the current user s Music\u2019 library.", "response": "async def remove_albums(self, *albums):\n \"\"\"Remove one or more albums from the current user\u2019s \u2018Your Music\u2019 library.\n\n Parameters\n ----------\n albums : Sequence[Union[Album, str]]\n A sequence of artist objects or spotify IDs\n \"\"\"\n _albums = [(obj if isinstance(obj, str) else obj.id) for obj in albums]\n await self.user.http.delete_saved_albums(','.join(_albums))"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\nasync def remove_tracks(self, *tracks):\n _tracks = [(obj if isinstance(obj, str) else obj.id) for obj in tracks]\n await self.user.http.delete_saved_tracks(','.join(_tracks))", "response": "Remove one or more tracks from the current user s Music\u2019 library."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\nasync def save_albums(self, *albums):\n _albums = [(obj if isinstance(obj, str) else obj.id) for obj in albums]\n await self.user.http.save_albums(','.join(_albums))", "response": "Save one or more albums to the current user s Music\u2019 library."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\nasync def save_tracks(self, *tracks):\n _tracks = [(obj if isinstance(obj, str) else obj.id) for obj in tracks]\n await self.user.http.save_tracks(','.join(_tracks))", "response": "Save one or more tracks to the current user s Music\u2019 library."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef to_id(string: str) -> str:\n string = string.strip()\n\n match = _URI_RE.match(string)\n\n if match is None:\n match = _OPEN_RE.match(string)\n\n if match is None:\n return string\n else:\n return match.group(2)\n else:\n return match.group(1)", "response": "Get a spotify ID from a string."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nconstructing a OAuth2 object from a spotify. Client object.", "response": "def from_client(cls, client, *args, **kwargs):\n \"\"\"Construct a OAuth2 object from a `spotify.Client`.\"\"\"\n return cls(client.http.client_id, *args, **kwargs)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef url_(client_id: str, redirect_uri: str, *, scope: str = None, state: str = None, secure: bool = True) -> str:\n attrs = {\n 'client_id': client_id,\n 'redirect_uri': quote(redirect_uri)\n }\n\n if scope is not None:\n attrs['scope'] = quote(scope)\n\n if state is not None:\n attrs['state'] = state\n\n parameters = '&'.join('{0}={1}'.format(*item) for item in attrs.items())\n\n return OAuth2._BASE.format(parameters=parameters)", "response": "Construct a OAuth2 URL instead of an OAuth2 object."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef attrs(self):\n data = {\n 'client_id': self.client_id,\n 'redirect_uri': quote(self.redirect_uri),\n }\n\n if self.scope is not None:\n data['scope'] = quote(self.scope)\n\n if self.state is not None:\n data['state'] = self.state\n\n return data", "response": "Return a dictionary of attributes used when constructing url parameters."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets the track object for each link in the partial tracks data Returns ------- tracks : List[Track] The tracks", "response": "async def build(self):\n \"\"\"get the track object for each link in the partial tracks data\n\n Returns\n -------\n tracks : List[Track]\n The tracks\n \"\"\"\n data = await self.__func()\n return list(PlaylistTrack(self.__client, track) for track in data['items'])"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\nasync def get_all_tracks(self) -> List[PlaylistTrack]:\n if isinstance(self._tracks, PartialTracks):\n return await self._tracks.build()\n\n _tracks = []\n offset = 0\n while len(self.tracks) < self.total_tracks:\n data = await self.__client.http.get_playlist_tracks(self.owner.id, self.id, limit=50, offset=offset)\n\n _tracks += [PlaylistTrack(self.__client, item) for item in data['items']]\n offset += 50\n\n self.total_tracks = len(self._tracks)\n return list(self._tracks)", "response": "Get all playlist tracks."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\npausing playback on the user s account.", "response": "async def pause(self, *, device: Optional[SomeDevice] = None):\n \"\"\"Pause playback on the user\u2019s account.\n\n Parameters\n ----------\n device : Optional[:obj:`SomeDevice`]\n The Device object or id of the device this command is targeting.\n If not supplied, the user\u2019s currently active device is the target.\n \"\"\"\n await self._user.http.pause_playback(device_id=str(device))"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\nasync def resume(self, *, device: Optional[SomeDevice] = None):\n await self._user.http.play_playback(None, device_id=str(device))", "response": "Resume playback on the user s account."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nseeking to the given position in the user s currently playing track.", "response": "async def seek(self, pos, *, device: Optional[SomeDevice] = None):\n \"\"\"Seeks to the given position in the user\u2019s currently playing track.\n\n Parameters\n ----------\n pos : int\n The position in milliseconds to seek to.\n Must be a positive number.\n Passing in a position that is greater than the length of the track will cause the player to start playing the next song.\n device : Optional[:obj:`SomeDevice`]\n The Device object or id of the device this command is targeting.\n If not supplied, the user\u2019s currently active device is the target.\n \"\"\"\n await self._user.http.seek_playback(pos, device_id=str(device))"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nsets the repeat mode for the user.", "response": "async def set_repeat(self, state, *, device: Optional[SomeDevice] = None):\n \"\"\"Set the repeat mode for the user\u2019s playback.\n\n Parameters\n ----------\n state : str\n Options are repeat-track, repeat-context, and off\n device : Optional[:obj:`SomeDevice`]\n The Device object or id of the device this command is targeting.\n If not supplied, the user\u2019s currently active device is the target.\n \"\"\"\n await self._user.http.repeat_playback(state, device_id=str(device))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\nasync def set_volume(self, volume: int, *, device: Optional[SomeDevice] = None):\n await self._user.http.set_playback_volume(volume, device_id=str(device))", "response": "Set the volume for the user s currently active device."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nskips to next track in the user s queue.", "response": "async def next(self, *, device: Optional[SomeDevice] = None):\n \"\"\"Skips to next track in the user\u2019s queue.\n\n Parameters\n ----------\n device : Optional[:obj:`SomeDevice`]\n The Device object or id of the device this command is targeting.\n If not supplied, the user\u2019s currently active device is the target.\n \"\"\"\n await self._user.http.skip_next(device_id=str(device))"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\nasync def previous(self, *, device: Optional[SomeDevice] = None):\n return await self._user.http.skip_previous(device_id=str(device))", "response": "Skip to the previous track in the queue."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\nasync def play(self, *uris: SomeURIs, offset: Optional[Offset] = 0, device: Optional[SomeDevice] = None):\n if len(uris) > 1:\n # Regular uris paramter\n context_uri = list(str(uri) for uri in uris)\n else:\n # Treat it as a context URI\n context_uri = str(uris[0])\n\n if device is not None:\n if not isinstance(device, (Device, str)):\n raise TypeError('Expected `device` to either be a spotify.Device or a string. got {type(0)!r}'.format(device))\n else:\n device = device.id\n\n await self._user.http.play_playback(context_uri, offset=offset, device_id=device)", "response": "Start a new context or resume current playback on the user s currently active device."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nshuffle the user s playback.", "response": "async def shuffle(self, state: Optional[bool] = None, *, device: Optional[SomeDevice] = None):\n \"\"\"shuffle on or off for user\u2019s playback.\n\n Parameters\n ----------\n state : Optional[bool]\n if `True` then Shuffle user\u2019s playback.\n else if `False` do not shuffle user\u2019s playback.\n device : Optional[:obj:`SomeDevice`]\n The Device object or id of the device this command is targeting.\n If not supplied, the user\u2019s currently active device is the target.\n \"\"\"\n await self.__user.http.shuffle_playback(state)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\nasync def transfer(self, device: SomeDevice, ensure_playback: bool = False):\n await self._user.http.transfer_player(str(device), play=ensure_playback)", "response": "Transfer playback to a new device and determine if it should start playing."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\nasync def from_href(self):\n if not hasattr(self, 'href'):\n raise TypeError('Spotify object has no `href` attribute, therefore cannot be retrived')\n\n elif hasattr(self, 'http'):\n return await self.http.request(('GET', self.href))\n\n else:\n cls = type(self)\n\n try:\n client = getattr(self, '_{0}__client'.format(cls.__name__))\n except AttributeError:\n raise TypeError('Spotify object has no way to access a HTTPClient.')\n else:\n http = client.http\n\n data = await http.request(('GET', self.href))\n\n return cls(client, data)", "response": "Get the full object from spotify with a href attribute."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ngetting the status of the currently tested element.", "response": "def get(self): # pragma: no cover\n \"\"\"\n Execute the logic behind the meaning of ExpirationDate + return the matched status.\n\n :return:\n The status of the tested domain.\n Can be one of the official status.\n :rtype: str\n \"\"\"\n\n # We get the status of the domain validation.\n domain_validation = self.checker.is_domain_valid()\n # We get the status of the IPv4 validation.\n ip_validation = self.checker.is_ip_valid()\n\n if \"current_test_data\" in PyFunceble.INTERN:\n # The end-user want more information whith his test.\n\n # We update some index.\n PyFunceble.INTERN[\"current_test_data\"].update(\n {\n \"domain_syntax_validation\": domain_validation,\n \"ip4_syntax_validation\": ip_validation,\n }\n )\n\n if (\n domain_validation\n and not ip_validation\n or domain_validation\n or PyFunceble.CONFIGURATION[\"local\"]\n ):\n # * The element is a valid domain.\n # and\n # * The element is not ahe valid IPv4.\n # or\n # * The element is a valid domain.\n\n # * We get the HTTP status code of the currently tested element.\n # and\n # * We try to get the element status from the IANA database.\n PyFunceble.INTERN.update(\n {\"http_code\": HTTPCode().get(), \"referer\": Referer().get()}\n )\n\n if not PyFunceble.INTERN[\"referer\"]:\n # We could not get the referer.\n\n # We parse the referer status into the upstream call.\n return PyFunceble.INTERN[\"referer\"]\n\n # The WHOIS record status is not into our list of official status.\n\n if PyFunceble.INTERN[\"referer\"] and not self.checker.is_subdomain():\n # * The iana database comparison status is not None.\n # and\n # * The domain we are testing is not a subdomain.\n\n # We try to extract the expiration date from the WHOIS record.\n # And we return the matched status.\n return self._extract()\n\n # The iana database comparison status is None.\n\n # We log our whois record if the debug mode is activated.\n Logs().whois(self.whois_record)\n\n # And we return None, we could not extract the expiration date.\n return None\n\n if (\n ip_validation\n and not domain_validation\n or ip_validation\n or PyFunceble.CONFIGURATION[\"local\"]\n ):\n # * The element is a valid IPv4.\n # and\n # * The element is not a valid domain.\n # or\n # * The element is a valid IPv4.\n\n # We get the HTTP status code.\n PyFunceble.INTERN[\"http_code\"] = HTTPCode().get()\n\n # We log our whois record if the debug mode is activated.\n Logs().whois(self.whois_record)\n\n # And we return None, there is no expiration date to look for.\n return None\n\n # The validation was not passed.\n\n # We log our whois record if the debug mode is activated.\n Logs().whois(self.whois_record)\n\n # And we return False, the domain could not pass the IP and domains syntax validation.\n return False"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nconvert a given month into our unified format.", "response": "def _convert_or_shorten_month(cls, data):\n \"\"\"\n Convert a given month into our unified format.\n\n :param data: The month to convert or shorten.\n :type data: str\n\n :return: The unified month name.\n :rtype: str\n \"\"\"\n\n # We map the different month and their possible representation.\n short_month = {\n \"jan\": [str(1), \"01\", \"Jan\", \"January\"],\n \"feb\": [str(2), \"02\", \"Feb\", \"February\"],\n \"mar\": [str(3), \"03\", \"Mar\", \"March\"],\n \"apr\": [str(4), \"04\", \"Apr\", \"April\"],\n \"may\": [str(5), \"05\", \"May\"],\n \"jun\": [str(6), \"06\", \"Jun\", \"June\"],\n \"jul\": [str(7), \"07\", \"Jul\", \"July\"],\n \"aug\": [str(8), \"08\", \"Aug\", \"August\"],\n \"sep\": [str(9), \"09\", \"Sep\", \"September\"],\n \"oct\": [str(10), \"Oct\", \"October\"],\n \"nov\": [str(11), \"Nov\", \"November\"],\n \"dec\": [str(12), \"Dec\", \"December\"],\n }\n\n for month in short_month:\n # We loop through our map.\n\n if data in short_month[month]:\n # If the parsed data (or month if you prefer) is into our map.\n\n # We return the element (or key if you prefer) assigned to\n # the month.\n return month\n\n # The element is not into our map.\n\n # We return the parsed element (or month if you prefer).\n return data"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _cases_management(self, regex_number, matched_result):\n\n # We map our regex numbers with with the right group order.\n # Note: please report to the method note for more information about the mapping.\n cases = {\n \"first\": [[1, 2, 3, 10, 11, 22, 26, 27, 28, 29, 32, 34, 38], [0, 1, 2]],\n \"second\": [[14, 15, 31, 33, 36, 37], [1, 0, 2]],\n \"third\": [\n [4, 5, 6, 7, 8, 9, 12, 13, 16, 17, 18, 19, 20, 21, 23, 24, 25, 30, 35],\n [2, 1, 0],\n ],\n }\n\n for case in cases:\n # We loop through the cases.\n\n # We get the case data.\n case_data = cases[case]\n\n if int(regex_number) in case_data[0]:\n # The regex number is into the currently read case data.\n\n # We return a list with the formatted elements.\n # 1. We convert the day to 2 digits.\n # 2. We convert the month to the unified format.\n # 3. We return the year.\n return [\n self._convert_1_to_2_digits(matched_result[case_data[1][0]]),\n self._convert_or_shorten_month(matched_result[case_data[1][1]]),\n str(matched_result[case_data[1][2]]),\n ]\n\n # The regex number is not already mapped.\n\n # We return the parsed data.\n return matched_result", "response": "This method maps the regex number to the most appropriate case."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nformats the expiration date into an unified format.", "response": "def _format(self, date_to_convert=None):\n \"\"\"\n Format the expiration date into an unified format (01-jan-1970).\n\n :param date_to_convert:\n The date to convert. In other words, the extracted date.\n :type date_to_convert: str\n\n :return: The formatted expiration date.\n :rtype: str\n \"\"\"\n\n if not date_to_convert: # pragma: no cover\n # The date to conver is given.\n\n # We initiate the date we are working with.\n date_to_convert = self.expiration_date\n\n # We map the different possible regex.\n # The regex index represent a unique number which have to be reported\n # to the self._case_management() method.\n regex_dates = {\n # Date in format: 02-jan-2017\n \"1\": r\"([0-9]{2})-([a-z]{3})-([0-9]{4})\",\n # Date in format: 02.01.2017 // Month: jan\n \"2\": r\"([0-9]{2})\\.([0-9]{2})\\.([0-9]{4})$\",\n # Date in format: 02/01/2017 // Month: jan\n \"3\": r\"([0-3][0-9])\\/(0[1-9]|1[012])\\/([0-9]{4})\",\n # Date in format: 2017-01-02 // Month: jan\n \"4\": r\"([0-9]{4})-([0-9]{2})-([0-9]{2})$\",\n # Date in format: 2017.01.02 // Month: jan\n \"5\": r\"([0-9]{4})\\.([0-9]{2})\\.([0-9]{2})$\",\n # Date in format: 2017/01/02 // Month: jan\n \"6\": r\"([0-9]{4})\\/([0-9]{2})\\/([0-9]{2})$\",\n # Date in format: 2017.01.02 15:00:00\n \"7\": r\"([0-9]{4})\\.([0-9]{2})\\.([0-9]{2})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}\",\n # Date in format: 20170102 15:00:00 // Month: jan\n \"8\": r\"([0-9]{4})([0-9]{2})([0-9]{2})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}\",\n # Date in format: 2017-01-02 15:00:00 // Month: jan\n \"9\": r\"([0-9]{4})-([0-9]{2})-([0-9]{2})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}\",\n # Date in format: 02.01.2017 15:00:00 // Month: jan\n \"10\": r\"([0-9]{2})\\.([0-9]{2})\\.([0-9]{4})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}\",\n # Date in format: 02-Jan-2017 15:00:00 UTC\n \"11\": r\"([0-9]{2})-([A-Z]{1}[a-z]{2})-([0-9]{4})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}\\s[A-Z]{1}.*\", # pylint: disable=line-too-long\n # Date in format: 2017/01/02 01:00:00 (+0900) // Month: jan\n \"12\": r\"([0-9]{4})\\/([0-9]{2})\\/([0-9]{2})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}\\s\\(.*\\)\",\n # Date in format: 2017/01/02 01:00:00 // Month: jan\n \"13\": r\"([0-9]{4})\\/([0-9]{2})\\/([0-9]{2})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}$\",\n # Date in format: Mon Jan 02 15:00:00 GMT 2017\n \"14\": r\"[a-zA-Z]{3}\\s([a-zA-Z]{3})\\s([0-9]{2})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}\\s[A-Z]{3}\\s([0-9]{4})\", # pylint: disable=line-too-long\n # Date in format: Mon Jan 02 2017\n \"15\": r\"[a-zA-Z]{3}\\s([a-zA-Z]{3})\\s([0-9]{2})\\s([0-9]{4})\",\n # Date in format: 2017-01-02T15:00:00 // Month: jan\n \"16\": r\"([0-9]{4})-([0-9]{2})-([0-9]{2})T[0-9]{2}:[0-9]{2}:[0-9]{2}$\",\n # Date in format: 2017-01-02T15:00:00Z // Month: jan${'7}\n \"17\": r\"([0-9]{4})-([0-9]{2})-([0-9]{2})T[0-9]{2}:[0-9]{2}:[0-9]{2}[A-Z].*\",\n # Date in format: 2017-01-02T15:00:00+0200 // Month: jan\n \"18\": r\"([0-9]{4})-([0-9]{2})-([0-9]{2})T[0-9]{2}:[0-9]{2}:[0-9]{2}[+-][0-9]{4}\",\n # Date in format: 2017-01-02T15:00:00+0200.622265+03:00 //\n # Month: jan\n \"19\": r\"([0-9]{4})-([0-9]{2})-([0-9]{2})T[0-9]{2}:[0-9]{2}:[0-9]{2}\\.[0-9].*[+-][0-9]{2}:[0-9]{2}\", # pylint: disable=line-too-long\n # Date in format: 2017-01-02T15:00:00+0200.622265 // Month: jan\n \"20\": r\"([0-9]{4})-([0-9]{2})-([0-9]{2})T[0-9]{2}:[0-9]{2}:[0-9]{2}\\.[0-9]{6}$\",\n # Date in format: 2017-01-02T23:59:59.0Z // Month: jan\n \"21\": r\"([0-9]{4})-([0-9]{2})-([0-9]{2})T[0-9]{2}:[0-9]{2}:[0-9]{2}\\.[0-9].*[A-Z]\",\n # Date in format: 02-01-2017 // Month: jan\n \"22\": r\"([0-9]{2})-([0-9]{2})-([0-9]{4})\",\n # Date in format: 2017. 01. 02. // Month: jan\n \"23\": r\"([0-9]{4})\\.\\s([0-9]{2})\\.\\s([0-9]{2})\\.\",\n # Date in format: 2017-01-02T00:00:00+13:00 // Month: jan\n \"24\": r\"([0-9]{4})-([0-9]{2})-([0-9]{2})T[0-9]{2}:[0-9]{2}:[0-9]{2}[+-][0-9]{2}:[0-9]{2}\", # pylint: disable=line-too-long\n # Date in format: 20170102 // Month: jan\n \"25\": r\"(?=[0-9]{8})(?=([0-9]{4})([0-9]{2})([0-9]{2}))\",\n # Date in format: 02-Jan-2017\n \"26\": r\"([0-9]{2})-([A-Z]{1}[a-z]{2})-([0-9]{4})$\",\n # Date in format: 02.1.2017 // Month: jan\n \"27\": r\"([0-9]{2})\\.([0-9]{1})\\.([0-9]{4})\",\n # Date in format: 02 Jan 2017\n \"28\": r\"([0-9]{1,2})\\s([A-Z]{1}[a-z]{2})\\s([0-9]{4})\",\n # Date in format: 02-January-2017\n \"29\": r\"([0-9]{2})-([A-Z]{1}[a-z]*)-([0-9]{4})\",\n # Date in format: 2017-Jan-02.\n \"30\": r\"([0-9]{4})-([A-Z]{1}[a-z]{2})-([0-9]{2})\\.\",\n # Date in format: Mon Jan 02 15:00:00 2017\n \"31\": r\"[a-zA-Z]{3}\\s([a-zA-Z]{3})\\s([0-9]{1,2})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}\\s([0-9]{4})\", # pylint: disable=line-too-long\n # Date in format: Mon Jan 2017 15:00:00\n \"32\": r\"()[a-zA-Z]{3}\\s([a-zA-Z]{3})\\s([0-9]{4})\\s[0-9]{2}:[0-9]{2}:[0-9]{2}\",\n # Date in format: January 02 2017-Jan-02\n \"33\": r\"([A-Z]{1}[a-z]*)\\s([0-9]{1,2})\\s([0-9]{4})\",\n # Date in format: 2.1.2017 // Month: jan\n \"34\": r\"([0-9]{1,2})\\.([0-9]{1,2})\\.([0-9]{4})\",\n # Date in format: 20170102000000 // Month: jan\n \"35\": r\"([0-9]{4})([0-9]{2})([0-9]{2})[0-9]+\",\n # Date in format: 01/02/2017 // Month: jan\n \"36\": r\"(0[1-9]|1[012])\\/([0-3][0-9])\\/([0-9]{4})\",\n # Date in format: January 2 2017\n \"37\": r\"([A-Z]{1}[a-z].*)\\s\\s([0-9]{1,2})\\s([0-9]{4})\",\n # Date in format: 2nd January 2017\n \"38\": r\"([0-9]{1,})[a-z]{1,}\\s([A-Z].*)\\s(2[0-9]{3})\",\n }\n\n for regx in regex_dates:\n # We loop through our map.\n\n # We try to get the matched groups if the date to convert match the currently\n # read regex.\n matched_result = Regex(\n date_to_convert, regex_dates[regx], return_data=True, rematch=True\n ).match()\n\n if matched_result:\n # The matched result is not None or an empty list.\n\n # We get the date.\n date = self._cases_management(regx, matched_result)\n\n if date:\n # The date is given.\n\n # We return the formatted date.\n return \"-\".join(date)\n\n # We return an empty string as we were not eable to match the date format.\n return \"\""} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nextract the expiration date from the whois record.", "response": "def _extract(self): # pragma: no cover\n \"\"\"\n Extract the expiration date from the whois record.\n\n :return: The status of the domain.\n :rtype: str\n \"\"\"\n\n # We try to get the expiration date from the database.\n expiration_date_from_database = Whois().get_expiration_date()\n\n if expiration_date_from_database:\n # The hash of the current whois record did not changed and the\n # expiration date from the database is not empty not equal to\n # None or False.\n\n # We generate the files and print the status.\n # It's an active element!\n Generate(\n PyFunceble.STATUS[\"official\"][\"up\"],\n \"WHOIS\",\n expiration_date_from_database,\n ).status_file()\n\n # We handle und return the official up status.\n return PyFunceble.STATUS[\"official\"][\"up\"]\n\n # We get the whois record.\n self.whois_record = Lookup().whois(PyFunceble.INTERN[\"referer\"])\n\n # We list the list of regex which will help us get an unformatted expiration date.\n to_match = [\n r\"expire:(.*)\",\n r\"expire on:(.*)\",\n r\"Expiry Date:(.*)\",\n r\"free-date(.*)\",\n r\"expires:(.*)\",\n r\"Expiration date:(.*)\",\n r\"Expiry date:(.*)\",\n r\"Expire Date:(.*)\",\n r\"renewal date:(.*)\",\n r\"Expires:(.*)\",\n r\"validity:(.*)\",\n r\"Expiration Date :(.*)\",\n r\"Expiry :(.*)\",\n r\"expires at:(.*)\",\n r\"domain_datebilleduntil:(.*)\",\n r\"Data de expira\u00e7\u00e3o \\/ Expiration Date \\(dd\\/mm\\/yyyy\\):(.*)\",\n r\"Fecha de expiraci\u00f3n \\(Expiration date\\):(.*)\",\n r\"\\[Expires on\\](.*)\",\n r\"Record expires on(.*)(\\(YYYY-MM-DD\\))\",\n r\"status: OK-UNTIL(.*)\",\n r\"renewal:(.*)\",\n r\"expires............:(.*)\",\n r\"expire-date:(.*)\",\n r\"Exp date:(.*)\",\n r\"Valid-date(.*)\",\n r\"Expires On:(.*)\",\n r\"Fecha de vencimiento:(.*)\",\n r\"Expiration:.........(.*)\",\n r\"Fecha de Vencimiento:(.*)\",\n r\"Registry Expiry Date:(.*)\",\n r\"Expires on..............:(.*)\",\n r\"Expiration Time:(.*)\",\n r\"Expiration Date:(.*)\",\n r\"Expired:(.*)\",\n r\"Date d'expiration:(.*)\",\n r\"expiration date:(.*)\",\n ]\n\n if self.whois_record:\n # The whois record is not empty.\n\n if \"current_test_data\" in PyFunceble.INTERN:\n # The end-user want more information whith his test.\n\n # We update the whois_record index.\n PyFunceble.INTERN[\"current_test_data\"][\n \"whois_record\"\n ] = self.whois_record\n\n for string in to_match:\n # We loop through the list of regex.\n\n # We try tro extract the expiration date from the WHOIS record.\n expiration_date = Regex(\n self.whois_record, string, return_data=True, rematch=True, group=0\n ).match()\n\n if expiration_date:\n # The expiration date could be extracted.\n\n # We get the extracted expiration date.\n self.expiration_date = expiration_date[0].strip()\n\n # We initate a regex which will help us know if a number\n # is present into the extracted expiration date.\n regex_rumbers = r\"[0-9]\"\n\n if Regex(\n self.expiration_date, regex_rumbers, return_data=False\n ).match():\n # The extracted expiration date has a number.\n\n # We format the extracted expiration date.\n self.expiration_date = self._format()\n\n if (\n self.expiration_date\n and not Regex(\n self.expiration_date,\n r\"[0-9]{2}\\-[a-z]{3}\\-2[0-9]{3}\",\n return_data=False,\n ).match()\n ):\n # The formatted expiration date does not match our unified format.\n\n # We log the problem.\n Logs().expiration_date(self.expiration_date)\n\n # We log the whois record.\n Logs().whois(self.whois_record)\n\n if \"current_test_data\" in PyFunceble.INTERN:\n # The end-user want more information whith his test.\n\n # We update the expiration_date index.\n PyFunceble.INTERN[\"current_test_data\"][\n \"expiration_date\"\n ] = self.expiration_date\n\n # We generate the files and print the status.\n # It's an active element!\n Generate(\n PyFunceble.STATUS[\"official\"][\"up\"],\n \"WHOIS\",\n self.expiration_date,\n ).status_file()\n\n # We log the whois record.\n Logs().whois(self.whois_record)\n\n # We save the whois record into the database.\n Whois(expiration_date=self.expiration_date).add()\n\n # We handle und return the official up status.\n return PyFunceble.STATUS[\"official\"][\"up\"]\n\n # The extracted expiration date does not have a number.\n\n # We log the whois record.\n Logs().whois(self.whois_record)\n\n # We return None, we could not get the expiration date.\n return None\n\n # The whois record is empty.\n\n # We return None, we could not get the whois record.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nread the code and update all links.", "response": "def _update_code_urls(self):\n \"\"\"\n Read the code and update all links.\n \"\"\"\n\n to_ignore = [\".gitignore\", \".keep\"]\n\n for root, _, files in PyFunceble.walk(\n PyFunceble.CURRENT_DIRECTORY\n + PyFunceble.directory_separator\n + \"PyFunceble\"\n + PyFunceble.directory_separator\n ):\n # We loop through every directories and files in the `PyFunceble` directory.\n\n for file in files:\n # We loop through the list of files of the currently read directory.\n\n if file not in to_ignore and \"__pycache__\" not in root:\n # * The filename is not into the list of file to ignore.\n # and\n # * The directory we are reading is not `__pycache__`.\n\n if root.endswith(PyFunceble.directory_separator):\n # The root directory ends with the directory separator.\n\n # We fix the path in the currently read file.\n self._update_docs(root + file)\n else:\n # The root directory does not ends with the directory separator.\n\n # We fix the path in the currently read file.\n # (after appending the directory separator between the root and file)\n self._update_docs(root + PyFunceble.directory_separator + file)\n\n for root, _, files in PyFunceble.walk(\n PyFunceble.CURRENT_DIRECTORY\n + PyFunceble.directory_separator\n + \"tests\"\n + PyFunceble.directory_separator\n ):\n # We loop through every directories and files in the `tests` directory.\n for file in files:\n # We loop through the list of files of the currently read directory.\n\n if file not in to_ignore and \"__pycache__\" not in root:\n # * The filename is not into the list of file to ignore.\n # and\n # * The directory we are reading is not `__pycache__`.\n\n if root.endswith(PyFunceble.directory_separator):\n # The root directory ends with the directory separator.\n\n # We fix the path in the currently read file.\n self._update_docs(root + file)\n else:\n # The root directory does not ends with the directory separator.\n\n # We fix the path in the currently read file.\n # (after appending the directory separator between the root and file)\n self._update_docs(root + PyFunceble.directory_separator + file)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking if the current version is greater than the older one.", "response": "def _is_version_greater(self):\n \"\"\"\n Check if the current version is greater as the older older one.\n \"\"\"\n\n # we compare the 2 versions.\n checked = Version(True).check_versions(\n self.current_version[0], self.version_yaml\n )\n\n if checked is not None and not checked:\n # The current version is greater as the older one.\n\n # We return True.\n return True\n\n # We return False\n return False"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck if the current branch is dev.", "response": "def is_dev_version(cls):\n \"\"\"\n Check if the current branch is `dev`.\n \"\"\"\n\n # We initiate the command we have to run in order to\n # get the branch we are currently working with.\n command = \"git branch\"\n\n # We execute and get the command output.\n command_result = Command(command).execute()\n\n for branch in command_result.split(\"\\n\"):\n # We loop through each line of the command output.\n\n if branch.startswith(\"*\") and \"dev\" in branch:\n # The current branch is `dev`.\n\n # We return True.\n return True\n\n # The current branch is not `dev`.\n\n # We return False.\n return False"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck if we need to deprecate the current version number.", "response": "def _does_require_deprecation(self):\n \"\"\"\n Check if we have to put the previous version into the deprecated list.\n \"\"\"\n\n for index, version_number in enumerate(self.current_version[0][:2]):\n # We loop through the 2 last elements of the version.\n\n if version_number > self.version_yaml[index]:\n # The currently read version number is greater than the one we have in\n # the version.yaml.\n\n # We return True.\n return True\n\n # We return False, we do not need to deprecate anything.\n return False"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nupdate the given file or README. rst so that it gives branch related URL and informations.", "response": "def _update_docs(self, file_to_update):\n \"\"\"\n Update the given documentation file or :code:`README.rst` so that\n it always gives branch related URL and informations.\n\n .. note::\n This only apply to :code:`dev` and :code:`master` branch.\n\n :param file_to_update: The file to update.\n :type file_to_update: str\n \"\"\"\n\n if self.is_dev_version():\n # The current version is the dev version.\n\n # We map what we have to replace.\n # Format: {match:replacement}\n regexes = {\n \"/%s/\" % \"dev\": r\"\\/%s\\/\" % \"master\",\n \"=%s\" % \"dev\": \"=%s\" % \"master\",\n }\n elif self.is_master_version():\n # The current version is the master version.\n\n # We map what we have to replace.\n regexes = {\n \"/%s/\" % \"master\": r\"\\/%s\\/\" % \"dev\",\n \"=%s\" % \"master\": \"=%s\" % \"dev\",\n }\n else:\n # The current version is not the master nor the dev version.\n\n # We raise an exception as the branch we are currently is not meaned\n # for production.\n raise Exception(\"Please switch to `dev` or `master` branch.\")\n\n # We get the content of the file to fix.\n to_update = File(file_to_update).read()\n\n for replacement, regex in regexes.items():\n # We loop through reach element of the map.\n\n # We process the replacement.\n to_update = Regex(to_update, regex, replace_with=replacement).replace()\n\n # We finally overwrite the file to fix with the filtered.\n # content.\n File(file_to_update).write(to_update, overwrite=True)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nupdating the setup. py file so that it always have the right name.", "response": "def _update_setup_py(self):\n \"\"\"\n Update :code:`setup.py` so that it always have the right name.\n \"\"\"\n\n # We initiate the path to the file we have to filter.\n setup_py_path = PyFunceble.CURRENT_DIRECTORY + \"setup.py\"\n\n if self.is_dev_version():\n # The current version is the `dev` version.\n\n # We map what we have to replace.\n # Format: {match:replacement}\n regexes = {\n 'name=\"PyFunceble-dev\"': r'name=\".*\"',\n '\"Development Status :: 4 - Beta\"': r'\"Development\\sStatus\\s::.*\"',\n }\n elif self.is_master_version():\n # The current version is the `dev` version.\n\n # We map what we have to replace.\n regexes = {\n 'name=\"PyFunceble\"': r'name=\".*\"',\n '\"Development Status :: 5 - Production/Stable\"': r'\"Development\\sStatus\\s::.*\"',\n }\n else:\n # The current version is not the `dev` nor the `master` version.\n\n # We raise an exception to the user, the current branch is not meant for\n # production.\n raise Exception(\"Please switch to `dev` or `master` branch.\")\n\n # We get the file content.\n to_update = File(setup_py_path).read()\n\n for replacement, regex in regexes.items():\n # We loop through our map.\n\n # And we process the replacement.\n to_update = Regex(to_update, regex, replace_with=replacement).replace()\n\n # We finally replace the content of the file with the filtered\n # version.\n File(setup_py_path).write(to_update, overwrite=True)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nupdate the travis. yml file according to current branch.", "response": "def _update_travis_yml(self):\n \"\"\"\n Update :code:`.travis.yml` according to current branch.\n \"\"\"\n\n # We initiate the file we have to filter/update.\n travis_yml_path = PyFunceble.CURRENT_DIRECTORY + \".travis.yml\"\n\n if self.is_dev_version():\n # The current version is the `dev` version.\n\n # We map what we have to replace.\n # Format: {match:replacement}\n regexes = {\n \"pip3 install pyfunceble-dev\": r\"pip3\\sinstall\\spyfunceble.*\",\n \"pip-autoremove pyfunceble-dev \": r\"pip-autoremove\\spyfunceble\\s\",\n }\n elif self.is_master_version():\n # The current version is the `master` version.\n\n # We map what we have to replace.\n regexes = {\n \"pip3 install pyfunceble\": r\"pip3\\sinstall\\spyfunceble.*\",\n \"pip-autoremove pyfunceble \": r\"pip-autoremove\\spyfunceble[a-z-_]+\\s\",\n }\n else:\n # The current version is not the `master` nor the `dev` version.\n\n # We raise an exception, the current branch is not meant for production.\n raise Exception(\"Please switch to `dev` or `master` branch.\")\n\n # We get the file content.\n to_update = File(travis_yml_path).read()\n\n for replacement, regex in regexes.items():\n # We loop through the map.\n\n # And we process the replacement.\n to_update = Regex(to_update, regex, replace_with=replacement).replace()\n\n # We finaly replace the file content with the filtered\n # content.\n File(travis_yml_path).write(to_update, overwrite=True)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef restore(self):\n\n if PyFunceble.CONFIGURATION[\"auto_continue\"] and self.backup_content:\n # The auto_continue subsystem is activated and the backup_content\n # is not empty.\n\n # We get the file we have to restore.\n file_to_restore = PyFunceble.INTERN[\"file_to_test\"]\n\n if file_to_restore in self.backup_content:\n # The file we are working with is already into the backup content.\n\n # We initiate the different status to set.\n to_initiate = [\"up\", \"down\", \"invalid\", \"tested\"]\n\n # Because at some time it was not the current status, we have to map\n # the new with the old. This way, if someone is running the latest\n # version but with old data, we still continue like nothing happend.\n alternatives = {\n \"up\": \"number_of_up\",\n \"down\": \"number_of_down\",\n \"invalid\": \"number_of_invalid\",\n \"tested\": \"number_of_tested\",\n }\n\n for string in to_initiate:\n # We loop over the status we have to initiate.\n\n try:\n # We try to update the counters by using the currently read status.\n PyFunceble.INTERN[\"counter\"][\"number\"].update(\n {string: self.backup_content[file_to_restore][string]}\n )\n except KeyError:\n # But if the status is not present, we try with the older index\n # we mapped previously.\n PyFunceble.INTERN[\"counter\"][\"number\"].update(\n {\n string: self.backup_content[file_to_restore][\n alternatives[string]\n ]\n }\n )", "response": "This function is called by the backup code when the backup data is restored."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nchecks if we have to ignore the given line.", "response": "def _is_to_ignore(cls, line):\n \"\"\"\n Check if we have to ignore the given line.\n\n :param line: The line from the file.\n :type line: str\n \"\"\"\n\n # We set the list of regex to match to be\n # considered as ignored.\n to_ignore = [r\"(^!|^@@|^\\/|^\\[|^\\.|^-|^_|^\\?|^&)\"] # , r\"(\\$|,)(image)\"]\n\n for element in to_ignore:\n # We loop through the list of regex.\n\n if Regex(line, element, return_data=False).match():\n # The currently read line match the currently read\n # regex.\n\n # We return true, it has to be ignored.\n return True\n\n # Wer return False, it does not has to be ignored.\n return False"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _handle_options(self, options):\n\n # We initiate a variable which will save our result\n result = []\n\n # We initiate the regex which will be used to extract the domain listed\n # under the option domain=\n regex_domain_option = r\"domain=(.*)\"\n\n for option in options:\n # We loop through the list of option.\n try:\n # We try to extract the list of domains from the currently read\n # option.\n domains = Regex(\n option, regex_domain_option, return_data=True, rematch=True, group=0\n ).match()[-1]\n\n if domains:\n # We could extract something.\n\n if self.aggressive: # pragma: no cover\n result.extend(\n [\n x\n for x in domains.split(\"|\")\n if x and not x.startswith(\"~\")\n ]\n )\n else:\n # We return True.\n return True\n except TypeError:\n pass\n\n # We return the result.\n return result", "response": "Handle the data from the options."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _extract_base(self, element):\n\n if isinstance(element, list):\n # The given element is a list.\n\n # We get the base of each element of the list.\n return [self._extract_base(x) for x in element]\n\n # We get the base if it is an URL.\n base = self.checker.is_url_valid(url=element, return_base=True)\n\n if base:\n # It is an URL.\n\n # We return the extracted base.\n return base\n\n if \"/\" in element:\n # / is in the given element.\n\n # We return the first element before the\n # first /\n return element.split(\"/\")[0]\n\n # / is not in the given element.\n\n # We return the given element.\n return element", "response": "Extract the base of the given element."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef decode(self):\n\n # We initiate a variable which will save what we are going to return.\n result = []\n\n # We initiate the first regex we are going to use to get\n # the element to format.\n regex = r\"^(?:.*\\|\\|)([^\\/\\$\\^]{1,}).*$\"\n\n # We initiate the third regex we are going to use to get\n # the element to format.\n regex_v3 = (\n r\"(?:#+(?:[a-z]+?)?\\[[a-z]+(?:\\^|\\*)\\=(?:\\'|\\\"))(.*\\..*)(?:(?:\\'|\\\")\\])\"\n )\n\n # We initiate the fourth regex we are going to use to get\n # the element to format.\n regex_v4 = r\"^\\|(.*\\..*)\\|$\"\n\n for line in self.to_format:\n # We loop through the different line.\n\n rematch = rematch_v3 = rematch_v4 = None\n\n # We extract the different group from our first regex.\n rematch = Regex(\n line, regex, return_data=True, rematch=True, group=0\n ).match()\n\n # We extract the different group from our fourth regex.\n #\n # Note: We execute the following in second because it is more\n # specific that others.\n rematch_v4 = Regex(\n line, regex_v4, return_data=True, rematch=True, group=0\n ).match()\n\n # We extract the different group from our third regex.\n rematch_v3 = Regex(\n line, regex_v3, return_data=True, rematch=True, group=0\n ).match()\n\n if rematch:\n # The first extraction was successfull.\n\n if self.options_separator in line:\n options = line.split(self.options_separator)[-1].split(\n self.option_separator\n )\n\n if (\n not options[-1]\n or \"third-party\" in options\n or \"script\" in options\n or \"popup\" in options\n or \"xmlhttprequest\" in options\n ):\n # We extend the result with the extracted elements.\n result.extend(self._extract_base(rematch))\n\n extra = self._handle_options(options)\n\n if extra and isinstance(extra, list): # pragma: no cover\n extra.extend(self._extract_base(rematch))\n result.extend(self._extract_base(extra))\n elif extra:\n result.extend(self._extract_base(rematch))\n\n else:\n # We extend the result with the extracted elements.\n result.extend(self._extract_base(rematch))\n\n if rematch_v4:\n # The fourth extraction was successfull.\n\n # We extend the formatted elements from the extracted elements.\n result.extend(List(self._format_decoded(rematch_v4)).format())\n\n if rematch_v3:\n # The second extraction was successfull.\n\n # We extend the formatted elements from the extracted elements.\n result.extend(List(self._format_decoded(rematch_v3)).format())\n\n # We return the result.\n return List(result).format()", "response": "Decode the list of domains to test from the adblock formated file."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nformatting the adblock line and return the list of domains or IP to test.", "response": "def _format_decoded(self, to_format, result=None): # pragma: no cover\n \"\"\"\n Format the exctracted adblock line before passing it to the system.\n\n :param to_format: The extracted line from the file.\n :type to_format: str\n\n :param result: A list of the result of this method.\n :type result: list\n\n :return: The list of domains or IP to test.\n :rtype: list\n \"\"\"\n\n if not result:\n # The result is not given.\n\n # We set the result as an empty list.\n result = []\n\n for data in List(to_format).format():\n # We loop through the different lines to format.\n\n if data:\n # The currently read line is not empty.\n\n if \"^\" in data:\n # There is an accent in the currently read line.\n\n # We recall this method but with the current result state\n # and splited data.\n return self._format_decoded(data.split(\"^\"), result)\n\n if \"#\" in data:\n # There is a dash in the currently read line.\n\n # We recall this method but with the current result state\n # and splited data.\n return self._format_decoded(data.split(\"#\"), result)\n\n if \",\" in data:\n # There is a comma in the currently read line.\n\n # We recall this method but with the current result state\n # and splited data.\n return self._format_decoded(data.split(\",\"), result)\n\n if \"!\" in data:\n # There is an exclamation mark in the currently read line.\n\n # We recall this method but with the current result state\n # and splited data.\n return self._format_decoded(data.split(\"!\"), result)\n\n if \"|\" in data:\n # There is a vertival bar in the currently read line.\n\n # We recall this method but with the current result state\n # and splited data.\n return self._format_decoded(data.split(\"|\"), result)\n\n if data:\n # The currently read line is not empty.\n\n data = self._extract_base(data)\n\n if data and (\n self.checker.is_domain_valid(data)\n or self.checker.is_ip_valid(data)\n ):\n # The extraced base is not empty.\n # and\n # * The currently read line is a valid domain.\n # or\n # * The currently read line is a valid IP.\n\n # We append the currently read line to the result.\n result.append(data)\n elif data:\n # * The currently read line is not a valid domain.\n # or\n # * The currently read line is not a valid IP.\n\n # We try to get the url base.\n url_base = self.checker.is_url_valid(data, return_base=True)\n\n if url_base:\n # The url_base is not empty or equal to False or None.\n\n # We append the url base to the result.\n result.append(url_base)\n\n # We return the result element.\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngets the HTTP status code of the current object.", "response": "def _access(self): # pragma: no cover\n \"\"\"\n Get the HTTP code status.\n\n :return: The matched HTTP status code.\n :rtype: int|None\n \"\"\"\n\n try:\n # We try to get the HTTP status code.\n\n if PyFunceble.INTERN[\"to_test_type\"] == \"url\":\n # We are globally testing a URL.\n\n # We get the head of the URL.\n req = PyFunceble.requests.head(\n self.to_get,\n timeout=PyFunceble.CONFIGURATION[\"seconds_before_http_timeout\"],\n headers=self.headers,\n verify=PyFunceble.CONFIGURATION[\"verify_ssl_certificate\"],\n )\n else:\n # We are not globally testing a URL.\n\n # We get the head of the constructed URL.\n req = PyFunceble.requests.head(\n self.to_get,\n timeout=PyFunceble.CONFIGURATION[\"seconds_before_http_timeout\"],\n headers=self.headers,\n )\n\n # And we try to get the status code.\n return req.status_code\n\n except (\n PyFunceble.requests.exceptions.InvalidURL,\n PyFunceble.socket.timeout,\n PyFunceble.requests.exceptions.Timeout,\n PyFunceble.requests.ConnectionError,\n urllib3_exceptions.InvalidHeader,\n UnicodeDecodeError, # The probability that this happend in production is minimal.\n ):\n # If one of the listed exception is matched, that means that something\n # went wrong and we were unable to extract the status code.\n\n # We return None.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nreturn the HTTP code status code.", "response": "def get(self):\n \"\"\"\n Return the HTTP code status.\n\n :return: The matched and formatted status code.\n :rtype: str|int|None\n \"\"\"\n if PyFunceble.HTTP_CODE[\"active\"]:\n # The http status code extraction is activated.\n\n # We get the http status code.\n http_code = self._access()\n\n # We initiate a variable which will save the list of allowed\n # http status code.\n list_of_valid_http_code = []\n\n for codes in [\n PyFunceble.HTTP_CODE[\"list\"][\"up\"],\n PyFunceble.HTTP_CODE[\"list\"][\"potentially_down\"],\n PyFunceble.HTTP_CODE[\"list\"][\"potentially_up\"],\n ]:\n # We loop throught the list of http status code.\n\n # We extend the list of valid with the currently read\n # codes.\n list_of_valid_http_code.extend(codes)\n\n if http_code not in list_of_valid_http_code or http_code is None:\n # * The extracted http code is not in the list of valid http code.\n # or\n # * The extracted http code is equal to `None`.\n\n # We return 3 star in order to mention that we were not eable to extract\n # the http status code.\n return \"*\" * 3\n\n # * The extracted http code is in the list of valid http code.\n # or\n # * The extracted http code is not equal to `None`.\n\n # We return the extracted http status code.\n return http_code\n\n # The http status code extraction is activated.\n\n # We return None.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking the syntax of the given domain.", "response": "def syntax_check(domain): # pragma: no cover\n \"\"\"\n Check the syntax of the given domain.\n\n :param domain: The domain to check the syntax for.\n :type domain: str\n\n :return: The syntax validity.\n :rtype: bool\n\n .. warning::\n If an empty or a non-string :code:`domain` is given, we return :code:`None`.\n \"\"\"\n\n if domain and isinstance(domain, str):\n # * The given domain is not empty nor None.\n # and\n # * The given domain is a string.\n\n # We silently load the configuration.\n load_config(True)\n\n return Check(domain).is_domain_valid()\n\n # We return None, there is nothing to check.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ncheck if the given domain is a subdomain.", "response": "def is_subdomain(domain): # pragma: no cover\n \"\"\"\n Check if the given domain is a subdomain.\n\n :param domain: The domain we are checking.\n :type domain: str\n\n :return: The subdomain state.\n :rtype: bool\n\n .. warning::\n If an empty or a non-string :code:`domain` is given, we return :code:`None`.\n \"\"\"\n\n if domain and isinstance(domain, str):\n # * The given domain is not empty nor None.\n # and\n # * The given domain is a string.\n\n # We silently load the configuration.\n load_config(True)\n\n return Check(domain).is_subdomain()\n\n # We return None, there is nothing to check.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncheck the syntax of the given IPv4.", "response": "def ipv4_syntax_check(ip): # pragma: no cover\n \"\"\"\n Check the syntax of the given IPv4.\n\n :param ip: The IPv4 to check the syntax for.\n :type ip: str\n\n :return: The syntax validity.\n :rtype: bool\n\n .. warning::\n If an empty or a non-string :code:`ip` is given, we return :code:`None`.\n \"\"\"\n\n if ip and isinstance(ip, str):\n # The given IP is not empty nor None.\n # and\n # * The given IP is a string.\n\n # We silently load the configuration.\n load_config(True)\n\n return Check(ip).is_ip_valid()\n\n # We return None, there is nothing to check.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nchecks if the given IP is an IPv4 range.", "response": "def is_ipv4_range(ip): # pragma: no cover\n \"\"\"\n Check if the given IP is an IP range.\n\n :param ip: The IP we are checking.\n :type ip: str\n\n :return: The IPv4 range state.\n :rtype: bool\n\n .. warning::\n If an empty or a non-string :code:`ip` is given, we return :code:`None`.\n \"\"\"\n\n if ip and isinstance(ip, str):\n # The given IP is not empty nor None.\n # and\n # * The given IP is a string.\n\n # We silently load the configuration.\n load_config(True)\n\n return Check(ip).is_ip_range()\n\n # We return None, there is nothing to check.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nchecking the syntax of the given URL.", "response": "def url_syntax_check(url): # pragma: no cover\n \"\"\"\n Check the syntax of the given URL.\n\n :param url: The URL to check the syntax for.\n :type url: str\n\n :return: The syntax validity.\n :rtype: bool\n\n .. warning::\n If an empty or a non-string :code:`url` is given, we return :code:`None`.\n \"\"\"\n\n if url and isinstance(url, str):\n # The given URL is not empty nor None.\n # and\n # * The given URL is a string.\n\n # We silently load the configuration.\n load_config(True)\n\n return Check(url).is_url_valid()\n\n # We return None, there is nothing to check.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nloading the configuration file and update the index.", "response": "def load_config(under_test=False, custom=None): # pragma: no cover\n \"\"\"\n Load the configuration.\n\n :param under_test:\n Tell us if we only have to load the configuration file (True)\n or load the configuration file and initate the output directory\n if it does not exist (False).\n :type under_test: bool\n\n :param custom:\n A dict with the configuration index (from .PyFunceble.yaml) to update.\n :type custom: dict\n\n .. warning::\n If :code:`custom` is given, the given :code:`dict` overwrite\n the last value of the given configuration indexes.\n \"\"\"\n\n if \"config_loaded\" not in INTERN:\n # The configuration was not already loaded.\n\n # We load and download the different configuration file if they are non\n # existant.\n Load(CURRENT_DIRECTORY)\n\n if not under_test:\n # If we are not under test which means that we want to save informations,\n # we initiate the directory structure.\n DirectoryStructure()\n\n # We save that the configuration was loaded.\n INTERN.update({\"config_loaded\": True})\n\n if custom and isinstance(custom, dict):\n # The given configuration is not None or empty.\n # and\n # It is a dict.\n\n # We update the configuration index.\n CONFIGURATION.update(custom)"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef stay_safe(): # pragma: no cover\n\n random = int(choice(str(int(time()))))\n\n if not CONFIGURATION[\"quiet\"] and random % 3 == 0:\n print(\"\\n\" + Fore.GREEN + Style.BRIGHT + \"Thanks for using PyFunceble!\")\n print(\n Fore.YELLOW\n + Style.BRIGHT\n + \"Share your experience on \"\n + Fore.CYAN\n + \"Twitter\"\n + Fore.YELLOW\n + \" with \"\n + Fore.CYAN\n + \"#PyFunceble\"\n + Fore.YELLOW\n + \"!\"\n )\n print(\n Fore.GREEN\n + Style.BRIGHT\n + \"Have a feedback, an issue or an improvement idea ?\"\n )\n print(\n Fore.YELLOW\n + Style.BRIGHT\n + \"Let us know on \"\n + Fore.CYAN\n + \"GitHub\"\n + Fore.YELLOW\n + \"!\"\n )", "response": "Print a friendly message."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _command_line(): # pragma: no cover pylint: disable=too-many-branches,too-many-statements\n\n if __name__ == \"PyFunceble\":\n # We initiate the end of the coloration at the end of each line.\n initiate(autoreset=True)\n\n # We load the configuration and the directory structure.\n load_config(True)\n try:\n # The following handle the command line argument.\n\n try:\n PARSER = argparse.ArgumentParser(\n epilog=\"Crafted with %s by %s\"\n % (\n Fore.RED + \"\u2665\" + Fore.RESET,\n Style.BRIGHT\n + Fore.CYAN\n + \"Nissar Chababy (Funilrys) \"\n + Style.RESET_ALL\n + \"with the help of \"\n + Style.BRIGHT\n + Fore.GREEN\n + \"https://pyfunceble.rtfd.io/en/master/contributors.html \"\n + Style.RESET_ALL\n + \"&& \"\n + Style.BRIGHT\n + Fore.GREEN\n + \"https://pyfunceble.rtfd.io/en/master/special-thanks.html\",\n ),\n add_help=False,\n )\n\n CURRENT_VALUE_FORMAT = (\n Fore.YELLOW + Style.BRIGHT + \"Configured value: \" + Fore.BLUE\n )\n\n PARSER.add_argument(\n \"-ad\",\n \"--adblock\",\n action=\"store_true\",\n help=\"Switch the decoding of the adblock format. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"adblock\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-a\",\n \"--all\",\n action=\"store_false\",\n help=\"Output all available information on the screen. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"less\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"\" \"-c\",\n \"--auto-continue\",\n \"--continue\",\n action=\"store_true\",\n help=\"Switch the value of the auto continue mode. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"auto_continue\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--autosave-minutes\",\n type=int,\n help=\"Update the minimum of minutes before we start \"\n \"committing to upstream under Travis CI. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"travis_autosave_minutes\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--clean\", action=\"store_true\", help=\"Clean all files under output.\"\n )\n\n PARSER.add_argument(\n \"--clean-all\",\n action=\"store_true\",\n help=\"Clean all files under output and all file generated by PyFunceble.\",\n )\n\n PARSER.add_argument(\n \"--cmd\",\n type=str,\n help=\"Pass a command to run before each commit \"\n \"(except the final one) under the Travis mode. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"command_before_end\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--cmd-before-end\",\n type=str,\n help=\"Pass a command to run before the results \"\n \"(final) commit under the Travis mode. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"command_before_end\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--commit-autosave-message\",\n type=str,\n help=\"Replace the default autosave commit message. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"travis_autosave_commit\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--commit-results-message\",\n type=str,\n help=\"Replace the default results (final) commit message. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"travis_autosave_final_commit\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-d\", \"--domain\", type=str, help=\"Set and test the given domain.\"\n )\n\n PARSER.add_argument(\n \"-db\",\n \"--database\",\n action=\"store_true\",\n help=\"Switch the value of the usage of a database to store \"\n \"inactive domains of the currently tested list. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"inactive_database\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-dbr\",\n \"--days-between-db-retest\",\n type=int,\n help=\"Set the numbers of days between each retest of domains present \"\n \"into inactive-db.json. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"days_between_db_retest\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--debug\",\n action=\"store_true\",\n help=\"Switch the value of the debug mode. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"debug\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--directory-structure\",\n action=\"store_true\",\n help=\"Generate the directory and files that are needed and which does \"\n \"not exist in the current directory.\",\n )\n\n PARSER.add_argument(\n \"-ex\",\n \"--execution\",\n action=\"store_true\",\n help=\"Switch the default value of the execution time showing. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"show_execution_time\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-f\",\n \"--file\",\n type=str,\n help=\"Read the given file and test all domains inside it. \"\n \"If a URL is given we download and test the content of the given URL.\", # pylint: disable=line-too-long\n )\n\n PARSER.add_argument(\n \"--filter\", type=str, help=\"Domain to filter (regex).\"\n )\n\n PARSER.add_argument(\n \"--help\",\n action=\"help\",\n default=argparse.SUPPRESS,\n help=\"Show this help message and exit.\",\n )\n\n PARSER.add_argument(\n \"--hierarchical\",\n action=\"store_true\",\n help=\"Switch the value of the hierarchical sorting of the tested file. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"hierarchical_sorting\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-h\",\n \"--host\",\n action=\"store_true\",\n help=\"Switch the value of the generation of hosts file. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"generate_hosts\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--http\",\n action=\"store_true\",\n help=\"Switch the value of the usage of HTTP code. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(HTTP_CODE[\"active\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--iana\",\n action=\"store_true\",\n help=\"Update/Generate `iana-domains-db.json`.\",\n )\n\n PARSER.add_argument(\n \"--idna\",\n action=\"store_true\",\n help=\"Switch the value of the IDNA conversion. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"idna_conversion\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-ip\",\n type=str,\n help=\"Change the IP to print in the hosts files with the given one. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"custom_ip\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--json\",\n action=\"store_true\",\n help=\"Switch the value of the generation \"\n \"of the JSON formatted list of domains. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"generate_json\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--less\",\n action=\"store_true\",\n help=\"Output less informations on screen. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(Core.switch(\"less\"))\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--local\",\n action=\"store_true\",\n help=\"Switch the value of the local network testing. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(Core.switch(\"local\"))\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--link\", type=str, help=\"Download and test the given file.\"\n )\n\n PARSER.add_argument(\n \"-m\",\n \"--mining\",\n action=\"store_true\",\n help=\"Switch the value of the mining subsystem usage. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"mining\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-n\",\n \"--no-files\",\n action=\"store_true\",\n help=\"Switch the value of the production of output files. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"no_files\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-nl\",\n \"--no-logs\",\n action=\"store_true\",\n help=\"Switch the value of the production of logs files \"\n \"in the case we encounter some errors. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(not CONFIGURATION[\"logs\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-ns\",\n \"--no-special\",\n action=\"store_true\",\n help=\"Switch the value of the usage of the SPECIAL rules. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"no_special\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-nu\",\n \"--no-unified\",\n action=\"store_true\",\n help=\"Switch the value of the production unified logs \"\n \"under the output directory. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"unified\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-nw\",\n \"--no-whois\",\n action=\"store_true\",\n help=\"Switch the value the usage of whois to test domain's status. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"no_whois\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-p\",\n \"--percentage\",\n action=\"store_true\",\n help=\"Switch the value of the percentage output mode. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"show_percentage\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--plain\",\n action=\"store_true\",\n help=\"Switch the value of the generation \"\n \"of the plain list of domains. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"plain_list_domain\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--production\",\n action=\"store_true\",\n help=\"Prepare the repository for production.\",\n )\n\n PARSER.add_argument(\n \"-psl\",\n \"--public-suffix\",\n action=\"store_true\",\n help=\"Update/Generate `public-suffix.json`.\",\n )\n\n PARSER.add_argument(\n \"-q\",\n \"--quiet\",\n action=\"store_true\",\n help=\"Run the script in quiet mode. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"quiet\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--share-logs\",\n action=\"store_true\",\n help=\"Switch the value of the sharing of logs. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"share_logs\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-s\",\n \"--simple\",\n action=\"store_true\",\n help=\"Switch the value of the simple output mode. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"simple\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--split\",\n action=\"store_true\",\n help=\"Switch the value of the split of the generated output files. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"inactive_database\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--syntax\",\n action=\"store_true\",\n help=\"Switch the value of the syntax test mode. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"syntax\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-t\",\n \"--timeout\",\n type=int,\n default=3,\n help=\"Switch the value of the timeout. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"seconds_before_http_timeout\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--travis\",\n action=\"store_true\",\n help=\"Switch the value of the Travis mode. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"travis\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"--travis-branch\",\n type=str,\n default=\"master\",\n help=\"Switch the branch name where we are going to push. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"travis_branch\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-u\", \"--url\", type=str, help=\"Analyze the given URL.\"\n )\n\n PARSER.add_argument(\n \"-uf\",\n \"--url-file\",\n type=str,\n help=\"Read and test the list of URL of the given file. \"\n \"If a URL is given we download and test the content of the given URL.\", # pylint: disable=line-too-long\n )\n\n PARSER.add_argument(\n \"-ua\",\n \"--user-agent\",\n type=str,\n help=\"Set the user-agent to use and set every time we \"\n \"interact with everything which is not our logs sharing system.\", # pylint: disable=line-too-long\n )\n\n PARSER.add_argument(\n \"-v\",\n \"--version\",\n help=\"Show the version of PyFunceble and exit.\",\n action=\"version\",\n version=\"%(prog)s \" + VERSION,\n )\n\n PARSER.add_argument(\n \"-vsc\",\n \"--verify-ssl-certificate\",\n action=\"store_true\",\n help=\"Switch the value of the verification of the \"\n \"SSL/TLS certificate when testing for URL. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"verify_ssl_certificate\"])\n + Style.RESET_ALL\n ),\n )\n\n PARSER.add_argument(\n \"-wdb\",\n \"--whois-database\",\n action=\"store_true\",\n help=\"Switch the value of the usage of a database to store \"\n \"whois data in order to avoid whois servers rate limit. %s\"\n % (\n CURRENT_VALUE_FORMAT\n + repr(CONFIGURATION[\"whois_database\"])\n + Style.RESET_ALL\n ),\n )\n\n ARGS = PARSER.parse_args()\n\n if ARGS.less:\n CONFIGURATION.update({\"less\": ARGS.less})\n elif not ARGS.all:\n CONFIGURATION.update({\"less\": ARGS.all})\n\n if ARGS.adblock:\n CONFIGURATION.update({\"adblock\": Core.switch(\"adblock\")})\n\n if ARGS.auto_continue:\n CONFIGURATION.update(\n {\"auto_continue\": Core.switch(\"auto_continue\")}\n )\n\n if ARGS.autosave_minutes:\n CONFIGURATION.update(\n {\"travis_autosave_minutes\": ARGS.autosave_minutes}\n )\n\n if ARGS.clean:\n Clean(None)\n\n if ARGS.clean_all:\n Clean(None, ARGS.clean_all)\n\n if ARGS.cmd:\n CONFIGURATION.update({\"command\": ARGS.cmd})\n\n if ARGS.cmd_before_end:\n CONFIGURATION.update({\"command_before_end\": ARGS.cmd_before_end})\n\n if ARGS.commit_autosave_message:\n CONFIGURATION.update(\n {\"travis_autosave_commit\": ARGS.commit_autosave_message}\n )\n\n if ARGS.commit_results_message:\n CONFIGURATION.update(\n {\"travis_autosave_final_commit\": ARGS.commit_results_message}\n )\n\n if ARGS.database:\n CONFIGURATION.update(\n {\"inactive_database\": Core.switch(\"inactive_database\")}\n )\n\n if ARGS.days_between_db_retest:\n CONFIGURATION.update(\n {\"days_between_db_retest\": ARGS.days_between_db_retest}\n )\n\n if ARGS.debug:\n CONFIGURATION.update({\"debug\": Core.switch(\"debug\")})\n\n if ARGS.directory_structure:\n DirectoryStructure()\n\n if ARGS.execution:\n CONFIGURATION.update(\n {\"show_execution_time\": Core.switch(\"show_execution_time\")}\n )\n\n if ARGS.filter:\n CONFIGURATION.update({\"filter\": ARGS.filter})\n\n if ARGS.hierarchical:\n CONFIGURATION.update(\n {\"hierarchical_sorting\": Core.switch(\"hierarchical_sorting\")}\n )\n\n if ARGS.host:\n CONFIGURATION.update(\n {\"generate_hosts\": Core.switch(\"generate_hosts\")}\n )\n\n if ARGS.http:\n HTTP_CODE.update({\"active\": Core.switch(HTTP_CODE[\"active\"], True)})\n\n if ARGS.iana:\n IANA().update()\n\n if ARGS.idna:\n CONFIGURATION.update(\n {\"idna_conversion\": Core.switch(\"idna_conversion\")}\n )\n\n if ARGS.ip:\n CONFIGURATION.update({\"custom_ip\": ARGS.ip})\n\n if ARGS.json:\n CONFIGURATION.update(\n {\"generate_json\": Core.switch(\"generate_json\")}\n )\n\n if ARGS.local:\n CONFIGURATION.update({\"local\": Core.switch(\"local\")})\n\n if ARGS.mining:\n CONFIGURATION.update({\"mining\": Core.switch(\"mining\")})\n\n if ARGS.no_files:\n CONFIGURATION.update({\"no_files\": Core.switch(\"no_files\")})\n\n if ARGS.no_logs:\n CONFIGURATION.update({\"logs\": Core.switch(\"logs\")})\n\n if ARGS.no_special:\n CONFIGURATION.update({\"no_special\": Core.switch(\"no_special\")})\n\n if ARGS.no_unified:\n CONFIGURATION.update({\"unified\": Core.switch(\"unified\")})\n\n if ARGS.no_whois:\n CONFIGURATION.update({\"no_whois\": Core.switch(\"no_whois\")})\n\n if ARGS.percentage:\n CONFIGURATION.update(\n {\"show_percentage\": Core.switch(\"show_percentage\")}\n )\n\n if ARGS.plain:\n CONFIGURATION.update(\n {\"plain_list_domain\": Core.switch(\"plain_list_domain\")}\n )\n\n if ARGS.production:\n Production()\n\n if ARGS.public_suffix:\n PublicSuffix().update()\n\n if ARGS.quiet:\n CONFIGURATION.update({\"quiet\": Core.switch(\"quiet\")})\n\n if ARGS.share_logs:\n CONFIGURATION.update({\"share_logs\": Core.switch(\"share_logs\")})\n\n if ARGS.simple:\n CONFIGURATION.update(\n {\"simple\": Core.switch(\"simple\"), \"quiet\": Core.switch(\"quiet\")}\n )\n\n if ARGS.split:\n CONFIGURATION.update({\"split\": Core.switch(\"split\")})\n\n if ARGS.syntax:\n CONFIGURATION.update({\"syntax\": Core.switch(\"syntax\")})\n\n if ARGS.timeout and ARGS.timeout % 3 == 0:\n CONFIGURATION.update({\"seconds_before_http_timeout\": ARGS.timeout})\n\n if ARGS.travis:\n CONFIGURATION.update({\"travis\": Core.switch(\"travis\")})\n\n if ARGS.travis_branch:\n CONFIGURATION.update({\"travis_branch\": ARGS.travis_branch})\n\n if ARGS.user_agent:\n CONFIGURATION.update({\"user_agent\": ARGS.user_agent})\n\n if ARGS.verify_ssl_certificate:\n CONFIGURATION.update(\n {\"verify_ssl_certificate\": ARGS.verify_ssl_certificate}\n )\n\n if ARGS.whois_database:\n CONFIGURATION.update(\n {\"whois_database\": Core.switch(\"whois_database\")}\n )\n\n if not CONFIGURATION[\"quiet\"]:\n Core.colorify_logo(home=True)\n\n # We compare the versions (upstream and local) and in between.\n Version().compare()\n\n # We call our Core which will handle all case depending of the configuration or\n # the used command line arguments.\n Core(\n domain_or_ip_to_test=ARGS.domain,\n file_path=ARGS.file,\n url_to_test=ARGS.url,\n url_file=ARGS.url_file,\n link_to_test=ARGS.link,\n )\n except KeyError as e:\n if not Version(True).is_cloned():\n # We are not into the cloned version.\n\n # We merge the local with the upstream configuration.\n Merge(CURRENT_DIRECTORY)\n else:\n # We are in the cloned version.\n\n # We raise the exception.\n #\n # Note: The purpose of this is to avoid having\n # to search for a mistake while developing.\n raise e\n except KeyboardInterrupt:\n stay_safe()", "response": "This function is called by the command line interface."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking if the given string is a URL and download the content of the link and update the location of the file to test.", "response": "def _entry_management_url_download(self, passed):\n \"\"\"\n Check if the given information is a URL.\n If it is the case, it download and update the location of file to test.\n\n :param passed: The url passed to the system.\n :type passed: str\n\n :return: The state of the check.\n :rtype: bool\n \"\"\"\n\n if passed and self.checker.is_url_valid(passed):\n # The passed string is an URL.\n\n # We get the file name based on the URL.\n # We actually just get the string after the last `/` in the URL.\n file_to_test = passed.split(\"/\")[-1]\n\n if (\n not PyFunceble.path.isfile(file_to_test)\n or PyFunceble.INTERN[\"counter\"][\"number\"][\"tested\"] == 0\n ):\n # The filename does not exist in the current directory\n # or the currently number of tested is equal to 0.\n\n # We download the content of the link.\n Download(passed, file_to_test).text()\n\n # The files does exist or the currently number of tested is greater than\n # 0.\n\n # We initiate the file we have to test.\n PyFunceble.INTERN[\"file_to_test\"] = file_to_test\n\n # We return true to say that everything goes right.\n return True\n\n # The passed string is not an URL.\n\n # We do not need to do anything else.\n return False"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _entry_management_url(self):\n\n if (\n self.url_file # pylint: disable=no-member\n and not self._entry_management_url_download(\n self.url_file # pylint: disable=no-member\n )\n ): # pylint: disable=no-member\n # The current url_file is not a URL.\n\n # We initiate the filename as the file we have to test.\n PyFunceble.INTERN[\n \"file_to_test\"\n ] = self.url_file", "response": "Manage the loading of the url system."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\navoids to have 1 millions line into self.__init__()", "response": "def _entry_management(self): # pylint: disable=too-many-branches\n \"\"\"\n Avoid to have 1 millions line into self.__init__()\n \"\"\"\n\n if not self.modulo_test: # pylint: disable=no-member\n # We are not in a module usage.\n\n # We set the file_path as the file we have to test.\n PyFunceble.INTERN[\n \"file_to_test\"\n ] = self.file_path # pylint: disable=no-member\n\n # We check if the given file_path is an url.\n # If it is an URL we update the file to test and download\n # the given URL.\n self._entry_management_url()\n\n # We fix the environnement permissions.\n AutoSave().travis_permissions()\n\n # We check if we need to bypass the execution of PyFunceble.\n self.bypass()\n\n # We set the start time.\n ExecutionTime(\"start\")\n\n if PyFunceble.CONFIGURATION[\"syntax\"]:\n # We are checking for syntax.\n\n # We deactivate the http status code.\n PyFunceble.HTTP_CODE[\"active\"] = False\n\n if self.domain_or_ip_to_test: # pylint: disable=no-member\n # The given domain is not empty or None.\n\n # We initiate a variable which will tell the system the type\n # of the tested element.\n PyFunceble.INTERN[\"to_test_type\"] = \"domain\"\n\n # We set the start time.\n ExecutionTime(\"start\")\n\n # We deactivate the showing of percentage as we are in a single\n # test run.\n PyFunceble.CONFIGURATION[\"show_percentage\"] = False\n\n # We deactivate the whois database as it is not needed.\n PyFunceble.CONFIGURATION[\"whois_database\"] = False\n\n if PyFunceble.CONFIGURATION[\"idna_conversion\"]:\n domain_or_ip_to_test = domain2idna(\n self.domain_or_ip_to_test.lower() # pylint: disable=no-member\n )\n else:\n domain_or_ip_to_test = (\n self.domain_or_ip_to_test.lower() # pylint: disable=no-member\n ) # pylint: disable=no-member\n\n # We test the domain after converting it to lower case.\n self.domain(domain_or_ip_to_test)\n elif self.url_to_test and not self.file_path: # pylint: disable=no-member\n # An url to test is given and the file path is empty.\n\n # We initiate a variable which will tell the system the type\n # of the tested element.\n PyFunceble.INTERN[\"to_test_type\"] = \"url\"\n\n # We set the start time.\n ExecutionTime(\"start\")\n\n # We deactivate the showing of percentage as we are in a single\n # test run.\n PyFunceble.CONFIGURATION[\"show_percentage\"] = False\n\n # We test the url to test after converting it if needed (IDNA).\n self.url(\n self.checker.is_url_valid(\n self.url_to_test, # pylint: disable=no-member\n return_formatted=True,\n )\n )\n elif (\n self._entry_management_url_download(\n self.url_file # pylint: disable=no-member\n )\n or self.url_file # pylint: disable=no-member\n ):\n # * A file full of URL is given.\n # or\n # * the given file full of URL is a URL.\n\n # * We deactivate the whois subsystem as it is not needed for url testing.\n # * We activate the generation of plain list element.\n # * We activate the generation of splited data instead of unified data.\n PyFunceble.CONFIGURATION[\"no_whois\"] = PyFunceble.CONFIGURATION[\n \"plain_list_domain\"\n ] = PyFunceble.CONFIGURATION[\"split\"] = True\n\n # We deactivate the generation of hosts file as it is not relevant for\n # url testing.\n PyFunceble.CONFIGURATION[\"generate_hosts\"] = False\n\n # We initiate a variable which will tell the system the type\n # of the tested element.\n PyFunceble.INTERN[\"to_test_type\"] = \"url\"\n\n # And we test the given or the downloaded file.\n self.file_url()\n elif (\n self._entry_management_url_download(\n self.link_to_test # pylint: disable=no-member\n )\n or self._entry_management_url_download(\n self.file_path # pylint: disable=no-member\n ) # pylint: disable=no-member\n or self.file_path # pylint: disable=no-member\n ):\n # * A file path is given.\n # or\n # * The given file path is an URL.\n # or\n # * A link to test is given.\n\n # We initiate a variable which will tell the system the type\n # of the tested element.\n PyFunceble.INTERN[\"to_test_type\"] = \"domain\"\n\n # We test the given or the downloaded file.\n self.file()\n else:\n # No file, domain, single url or file or url is given.\n\n # We print a message on screen.\n print(\n PyFunceble.Fore.CYAN + PyFunceble.Style.BRIGHT + \"Nothing to test.\"\n )\n\n if (\n self.domain_or_ip_to_test # pylint: disable=no-member\n or self.url_to_test # pylint: disable=no-member\n ):\n # We are testing a domain.\n\n # We stop and log the execution time.\n ExecutionTime(\"stop\", last=True)\n\n # We log the current percentage state.\n self.percentage.log()\n\n # We show the colored logo.\n self.colorify_logo()\n\n # We print our friendly message :)\n PyFunceble.stay_safe()\n else:\n # We are used as an imported module.\n\n # * We activate the simple mode as the table or any full\n # details on screen are irrelevant.\n # * We activate the quiet mode.\n # And we deactivate the generation of files.\n PyFunceble.CONFIGURATION[\"simple\"] = PyFunceble.CONFIGURATION[\n \"quiet\"\n ] = PyFunceble.CONFIGURATION[\"no_files\"] = True\n\n # * We deactivate the whois database as it is not needed.\n # * We deactivate the database as it is not needed.\n # * We deactivate the autocontinue subsystem as it is not needed.\n # * We deactivate the execution time subsystem as it is not needed.\n PyFunceble.CONFIGURATION[\"whois_database\"] = PyFunceble.CONFIGURATION[\n \"inactive_database\"\n ] = PyFunceble.CONFIGURATION[\"auto_continue\"] = PyFunceble.CONFIGURATION[\n \"show_execution_time\"\n ] = False\n\n if self.domain_or_ip_to_test: # pylint: disable=no-member\n # A domain is given.\n\n # We initiate a variable which will tell the system the type\n # of the tested element.\n PyFunceble.INTERN[\"to_test_type\"] = \"domain\"\n\n # We set the domain to test.\n PyFunceble.INTERN[\n \"to_test\"\n ] = self.domain_or_ip_to_test.lower() # pylint: disable=no-member\n elif self.url_to_test: # pylint: disable=no-member\n # A url is given,\n\n # We initiate a variable which will tell the system the type\n # of the tested element.\n PyFunceble.INTERN[\"to_test_type\"] = \"url\"\n\n # We set the url to test.\n PyFunceble.INTERN[\n \"to_test\"\n ] = self.url_to_test"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nexiting the script if :code:`[PyFunceble skip]` is matched into the latest commit message.", "response": "def bypass(cls):\n \"\"\"\n Exit the script if :code:`[PyFunceble skip]` is matched into the latest\n commit message.\n \"\"\"\n\n # We set the regex to match in order to bypass the execution of\n # PyFunceble.\n regex_bypass = r\"\\[PyFunceble\\sskip\\]\"\n\n if (\n PyFunceble.CONFIGURATION[\"travis\"]\n and Regex(\n Command(\"git log -1\").execute(), regex_bypass, return_data=False\n ).match()\n ):\n # * We are under Travis CI.\n # and\n # * The bypass marker is matched into the latest commit.\n\n # We save everything and stop PyFunceble.\n AutoSave(True, is_bypass=True)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nprint the header of the current class.", "response": "def _print_header(cls):\n \"\"\"\n Decide if we print or not the header.\n \"\"\"\n\n if (\n not PyFunceble.CONFIGURATION[\"quiet\"]\n and not PyFunceble.CONFIGURATION[\"header_printed\"]\n ):\n # * The quiet mode is not activated.\n # and\n # * The header has not been already printed.\n\n # We print a new line.\n print(\"\\n\")\n\n if PyFunceble.CONFIGURATION[\"less\"]:\n # We have to show less informations on screen.\n\n # We print the `Less` header.\n Prints(None, \"Less\").header()\n else:\n # We have to show every informations on screen.\n\n # We print the `Generic` header.\n Prints(None, \"Generic\").header()\n\n # The header was printed.\n\n # We initiate the variable which say that the header has been printed to True.\n PyFunceble.CONFIGURATION[\"header_printed\"] = True"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nmanaging the database, autosave and autocontinue systems for the case that we are reading a file. :param current: The currently tested element. :type current: str :param last: The last element of the list. :type last: str :param status: The status of the currently tested element. :type status: str", "response": "def _file_decision(self, current, last, status=None):\n \"\"\"\n Manage the database, autosave and autocontinue systems for the case that we are reading\n a file.\n\n :param current: The currently tested element.\n :type current: str\n\n :param last: The last element of the list.\n :type last: str\n\n :param status: The status of the currently tested element.\n :type status: str\n \"\"\"\n\n if (\n status\n and not PyFunceble.CONFIGURATION[\"simple\"]\n and PyFunceble.INTERN[\"file_to_test\"]\n ):\n # * The status is given.\n # and\n # * The simple mode is deactivated.\n # and\n # * A file to test is set.\n\n # We run the mining logic.\n self.mining.process()\n\n # We delete the currently tested element from the mining\n # database.\n # Indeed, as it is tested, it is already in our\n # testing process which means that we don't need it into\n # the mining database.\n self.mining.remove()\n\n if (\n status.lower() in PyFunceble.STATUS[\"list\"][\"up\"]\n or status.lower() in PyFunceble.STATUS[\"list\"][\"valid\"]\n ):\n # The status is in the list of up status.\n\n if self.inactive_database.is_present():\n # The currently tested element is in the database.\n\n # We generate the suspicious file(s).\n Generate(PyFunceble.STATUS[\"official\"][\"up\"]).analytic_file(\n \"suspicious\"\n )\n\n # We remove the currently tested element from the\n # database.\n self.inactive_database.remove()\n\n else:\n # The status is not in the list of up status.\n\n # We add the currently tested element to the\n # database.\n self.inactive_database.add()\n\n # We backup the current state of the file reading\n # for the case that we need to continue later.\n self.auto_continue.backup()\n\n if current != last:\n # The current element is not the last one.\n\n # We run the autosave logic.\n AutoSave()\n else:\n # The current element is the last one.\n\n # We stop and log the execution time.\n ExecutionTime(\"stop\", last=True)\n\n # We show/log the percentage.\n self.percentage.log()\n\n # We reset the counters as we end the process.\n self.reset_counters()\n\n # We backup the current state of the file reading\n # for the case that we need to continue later.\n self.auto_continue.backup()\n\n # We show the colored logo.\n self.colorify_logo()\n\n # We save and stop the script if we are under\n # Travis CI.\n AutoSave(True)\n\n for index in [\"http_code\", \"referer\"]:\n # We loop through some configuration index we have to empty.\n\n if index in PyFunceble.INTERN:\n # The index is in the configuration.\n\n # We empty the configuration index.\n PyFunceble.INTERN[index] = \"\""} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef domain(self, domain=None, last_domain=None):\n\n # We print the header.\n self._print_header()\n\n if domain:\n # A domain is given.\n\n # We format and set the domain we are testing and treating.\n PyFunceble.INTERN[\"to_test\"] = self._format_domain(domain)\n else:\n # A domain is not given.\n\n # We set the domain we are testing and treating to None.\n PyFunceble.INTERN[\"to_test\"] = None\n\n if PyFunceble.INTERN[\"to_test\"]:\n # The domain is given (Not None).\n\n if PyFunceble.CONFIGURATION[\"syntax\"]:\n # The syntax mode is activated.\n\n # We get the status from Syntax.\n status = self.syntax_status.get()\n else:\n # We test and get the status of the domain.\n status, _ = self.status.get()\n\n # We run the file decision logic.\n self._file_decision(PyFunceble.INTERN[\"to_test\"], last_domain, status)\n\n if PyFunceble.CONFIGURATION[\"simple\"]:\n # The simple mode is activated.\n\n # We print the domain and the status.\n print(PyFunceble.INTERN[\"to_test\"], status)\n\n # We return the tested domain and its status.\n return PyFunceble.INTERN[\"to_test\"], status\n\n # We return None, there is nothing to test.\n return None", "response": "Tests if a domain is in the list of domains."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ntest if a given url is in the list of URLs.", "response": "def url(self, url_to_test=None, last_url=None):\n \"\"\"\n Manage the case that we want to test only a given url.\n\n :param url_to_test: The url to test.\n :type url_to_test: str\n\n :param last_url:\n The last url of the file we are testing\n (if exist)\n :type last_url: str\n \"\"\"\n\n # We print the header.\n self._print_header()\n\n if url_to_test:\n # An url to test is given.\n\n # We set the url we are going to test.\n PyFunceble.INTERN[\"to_test\"] = url_to_test\n else:\n # An URL to test is not given.\n\n # We set the url we are going to test to None.\n PyFunceble.INTERN[\"to_test\"] = None\n\n if PyFunceble.INTERN[\"to_test\"]:\n # An URL to test is given.\n\n if PyFunceble.CONFIGURATION[\"syntax\"]:\n # The syntax mode is activated.\n\n # We get the status from Syntax.\n status = self.syntax_status.get()\n else:\n # The syntax mode is not activated.\n\n # We get the status from URL.\n status = self.url_status.get()\n\n # We run the file decision logic.\n self._file_decision(PyFunceble.INTERN[\"to_test\"], last_url, status)\n\n if PyFunceble.CONFIGURATION[\"simple\"]:\n # The simple mode is activated.\n\n # We print the URL informations.\n print(PyFunceble.INTERN[\"to_test\"], status)\n\n # We return the URL we tested and its status.\n return PyFunceble.INTERN[\"to_test\"], status\n\n # We return None, there is nothing to test.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef colorify_logo(cls, home=False):\n\n if not PyFunceble.CONFIGURATION[\"quiet\"]:\n # The quiet mode is not activated.\n\n to_print = []\n\n if home:\n # We have to print the initial logo.\n\n for line in PyFunceble.ASCII_PYFUNCEBLE.split(\"\\n\"):\n # We loop through each lines of the ASCII representation\n # of PyFunceble.\n\n # And we append to the data to print the currently read\n # line with the right coloration.\n to_print.append(\n PyFunceble.Fore.YELLOW + line + PyFunceble.Fore.RESET\n )\n\n elif PyFunceble.INTERN[\"counter\"][\"percentage\"][\"up\"] >= 50:\n # The percentage of up is greater or equal to 50%.\n\n for line in PyFunceble.ASCII_PYFUNCEBLE.split(\"\\n\"):\n # We loop through each lines of the ASCII representation\n # of PyFunceble.\n\n # And we append to the data to print the currently read\n # line with the right coloration.\n to_print.append(\n PyFunceble.Fore.GREEN + line + PyFunceble.Fore.RESET\n )\n else:\n # The percentage of up is less than 50%.\n\n for line in PyFunceble.ASCII_PYFUNCEBLE.split(\"\\n\"):\n # We loop through each lines of the ASCII representation\n # of PyFunceble.\n\n # And we append to the data to print the currently read\n # line with the right coloration.\n to_print.append(PyFunceble.Fore.RED + line + PyFunceble.Fore.RESET)\n\n print(\"\\n\".join(to_print))", "response": "Print the colored logo based on global results."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _format_domain(cls, extracted_domain):\n\n if not extracted_domain.startswith(\"#\"):\n # The line is not a commented line.\n\n if \"#\" in extracted_domain:\n # There is a comment at the end of the line.\n\n # We delete the comment from the line.\n extracted_domain = extracted_domain[\n : extracted_domain.find(\"#\")\n ].strip()\n\n if \" \" in extracted_domain or \"\\t\" in extracted_domain:\n # A space or a tabs is in the line.\n\n # We remove all whitestring from the extracted line.\n splited_line = extracted_domain.split()\n\n # As there was a space or a tab in the string, we consider\n # that we are working with the hosts file format which means\n # that the domain we have to test is after the first string.\n # So we set the index to 1.\n index = 1\n\n while index < len(splited_line):\n # We loop until the index is greater than the length of\n # the splited line.\n\n if splited_line[index]:\n # The element at the current index is not an empty string.\n\n # We break the loop.\n break\n\n # The element at the current index is an empty string.\n\n # We increase the index number.\n index += 1\n\n # We return the last read element.\n return splited_line[index]\n\n # We return the extracted line.\n return extracted_domain\n\n # The extracted line is a comment line.\n\n # We return an empty string as we do not want to work with commented line.\n return \"\"", "response": "Format the extracted domain before passing it to the system."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nextract all non commented lines from the given file.", "response": "def _extract_domain_from_file(cls):\n \"\"\"\n Extract all non commented lines from the file we are testing.\n\n :return: The elements to test.\n :rtype: list\n \"\"\"\n\n # We initiate the variable which will save what we are going to return.\n result = []\n\n if PyFunceble.path.isfile(PyFunceble.INTERN[\"file_to_test\"]):\n # The give file to test exist.\n\n try:\n with open(PyFunceble.INTERN[\"file_to_test\"]) as file:\n # We open and read the file.\n\n for line in file:\n # We loop through each lines.\n\n if not line.startswith(\"#\"):\n # The currently read line is not a commented line.\n\n # We append the current read line to the result.\n result.append(line.rstrip(\"\\n\").strip())\n except UnicodeDecodeError:\n with open(PyFunceble.INTERN[\"file_to_test\"], encoding=\"utf-8\") as file:\n # We open and read the file.\n\n for line in file:\n # We loop through each lines.\n\n if not line.startswith(\"#\"):\n # The currently read line is not a commented line.\n\n # We append the current read line to the result.\n result.append(line.rstrip(\"\\n\").strip())\n\n else:\n # The given file to test does not exist.\n\n # We raise a FileNotFoundError exception.\n raise FileNotFoundError(PyFunceble.INTERN[\"file_to_test\"])\n\n # We return the result.\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nmanages the case that need to test each domain of a given file path.", "response": "def file(self):\n \"\"\"\n Manage the case that need to test each domain of a given file path.\n\n .. note::\n 1 domain per line.\n \"\"\"\n\n # We get, format, filter, clean the list to test.\n list_to_test = self._file_list_to_test_filtering()\n\n if PyFunceble.CONFIGURATION[\"idna_conversion\"]:\n # We have to convert domains to idna.\n\n # We convert if we need to convert.\n list_to_test = domain2idna(list_to_test)\n\n if PyFunceble.CONFIGURATION[\"hierarchical_sorting\"]:\n # The hierarchical sorting is desired by the user.\n\n # We format the list.\n list_to_test = List(list_to_test).custom_format(Sort.hierarchical)\n else:\n # The hierarchical sorting is not desired by the user.\n\n # We format the list.\n list_to_test = List(list_to_test).custom_format(Sort.standard)\n\n # We initiate a local variable which will save the current state of the list.\n not_filtered = list_to_test\n\n try:\n # We remove the element which are in the database from the\n # current list to test.\n list_to_test = List(\n list(\n set(\n list_to_test[PyFunceble.INTERN[\"counter\"][\"number\"][\"tested\"] :]\n )\n - set(PyFunceble.INTERN[\"flatten_inactive_db\"])\n )\n ).format()\n _ = list_to_test[-1]\n except IndexError:\n # Our list to test is the one with the element from the database.\n list_to_test = not_filtered[\n PyFunceble.INTERN[\"counter\"][\"number\"][\"tested\"] :\n ]\n\n # We delete the undesired variable.\n del not_filtered\n\n if PyFunceble.CONFIGURATION[\"hierarchical_sorting\"]:\n # The hierarchical sorting is desired by the user.\n\n # We format the list.\n list_to_test = List(list(list_to_test)).custom_format(Sort.hierarchical)\n\n try:\n # We test each element of the list to test.\n return [self.domain(x, list_to_test[-1]) for x in list_to_test if x]\n except IndexError:\n # We print a message on screen.\n print(PyFunceble.Fore.CYAN + PyFunceble.Style.BRIGHT + \"Nothing to test.\")"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning a list of file URLs for the current file.", "response": "def file_url(self):\n \"\"\"\n Manage the case that we have to test a file\n\n .. note::\n 1 URL per line.\n \"\"\"\n\n # We get, format, clean the list of URL to test.\n list_to_test = self._file_list_to_test_filtering()\n\n # We initiate a local variable which will save the current state of the list.\n not_filtered = list_to_test\n\n try:\n # We remove the element which are in the database from the\n # current list to test.\n list_to_test = List(\n list(\n set(\n list_to_test[PyFunceble.INTERN[\"counter\"][\"number\"][\"tested\"] :]\n )\n - set(PyFunceble.INTERN[\"flatten_inactive_db\"])\n )\n ).format()\n _ = list_to_test[-1]\n except IndexError:\n # Our list to test is the one with the element from the database.\n list_to_test = not_filtered[\n PyFunceble.INTERN[\"counter\"][\"number\"][\"tested\"] :\n ]\n\n # We delete the undesired variable.\n del not_filtered\n\n if PyFunceble.CONFIGURATION[\"hierarchical_sorting\"]:\n # The hierarchical sorting is desired by the user.\n\n # We format the list.\n list_to_test = List(list(list_to_test)).custom_format(Sort.hierarchical)\n\n try:\n # We test each URL from the list to test.\n return [self.url(x, list_to_test[-1]) for x in list_to_test if x]\n except IndexError:\n # We print a message on screen.\n print(PyFunceble.Fore.CYAN + PyFunceble.Style.BRIGHT + \"Nothing to test.\")"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef switch(\n cls, variable, custom=False\n ): # pylint: disable=inconsistent-return-statements\n \"\"\"\n Switch PyFunceble.CONFIGURATION variables to their opposite.\n\n :param variable:\n The variable name to switch.\n The variable should be an index our configuration system.\n If we want to switch a bool variable, we should parse\n it here.\n :type variable: str|bool\n\n :param custom:\n Let us know if have to switch the parsed variable instead\n of our configuration index.\n :type custom: bool\n\n :return:\n The opposite of the configuration index or the given variable.\n :rtype: bool\n\n :raises:\n :code:`Exception`\n When the configuration is not valid. In other words,\n if the PyFunceble.CONFIGURATION[variable_name] is not a bool.\n \"\"\"\n\n if not custom:\n # We are not working with custom variable which is not into\n # the configuration.\n\n # We get the current state.\n current_state = dict.get(PyFunceble.CONFIGURATION, variable)\n else:\n # We are working with a custom variable which is not into the\n # configuration\n current_state = variable\n\n if isinstance(current_state, bool):\n # The current state is a boolean.\n\n if current_state:\n # The current state is equal to True.\n\n # We return False.\n return False\n\n # The current state is equal to False.\n\n # We return True.\n return True\n\n # The current state is not a boolean.\n\n # We set the message to raise.\n to_print = \"Impossible to switch %s. Please post an issue to %s\"\n\n # We raise an exception inviting the user to report an issue.\n raise Exception(\n to_print % (repr(variable), PyFunceble.LINKS[\"repo\"] + \"/issues.\")\n )", "response": "Switch PyFunceble. CONFIGURATION variables to their opposite."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nreturn the status while testing for an IP or domain.", "response": "def get(cls):\n \"\"\"\n Get the status while testing for an IP or domain.\n\n .. note::\n We consider that the domain or IP we are currently testing\n is into :code:`PyFunceble.INTERN[\"to_test\"]`.\n \"\"\"\n\n if \"to_test\" in PyFunceble.INTERN and PyFunceble.INTERN[\"to_test\"]:\n expiration_date = ExpirationDate().get()\n\n if expiration_date is False:\n return cls.handle(status=\"invalid\")\n\n if expiration_date == PyFunceble.STATUS[\"official\"][\"up\"]:\n return expiration_date, \"WHOIS\"\n\n return cls.handle(status=\"inactive\")\n\n raise NotImplementedError(\"We expect `INTERN['to_test']` to be set.\")"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nhandling the lack of WHOIS and expiration date.", "response": "def handle(cls, status, invalid_source=\"IANA\"):\n \"\"\"\n Handle the lack of WHOIS and expiration date. :smile_cat:\n\n :param matched_status: The status that we have to handle.\n :type status: str\n\n :param invalid_source:\n The source to set when we handle INVALID element.\n :type invalid_source: str\n\n :return:\n The strus of the domain after generating the files desired\n by the user.\n :rtype: str\n \"\"\"\n\n if status.lower() not in PyFunceble.STATUS[\"list\"][\"invalid\"]:\n # The matched status is not in the list of invalid status.\n\n # We initiate the source we are going to parse to the Generate class.\n source = \"NSLOOKUP\"\n\n if Lookup().nslookup():\n # We could execute the nslookup logic.\n\n # We get the status and source after extra rules check.\n status, source = cls.extra_rules.handle(\n PyFunceble.STATUS[\"official\"][\"up\"], source\n )\n\n # We generate the status files with the up status.\n Generate(status, source).status_file()\n\n # We return the up status.\n return status, source\n\n # We could not execute the nslookup logic.\n\n # We get the status and source after extra rules check.\n status, source = cls.extra_rules.handle(\n PyFunceble.STATUS[\"official\"][\"down\"], source\n )\n\n # We generate the status file with the down status.\n Generate(status, source).status_file()\n\n # We return the down status.\n return status, source\n\n # The matched status is in the list of invalid status.\n\n # We get the status and source after extra rules check.\n status, source = cls.extra_rules.handle(\n PyFunceble.STATUS[\"official\"][\"invalid\"], invalid_source\n )\n\n # We generate the status file with the invalid status.\n Generate(status, source).status_file()\n\n # We return the status.\n return status, source"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef handle(self):\n\n # We initiate the source we are going to parse to the Generate class.\n source = \"URL\"\n\n if self.catched.lower() not in PyFunceble.STATUS[\"list\"][\"invalid\"]:\n # The parsed status is not in the list of invalid.\n\n # We generate the status file with the catched status.\n Generate(self.catched, source).status_file()\n else:\n # The parsed status is in the list of invalid.\n\n # We generate the status file with the parsed status.\n Generate(self.catched, \"SYNTAX\").status_file()\n\n # We return the parsed status.\n return self.catched", "response": "Handle the backend of the given status."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef backup(self):\n\n # We set the current output directory path.\n output_path = self.base + PyFunceble.OUTPUTS[\"parent_directory\"]\n\n # We initiate the structure base.\n result = {PyFunceble.OUTPUTS[\"parent_directory\"]: {}}\n\n for root, _, files in PyFunceble.walk(output_path):\n # We loop through the current output directory structure.\n\n # We get the currently read directory name.\n directories = Directory(root.split(output_path)[1]).fix_path()\n\n # We initiate a local variable which will get the structure of the subdirectory.\n local_result = result[PyFunceble.OUTPUTS[\"parent_directory\"]]\n\n for file in files:\n # We loop through the list of files.\n\n # We construct the file path.\n file_path = root + PyFunceble.directory_separator + file\n\n # We get the hash of the file.\n file_hash = Hash(file_path, \"sha512\", True).get()\n\n # We convert the file content to a list.\n lines_in_list = [line.rstrip(\"\\n\") for line in open(file_path)]\n\n # We convert the file content into a more flat format.\n # We use `@@@` as glue and implicitly replacement for `\\n`.\n formatted_content = \"@@@\".join(lines_in_list)\n\n # We update the local result (and implicitly the global result)\n # with the files and directory informations/structure.\n local_result = local_result.setdefault(\n directories,\n {file: {\"sha512\": file_hash, \"content\": formatted_content}},\n )\n\n # We finally save the directory structure into the production file.\n Dict(result).to_json(self.base + \"dir_structure_production.json\")", "response": "This function will backup the developer state of output directory structure and the production file structure."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if we need to replace. gitignore to. keep.", "response": "def _restore_replace(self):\n \"\"\"\n Check if we need to replace \".gitignore\" to \".keep\".\n\n :return: The replacement status.\n :rtype: bool\n \"\"\"\n\n if PyFunceble.path.isdir(self.base + \".git\"):\n # The `.git` directory exist.\n\n if \"PyFunceble\" not in Command(\"git remote show origin\").execute():\n # PyFunceble is not in the origin.\n\n # We return True.\n return True\n\n # We return False.\n return False\n\n # The `.git` directory does not exist.\n\n # We return True.\n return True"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _update_structure_from_config(self, structure):\n\n # We initiate a variable which will map what we have to replace `ouput` to.\n # Indeed, as we allow the user to change directory names directly from the\n # configuration, here we initiate what we have to replace `output/` with.\n to_replace_base_map = {\"output/\": PyFunceble.OUTPUTS[\"parent_directory\"]}\n\n # We map the replacement of other directories.\n to_replace_map = {\n #########################################################################\n # The following part is there for historical reason. #\n #########################################################################\n # We get the replacement of the HTTP_Analytic directory from the\n # configuration file.\n \"HTTP_Analytic/\": PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"parent\"],\n # We get the replacement of the HTTP_Analytic/ACTIVE directory from the\n # configuration file.\n \"HTTP_Analytic/ACTIVE/\": PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\n \"parent\"\n ]\n + PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"up\"],\n \"HTTP_Analytic/POTENTIALLY_ACTIVE/\": PyFunceble.OUTPUTS[\"analytic\"][\n \"directories\"\n ][\"parent\"]\n + PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"potentially_up\"],\n # We get the replacement of the HTTP_Analytic/POTENTIALLY_INACTIVE directory\n # from the configuration file.\n \"HTTP_Analytic/POTENTIALLY_INACTIVE/\": PyFunceble.OUTPUTS[\"analytic\"][\n \"directories\"\n ][\"parent\"]\n + PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"potentially_down\"],\n #########################################################################\n # The previous part is there for historical reason. #\n #########################################################################\n # We get the replacement of the Analytic directory from the\n # configuration file.\n \"Analytic/\": PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"parent\"],\n # We get the replacement of the Analytic/ACTIVE directory from the\n # configuration file.\n \"Analytic/ACTIVE/\": PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"parent\"]\n + PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"up\"],\n \"Analytic/POTENTIALLY_ACTIVE/\": PyFunceble.OUTPUTS[\"analytic\"][\n \"directories\"\n ][\"parent\"]\n + PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"potentially_up\"],\n # We get the replacement of the Analytic/POTENTIALLY_INACTIVE directory\n # from the configuration file.\n \"Analytic/POTENTIALLY_INACTIVE/\": PyFunceble.OUTPUTS[\"analytic\"][\n \"directories\"\n ][\"parent\"]\n + PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"potentially_down\"],\n # We get the replacement of the Analytic/SUSPICIOUS directory\n # from the configuration file.\n \"Analytic/SUSPICIOUS/\": PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\n \"parent\"\n ]\n + PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"suspicious\"],\n # We get the replacement of the domains directory from the\n # configuration file.\n \"domains/\": PyFunceble.OUTPUTS[\"domains\"][\"directory\"],\n # We get the replacement of the domains/ACTIVE directory from the\n # configuration file.\n \"domains/ACTIVE/\": PyFunceble.OUTPUTS[\"domains\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"up\"],\n # We get the replacement of the domains/INACTIVE directory from the\n # configuration file.\n \"domains/INACTIVE/\": PyFunceble.OUTPUTS[\"domains\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"down\"],\n # We get the replacement of the domains/INVALID directory from the\n # configuration file.\n \"domains/INVALID/\": PyFunceble.OUTPUTS[\"domains\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"invalid\"],\n # We get the replacement of the domains/VALID directory from the\n # configuration file.\n \"domains/VALID/\": PyFunceble.OUTPUTS[\"domains\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"valid\"],\n # We get the replacement of the hosts directory from the\n # configuration file.\n \"hosts/\": PyFunceble.OUTPUTS[\"hosts\"][\"directory\"],\n # We get the replacement of the hosts/ACTIVE directory from the\n # configuration file.\n \"hosts/ACTIVE/\": PyFunceble.OUTPUTS[\"hosts\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"up\"],\n # We get the replacement of the hosts/INACTIVE directory from the\n # configuration file.\n \"hosts/INACTIVE/\": PyFunceble.OUTPUTS[\"hosts\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"down\"],\n # We get the replacement of the hosts/INVALID directory from the\n # configuration file.\n \"hosts/INVALID/\": PyFunceble.OUTPUTS[\"hosts\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"invalid\"],\n # We get the replacement of the hosts/VALID directory from the\n # configuration file.\n \"hosts/VALID/\": PyFunceble.OUTPUTS[\"hosts\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"valid\"],\n # We get the replacement of the json directory from the\n # configuration file.\n \"json/\": PyFunceble.OUTPUTS[\"json\"][\"directory\"],\n # We get the replacement of the json/ACTIVE directory from the\n # configuration file.\n \"json/ACTIVE/\": PyFunceble.OUTPUTS[\"json\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"up\"],\n # We get the replacement of the json/INACTIVE directory from the\n # configuration file.\n \"json/INACTIVE/\": PyFunceble.OUTPUTS[\"json\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"down\"],\n # We get the replacement of the json/INVALID directory from the\n # configuration file.\n \"json/INVALID/\": PyFunceble.OUTPUTS[\"json\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"invalid\"],\n # We get the replacement of the json/VALID directory from the\n # configuration file.\n \"json/VALID/\": PyFunceble.OUTPUTS[\"json\"][\"directory\"]\n + PyFunceble.STATUS[\"official\"][\"valid\"],\n # We get the replacement of the logs directory from the\n # configuration file.\n \"logs/\": PyFunceble.OUTPUTS[\"logs\"][\"directories\"][\"parent\"],\n # We get the replacement of the logs/percentage directory from the\n # configuration file.\n \"logs/percentage/\": PyFunceble.OUTPUTS[\"logs\"][\"directories\"][\"parent\"]\n + PyFunceble.OUTPUTS[\"logs\"][\"directories\"][\"percentage\"],\n # We get the replacement of the splited directory from the\n # configuration file.\n \"splited/\": PyFunceble.OUTPUTS[\"splited\"][\"directory\"],\n }\n\n # We initiate the variable which will be used for the structure\n # update.\n to_replace = {}\n\n for mapped, declared in to_replace_map.items():\n # We loop through the declared mad.\n\n # We fix the path of the declared.\n declared = Directory(declared).fix_path()\n # print('dec', declared, 'map', mapped)\n\n # And we update our data.\n to_replace.update({mapped: declared})\n\n to_replace_base = {}\n for mapped, declared in to_replace_base_map.items():\n # We loop through the declared mad.\n\n # We fix the path of the declared.\n declared = Directory(declared).fix_path()\n\n # And we update our data.\n to_replace_base.update({mapped: declared})\n\n # We perform the replacement of the base directory.\n structure = Dict(structure).rename_key(to_replace_base)\n\n # We perform the replacement of every subdirectories.\n structure[PyFunceble.OUTPUTS[\"parent_directory\"]] = Dict(\n structure[PyFunceble.OUTPUTS[\"parent_directory\"]]\n ).rename_key(to_replace)\n\n try:\n # We try to save the structure into the right path.\n\n Dict(structure).to_json(self.structure)\n except FileNotFoundError:\n # But if we get a FileNotFoundError exception,\n\n # We create the directory where the directory structure should be saved.\n PyFunceble.mkdir(\n PyFunceble.directory_separator.join(\n self.structure.split(PyFunceble.directory_separator)[:-1]\n )\n )\n\n # And we retry to save the structure into the right path.\n Dict(structure).to_json(self.structure)\n\n # We finaly return the new structure in case it's needed for other logic.\n return structure", "response": "Update the structure according to the configuration file."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _get_structure(self):\n\n # We initiate an empty variable which is going to save the location of\n # file we are going to download.\n structure_file = \"\"\n\n # We initiate the variable which will save the request instance.\n req = \"\"\n\n if PyFunceble.path.isfile(self.structure):\n # The structure path file exist.\n\n # We set it as the destination file.\n structure_file = self.structure\n elif PyFunceble.path.isfile(self.base + \"dir_structure_production.json\"):\n # * The structure path file does not exist.\n # but\n # * The production structure path file exist.\n\n # We set it as the destination file\n structure_file = self.base + \"dir_structure_production.json\"\n else:\n # * The structure path file does not exist.\n # and\n # * The production structure path file does not exist.\n\n if \"dev\" not in PyFunceble.VERSION:\n # `dev` is not into the local version name.\n\n # We get the production file from the master branch.\n req = PyFunceble.requests.get(\n PyFunceble.LINKS[\"dir_structure\"].replace(\"dev\", \"master\")\n )\n else:\n # `dev` is into the local version name.\n\n # We get the production file from the dev branch.\n req = PyFunceble.requests.get(\n PyFunceble.LINKS[\"dir_structure\"].replace(\"master\", \"dev\")\n )\n\n if structure_file.endswith(\"_production.json\"):\n # The destination is the production file.\n\n # And we return the updated the structure from the last read file.\n # (with the names from the configuration file).\n return self._update_structure_from_config(\n Dict().from_json(File(structure_file).read())\n )\n\n # The destination is not the production file.\n\n if structure_file.endswith(\".json\"):\n # The destination ends with `.json`.\n\n # And we return the updated the structure from the given file.\n # (with the names from the configuration file).\n return self._update_structure_from_config(\n Dict().from_json(File(structure_file).read())\n )\n\n # The destination does not ends with `.json`.\n\n # We return the updated the structure from the link we previously got.\n # (with the names from the configuration file).\n return self._update_structure_from_config(Dict().from_json(req.text))", "response": "Get the structure we are going to work with."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _create_directory(cls, directory, loop=False):\n\n if not loop and PyFunceble.directory_separator in directory:\n # * We are not in the loop.\n # and\n # * The directory separator in the given directory.\n\n # We split the directories separator.\n splited_directory = directory.split(PyFunceble.directory_separator)\n\n # We initiate a variable which will save the full path to create.\n full_path_to_create = \"\"\n\n for single_directory in splited_directory:\n # We loop through each directory.\n\n # We append the currently read directory to the full path.\n full_path_to_create += single_directory + PyFunceble.directory_separator\n\n # And we create the directory if it does not exist.\n cls._create_directory(full_path_to_create, True)\n\n if not PyFunceble.path.isdir(directory):\n # The given directory does not exist.\n\n # We update the permission.\n # (Only if we are under Travis CI.)\n AutoSave.travis_permissions()\n\n # We create the directory.\n PyFunceble.mkdir(directory)\n\n # We update the permission.\n # (Only if we are under Travis CI.)\n AutoSave.travis_permissions()", "response": "Create the given directory if it does not exist."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nrestore the output directory structure based on the dir_structure. json file.", "response": "def restore(self):\n \"\"\"\n Restore the 'output/' directory structure based on the `dir_structure.json` file.\n \"\"\"\n\n # We get the structure we have to create/apply.\n structure = self._get_structure()\n\n # We get the list of key which is implicitly the list of directory to recreate.\n list_of_key = list(structure.keys())\n\n # We move to the content of the parent as we know that we are creating only one directory.\n # Note: if one day we will have to create multiple directory, we will have to change\n # the following.\n structure = structure[list_of_key[0]]\n\n # We also set the parent directory as we are going to construct its childen.\n parent_path = list_of_key[0]\n\n if not parent_path.endswith(PyFunceble.directory_separator):\n parent_path += PyFunceble.directory_separator\n\n # We get if we have to replace `.gitignore` to `.keep` and versa.\n replacement_status = self._restore_replace()\n\n for directory in structure:\n # We loop through the list of directory to create.\n\n # We construct the full path.\n base = self.base + parent_path + directory\n\n if not base.endswith(PyFunceble.directory_separator):\n base += PyFunceble.directory_separator\n\n # We create the constructed path if it does not exist.\n self._create_directory(base)\n\n for file in structure[directory]:\n # We loop through the list of files in the currently read directory.\n\n # We construct the full file path.s\n file_path = base + file\n\n # We get the file content.\n content_to_write = structure[directory][file][\"content\"]\n\n # And its sha512 checksum.\n online_sha = structure[directory][file][\"sha512\"]\n\n # We update the content to write by replacing our glue with `\\n`.\n content_to_write = Regex(\n content_to_write, \"@@@\", escape=True, replace_with=\"\\\\n\"\n ).replace()\n\n # We get the file path as .keep.\n git_to_keep = file_path.replace(\"gitignore\", \"keep\")\n\n # We get the file path as .gitignore.\n keep_to_git = file_path.replace(\"keep\", \"gitignore\")\n\n if replacement_status:\n # We have to replace every .gitignore to .keep.\n\n if (\n PyFunceble.path.isfile(file_path)\n and Hash(file_path, \"sha512\", True).get() == online_sha\n ):\n # * The currently read file exist.\n # and\n # * Its sha512sum is equal to the one we have in our structure.\n\n # We rename the file.\n PyFunceble.rename(file_path, git_to_keep)\n\n # And we disallow the file writing.\n write = False\n else:\n # * The currently read file does not exist.\n # or\n # * Its sha512sum is not equal to the one we have in our structure.\n\n # We delere the file if it does exist.\n File(file_path).delete()\n\n # We update the file path.\n file_path = git_to_keep\n\n # And we allow the file writing.\n write = True\n else:\n # We have to replace every .keep to .gitignore.\n if (\n PyFunceble.path.isfile(keep_to_git)\n and Hash(file_path, \"sha512\", True).get() == online_sha\n ):\n # * The .keep file exist.\n # and\n # * Its sha512sum is equal to the one we have in our structure.\n\n # We rename the file.\n PyFunceble.rename(file_path, keep_to_git)\n\n # And we disallow the file writing.\n write = False\n else:\n # * The .keep file does not exist.\n # or\n # * Its sha512sum is not equal to the one we have in our structure.\n\n # We delete the file if it exist.\n File(keep_to_git).delete()\n\n # We update the file path\n file_path = keep_to_git\n\n # And we allow the file writing.\n write = True\n\n if write:\n # The file writing is allowed.\n\n # We write our file content into the file path.\n File(file_path).write(content_to_write + \"\\n\", True)\n self.delete_uneeded()"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ndeleting the uneeded directory.", "response": "def delete_uneeded(self):\n \"\"\"\n Delete the directory which are not registered into our structure.\n \"\"\"\n\n # We get the structure we have to apply.\n structure = self._get_structure()\n\n # We get the list of key which is implicitly the list of directory we do not bave to delete.\n list_of_key = list(structure.keys())\n\n # We move to the content of the parent as we know that we are creating only one directory.\n # Note: if one day we will have to create multiple directory, we will have to change\n # the following.\n structure = structure[list_of_key[0]]\n\n # We also set the parent directory as we are going to construct its childen.\n parent_path = list_of_key[0]\n\n if not parent_path.endswith(PyFunceble.directory_separator):\n parent_path += PyFunceble.directory_separator\n\n for root, _, _ in PyFunceble.walk(parent_path):\n # We loop through each directories of the parent path.\n\n # We fix the path in order to avoid issues.\n root = Directory(root).fix_path()\n\n if root.replace(parent_path, \"\") not in structure:\n # The currently read directory is not in our structure.\n\n # We delete it.\n PyFunceble.rmtree(root)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _set_path_to_configs(cls, path_to_config):\n\n if not path_to_config.endswith(PyFunceble.directory_separator):\n # The path to the config does not ends with the directory separator.\n\n # We initiate the default and the parsed variable with the directory separator.\n default = parsed = path_to_config + PyFunceble.directory_separator\n else:\n # The path to the config does ends with the directory separator.\n\n # We initiate the default and the parsed variable.\n default = parsed = path_to_config\n\n # We append the `CONFIGURATION_FILENAME` to the parsed variable.\n parsed += PyFunceble.CONFIGURATION_FILENAME\n # And we append the `DEFAULT_CONFIGURATION_FILENAME` to the default variable.\n default += PyFunceble.DEFAULT_CONFIGURATION_FILENAME\n\n # We finaly return a tuple which contain both informations.\n return (parsed, default)", "response": "Set the paths to the configuration files."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _load_config_file(self):\n\n try:\n # We try to load the configuration file.\n\n PyFunceble.CONFIGURATION.update(\n Dict.from_yaml(File(self.path_to_config).read())\n )\n\n # We install the latest iana configuration file.\n self._install_iana_config()\n\n # We install the latest public suffix configuration file.\n self._install_psl_config()\n\n # We install the latest directory structure file.\n self._install_directory_structure_file()\n except FileNotFoundError as exception:\n # But if the configuration file is not found.\n\n if PyFunceble.path.isfile(self.path_to_default_config):\n # The `DEFAULT_CONFIGURATION_FILENAME` file exists.\n\n # We copy it as the configuration file.\n File(self.path_to_default_config).copy(self.path_to_config)\n\n # And we load the configuration file as it does exist (yet).\n self._load_config_file()\n else:\n # The `DEFAULT_CONFIGURATION_FILENAME` file does not exists.\n\n # We raise the exception we were handling.\n raise exception", "response": "Load the configuration file into the system."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ndownloads the production configuration and install it in the current directory.", "response": "def _install_production_config(self):\n \"\"\"\n Download the production configuration and install it in the\n current directory.\n \"\"\"\n\n # We initiate the link to the production configuration.\n # It is not hard coded because this method is called only if we\n # are sure that the configuration file exist.\n production_config_link = \"https://raw.githubusercontent.com/funilrys/PyFunceble/master/.PyFunceble_production.yaml\" # pylint: disable=line-too-long\n\n # We update the link according to our current version.\n production_config_link = Version(True).right_url_from_version(\n production_config_link\n )\n\n if not Version(True).is_cloned():\n # The current version is not the cloned one.\n\n # We download the link content and save it inside the default location.\n #\n # Note: We add this one in order to allow the enduser to always have\n # a copy of our upstream configuration file.\n Download(production_config_link, self.path_to_default_config).text()\n\n # And we download the link content and return the download status.\n return Download(production_config_link, self.path_to_config).text()"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ndownload the iana configuration file and return the download status.", "response": "def _install_iana_config(cls):\n \"\"\"\n Download `iana-domains-db.json` if not present.\n \"\"\"\n\n # We initiate the link to the iana configuration.\n # It is not hard coded because this method is called only if we\n # are sure that the configuration file exist.\n iana_link = PyFunceble.CONFIGURATION[\"links\"][\"iana\"]\n\n # We update the link according to our current version.\n iana_link = Version(True).right_url_from_version(iana_link)\n\n # We set the destination of the downloaded file.\n destination = PyFunceble.CURRENT_DIRECTORY + \"iana-domains-db.json\"\n\n if not Version(True).is_cloned() or not PyFunceble.path.isfile(destination):\n # The current version is not the cloned version.\n\n # We Download the link content and return the download status.\n return Download(iana_link, destination).text()\n\n # We are in the cloned version.\n\n # We do not need to download the file, so we are returning None.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _install_psl_config(cls):\n\n # We initiate the link to the public suffix configuration.\n # It is not hard coded because this method is called only if we\n # are sure that the configuration file exist.\n psl_link = PyFunceble.CONFIGURATION[\"links\"][\"psl\"]\n\n # We update the link according to our current version.\n psl_link = Version(True).right_url_from_version(psl_link)\n\n # We set the destination of the downloaded file.\n destination = (\n PyFunceble.CURRENT_DIRECTORY\n + PyFunceble.CONFIGURATION[\"outputs\"][\"default_files\"][\"public_suffix\"]\n )\n\n if not Version(True).is_cloned() or not PyFunceble.path.isfile(destination):\n # The current version is not the cloned version.\n\n # We Download the link content and return the download status.\n return Download(psl_link, destination).text()\n\n # We are in the cloned version.\n\n # We do not need to download the file, so we are returning None.\n return None", "response": "Download the public - suffix. json and install it to the current directory."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ndownload the latest version of dir_structure. json and install it to the current directory structure file.", "response": "def _install_directory_structure_file(cls):\n \"\"\"\n Download the latest version of `dir_structure_production.json`.\n \"\"\"\n\n # We initiate the link to the public suffix configuration.\n # It is not hard coded because this method is called only if we\n # are sure that the configuration file exist.\n dir_structure_link = PyFunceble.CONFIGURATION[\"links\"][\"dir_structure\"]\n\n # We update the link according to our current version.\n dir_structure_link = Version(True).right_url_from_version(dir_structure_link)\n\n # We set the destination of the downloaded file.\n destination = (\n PyFunceble.CURRENT_DIRECTORY\n + PyFunceble.CONFIGURATION[\"outputs\"][\"default_files\"][\"dir_structure\"]\n )\n\n if not Version(True).is_cloned() or not PyFunceble.path.isfile(destination):\n # The current version is not the cloned version.\n\n # We Download the link content and return the download status.\n data = Download(dir_structure_link, destination, return_data=True).text()\n\n File(destination).write(data, overwrite=True)\n return True\n\n # We are in the cloned version.\n\n # We do not need to download the file, so we are returning None.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _merge_values(self):\n\n to_remove = []\n\n self.new_config = Dict(\n Dict(self.upstream_config).merge(PyFunceble.CONFIGURATION)\n ).remove_key(to_remove)", "response": "Merge the older configuration values into the new ones."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _load(self):\n\n if \"PYFUNCEBLE_AUTO_CONFIGURATION\" not in PyFunceble.environ:\n # The auto configuration environment variable is not set.\n\n while True:\n # We infinitly loop until we get a reponse which is `y|Y` or `n|N`.\n\n # We ask the user if we should install and load the default configuration.\n response = input(\n PyFunceble.Style.BRIGHT\n + PyFunceble.Fore.RED\n + \"A configuration key is missing.\\n\"\n + PyFunceble.Fore.RESET\n + \"Try to merge upstream configuration file into %s ? [y/n] \"\n % (\n PyFunceble.Style.BRIGHT\n + self.path_to_config\n + PyFunceble.Style.RESET_ALL\n )\n )\n\n if isinstance(response, str):\n # The response is a string\n\n if response.lower() == \"y\":\n # The response is a `y` or `Y`.\n\n # We merge the old values inside the new one.\n self._merge_values()\n\n # And we save.\n self._save()\n\n print(\n PyFunceble.Style.BRIGHT + PyFunceble.Fore.GREEN + \"Done!\\n\"\n \"Please try again, if it happens again,\"\n \" please fill a new issue.\"\n )\n\n # And we break the loop as we got a satisfied response.\n break\n\n elif response.lower() == \"n\":\n # The response is a `n` or `N`.\n\n # We inform the user that something went wrong.\n raise Exception(\"Configuration key still missing.\")\n else:\n # The auto configuration environment variable is set.\n\n # We merge the old values inside the new one.\n self._merge_values()\n\n # And we save.\n self._save()", "response": "Execute the logic behind the merging."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nconverts the version to a shorter one.", "response": "def split_versions(cls, version, return_non_digits=False):\n \"\"\"\n Convert the versions to a shorter one.\n\n :param version: The version to split.\n :type version: str\n\n :param return_non_digits:\n Activate the return of the non-digits parts of the splitted\n version.\n :type return_non_digits: bool\n\n :return: The splitted version name/numbers.\n :rtype: list\n \"\"\"\n\n # We split the version.\n splited_version = version.split(\".\")\n\n # We split the parsed version and keep the digits.\n digits = [x for x in splited_version if x.isdigit()]\n\n if not return_non_digits:\n # We do not have to return the non digits part of the version.\n\n # We return the digits part of the version.\n return digits\n\n # We have to return the non digit parts of the version.\n\n # We split the parsed version and keep the non digits.\n non_digits = [x for x in splited_version if not x.isdigit()]\n\n # We return a tuple with first the digits part and finally the non digit parts.\n return (digits, non_digits[0])"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\ncompare the given versions of the local and upstream versions.", "response": "def check_versions(cls, local, upstream):\n \"\"\"\n Compare the given versions.\n\n :param local: The local version converted by split_versions().\n :type local: list\n\n :param upstream: The upstream version converted by split_versions().\n :type upstream: list\n\n :return:\n - True: local < upstream\n - None: local == upstream\n - False: local > upstream\n :rtype: bool|None\n \"\"\"\n\n # A version should be in format [1,2,3] which is actually the version `1.2.3`\n # So as we only have 3 elements in the versioning,\n # we initiate the following variable in order to get the status of each parts.\n status = [None, None, None]\n\n for index, version_number in enumerate(local):\n # We loop through the local version.\n\n if int(version_number) < int(upstream[index]):\n # The local version is less than the upstream version.\n\n # We initiate its status to True which means that we are in\n # an old version (for the current version part).\n status[index] = True\n elif int(version_number) > int(upstream[index]):\n # The local version is greater then the upstream version.\n\n # We initiate its status to False which means that we are in\n # a more recent version (for the current version part).\n status[index] = False\n\n # Otherwise the status stay None which means that there is no change\n # between both local and upstream.\n\n if False in status:\n # There is a False in the status.\n\n # We return False which means that we are in a more recent version.\n return False\n\n if True in status:\n # There is a True in the status.\n\n # We return True which means that we are in a older version.\n return True\n\n # There is no True or False in the status.\n\n # We return None which means that we are in the same version as upstream.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncompare the current version with the upstream saved version.", "response": "def compare(self):\n \"\"\"\n Compare the current version with the upstream saved version.\n \"\"\"\n\n if self.upstream_data[\"force_update\"][\"status\"]:\n # The force_update status is set to True.\n\n for minimal in self.upstream_data[\"force_update\"][\"minimal_version\"]:\n # We loop through the list of minimal version which trigger the\n # the force update message.\n\n # We compare the local with the currently read minimal version.\n checked = self.check_versions(\n self.local_splited, self.split_versions(minimal)\n )\n\n if not PyFunceble.CONFIGURATION[\"quiet\"]:\n # The quiet mode is not activated.\n\n if checked or checked is not False and not checked:\n # The current version is less or equal to\n # the minimal version.\n\n # We initiate the message we are going to return to\n # the user.\n message = (\n PyFunceble.Style.BRIGHT\n + PyFunceble.Fore.RED\n + \"A critical issue has been fixed.\\n\"\n + PyFunceble.Style.RESET_ALL\n ) # pylint:disable=line-too-long\n message += (\n PyFunceble.Style.BRIGHT\n + PyFunceble.Fore.GREEN\n + \"Please take the time to update PyFunceble!\\n\"\n + PyFunceble.Style.RESET_ALL\n ) # pylint:disable=line-too-long\n\n # We print the message on screen.\n print(message)\n\n # We exit PyFunceble with the code 1.\n exit(1)\n elif checked or checked is not False and not checked:\n # The quiet mode is activated and the current version\n # is less or equal to the minimal version.\n\n # We raise an exception telling the user to update their\n # instance of PyFunceble.\n raise Exception(\n \"A critical issue has been fixed. Please take the time to update PyFunceble!\" # pylint:disable=line-too-long\n )\n\n for version in self.upstream_data[\"deprecated\"]:\n # We loop through the list of deprecated versions.\n\n # We compare the local with the currently read deprecated version.\n checked = self.check_versions(\n self.local_splited, self.split_versions(version)\n )\n\n if (\n not PyFunceble.CONFIGURATION[\"quiet\"]\n and checked\n or checked is not False\n and not checked\n ):\n # The quiet mode is not activated and the local version is\n # less or equal to the currently read deprecated version.\n\n # We initiate the message we are going to return to the user.\n message = (\n PyFunceble.Style.BRIGHT\n + PyFunceble.Fore.RED\n + \"Your current version is considered as deprecated.\\n\"\n + PyFunceble.Style.RESET_ALL\n ) # pylint:disable=line-too-long\n message += (\n PyFunceble.Style.BRIGHT\n + PyFunceble.Fore.GREEN\n + \"Please take the time to update PyFunceble!\\n\"\n + PyFunceble.Style.RESET_ALL\n ) # pylint:disable=line-too-long\n\n # We print the message.\n print(message)\n\n # And we continue to the next logic. There is no need to\n # shutdown PyFunceble as it's just for information.\n return\n\n # The quiet mode is activated.\n\n if checked or checked is not False and not checked:\n # The local version is less or equal to the currently\n # read deprecated version.\n print(\"Version deprecated.\")\n\n # And we continue to the next logic. There is no need to\n # shutdown PyFunceble as it's just for information.\n return\n\n # We compare the local version with the upstream version.\n status = self.check_versions(\n self.local_splited,\n self.split_versions(self.upstream_data[\"current_version\"]),\n )\n\n if status is not None and not status and not PyFunceble.CONFIGURATION[\"quiet\"]:\n # The quiet mode is not activate and the current version is greater than\n # the upstream version.\n\n # We initiate the message we are going to return to the user.\n message = (\n PyFunceble.Style.BRIGHT\n + PyFunceble.Fore.CYAN\n + \"Your version is more recent!\\nYou should really think about sharing your changes with the community!\\n\" # pylint:disable=line-too-long\n + PyFunceble.Style.RESET_ALL\n )\n message += (\n PyFunceble.Style.BRIGHT\n + \"Your version: \"\n + PyFunceble.Style.RESET_ALL\n + PyFunceble.VERSION\n + \"\\n\"\n )\n message += (\n PyFunceble.Style.BRIGHT\n + \"Upstream version: \"\n + PyFunceble.Style.RESET_ALL\n + self.upstream_data[\"current_version\"]\n + \"\\n\"\n )\n\n # We print the message.\n print(message)\n elif status:\n # The current version is less that the upstream version.\n\n if not PyFunceble.CONFIGURATION[\"quiet\"]:\n # The quiet mode is not activated.\n\n # We initiate the message we are going to return to the user.\n message = (\n PyFunceble.Style.BRIGHT\n + PyFunceble.Fore.YELLOW\n + \"Please take the time to update PyFunceble!\\n\"\n + PyFunceble.Style.RESET_ALL\n ) # pylint:disable=line-too-long\n message += (\n PyFunceble.Style.BRIGHT\n + \"Your version: \"\n + PyFunceble.Style.RESET_ALL\n + PyFunceble.VERSION\n + \"\\n\"\n ) # pylint:disable=line-too-long\n message += (\n PyFunceble.Style.BRIGHT\n + \"Upstream version: \"\n + PyFunceble.Style.RESET_ALL\n + self.upstream_data[ # pylint:disable=line-too-long\n \"current_version\"\n ]\n + \"\\n\"\n )\n\n # We print the message.\n print(message)\n else:\n # The quiet mode is activated.\n\n # We print the message.\n print(\"New version available.\")\n\n # And we continue to the next app logic. There is no need to\n # shutdown PyFunceble as it's just for information.\n return"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef is_cloned(cls):\n\n if not PyFunceble.path.isdir(\".git\"):\n # The git directory does not exist.\n\n # We return False, the current version is not the cloned version.\n return False\n\n # We list the list of file which can be found only in a cloned version.\n list_of_file = [\n \".coveragerc\",\n \".coveralls.yml\",\n \".gitignore\",\n \".PyFunceble_production.yaml\",\n \".travis.yml\",\n \"CODE_OF_CONDUCT.md\",\n \"CONTRIBUTING.md\",\n \"dir_structure_production.json\",\n \"MANIFEST.in\",\n \"README.rst\",\n \"requirements.txt\",\n \"setup.py\",\n \"version.yaml\",\n ]\n\n # We list the list of directory which can be found only in a cloned\n # version.\n list_of_dir = [\"docs\", \"PyFunceble\", \"tests\"]\n\n for file in list_of_file:\n # We loop through the list of file.\n\n if not PyFunceble.path.isfile(file):\n # The file does not exist in the current directory.\n\n # We return False, the current version is not the cloned version.\n return False\n\n # All required files exist in the current directory.\n\n for directory in list_of_dir:\n # We loop through the list of directory.\n\n if not PyFunceble.path.isdir(directory):\n # The directory does not exist in the current directory.\n\n # We return False, the current version is not the cloned version.\n return False\n\n # All required directories exist in the current directory.\n\n # We return True, the current version is a cloned version.\n return True", "response": "Return True if the current version is a cloned version."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _handle_non_existant_index(cls):\n\n try:\n # We try to call the http code.\n PyFunceble.INTERN[\"http_code\"]\n except KeyError:\n # If it is not found.\n\n # We initiate an empty http code.\n PyFunceble.INTERN[\"http_code\"] = \"*\" * 3\n\n try:\n # We try to call the referer.\n PyFunceble.INTERN[\"referer\"]\n except KeyError:\n # If it is not found.\n\n # We initate an `Unknown` referer.\n PyFunceble.INTERN[\"referer\"] = \"Unknown\"", "response": "Handle and check that some configuration index exists."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning the path to the analytic host file directory depending of the matched ArcGIS status.", "response": "def _analytic_host_file_directory(self):\n \"\"\"\n Return the analytic directory to write depending of the matched\n status.\n \"\"\"\n\n # We construct the path to the analytic directory.\n output_dir = (\n self.output_parent_dir\n + PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"parent\"]\n )\n\n if self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"potentially_up\"]:\n # The status is in the list of analytic up status.\n\n # We complete the output directory.\n output_dir += PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\n \"potentially_up\"\n ]\n elif (\n self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"potentially_down\"]\n ):\n # The status is in the list of analytic down status.\n\n # We complete the output directory.\n output_dir += PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\n \"potentially_down\"\n ]\n elif self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"suspicious\"]:\n # The status is in the list of analytic suspicious status.\n\n # We complete the output directory.\n output_dir += PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"suspicious\"]\n else:\n # The status is not in the list of analytic down or up status.\n\n # We complete the output directory.\n output_dir += PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"up\"]\n\n return output_dir"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\ngenerates the hosts file plain list and splitted lists.", "response": "def info_files(self): # pylint: disable=inconsistent-return-statements\n \"\"\"\n Generate the hosts file, the plain list and the splitted lists.\n \"\"\"\n\n if self._do_not_produce_file():\n # We do not have to produce file.\n\n # We return false.\n return False\n\n if (\n \"file_to_test\" in PyFunceble.INTERN\n and PyFunceble.INTERN[\"file_to_test\"]\n and (\n PyFunceble.CONFIGURATION[\"generate_hosts\"]\n or PyFunceble.CONFIGURATION[\"plain_list_domain\"]\n or PyFunceble.CONFIGURATION[\"generate_json\"]\n )\n ):\n # * We are not testing as an imported module.\n # and\n # * The hosts file generation is activated.\n # or\n # * The plain list generation is activated.\n\n # We initiate a variable which whill save the splited testination.\n splited_destination = \"\"\n\n # We initiate the list of all analytic related statuses.\n http_list = []\n http_list.extend(PyFunceble.STATUS[\"list\"][\"potentially_up\"])\n http_list.extend(PyFunceble.STATUS[\"list\"][\"potentially_down\"])\n http_list.extend(PyFunceble.STATUS[\"list\"][\"http_active\"])\n http_list.extend(PyFunceble.STATUS[\"list\"][\"suspicious\"])\n\n # We partially initiate the path to the hosts file.\n output_hosts = (\n self.output_parent_dir\n + PyFunceble.OUTPUTS[\"hosts\"][\"directory\"]\n + \"%s\"\n + directory_separator\n + PyFunceble.OUTPUTS[\"hosts\"][\"filename\"]\n )\n\n # We partially initiate the path to the plain list file.\n output_domains = (\n self.output_parent_dir\n + PyFunceble.OUTPUTS[\"domains\"][\"directory\"]\n + \"%s\"\n + directory_separator\n + PyFunceble.OUTPUTS[\"domains\"][\"filename\"]\n )\n\n # We partially intiate the path to the json list file.\n output_json = (\n self.output_parent_dir\n + PyFunceble.OUTPUTS[\"json\"][\"directory\"]\n + \"%s\"\n + directory_separator\n + PyFunceble.OUTPUTS[\"json\"][\"filename\"]\n )\n\n if self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"up\"]:\n # The status is in the list of up list.\n\n # We complete the path to the hosts file.\n hosts_destination = output_hosts % PyFunceble.STATUS[\"official\"][\"up\"]\n\n # We complete the path to the plain list file.\n plain_destination = output_domains % PyFunceble.STATUS[\"official\"][\"up\"]\n\n # We complete the path to the json list file.\n json_destination = output_json % PyFunceble.STATUS[\"official\"][\"up\"]\n elif self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"valid\"]:\n # The status is in the list of valid list.\n\n # We complete the path to the hosts file.\n hosts_destination = (\n output_hosts % PyFunceble.STATUS[\"official\"][\"valid\"]\n )\n\n # We complete the path to the plain list file.\n plain_destination = (\n output_domains % PyFunceble.STATUS[\"official\"][\"valid\"]\n )\n\n # We complete the path to the json list file.\n json_destination = output_json % PyFunceble.STATUS[\"official\"][\"valid\"]\n elif self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"down\"]:\n # The status is in the list of down list.\n\n # We complete the path to the hosts file.\n hosts_destination = output_hosts % PyFunceble.STATUS[\"official\"][\"down\"]\n\n # We complete the path to the plain list file.\n plain_destination = (\n output_domains % PyFunceble.STATUS[\"official\"][\"down\"]\n )\n\n # We complete the path to the json list file.\n json_destination = output_json % PyFunceble.STATUS[\"official\"][\"down\"]\n elif self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"invalid\"]:\n # The status is in the list of invalid list.\n\n # We complete the path to the hosts file.\n hosts_destination = (\n output_hosts % PyFunceble.STATUS[\"official\"][\"invalid\"]\n )\n\n # We complete the path to the plain list file.\n plain_destination = (\n output_domains % PyFunceble.STATUS[\"official\"][\"invalid\"]\n )\n\n # We complete the path to the json list file.\n json_destination = (\n output_json % PyFunceble.STATUS[\"official\"][\"invalid\"]\n )\n elif self.domain_status.lower() in http_list:\n # The status is in the list of analytic status.\n\n # We construct the path to the analytic directory.\n output_dir = self._analytic_host_file_directory()\n\n if not output_dir.endswith(directory_separator):\n # The output directory does not ends with the directory separator.\n\n # We append the directory separator at the end of the output directory.\n output_dir += directory_separator\n\n # We initiate the hosts file path.\n hosts_destination = output_dir + PyFunceble.OUTPUTS[\"hosts\"][\"filename\"]\n\n # We initiate the plain list file path.\n plain_destination = (\n output_dir + PyFunceble.OUTPUTS[\"domains\"][\"filename\"]\n )\n\n # We complete the path to the json list file.\n json_destination = output_dir + PyFunceble.OUTPUTS[\"json\"][\"filename\"]\n\n # We initiate the path to the http code file.\n # Note: We generate the http code file so that\n # we can have each domain in a file which is the\n # extracted http code.\n splited_destination = output_dir + str(PyFunceble.INTERN[\"http_code\"])\n\n if PyFunceble.CONFIGURATION[\"generate_hosts\"]:\n # The hosts file generation is activated.\n\n # We generate/append the currently tested element in its\n # final location. (hosts file format)\n # We print on screen and on file.\n Prints(\n [PyFunceble.CONFIGURATION[\"custom_ip\"], self.tested],\n \"FullHosts\",\n hosts_destination,\n ).data()\n\n if PyFunceble.CONFIGURATION[\"plain_list_domain\"]:\n # The plain list generation is activated.\n\n # We generate/append the currently tested element in its\n # final location. (the plain list format)\n # We print on file.\n Prints([self.tested], \"PlainDomain\", plain_destination).data()\n\n if PyFunceble.CONFIGURATION[\"split\"] and splited_destination:\n # The splited list generation is activated.\n\n # We generate/append the currently tested element in its\n # final location. (the split list format)\n # We print on file.\n Prints([self.tested], \"PlainDomain\", splited_destination).data()\n\n if PyFunceble.CONFIGURATION[\"generate_json\"]:\n # The jsaon list generation is activated.\n\n # We generate/append the currently tested element in its\n # final location. (the json format)\n # We print on file.\n Prints([self.tested], \"JSON\", json_destination).data()"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef unified_file(self):\n\n if (\n \"file_to_test\" in PyFunceble.INTERN\n and PyFunceble.INTERN[\"file_to_test\"]\n and PyFunceble.CONFIGURATION[\"unified\"]\n ):\n # * We are not testing as an imported module.\n # and\n # * The unified file generation is activated.\n\n # We construct the path of the unified file.\n output = (\n self.output_parent_dir + PyFunceble.OUTPUTS[\"default_files\"][\"results\"]\n )\n\n if PyFunceble.CONFIGURATION[\"less\"]:\n # We have to print less information.\n\n if PyFunceble.HTTP_CODE[\"active\"]:\n # The http status code request is activated.\n\n # We construct what we have to print.\n to_print = [\n self.tested,\n self.domain_status,\n PyFunceble.INTERN[\"http_code\"],\n ]\n else:\n # The http status code request is not activated.\n\n # We construct what we have to print.\n to_print = [self.tested, self.domain_status, self.source]\n\n # And we print the informations on file.\n Prints(to_print, \"Less\", output, True).data()\n else:\n # The unified file generation is not activated.\n\n # We construct what we have to print.\n to_print = [\n self.tested,\n self.domain_status,\n self.expiration_date,\n self.source,\n PyFunceble.INTERN[\"http_code\"],\n PyFunceble.CURRENT_TIME,\n ]\n\n # And we print the information on file.\n Prints(to_print, \"Generic_File\", output, True).data()", "response": "Generate unified file. Understand by that that we use an unified table\n instead of a separate table for each status which could result into a\n misunderstanding."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef analytic_file(self, new_status, old_status=None):\n\n if not old_status:\n # The old status is not given.\n\n # We set the old status as the one given globally.\n old_status = self.domain_status\n\n if \"file_to_test\" in PyFunceble.INTERN and PyFunceble.INTERN[\"file_to_test\"]:\n # We are not testing as an imported module.\n\n # We partially construct the path to the file to write/print.\n output = (\n self.output_parent_dir\n + PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"parent\"]\n + \"%s%s\"\n )\n\n if new_status.lower() in PyFunceble.STATUS[\"list\"][\"up\"]:\n # The new status is in the list of up status.\n\n # We complete the output directory.\n output = output % (\n PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"up\"],\n PyFunceble.OUTPUTS[\"analytic\"][\"filenames\"][\"up\"],\n )\n\n # We generate the hosts file.\n Generate(\"HTTP_Active\").info_files()\n elif new_status.lower() in PyFunceble.STATUS[\"list\"][\"potentially_up\"]:\n # The new status is in the list of down status.\n\n # We complete the output directory.\n output = output % (\n PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"potentially_up\"],\n PyFunceble.OUTPUTS[\"analytic\"][\"filenames\"][\"potentially_up\"],\n )\n\n # We generate the hosts file.\n Generate(\"potentially_up\").info_files()\n elif new_status.lower() in PyFunceble.STATUS[\"list\"][\"suspicious\"]:\n # The new status is in the list of suspicious status.\n\n # We complete the output directory.\n output = output % (\n PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"suspicious\"],\n PyFunceble.OUTPUTS[\"analytic\"][\"filenames\"][\"suspicious\"],\n )\n\n # We generate the hosts file.\n Generate(\"suspicious\").info_files()\n else:\n # The new status is in the list of up and down status.\n\n # We complete the output directory.\n output = output % (\n PyFunceble.OUTPUTS[\"analytic\"][\"directories\"][\"potentially_down\"],\n PyFunceble.OUTPUTS[\"analytic\"][\"filenames\"][\"potentially_down\"],\n )\n\n # We generate the hosts files.\n Generate(\"potentially_down\").info_files()\n\n # We print the information on file.\n Prints(\n [\n self.tested,\n old_status,\n PyFunceble.INTERN[\"http_code\"],\n PyFunceble.CURRENT_TIME,\n ],\n \"HTTP\",\n output,\n True,\n ).data()", "response": "Generate an analytic file based on the given new and old status."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _prints_status_file(self): # pylint: disable=too-many-branches\n\n if PyFunceble.INTERN[\"file_to_test\"]:\n # We are testing a file.\n\n output = (\n self.output_parent_dir\n + PyFunceble.OUTPUTS[\"splited\"][\"directory\"]\n + self.domain_status\n )\n\n if PyFunceble.CONFIGURATION[\"less\"]:\n # We have to print less information.\n\n # We print the information on file.\n Prints(\n [self.tested, self.domain_status, self.source], \"Less\", output, True\n ).data()\n elif PyFunceble.CONFIGURATION[\"split\"]:\n # We have to split the information we print on file.\n\n if self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"up\"]:\n # The status is in the list of up status.\n\n if PyFunceble.HTTP_CODE[\"active\"]:\n # The http code extraction is activated.\n\n # We initiate the data to print.\n data_to_print = [\n self.tested,\n self.expiration_date,\n self.source,\n PyFunceble.INTERN[\"http_code\"],\n PyFunceble.CURRENT_TIME,\n ]\n else:\n # The http code extraction is not activated.\n\n # We initiate the data to print.\n data_to_print = [\n self.tested,\n self.expiration_date,\n self.source,\n PyFunceble.CURRENT_TIME,\n ]\n\n # We print the informations to print on file.\n Prints(\n data_to_print, PyFunceble.STATUS[\"official\"][\"up\"], output, True\n ).data()\n elif self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"valid\"]:\n # The status is in the list of valid status.\n\n # We initiate the data to print.\n data_to_print = [self.tested, self.source, PyFunceble.CURRENT_TIME]\n\n # We print the informations to print on file.\n Prints(\n data_to_print,\n PyFunceble.STATUS[\"official\"][\"valid\"],\n output,\n True,\n ).data()\n elif self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"down\"]:\n # The status is in the list of down status.\n\n if PyFunceble.HTTP_CODE[\"active\"]:\n # The http statuc code extraction is activated.\n\n # We initiate the data to print.\n data_to_print = [\n self.tested,\n PyFunceble.INTERN[\"referer\"],\n self.domain_status,\n self.source,\n PyFunceble.INTERN[\"http_code\"],\n PyFunceble.CURRENT_TIME,\n ]\n else:\n # The http status code extraction is not activated.\n\n # We initate the data to print.\n data_to_print = [\n self.tested,\n PyFunceble.INTERN[\"referer\"],\n self.domain_status,\n self.source,\n PyFunceble.CURRENT_TIME,\n ]\n\n # We print the information on file.\n Prints(\n data_to_print,\n PyFunceble.STATUS[\"official\"][\"down\"],\n output,\n True,\n ).data()\n elif self.domain_status.lower() in PyFunceble.STATUS[\"list\"][\"invalid\"]:\n # The status is in the list of invalid status.\n\n if PyFunceble.HTTP_CODE[\"active\"]:\n # The http status code extraction is activated.\n\n # We initiate the data to print.\n data_to_print = [\n self.tested,\n self.source,\n PyFunceble.INTERN[\"http_code\"],\n PyFunceble.CURRENT_TIME,\n ]\n else:\n # The http status code extraction is not activated.\n\n # We initiate the data to print.\n data_to_print = [\n self.tested,\n self.source,\n PyFunceble.CURRENT_TIME,\n ]\n\n # We print the information to print on file.\n Prints(\n data_to_print,\n PyFunceble.STATUS[\"official\"][\"invalid\"],\n output,\n True,\n ).data()", "response": "This function is used to print the status of the current domain."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _prints_status_screen(self):\n\n if not PyFunceble.CONFIGURATION[\"quiet\"]:\n # The quiet mode is not activated.\n\n if PyFunceble.CONFIGURATION[\"less\"]:\n # We have to print less information.\n\n # We initiate the data to print.\n to_print = [\n self.tested,\n self.domain_status,\n PyFunceble.INTERN[\"http_code\"],\n ]\n\n if not PyFunceble.HTTP_CODE[\"active\"]:\n # The http status code is not activated.\n\n # We replace the last element to print with\n # the source.\n to_print[-1] = self.source\n\n # We print the informations on screen.\n Prints(to_print, \"Less\").data()\n else:\n # We have to print all informations on screen.\n\n if PyFunceble.HTTP_CODE[\"active\"]:\n # The http status code extraction is activated.\n\n # We initiate the data to print.\n data_to_print = [\n self.tested,\n self.domain_status,\n self.expiration_date,\n self.source,\n PyFunceble.INTERN[\"http_code\"],\n ]\n else:\n # The http status code extraction is not activated.\n\n # We initiate the data to print.\n data_to_print = [\n self.tested,\n self.domain_status,\n self.expiration_date,\n self.source,\n PyFunceble.CURRENT_TIME,\n ]\n\n # We print the information on screen.\n Prints(data_to_print, \"Generic\").data()", "response": "This method is used to print the status of the status file on screen."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef status_file(self): # pylint: disable=inconsistent-return-statements\n\n if \"file_to_test\" in PyFunceble.INTERN:\n # We are not testing as an imported module.\n\n # We generate the hosts file.\n Generate(self.domain_status, self.source, self.expiration_date).info_files()\n\n # We are testing a file content.\n\n # We increase the percentage count.\n Percentage(self.domain_status).count()\n\n # We print on screen if needed.\n self._prints_status_screen()\n\n if self._do_not_produce_file():\n return None\n\n if (\n not PyFunceble.CONFIGURATION[\"no_files\"]\n and PyFunceble.CONFIGURATION[\"split\"]\n ):\n # * The file non-generation of file is globaly deactivated.\n # and\n # * We have to split the outputs.\n\n # We print or generate the files.\n self._prints_status_file()\n else:\n # * The file non-generation of file is globaly activated.\n # or\n # * We do not have to split the outputs.\n\n # We print or generate the unified files.\n self.unified_file()", "response": "Generate a status file according to the domain status."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _do_not_produce_file(self):\n\n if (\n Inactive().is_present()\n and self.domain_status\n in [\n PyFunceble.STATUS[\"official\"][\"down\"],\n PyFunceble.STATUS[\"official\"][\"invalid\"],\n ]\n and PyFunceble.INTERN[\"to_test\"]\n not in PyFunceble.INTERN[\"extracted_list_to_test\"]\n ):\n return True\n return False", "response": "Check if we do not produce a file based from the given information."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nextracts the extension from the given line.", "response": "def _extensions(self, line):\n \"\"\"\n Extract the extension from the given line.\n\n :param line: The line from the official public suffix repository.\n :type line: str\n \"\"\"\n\n # We strip the parsed line.\n line = line.strip()\n\n if not line.startswith(\"//\") and \".\" in line:\n # * The parsed line is not a commented line.\n # and\n # * There is a point in the parsed line.\n line = line.encode(\"idna\").decode(\"utf-8\")\n\n if line.startswith(\"*.\"):\n # The parsed line start with `*.`.\n\n # We remove the first two characters.\n line = line[2:]\n\n # We we split the points and we get the last element.\n # Explanation: The idea behind this action is to\n # always get the extension.\n extension = line.split(\".\")[-1]\n\n if extension in self.public_suffix_db:\n # The extension is alrady in our database.\n\n # We update the content of the 1st level TDL with\n # the content of the suffix.\n # In between, we format so that we ensure that there is no\n # duplicate in the database index content.\n self.public_suffix_db[extension] = List(\n self.public_suffix_db[extension] + [line]\n ).format()\n else:\n # The extension is not already in our database.\n\n # We append the currently formatted extension and the line content.\n self.public_suffix_db.update({extension: [line]})"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nupdate the content of the public - suffix. json.", "response": "def update(self):\n \"\"\"\n Update of the content of the :code:`public-suffix.json`.\n \"\"\"\n\n if not PyFunceble.CONFIGURATION[\"quiet\"]:\n # The quiet mode is not activated.\n\n # We print a message for the user on screen.\n print(\n \"Update of %s\" % PyFunceble.OUTPUTS[\"default_files\"][\"public_suffix\"],\n end=\" \",\n )\n\n # We loop through the line of the upstream file.\n list(map(self._extensions, self._data().split(\"\\n\")))\n\n # We save the content of our database in the final testination.\n Dict(self.public_suffix_db).to_json(self.destination)\n\n if not PyFunceble.CONFIGURATION[\"quiet\"]:\n # The quiet mode is not activated.\n\n # We inform the user that everything goes right.\n print(PyFunceble.INTERN[\"done\"])"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef load(self):\n\n if not PyFunceble.INTERN[\"psl_db\"]:\n # The public database was not already loaded.\n\n # * We read, convert to dict and return the file content.\n # and\n # * We fill/create the database.\n PyFunceble.INTERN[\"psl_db\"] = Dict().from_json(\n File(self.destination).read()\n )", "response": "Load the public suffix database into the system."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef standard(cls, element):\n\n # We remove all special characters and return the formatted string.\n return (\n Regex(element, cls.regex_replace, replace_with=\"@funilrys\")\n .replace()\n .replace(\"@funilrys\", \"\")\n )", "response": "Implement the standard and alphabetical sorting of the elements."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef hierarchical(cls, element):\n\n # We initiate a variable which will save the element to sort without\n # the extension.\n to_sort = \"\"\n\n # We initiate a variable which will save the full extension.\n full_extension = \"\"\n\n # We convert the parsed element to lower case.\n element = element.lower()\n\n # We try to get the url base.\n url_base = Check().is_url_valid(element, return_base=True)\n\n if not isinstance(url_base, str):\n # The url base is not found.\n\n if \".\" in element:\n # There is point in the parsed element.\n\n # We get the position of the first letter of the extension.\n extension_index = element.rindex(\".\") + 1\n\n # We get the extension from the position of the first letter\n # of the extension.\n extension = element[extension_index:]\n\n if extension in PyFunceble.INTERN[\"psl_db\"]:\n # The extension is in the public suffix database.\n\n for suffix in PyFunceble.INTERN[\"psl_db\"][extension]:\n # We loop through the list of suffix of the extracted extension.\n\n # We suffix the sufix with a point.\n formatted_suffix = \".\" + suffix\n\n if element.endswith(formatted_suffix):\n # The elements ends with the suffix.\n\n # We get the position of the first character of the suffix in\n # the parsed element.\n suffix_index = element.rindex(formatted_suffix)\n\n # We update the to_sort variable with the element without the suffix.\n to_sort = element[:suffix_index]\n\n # We replace the full extension with the currently read suffix.\n full_extension = suffix\n\n # We break the loop, we got what we wanted.\n break\n\n if not full_extension:\n # The full extension is empty.\n\n # We initiate it with the extension.\n full_extension = element[extension_index:]\n\n # We update the to_sort variable with the element without the extension.\n to_sort = element[: extension_index - 1]\n\n # We append a point to the full extension because the point has to be\n # at the end and not at the begining of the extension.\n # To understand: Imagine a miror.\n full_extension += \".\"\n\n # We reverse the to_sort string.\n tros_ot = to_sort[::-1]\n\n if \".\" in tros_ot:\n # There is a point in the reversed string.\n\n # We prefix the full extension with the top level\n # domain name.\n full_extension = (\n tros_ot[: tros_ot.index(\".\")][::-1] + \".\" + full_extension\n )\n\n # We remove the tor level domain from the rest of\n # the reversed string.\n tros_ot = tros_ot[tros_ot.index(\".\") + 1 :]\n\n # * We reverse each level of the parsed element.\n # and\n # * We glue each level of the parsed element with each other.\n #\n # Note: after this, there is no point anymore.\n reversion = full_extension + \".\".join(\n [x[::-1] for x in tros_ot.split(\".\")]\n )\n\n # We remove all special characters and return the formatted string.\n return (\n Regex(reversion, cls.regex_replace, replace_with=\"@funilrys\")\n .replace()\n .replace(\"@funilrys\", \"\")\n )\n\n # We remove all special characters and return the formatted string.\n return (\n Regex(\n to_sort + full_extension,\n cls.regex_replace,\n replace_with=\"@funilrys\",\n )\n .replace()\n .replace(\"@funilrys\", \"\")\n )\n\n # There is no point in the parsed element.\n\n # We return the parsed element.\n return element\n\n # The url base is found.\n\n # We get the position of the element.\n protocol_position = element.rindex(url_base)\n\n # We extract the protocol from the element position.\n protocol = element[:protocol_position]\n\n # We return the output of this method but with the url base instead of the full url.\n return protocol + cls.hierarchical(url_base)", "response": "This method sorts a list of domain hierarchicaly."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef load(self):\n\n if \"iana_db\" not in PyFunceble.INTERN or not PyFunceble.INTERN[\"iana_db\"]:\n # The global database is empty, None or does not exist.\n\n # We update it with the database content.\n PyFunceble.INTERN[\"iana_db\"] = self.iana_db", "response": "Load the IANA database content into the database."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning the referer for the given extension.", "response": "def _referer(self, extension):\n \"\"\"\n Return the referer for the given extension.\n\n :param extension: A valid domain extension.\n :type extension: str\n\n :return: The whois server to use to get the WHOIS record.\n :rtype: str\n \"\"\"\n\n # We get the a copy of the page.\n iana_record = self.lookup.whois(\n PyFunceble.CONFIGURATION[\"iana_whois_server\"], \"hello.%s\" % extension\n )\n\n if iana_record and \"refer\" in iana_record:\n # The record is not empty.\n\n # We initiate a regex which will extract the referer.\n regex_referer = r\"(?s)refer\\:\\s+([a-zA-Z0-9._-]+)\\n\"\n\n # We try to extract the referer.\n matched = Regex(\n iana_record, regex_referer, return_data=True, group=1\n ).match()\n\n if matched:\n # The referer was extracted successfully.\n\n # We return the matched referer.\n return matched\n\n # * The referer was not extracted successfully.\n # or\n # * The iana record is empty.\n\n if extension in self.manual_server:\n # The extension is in the list of manual entries.\n\n # We return the server which we set manually.\n return self.manual_server[extension]\n\n # We return None because we weren't able to get the server to call for\n # the given extension.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nyield the extensions from the given block.", "response": "def _extensions(self):\n \"\"\"\n Extract the extention from the given block.\n Plus get its referer.\n \"\"\"\n\n upstream_lines = (\n Download(self.iana_url, return_data=True)\n .text()\n .split('')\n )\n\n # We extract the different extension from the currently readed line.\n regex_valid_extension = r\"(/domains/root/db/)(.*)(\\.html)\"\n\n for block in upstream_lines:\n if \"/domains/root/db/\" in block:\n # The link is in the line.\n\n # We try to extract the extension.\n matched = Regex(\n block, regex_valid_extension, return_data=True, rematch=True\n ).match()[1]\n\n if matched:\n # The extraction is not empty or None.\n\n # We get the referer.\n referer = self._referer(matched)\n\n # We yield the matched extension and its referer.\n yield (matched, referer)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nupdate the content of the iana - domains - db file.", "response": "def update(self):\n \"\"\"\n Update the content of the `iana-domains-db` file.\n \"\"\"\n\n if not PyFunceble.CONFIGURATION[\"quiet\"]:\n # * The quiet mode is not activated.\n\n # We print on screen what we are doing.\n print(\"Update of iana-domains-db\", end=\" \")\n\n # We loop through the line of the iana website.\n for extension, referer in self._extensions():\n\n if extension not in self.iana_db or self.iana_db[extension] != referer:\n # We add the extension to the databae.\n self.iana_db[extension] = referer\n\n # We save the content of the constructed database.\n Dict(self.iana_db).to_json(self.destination)\n\n if not PyFunceble.CONFIGURATION[\"quiet\"]:\n # The quiet mode is not activated.\n\n # We indicate that the work is done without any issue.\n print(PyFunceble.INTERN[\"done\"])"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nsearches for domain or URL related to the original URL or domain. :return: The mined domains or URL. :rtype: dict", "response": "def mine(self): # pragma: no cover\n \"\"\"\n Search for domain or URL related to the original URL or domain.\n\n :return: The mined domains or URL.\n :rtype: dict\n \"\"\"\n\n if PyFunceble.CONFIGURATION[\"mining\"]:\n # The mining is activated.\n\n try:\n # We get the history.\n history = PyFunceble.requests.get(\n self.to_get,\n timeout=PyFunceble.CONFIGURATION[\"seconds_before_http_timeout\"],\n headers=self.headers,\n ).history\n\n # We initiate a dictionnary which will save the\n # list of mined links.\n mined = {self.to_get_bare: []}\n\n for element in history:\n # We loop through the history.\n\n # We update the element.\n element = element.url\n\n if PyFunceble.INTERN[\"to_test_type\"] == \"url\":\n # We are testing a full url.\n\n # We get the element to append.\n to_append = Check().is_url_valid(element, return_base=False)\n elif PyFunceble.INTERN[\"to_test_type\"] == \"domain\":\n # We are testing a domain.\n\n # We get the element to append.\n to_append = Check().is_url_valid(element, return_base=True)\n else:\n raise Exception(\"Unknown tested.\")\n\n if to_append:\n # There is something to append.\n\n if to_append.endswith(\":80\"):\n # The port is present.\n\n # We get rid of it.\n to_append = to_append[:-3]\n\n if to_append != self.to_get_bare:\n # The element to append is different as\n # the element we are globally testing.\n\n # We append the element to append to the\n # list of mined links.\n mined[self.to_get_bare].append(to_append)\n\n if mined[self.to_get_bare]:\n # There is something in the list of mined links.\n\n # We return the whole element.\n return mined\n\n # There is nothing in the list of mined links.\n\n # We return None.\n return None\n\n except (\n PyFunceble.requests.ConnectionError,\n PyFunceble.requests.exceptions.Timeout,\n PyFunceble.requests.exceptions.InvalidURL,\n PyFunceble.socket.timeout,\n urllib3_exceptions.InvalidHeader,\n UnicodeDecodeError, # The probability that this happend in production is minimal.\n ):\n # Something went wrong.\n\n # We return None.\n return None\n return None"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _retrieve(self):\n\n if PyFunceble.CONFIGURATION[\"mining\"]:\n # The mining is activated.\n\n if \"mined\" not in PyFunceble.INTERN:\n PyFunceble.INTERN[\"mined\"] = {}\n\n if PyFunceble.path.isfile(self.file):\n # Our backup file exist.\n\n # We return the information from our backup.\n data = Dict().from_json(File(self.file).read())\n\n # We clean the empty elements.\n for file_path in data:\n PyFunceble.INTERN[\"mined\"][file_path] = {}\n\n for element in data[file_path]:\n if data[file_path][element]:\n PyFunceble.INTERN[\"mined\"][file_path][element] = data[\n file_path\n ][element]\n\n return\n # * The mining is not activated.\n # or\n # * Our backup file does not exist.\n\n # We return nothing.\n PyFunceble.INTERN[\"mined\"] = {}\n\n return", "response": "Retrieve the mining informations."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nadding the currently mined information to the current mined database.", "response": "def _add(self, to_add):\n \"\"\"\n Add the currently mined information to the\n mined \"database\".\n\n :param to_add: The element to add.\n :type to_add: dict\n \"\"\"\n\n if PyFunceble.CONFIGURATION[\"mining\"]:\n # The mining is activated.\n\n if PyFunceble.INTERN[\"file_to_test\"] not in PyFunceble.INTERN[\"mined\"]:\n # Our tested file path is not into our mined database.\n\n # We initiate it.\n PyFunceble.INTERN[\"mined\"][PyFunceble.INTERN[\"file_to_test\"]] = {}\n\n for element in to_add:\n # We loop through the element to add.\n\n if (\n element\n in PyFunceble.INTERN[\"mined\"][PyFunceble.INTERN[\"file_to_test\"]]\n ):\n # The element is already into the tested file path database.\n\n # We extent it with our element to add.\n PyFunceble.INTERN[\"mined\"][PyFunceble.INTERN[\"file_to_test\"]][\n element\n ].extend(to_add[element])\n else:\n # The element is already into the tested file path database.\n\n # We initiate it.\n PyFunceble.INTERN[\"mined\"][PyFunceble.INTERN[\"file_to_test\"]][\n element\n ] = to_add[element]\n\n # We format the added information in order to avoid duplicate.\n PyFunceble.INTERN[\"mined\"][PyFunceble.INTERN[\"file_to_test\"]][\n element\n ] = List(\n PyFunceble.INTERN[\"mined\"][PyFunceble.INTERN[\"file_to_test\"]][\n element\n ]\n ).format()\n\n # We backup everything.\n self._backup()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nremove the currently tested element from the mining database.", "response": "def remove(self):\n \"\"\"\n Remove the currently tested element from the mining\n data.\n \"\"\"\n\n if PyFunceble.CONFIGURATION[\"mining\"]:\n # The mining is activated.\n\n if PyFunceble.INTERN[\"file_to_test\"] in PyFunceble.INTERN[\"mined\"]:\n # The currently tested file is in our mined database.\n\n for element in PyFunceble.INTERN[\"mined\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ]:\n # We loop through the mined index.\n\n if (\n self.to_get_bare\n in PyFunceble.INTERN[\"mined\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ][element]\n ):\n # The currently read element content.\n\n # We remove the globally tested element from the currently\n # read element content.\n PyFunceble.INTERN[\"mined\"][PyFunceble.INTERN[\"file_to_test\"]][\n element\n ].remove(self.to_get_bare)\n\n # We backup everything.\n self._backup()"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nprovide the list of mined so they can be added to the list queue. :return: The list of mined domains or URL. :rtype: list", "response": "def list_of_mined(cls):\n \"\"\"\n Provide the list of mined so they can be added to the list\n queue.\n\n :return: The list of mined domains or URL.\n :rtype: list\n \"\"\"\n\n # We initiate a variable which will return the result.\n result = []\n\n if PyFunceble.CONFIGURATION[\"mining\"]:\n # The mining is activated.\n\n if PyFunceble.INTERN[\"file_to_test\"] in PyFunceble.INTERN[\"mined\"]:\n # The file we are testing is into our mining database.\n\n for element in PyFunceble.INTERN[\"mined\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ]:\n # We loop through the list of index of the file we are testing.\n\n # We append the element of the currently read index to our result.\n result.extend(\n PyFunceble.INTERN[\"mined\"][PyFunceble.INTERN[\"file_to_test\"]][\n element\n ]\n )\n\n # We format our result.\n result = List(result).format()\n\n # We return the result.\n return result"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef process(self): # pragma: no cover\n\n if PyFunceble.CONFIGURATION[\"mining\"]:\n # The mining is activated.\n\n # We load the mining logic.\n mined = self.mine()\n\n if mined:\n # The mined data is not empty or None.\n\n # We add the mined data to the global database.\n self._add(mined)\n\n # And we finally backup everything.\n self._backup()", "response": "Process the logic and structuration of the mining database."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\ngets and return the content of the given log file.", "response": "def _get_content(cls, file):\n \"\"\"\n Get and return the content of the given log file.\n\n :param file: The file we have to get the content from.\n :type file: str\n\n :return The content of the given file.\n :rtype: dict\n \"\"\"\n\n if PyFunceble.path.isfile(file):\n return Dict().from_json(File(file).read())\n\n return {}"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nwrite the content into the given file.", "response": "def _write_content(cls, content, file):\n \"\"\"\n Write the content into the given file.\n\n :param content: The dict to write.\n :type content: dict\n\n :param file: The file to write.\n :type file: str\n \"\"\"\n\n if not PyFunceble.CONFIGURATION[\"no_files\"]:\n if not isinstance(content, dict):\n content = {}\n\n Dict(content).to_json(file)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nlogging the WHOIS record if needed.", "response": "def whois(self, record):\n \"\"\"\n Logs the WHOIS record if needed.\n\n :param record: The record to log.\n :type record: str\n \"\"\"\n\n if PyFunceble.CONFIGURATION[\"debug\"] and PyFunceble.CONFIGURATION[\"logs\"]:\n # The debug and the logs subsystem are activated.\n\n if PyFunceble.INTERN[\"referer\"]:\n referer = PyFunceble.INTERN[\"referer\"]\n else:\n referer = None\n\n to_write = {\n self.current_time: {\n \"domain\": PyFunceble.INTERN[\"to_test\"],\n \"record\": record,\n \"referer\": referer,\n }\n }\n\n if self.output:\n output = self.output\n else:\n output = PyFunceble.OUTPUT_DIRECTORY\n output += PyFunceble.OUTPUTS[\"parent_directory\"]\n output += PyFunceble.OUTPUTS[\"logs\"][\"directories\"][\"parent\"]\n output += PyFunceble.OUTPUTS[\"logs\"][\"filenames\"][\"whois\"]\n\n current_content = self._get_content(output)\n current_content.update(to_write)\n\n self._write_content(current_content, output)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nlog the extracted expiration date.", "response": "def expiration_date(self, extracted):\n \"\"\"\n Logs the extracted expiration date.\n\n :param extracted: The extracted expiration date (from WHOIS record).\n :type extracted: str\n \"\"\"\n\n if PyFunceble.CONFIGURATION[\"logs\"]:\n # The logs subsystem is activated.\n\n if PyFunceble.INTERN[\"referer\"]:\n referer = PyFunceble.INTERN[\"referer\"]\n else:\n referer = None\n\n to_write = {\n self.current_time: {\n \"domain\": PyFunceble.INTERN[\"to_test\"],\n \"expiration_date\": extracted,\n \"whois_server\": referer,\n }\n }\n\n if self.output:\n output = self.output\n else:\n output = PyFunceble.OUTPUT_DIRECTORY\n output += PyFunceble.OUTPUTS[\"parent_directory\"]\n output += PyFunceble.OUTPUTS[\"logs\"][\"directories\"][\"parent\"]\n output += PyFunceble.OUTPUTS[\"logs\"][\"filenames\"][\"date_format\"]\n\n current_content = self._get_content(output)\n current_content.update(to_write)\n\n self._write_content(current_content, output)\n\n if PyFunceble.CONFIGURATION[\"share_logs\"]:\n # The logs sharing is activated.\n\n # And we share the logs with the api.\n PyFunceble.requests.post(\n PyFunceble.LINKS[\"api_date_format\"],\n data=to_write[self.current_time],\n )"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef referer_not_found(self, extension):\n\n if PyFunceble.CONFIGURATION[\"logs\"]:\n # The logs subsystem is activated.\n\n to_write = {\n self.current_time: {\n \"domain\": PyFunceble.INTERN[\"to_test\"],\n \"extension\": extension,\n }\n }\n\n if self.output:\n output = self.output\n else:\n output = PyFunceble.OUTPUT_DIRECTORY\n output += PyFunceble.OUTPUTS[\"parent_directory\"]\n output += PyFunceble.OUTPUTS[\"logs\"][\"directories\"][\"parent\"]\n output += PyFunceble.OUTPUTS[\"logs\"][\"filenames\"][\"no_referer\"]\n\n current_content = self._get_content(output)\n current_content.update(to_write)\n\n self._write_content(current_content, output)\n\n if PyFunceble.CONFIGURATION[\"share_logs\"]:\n # The logs sharing is activated.\n\n # And we share the logs with the api.\n PyFunceble.requests.post(\n PyFunceble.LINKS[\"api_no_referer\"], data=to_write[self.current_time]\n )", "response": "Logs the case that the referer was not found."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _before_header(self):\n\n if (\n not PyFunceble.CONFIGURATION[\"no_files\"]\n and self.output\n and not PyFunceble.path.isfile(self.output)\n ):\n # * We are allowed to generate files.\n # and\n # * And output is given.\n # and\n # * The given output does not exist.\n\n # We initiate the information about what generated the file.\n link = \"# File generated by %s\\n\" % PyFunceble.LINKS[\"repo\"]\n\n # We initiate the information about the generation date of this file.\n date_of_generation = (\n \"# Date of generation: %s \\n\\n\" % PyFunceble.CURRENT_TIME\n )\n\n # We initiate a variable which will save the list of\n # templates which have to meet in order to write the before\n # header informations.\n authorized_templates = [\n \"Generic_File\",\n PyFunceble.STATUS[\"official\"][\"up\"],\n PyFunceble.STATUS[\"official\"][\"down\"],\n PyFunceble.STATUS[\"official\"][\"invalid\"],\n PyFunceble.STATUS[\"official\"][\"valid\"],\n \"Less\",\n ]\n\n if self.template in authorized_templates:\n # The current header is in our list of authorized templated.\n\n # We get the header.\n header = (\n self._header_constructor(self.currently_used_header, None)[0] + \"\\n\"\n )\n\n try:\n # We try to print the link, the date of generation and the header in the\n # given file.\n File(self.output).write(link + date_of_generation + header)\n except UnboundLocalError:\n # We don't have any header.\n\n # We print the link and the date in the given file.\n File(self.output).write(link + date_of_generation)", "response": "This method is called by the base class to write the before header of a file."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nconstruct header of the table according to template. :param data_to_print: The list of data to print into the header of the table. :type data_to_print: list :param header_separator: The separator to use between the table header and our data. :type header_separator: str :param colomn_separator: The separator to use between each colomns. :type colomn_separator: str :return: The data to print in a list format. :rtype: list", "response": "def _header_constructor(\n cls, data_to_print, header_separator=\"-\", column_separator=\" \"\n ):\n \"\"\"\n Construct header of the table according to template.\n\n :param data_to_print:\n The list of data to print into the header of the table.\n :type data_to_print: list\n\n :param header_separator:\n The separator to use between the table header and our data.\n :type header_separator: str\n\n :param colomn_separator: The separator to use between each colomns.\n :type colomn_separator: str\n\n :return: The data to print in a list format.\n :rtype: list\n \"\"\"\n\n # We initiate a variable which will save the header data.\n header_data = []\n\n # We initiate a variable which will save the header sizes.\n header_size = \"\"\n\n # We initiate the glue to set before the size.\n before_size = \"%-\"\n\n # We initiate the glue to set after the size.\n after_size = \"s\"\n\n if header_separator:\n # The header separator is not empty.\n\n # We initiate a variable which will save the list of\n # separator data.\n header_separator_data = []\n\n # We get the length of the data to print.\n length_data_to_print = len(data_to_print) - 1\n\n # We initiate an iterator.\n i = 0\n\n for data in data_to_print:\n # We loop through the list of data.\n\n # We get the size of the currently read data.\n size = data_to_print[data]\n\n # We append the data to the header data list.\n header_data.append(data)\n\n # We construct the header size.\n # Note: our header size is formatted line %s-sizes\n # (the s at the end is part of the formatting.)\n header_size += before_size + str(size) + after_size\n\n if i < length_data_to_print:\n # The iterator is less than the length of data to print.\n\n # We append the the colomn separator to the header size.\n header_size += column_separator\n\n if header_separator:\n # The header separator is given.\n\n # We append the right size of separator to the list of\n # separator data.\n header_separator_data.append(header_separator * size)\n\n # We increase the iterator.\n i += 1\n\n if header_separator:\n # The header separator is given.\n\n return [\n # We return the formatted header (like we will do with print('%s' % 'hello'))\n header_size % tuple(header_data),\n # We return the formatted header separator.\n header_size % tuple(header_separator_data),\n ]\n\n # The header separator is not given.\n\n # We return the formetted header.\n return [header_size % tuple(header_data)]"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nprint the header of the current page.", "response": "def header(\n self, do_not_print=False\n ): # pragma: no cover pylint: disable=too-many-branches\n \"\"\"\n Management and creation of templates of header.\n Please consider as \"header\" the title of each columns.\n\n :param do_not_print:\n Tell us if we have to print the header or not.\n :type do_not_print: bool\n \"\"\"\n\n if (\n not PyFunceble.CONFIGURATION[\"header_printed\"]\n or self.template == \"Percentage\"\n or do_not_print\n ):\n # * The header has not been already printed.\n # or\n # * The template is the `Percentage template`.\n # or\n # * We are authorized to print something.\n\n if (\n self.template.lower() in PyFunceble.STATUS[\"list\"][\"generic\"]\n or self.template == \"Generic_File\"\n ):\n # * The template is into the list of generic status.\n # or\n # * The template is equal to `Generic_File`.\n\n # The data to print is the Generic header.\n to_print = self.headers[\"Generic\"]\n\n if (\n self.template.lower() in PyFunceble.STATUS[\"list\"][\"generic\"]\n and PyFunceble.HTTP_CODE[\"active\"]\n ):\n # * The template is in the list of generic status.\n # and\n # * the http status code extraction is activated.\n\n # We remove the Analyze Date colomn from the data to print.\n to_print = Dict(to_print).remove_key(\"Analyze Date\")\n elif self.template.lower() in PyFunceble.STATUS[\"list\"][\"up\"]:\n # The template is in the list of up status.\n\n # We informations to print is the up header.\n to_print = self.headers[PyFunceble.STATUS[\"official\"][\"up\"]]\n elif self.template.lower() in PyFunceble.STATUS[\"list\"][\"valid\"]:\n # The template is in the list of valid status.\n\n # We informations to print is the valid header.\n to_print = self.headers[PyFunceble.STATUS[\"official\"][\"valid\"]]\n elif self.template.lower() in PyFunceble.STATUS[\"list\"][\"down\"]:\n # The template is in the list of down status.\n\n # We informations to print is the down header.\n to_print = self.headers[PyFunceble.STATUS[\"official\"][\"down\"]]\n elif self.template.lower() in PyFunceble.STATUS[\"list\"][\"invalid\"]:\n # The template is in the list of invalid status.\n\n # We informations to print is the invalid header.\n to_print = self.headers[PyFunceble.STATUS[\"official\"][\"invalid\"]]\n elif (\n self.template == \"Less\"\n or self.template == \"Percentage\"\n or self.template == \"HTTP\"\n ): # pylint: disable=line-too-long\n # * The template is equal to `Less`.\n # or\n # * The template is equal to `Percentage`.\n # or\n # * The template is equal to `HTTP`.\n\n # We get the header with the help of the template name.\n to_print = self.headers[self.template]\n\n if self.template == \"Less\" and not PyFunceble.HTTP_CODE[\"active\"]:\n # * The template is equal to `Less`.\n # and\n # * The http status code extraction is deactivated.\n\n # We append the source index to the header.\n to_print[\"Source\"] = 10\n\n if not PyFunceble.HTTP_CODE[\"active\"]:\n # * The http status code extraction is deactivated.\n\n # We remove the HTTP Code index from the data to print.\n to_print = Dict(to_print).remove_key(\"HTTP Code\")\n\n # We update the currently used header.\n self.currently_used_header = to_print\n\n if not do_not_print:\n # We are not authorized to print anything.\n\n # We generate the before header.\n self._before_header()\n\n for formatted_template in self._header_constructor(to_print):\n # We loop through the formatted template.\n\n if not self.only_on_file:\n # We do not have to print only on file.\n\n # We print on screen the formatted header template.\n print(formatted_template)\n\n if not PyFunceble.CONFIGURATION[\"no_files\"] and self.output:\n # An output destination is given.\n\n # We write the file with the formatted header template.\n File(self.output).write(formatted_template + \"\\n\")"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _data_constructor(self, size):\n\n # We initiate a variable which will save what we are going to\n # return.\n result = PyFunceble.OrderedDict()\n\n if len(self.data_to_print) == len(size):\n # The length of the data to print is equal to the length of the given size.\n\n for i in range(len(self.data_to_print)):\n # We loop until our iterator is less or equal to the length of the data\n # to print.\n\n # We initiate the result index and its size.\n result[self.data_to_print[i]] = size[i]\n else:\n # This should never happend. If it's happens then there is something\n # wrong from the inputed data.\n raise Exception(\n \"Inputed: \" + str(len(self.data_to_print)) + \"; Size: \" + str(len(size))\n )\n\n # We return the constructed result.\n return result", "response": "This method is called by the base class to construct the table of data according to the given size."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets the size of each column from the header.", "response": "def _size_from_header(cls, header):\n \"\"\"\n Get the size of each columns from the header.\n\n :param header:\n The header template we have to get the size from.\n :type header: dict\n\n :return: The maximal size of the each data to print.\n :rtype: list\n \"\"\"\n\n # We initiate the result we are going to return.\n result = []\n\n for data in header:\n # We lopp through the header.\n\n # And we append the size to our result.\n result.append(header[data])\n\n # We return the result.\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _colorify(self, data):\n\n if self.template in [\"Generic\", \"Less\"]:\n # The template is in the list of template that need the coloration.\n\n if (\n self.data_to_print[1].lower() in PyFunceble.STATUS[\"list\"][\"up\"]\n or self.data_to_print[1].lower() in PyFunceble.STATUS[\"list\"][\"valid\"]\n ):\n # The status is in the list of up status.\n\n # We print the data with a green background.\n data = PyFunceble.Fore.BLACK + PyFunceble.Back.GREEN + data\n elif self.data_to_print[1].lower() in PyFunceble.STATUS[\"list\"][\"down\"]:\n # The status is in the list of down status.\n\n # We print the data with a red background.\n data = PyFunceble.Fore.BLACK + PyFunceble.Back.RED + data\n else:\n # The status is not in the list of up and down status.\n\n # We print the data with a cyan background.\n data = PyFunceble.Fore.BLACK + PyFunceble.Back.CYAN + data\n\n # We return the data.\n return data", "response": "Colorify the data for the related resource."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _json_print(self): # pragma: no cover\n\n if self.output:\n # The given output is not empty.\n\n if PyFunceble.path.isfile(self.output):\n # The given output already exist.\n\n # We get the content of the output.\n content = Dict().from_json(File(self.output).read())\n\n if isinstance(content, list):\n # The content is a list.\n\n # We extend the content with our data to print.\n content.extend(self.data_to_print)\n\n # We format our list.\n content = List(content).custom_format(Sort.standard)\n\n if PyFunceble.CONFIGURATION[\"hierarchical_sorting\"]:\n # The hierarchical sorting is activated.\n\n # We format our content hierarchicaly\n content = List(content).custom_format(Sort.hierarchical)\n\n # We finally save our content into the file.\n Dict(content).to_json(self.output)\n else:\n # The content is not a list.\n\n # We raise an exception.\n raise Exception(\"Output not correctly formatted.\")\n else:\n # The given output does not already exist.\n\n # We save our data to print into the output.\n #\n # Note: We do not have to take care if self.data_to_print is a list\n # formatted or not because this method should not be called if it is\n # not the case.\n Dict(self.data_to_print).to_json(self.output)\n else:\n # The given output is empty.\n\n # We raise an exception.\n raise Exception(\"Empty output given.\")", "response": "This method is used to print the json template."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef data(self): # pragma: no cover pylint: disable=inconsistent-return-statements\n\n if isinstance(self.data_to_print, list):\n # The data to print is a list.\n\n # We initiate the data we are going to print.\n to_print = {}\n\n # We initiate the size we are going to print.\n to_print_size = []\n\n # We initiate a variable which will list the list of\n # alone case.\n alone_cases = [\"Percentage\", \"HTTP\"]\n\n # we initiate a variable which will list the list of\n # template which does not need a header.\n without_header = [\"FullHosts\", \"PlainDomain\"]\n\n if self.template.lower() == \"json\":\n # The template is the json template.\n\n if not PyFunceble.CONFIGURATION[\"no_files\"] and self.output:\n # * We are allowed to generate file.\n # and\n # * The given output is not empty.\n\n # We print the json file.\n return self._json_print()\n\n # We return nothing.\n return None\n\n if self.template not in alone_cases and self.template not in without_header:\n # * The template is not in the list of alone case.\n # and\n # * THe template is not in the list of template without header.\n\n # We get the template we should use.\n # Note: We basically only need the self.currently_used_header to be filled.\n self.header(True)\n\n # And we get the size from the header.\n to_print_size = self._size_from_header(self.currently_used_header)\n elif self.template in without_header:\n # The template is in the list of template which does not need a header.\n\n for data in self.data_to_print:\n # We loop through the list of data to print.\n\n # And we construct the (spacement) size of the data to print.\n to_print_size.append(str(len(data)))\n else:\n # We get the size from the given template name.\n to_print_size = self._size_from_header(self.headers[self.template])\n\n # We construct and format the data to print.\n to_print = self._data_constructor(to_print_size)\n\n # We print the before header section.\n self._before_header()\n\n for data in self._header_constructor(to_print, False):\n # We loop through the formatted data.\n\n if self.template.lower() in PyFunceble.STATUS[\"list\"][\n \"generic\"\n ] or self.template in [\"Less\", \"Percentage\"]:\n # * The template is in the list of generic status.\n # or\n # * The template is in a specific list.\n\n if not self.only_on_file:\n # We are authorized to print on screen.\n\n # We colorify the data to print.\n colorified_data = self._colorify(data)\n\n # And we print the data.\n print(colorified_data)\n if not PyFunceble.CONFIGURATION[\"no_files\"] and self.output:\n # * We are authorized to print on any file.\n # and\n # * The output is given.\n\n # We write our data into the printed file.\n File(self.output).write(data + \"\\n\")\n else:\n # This should never happend. If it's happens then there's a big issue\n # around data_to_print.\n raise Exception(\"Please review Prints().data()\")", "response": "This method returns the data to be printed to the output file."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nsave the current time to the file.", "response": "def _save(self, last=False): # pragma: no cover\n \"\"\"\n Save the current time to the file.\n\n :param last:\n Tell us if we are at the very end of the file testing.\n :type last: bool\n \"\"\"\n\n if (\n self._authorization()\n and PyFunceble.CONFIGURATION[\"logs\"]\n and \"file_to_test\" in PyFunceble.INTERN\n and PyFunceble.INTERN[\"file_to_test\"]\n ):\n # * We are authorized to work.\n # and\n # * The generation of logs is activated.\n # and\n # * We are not testing as an imported module.\n\n # We set the location of the file we are working with.\n self.file = (\n PyFunceble.OUTPUT_DIRECTORY\n + PyFunceble.OUTPUTS[\"parent_directory\"]\n + PyFunceble.OUTPUTS[\"logs\"][\"directories\"][\"parent\"]\n + PyFunceble.OUTPUTS[\"logs\"][\"filenames\"][\"execution_time\"]\n )\n\n if PyFunceble.path.isfile(self.file):\n # The file we are working with exist.\n\n # We get its content so we can directly work with it.\n content = Dict().from_json(File(self.file).read())\n else:\n # The file we are working with does not exist.\n\n # We generate a dummy content.\n content = {}\n\n if self.action == \"start\":\n # The action is equal to `start`.\n\n if \"final_total\" in content and content[\"final_total\"]:\n # The final total index exist.\n\n # We delete it.\n del content[\"final_total\"]\n\n if \"data\" in content:\n # The data index exist.\n\n # We append the current start time inside it at\n # a new sublist.\n content[\"data\"].append([PyFunceble.INTERN[\"start\"]])\n else:\n # The data index does not exist.\n\n # We create the index along with the current start time.\n content[\"data\"] = [[PyFunceble.INTERN[\"start\"]]]\n elif self.action == \"stop\":\n # The action is equal to `stop`.\n\n try:\n # We try to work with the data index.\n\n # We append the end time at the end of the last element\n # of data.\n #\n # Note: It is at the end because we should have as first\n # the star time.\n content[\"data\"][-1].append(PyFunceble.INTERN[\"end\"])\n\n # We get the start time.\n start = content[\"data\"][0][0]\n # We get the end time.\n end = content[\"data\"][-1][-1]\n\n # We calculate the execution time of the test.\n content[\"current_total\"] = self.format_execution_time(start, end)\n\n if last:\n # We are at the very end of the file testing.\n\n # We initiate the global execution time.\n content[\"final_total\"] = content[\"current_total\"]\n\n # We inform the user about the global execution time.\n print(\n PyFunceble.Fore.MAGENTA\n + PyFunceble.Style.BRIGHT\n + \"Global execution time: \"\n + content[\"final_total\"]\n )\n except KeyError:\n # It is not possible to work with the data index because\n # it does not exist.\n\n # We ignore the problem.\n pass\n\n try:\n # We try to save the whole data at its final location.\n Dict(content).to_json(self.file)\n except FileNotFoundError:\n # The directory was not found.\n\n # We construct the output directory\n DirectoryStructure()\n\n # And we retry to save the whole data at its final location.\n Dict(content).to_json(self.file)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncalculating the difference between starting and ending time. :param start: A starting time. :type start: int|str :param stop: A ending time. :type stop: int|str :return: A dict with following as index. * :code:`days` * :code:`hours` * :code:`minutes` * :code:`seconds` as index. :rtype: dict", "response": "def _calculate(cls, start=None, end=None):\n \"\"\"\n calculate the difference between starting and ending time.\n\n :param start: A starting time.\n :type start: int|str\n\n :param stop: A ending time.\n :type stop: int|str\n\n :return:\n A dict with following as index.\n\n * :code:`days`\n * :code:`hours`\n * :code:`minutes`\n * :code:`seconds`\n\n as index.\n :rtype: dict\n \"\"\"\n\n if start and end:\n # The start and end time is explicitly given.\n\n # We get the difference between the ending and the starting time.\n time_difference = int(end) - int(start)\n else:\n # The start and end time is not explicitly given.\n\n # We get the difference between the ending and the starting time.\n time_difference = PyFunceble.INTERN[\"end\"] - PyFunceble.INTERN[\"start\"]\n\n # We initiate an OrderedDict.\n # Indeed, we use an ordered dict because we want the structuration and the\n # order to stay always the same.\n # As a dictionnary is always unordered, we can use it. Otherwise the time will\n # not be shown correctly.\n data = PyFunceble.OrderedDict()\n\n # We calculate and append the day to our data.\n data[\"days\"] = str(time_difference // (24 * 60 * 60)).zfill(2)\n\n # We calculate and append the hours to our data.\n data[\"hours\"] = str((time_difference // (60 * 60)) % 24).zfill(2)\n\n # We calculate and append the minutes to our data.\n data[\"minutes\"] = str((time_difference % 3600) // 60).zfill(2)\n\n # We calculate and append the minutes to our data.\n data[\"seconds\"] = str(time_difference % 60).zfill(2)\n\n # We finaly return our data.\n return data"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef format_execution_time(self, start=None, end=None):\n\n # We return the formatted execution time.\n return \":\".join(list(self._calculate(start, end).values()))", "response": "Format the calculated time into a human readable format."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef file_to_delete(cls):\n\n # We initiate the directory we have to look for.\n directory = PyFunceble.OUTPUT_DIRECTORY + PyFunceble.OUTPUTS[\"parent_directory\"]\n\n if not directory.endswith(PyFunceble.directory_separator): # pragma: no cover\n # For safety, if it does not ends with the directory separator, we append it\n # to its end.\n directory += PyFunceble.directory_separator\n\n # We initiate a variable which will save the list of file to delete.\n result = []\n\n for root, _, files in PyFunceble.walk(directory):\n # We walk in the directory and get all files and sub-directories.\n\n for file in files:\n # If there is files in the current sub-directory, we loop\n # through the list of files.\n\n if file not in [\".gitignore\", \".keep\"]:\n # The file is not into our list of file we do not have to delete.\n\n if root.endswith(PyFunceble.directory_separator):\n # The root ends with the directory separator.\n\n # We construct the path and append the full path to the result.\n result.append(root + file)\n else:\n # The root directory does not ends with the directory separator.\n\n # We construct the path by appending the directory separator\n # between the root and the filename and append the full path to\n # the result.\n result.append(\n root + PyFunceble.directory_separator + file\n ) # pragma: no cover\n\n # We return our list of file to delete.\n return result", "response": "Return the list of files to delete."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef databases_to_delete(cls): # pragma: no cover\n\n # We initiate the directory we have to look for.\n directory = PyFunceble.CURRENT_DIRECTORY\n\n # We initate the result variable.\n result = []\n\n # We append the dir_structure file.\n result.append(\n directory\n + PyFunceble.CONFIGURATION[\"outputs\"][\"default_files\"][\"dir_structure\"]\n )\n\n # We append the iana file.\n result.append(\n directory + PyFunceble.CONFIGURATION[\"outputs\"][\"default_files\"][\"iana\"]\n )\n\n # We append the public suffix file.\n result.append(\n directory\n + PyFunceble.CONFIGURATION[\"outputs\"][\"default_files\"][\"public_suffix\"]\n )\n\n # We append the inactive database file.\n result.append(\n directory\n + PyFunceble.CONFIGURATION[\"outputs\"][\"default_files\"][\"inactive_db\"]\n )\n\n # We append the mining database file.\n result.append(\n directory + PyFunceble.CONFIGURATION[\"outputs\"][\"default_files\"][\"mining\"]\n )\n\n # We append the whois database file.\n result.append(\n directory + PyFunceble.CONFIGURATION[\"outputs\"][\"default_files\"][\"whois_db\"]\n )\n\n return result", "response": "Return the list of databases files to delete."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndeleting almost all discovered files. :param clean_all: Tell the subsystem if we have to clean everything instesd of almost everything. :type clean_all: bool", "response": "def almost_everything(self, clean_all=False):\n \"\"\"\n Delete almost all discovered files.\n\n :param clean_all:\n Tell the subsystem if we have to clean everything instesd\n of almost everything.\n :type clean_all: bool\n \"\"\"\n\n # We get the list of file to delete.\n to_delete = self.file_to_delete()\n\n if clean_all: # pragma: no cover\n to_delete.extend(self.databases_to_delete())\n\n for file in to_delete:\n # We loop through the list of file to delete.\n\n # And we delete the currently read file.\n File(file).delete()\n\n if clean_all: # pragma: no cover\n Load(PyFunceble.CURRENT_DIRECTORY)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _hash_file(self, algo):\n\n # We het the algorithm function.\n hash_data = getattr(hashlib, algo)()\n\n with open(self.path, \"rb\") as file:\n # We open an read the parsed path.\n\n # We read the content.\n content = file.read()\n\n # We parse the content to the hash algorithm.\n hash_data.update(content)\n\n # And we extract and return the hash.\n return hash_data.hexdigest()", "response": "Get the hash of the given file."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nget the hash of the given data.", "response": "def _hash_data(self, algo):\n \"\"\"\n Get hash of the given data.\n\n :param algo: The algorithm to use.\n :type algo: str\n \"\"\"\n\n # We het the algorithm function.\n hash_data = getattr(hashlib, algo)()\n\n # We set the data into our hashlib.\n hash_data.update(self.data)\n\n # And we extract and return the hash.\n return hash_data.hexdigest()"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get(self):\n\n # We initiate a variable which will save the result we are going\n # to return.\n result = {}\n\n if self.algorithm in self.valid_algorithms:\n # * The parsed path exist.\n # and\n # * The parsed algorithm is in the list of valid algorithms.\n\n if self.algorithm == \"all\":\n # The parsed algorithm is `all`.\n\n # We remove `all` (the first element of the list) from\n # the list of valid algorithms because we are going to\n # loop through the list of valid algorithms.\n del self.valid_algorithms[0]\n\n for algo in self.valid_algorithms:\n # We loop through the list of valid algorithms.\n\n if self.path and path.isfile(self.path):\n # The file path exist.\n\n # We save the hash into the result variable.\n result[algo] = self._hash_file(algo)\n elif self.data:\n # * The path does not exist.\n # and\n # * The given data is not empty.\n\n # We save the hash into the result variable.\n result[algo] = self._hash_data(algo)\n else: # pragma: no cover\n # All other case are met.\n\n # We return None.\n return None\n else:\n # The parsed algorithm is a specific one.\n\n if self.path and path.isfile(self.path):\n # The file path exist.\n\n # We save the hash into the result variable.\n result[self.algorithm] = self._hash_file(self.algorithm)\n elif self.data:\n # * The path does not exist.\n # and\n # * The given data is not empty.\n\n # We save the hash into the result variable.\n result[self.algorithm] = self._hash_data(self.algorithm)\n else:\n # All the other case are met.\n\n # We return None.\n return None\n else: # pragma: no cover\n # The parsed algorithm is not in the list of valid algorithms.\n return None\n\n if self.algorithm != \"all\" and self.only_hash:\n # * The parsed algorithm is not equal to `all`.\n # and\n # * We only have to return the selected hash.\n\n # We return the selected algorithm.\n return result[self.algorithm]\n\n # * The parsed algorithm is equal to `all`.\n # or\n # * We do not have to return the selected hash.\n\n # We return all hashes.\n return result", "response": "Return the hash of the given file and the given algorithm."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nexecuting the given command and return the output of the command.", "response": "def execute(self):\n \"\"\"\n Execute the given command.\n\n :return: The output of the command.\n :rtype: str\n \"\"\"\n\n # We initiate a process and parse the command to it.\n process = Popen(self.command, stdout=PIPE, stderr=PIPE, shell=True)\n\n # We communicate the command and get the output and the error.\n (output, error) = process.communicate()\n\n if process.returncode != 0: # pragma: no cover\n # The return code is different to 0.\n\n # We return the decoded error.\n return self._decode_output(error)\n\n # The return code (or exit code if you prefer) if equal to 0.\n\n # We return the decoded output of the executed command.\n return self._decode_output(output)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef run(self):\n\n with Popen(self.command, stdout=PIPE, shell=True) as process:\n # We initiate a process and parse the command to it.\n\n while True:\n # We loop infinitly because we want to get the output\n # until there is none.\n\n # We get the current line from the process stdout.\n #\n # Note: we use rstrip() because we are paranoid :-)\n current_line = process.stdout.readline().rstrip()\n\n if not current_line:\n # The current line is empty or equal to None.\n\n # We break the loop.\n break\n\n # The line is not empty nor equal to None.\n\n # We encode and yield the current line\n yield self._decode_output(current_line)", "response": "Run the given command and yield each line of output one by one."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef remove_key(self, key_to_remove):\n\n if isinstance(self.main_dictionnary, dict):\n # The main dictionnary is a dictionnary\n\n if isinstance(key_to_remove, list):\n # The parsed key to remove is a list.\n\n for key in key_to_remove:\n # We loop through the list of key to remove.\n\n # We delete the key from the dictionnary.\n del self.main_dictionnary[key]\n else:\n # The parsed key to remove is not a list.\n\n try:\n # We delete the given key from the dictionnary.\n del self.main_dictionnary[key_to_remove]\n except KeyError:\n pass\n\n # We return the final dictionnary.\n return self.main_dictionnary\n\n # The main dictionnary is not a dictionnary.\n\n # We return None.\n return None", "response": "Removes a given key from a given dictionary."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nrenames the given keys from the given dictionary. :param key_to_rename: The key(s) to rename. Expected format: :code:`{old:new}` :type key_to_rename: dict :param strict: Tell us if we have to rename the exact index or the index which looks like the given key(s) :return: The well formatted dict. :rtype: dict|None", "response": "def rename_key(self, key_to_rename, strict=True):\n \"\"\"\n Rename the given keys from the given dictionary.\n\n :param key_to_rename:\n The key(s) to rename.\n Expected format: :code:`{old:new}`\n :type key_to_rename: dict\n\n :param strict:\n Tell us if we have to rename the exact index or\n the index which looks like the given key(s)\n\n :return: The well formatted dict.\n :rtype: dict|None\n \"\"\"\n\n if isinstance(self.main_dictionnary, dict) and isinstance(key_to_rename, dict):\n # * The given main directory is a dictionnary.\n # and\n # * The given key to rename is a dictionnary.\n\n for old, new in key_to_rename.items():\n # We loop through the key to raname.\n\n if strict:\n # The strict method is activated.\n if old in self.main_dictionnary:\n # The old key is in the main dictionnary.\n\n # We initiate the new with the old and remove the old content.\n self.main_dictionnary[new] = self.main_dictionnary.pop(old)\n else:\n # The strict method is not activated.\n\n # We initiate the elements to rename.\n to_rename = {}\n\n for index in self.main_dictionnary:\n # We loop throught the indexes of the main dictionnary.\n\n if old in index:\n # The old key is into the index name.\n\n # We append the index name and the new index to our\n # local list to rename.\n to_rename.update({index: new[:-1] + index.split(old)[-1]})\n\n # We run this method against the local list to rename in order\n # to rename the element.\n self.main_dictionnary = Dict(self.main_dictionnary).rename_key(\n to_rename, True\n )\n\n # We return the final list.\n return self.main_dictionnary\n\n # * The given main directory is not a dictionnary.\n # or\n # * The given key to rename is not a dictionnary.\n\n # We return None.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nmerges the content of to_merge into the given main dictionnary. :param to_merge: The dictionnary to merge. :type to_merge: dict :param strict: Tell us if we have to strictly merge lists. :code:`True`: We follow index :code`False`: We follow element (content) :type strict: bool :return: The merged dict. :rtype: dict", "response": "def merge(self, to_merge, strict=True):\n \"\"\"\n Merge the content of to_merge into the given main dictionnary.\n\n :param to_merge: The dictionnary to merge.\n :type to_merge: dict\n\n :param strict:\n Tell us if we have to strictly merge lists.\n\n :code:`True`: We follow index\n :code`False`: We follow element (content)\n :type strict: bool\n\n :return: The merged dict.\n :rtype: dict\n \"\"\"\n\n # We initiate a variable which will save our result.\n result = {}\n\n for element in to_merge:\n # We loop throught the given dict to merge.\n\n if element in self.main_dictionnary:\n # The currently read element is in the main dict.\n\n if isinstance(to_merge[element], dict) and isinstance(\n self.main_dictionnary[element], dict\n ):\n # They are in both side dict.\n\n # We merge the dict tree and save into result.\n result[element] = Dict(self.main_dictionnary[element]).merge(\n to_merge[element]\n )\n\n elif isinstance(to_merge[element], list) and isinstance(\n self.main_dictionnary[element], list\n ):\n # They are in both side list.\n\n # We merge the lists and save into result.\n result[element] = List(self.main_dictionnary[element]).merge(\n to_merge[element], strict\n )\n else:\n # They are not list, not dict.\n\n # We append the currently read element to the result.\n result.update({element: to_merge[element]})\n else:\n # The currently read element is not into the main\n # dict.\n\n # We append the currently read element to the result.\n result.update({element: to_merge[element]})\n\n for element in self.main_dictionnary:\n # We loop through each element of the main dict.\n\n if element not in result:\n # The currently read element is not into\n # the result.\n\n # We append it to the result.\n result[element] = self.main_dictionnary[element]\n\n # We return the result.\n return result"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsaving a dictionnary into a JSON file.", "response": "def to_json(self, destination):\n \"\"\"\n Save a dictionnary into a JSON file.\n\n :param destination:\n A path to a file where we're going to\n write the converted dict into a JSON format.\n :type destination: str\n \"\"\"\n\n try:\n with open(destination, \"w\") as file:\n # We open the file we are going to write.\n # Note: We always overwrite the destination.\n\n # We save the current dictionnary into a json format.\n dump(\n self.main_dictionnary,\n file,\n ensure_ascii=False,\n indent=4,\n sort_keys=True,\n )\n except UnicodeEncodeError: # pragma: no cover\n with open(destination, \"w\", encoding=\"utf-8\") as file:\n # We open the file we are going to write.\n # Note: We always overwrite the destination.\n\n # We save the current dictionnary into a json format.\n dump(\n self.main_dictionnary,\n file,\n ensure_ascii=False,\n indent=4,\n sort_keys=True,\n )"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef to_yaml(self, destination, flow_style=False):\n\n with open(destination, \"w\") as file:\n # We open the file we are going to write.\n # Note: We always overwrite the destination.\n\n # We save the current dictionnary into a json format.\n dump_yaml(\n self.main_dictionnary,\n file,\n encoding=\"utf-8\",\n allow_unicode=True,\n indent=4,\n default_flow_style=flow_style,\n )", "response": "Save a dictionnary into a YAML file."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nfixes the path of the given path. :param splited_path: A list to convert to the right path. :type splited_path: list :return: The fixed path. :rtype: str", "response": "def fix_path(self, splited_path=None):\n \"\"\"\n Fix the path of the given path.\n\n :param splited_path: A list to convert to the right path.\n :type splited_path: list\n\n :return: The fixed path.\n :rtype: str\n \"\"\"\n\n if not splited_path:\n # A splited path is parsed.\n\n # We initate a variable which will save the splited path.\n split_path = []\n\n if self.directory:\n # The parsed directory is not empty or equal to None.\n\n if \"/\" in self.directory:\n # We split the separator.\n split_path = self.directory.split(\"/\")\n elif \"\\\\\" in self.directory:\n # We split the separator.\n split_path = self.directory.split(\"\\\\\")\n else:\n split_path = [self.directory]\n\n # We run the same function with the splited_path argument filled.\n return self.fix_path(splited_path=[x for x in split_path if x])\n\n # We return the directory.\n return self.directory\n\n # We join the splited element with the directory separator as glue.\n return directory_separator.join(splited_path) + directory_separator"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef write(self, data_to_write, overwrite=False):\n\n if overwrite or not path.isfile(self.file):\n # * We have to overwrite the file data.\n # or\n # * The file path does not already exist.\n\n with open(self.file, \"w\", encoding=\"utf-8\", newline=\"\\n\") as file:\n # We prepare the file for writting.\n\n if data_to_write and isinstance(data_to_write, str):\n # * A data to write is given.\n # and\n # * The data to write is a string\n\n # We write the string into the file.\n file.write(data_to_write)\n else:\n # * We do not have to overwrite the file data.\n # or\n # * The file path does already exist.\n\n with open(self.file, \"a\", encoding=\"utf-8\", newline=\"\\n\") as file:\n # We prepare the file for append writting.\n\n if data_to_write and isinstance(data_to_write, str):\n # * A data to write is given.\n # and\n # * The data to write is a string\n\n # We append the string into the file.\n file.write(data_to_write)", "response": "Writes or appends the given data into the given file path."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef read(self):\n\n try:\n with open(self.file, \"r\", encoding=\"utf-8\") as file:\n # We open and read a file.\n\n # We get the file content.\n funilrys = file.read()\n except UnicodeDecodeError: # pragma: no cover\n with open(self.file, \"r\") as file:\n # We open and read a file.\n\n # We get the file content.\n funilrys = file.read()\n\n # We return the file content.\n return funilrys", "response": "Read a given file path and return its content."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn a well formatted list of all the available user s entries.", "response": "def format(self):\n \"\"\"\n Return a well formatted list. Basicaly, it's sort a list and remove duplicate.\n\n :return: A sorted, without duplicate, list.\n :rtype: list\n \"\"\"\n\n try:\n return sorted(list(set(self.main_list)), key=str.lower)\n\n except TypeError: # pragma: no cover\n return self.main_list"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef custom_format(self, key_method, reverse=False):\n\n try:\n return sorted(list(set(self.main_list)), key=key_method, reverse=reverse)\n except TypeError: # pragma: no cover\n return self.main_list", "response": "Returns a well formatted list."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nmerge to_merge into the given main list. :param to_merge: The list to merge. :type to_merge: list :param strict: Tell us if we have to respect index (True) or not (False). :type strict: bool :return: The merged list. :rtype: list", "response": "def merge(self, to_merge, strict=True):\n \"\"\"\n Merge to_merge into the given main list.\n\n :param to_merge: The list to merge.\n :type to_merge: list\n\n :param strict:\n Tell us if we have to respect index (True)\n or not (False).\n :type strict: bool\n\n :return: The merged list.\n :rtype: list\n \"\"\"\n\n # We initiate a variable which will save the\n # result\n result = []\n\n if strict:\n # We are in strict mode.\n\n for index, element in enumerate(to_merge):\n # We loop through each element of the list to merge\n # to the main dict.\n\n try:\n if isinstance(element, dict) and isinstance(\n self.main_list[index], dict\n ):\n # The currently read element is a dict.\n\n # We merge its content into the main dict\n # and append into the result.\n result.append(Dict(self.main_list[index]).merge(element))\n elif isinstance(element, list) and isinstance(\n self.main_list[index], list\n ):\n # The currently read element is a list.\n\n # We loop through this method.\n result.append(List(self.main_list[index]).merge(element))\n else:\n # The currently read element is not a list\n # nor a dict.\n\n # We append the element to the result.\n result.append(element)\n except IndexError: # pragma: no cover\n # The index does not exist.\n # Which means that for example one list is bigger\n # than the other one.\n\n # We append the element to the result.\n result.append(element)\n else:\n # We are not is strict mode.\n\n # We initiate the result with the main list.\n result = self.main_list\n\n for element in to_merge:\n # We loop through the element to merge.\n\n if element not in result:\n # The currently read element is not\n # in the result.\n\n # We append it to the result\n result.append(element)\n\n # We return the result.\n return result"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef not_matching_list(self):\n\n pre_result = comp(self.regex)\n\n return [x for x in self.data if not pre_result.search(str(x))]", "response": "Return a list of string which don t match the\n given regex."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning the data of the match status.", "response": "def match(self):\n \"\"\"\n Used to get exploitable result of re.search\n\n :return: The data of the match status.\n :rtype: mixed\n \"\"\"\n\n # We initate this variable which gonna contain the returned data\n result = []\n\n # We compile the regex string\n to_match = comp(self.regex)\n\n # In case we have to use the implementation of ${BASH_REMATCH} we use\n # re.findall otherwise, we use re.search\n if self.rematch: # pylint: disable=no-member\n pre_result = to_match.findall(self.data)\n else:\n pre_result = to_match.search(self.data)\n\n if self.return_data and pre_result: # pylint: disable=no-member\n if self.rematch: # pylint: disable=no-member\n for data in pre_result:\n if isinstance(data, tuple):\n result.extend(list(data))\n else:\n result.append(data)\n\n if self.group != 0: # pylint: disable=no-member\n return result[self.group] # pylint: disable=no-member\n\n else:\n result = pre_result.group(\n self.group # pylint: disable=no-member\n ).strip()\n\n return result\n\n if not self.return_data and pre_result: # pylint: disable=no-member\n return True\n\n return False"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef replace(self):\n\n if self.replace_with: # pylint: disable=no-member\n return substrings(\n self.regex,\n self.replace_with, # pylint: disable=no-member\n self.data,\n self.occurences, # pylint: disable=no-member\n )\n\n return self.data", "response": "Used to replace a matched string with another."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\ndownloads the given link and return or save its content at the given destination.", "response": "def text(self):\n \"\"\"\n Download the given link and return or save its :code:`requests.text`\n at the given destination.\n\n :rtype: mixed\n\n :raises:\n :code:`Exception`\n If the status code is not :code:`200`.\n \"\"\"\n\n try:\n # We request the link.\n req = requests.get(self.link, verify=self.verification)\n\n if req.status_code == 200:\n # The request http status code is equal to 200.\n\n if self.return_data:\n # We have to return the data.\n\n # We return the link content.\n return req.text\n\n # We save the link content to the parsed destination.\n File(self.destination).write(req.text, overwrite=True)\n\n # We return True.\n return True\n\n # The request http status code is not equal to 200.\n\n # We raise an exception saying that we were unable to download.\n raise Exception(\"Unable to download %s.\" % repr(self.link))\n except requests.exceptions.ConnectionError:\n print(Fore.RED + \"No Internet connection available.\" + Style.RESET_ALL)\n exit(1)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef count(self):\n\n if self.status:\n # The status is parsed.\n\n # We increase the number of tested.\n PyFunceble.INTERN[\"counter\"][\"number\"][\"tested\"] += 1\n\n if (\n self.status.lower() in PyFunceble.STATUS[\"list\"][\"up\"]\n or self.status.lower() in PyFunceble.STATUS[\"list\"][\"valid\"]\n ):\n # The status is in the list of up status.\n\n # We increase the number of up.\n PyFunceble.INTERN[\"counter\"][\"number\"][\"up\"] += 1\n elif self.status.lower() in PyFunceble.STATUS[\"list\"][\"down\"]:\n # The status is in the list of down status.\n\n # We increase the number of down.\n PyFunceble.INTERN[\"counter\"][\"number\"][\"down\"] += 1\n else:\n # The status is not in the list of up nor down status.\n\n # We increase the number of invalid.\n PyFunceble.INTERN[\"counter\"][\"number\"][\"invalid\"] += 1", "response": "Count the number of domain for each status."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncalculate the percentage of each status.", "response": "def _calculate(cls):\n \"\"\"\n Calculate the percentage of each status.\n \"\"\"\n\n # We map the current state/counters of the different status.\n percentages = {\n \"up\": PyFunceble.INTERN[\"counter\"][\"number\"][\"up\"],\n \"down\": PyFunceble.INTERN[\"counter\"][\"number\"][\"down\"],\n \"invalid\": PyFunceble.INTERN[\"counter\"][\"number\"][\"invalid\"],\n }\n\n for percentage in percentages:\n # We loop through our map index.\n\n # We calculate the percentage.\n calculation = (\n percentages[percentage]\n * 100\n // PyFunceble.INTERN[\"counter\"][\"number\"][\"tested\"]\n )\n\n # And we update the percentage counter of the actual status.\n PyFunceble.INTERN[\"counter\"][\"percentage\"].update({percentage: calculation})"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nprint on screen and on file the percentage of each status.", "response": "def log(self):\n \"\"\"\n Print on screen and on file the percentages for each status.\n \"\"\"\n\n if (\n PyFunceble.CONFIGURATION[\"show_percentage\"]\n and PyFunceble.INTERN[\"counter\"][\"number\"][\"tested\"] > 0\n ):\n # * We are allowed to show the percentage on screen.\n # and\n # * The number of tested is greater than 0.\n\n # We initiate the output file.\n output = (\n PyFunceble.OUTPUT_DIRECTORY\n + PyFunceble.OUTPUTS[\"parent_directory\"]\n + PyFunceble.OUTPUTS[\"logs\"][\"directories\"][\"parent\"]\n + PyFunceble.OUTPUTS[\"logs\"][\"directories\"][\"percentage\"]\n + PyFunceble.OUTPUTS[\"logs\"][\"filenames\"][\"percentage\"]\n )\n\n # We delete the output file if it does exist.\n File(output).delete()\n\n # We calculate the percentage of each statuses.\n self._calculate()\n\n if not PyFunceble.CONFIGURATION[\"quiet\"]:\n # The quiet mode is activated.\n\n # We print a new line.\n print(\"\\n\")\n\n # We print the percentage header on file and screen.\n Prints(None, \"Percentage\", output).header()\n\n # We construct the different lines/data to print on screen and file.\n lines_to_print = [\n [\n PyFunceble.STATUS[\"official\"][\"up\"],\n str(PyFunceble.INTERN[\"counter\"][\"percentage\"][\"up\"]) + \"%\",\n PyFunceble.INTERN[\"counter\"][\"number\"][\"up\"],\n ],\n [\n PyFunceble.STATUS[\"official\"][\"down\"],\n str(PyFunceble.INTERN[\"counter\"][\"percentage\"][\"down\"]) + \"%\",\n PyFunceble.INTERN[\"counter\"][\"number\"][\"down\"],\n ],\n [\n PyFunceble.STATUS[\"official\"][\"invalid\"],\n str(PyFunceble.INTERN[\"counter\"][\"percentage\"][\"invalid\"])\n + \"%\",\n PyFunceble.INTERN[\"counter\"][\"number\"][\"invalid\"],\n ],\n ]\n\n if PyFunceble.CONFIGURATION[\"syntax\"]:\n # We are checking for syntax.\n\n # We update the denomination of the UP.\n lines_to_print[0][0] = PyFunceble.STATUS[\"official\"][\"valid\"]\n\n # And we unset the INACTIVE line.\n del lines_to_print[1]\n\n for to_print in lines_to_print:\n # We loop throught the different line to print.\n # (one line for each status.)\n\n # And we print the current status line on file and screen.\n Prints(to_print, \"Percentage\", output).data()\n\n elif PyFunceble.INTERN[\"counter\"][\"number\"][\"tested\"] > 0:\n # * We are not allowed to show the percentage on screen.\n # but\n # * The number of tested is greater than 0.\n\n # We run the calculation.\n # Note: The following is needed, because all counter calculation are\n # done by this class.\n self._calculate()"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef is_url_valid(self, url=None, return_base=False, return_formatted=False):\n\n # We initiate a variable which will save the initial base in case\n # we have to convert the base to IDNA.\n initial_base = None\n\n if url:\n # The given url is not empty.\n\n # We initiate the element to test.\n to_test = url\n elif self.element:\n # The globaly given url is not empty.\n\n # We initiate the element to test.\n to_test = self.element\n else:\n # The given url is empty.\n\n # We initiate the element to test from the globaly URl to test.\n to_test = PyFunceble.INTERN[\"to_test\"]\n\n if to_test.startswith(\"http\"):\n # The element to test starts with http.\n\n try:\n # We initiate a regex which will match the domain or the url base.\n regex = r\"(^(http:\\/\\/|https:\\/\\/)(.+?(?=\\/)|.+?$))\"\n\n # We extract the url base with the help of the initiated regex.\n initial_base = base = Regex(\n to_test, regex, return_data=True, rematch=True\n ).match()[2]\n\n if PyFunceble.CONFIGURATION[\"idna_conversion\"]:\n # We have to convert the domain to IDNA.\n\n # We convert the initial base to IDNA.\n base = domain2idna(base)\n\n # We check if the url base is a valid domain.\n domain_status = self.is_domain_valid(base)\n\n # We check if the url base is a valid IP.\n ip_status = self.is_ip_valid(base)\n\n if domain_status or ip_status:\n # * The url base is a valid domain.\n # and\n # * The url base is a valid IP.\n\n if PyFunceble.CONFIGURATION[\"idna_conversion\"] and return_formatted:\n # * We have to convert to IDNA.\n # and\n # * We have to return the converted full URL.\n\n # We return the converted full URL.\n return Regex(\n to_test,\n initial_base,\n escape=True,\n return_data=True,\n replace_with=base,\n occurences=1,\n ).replace()\n\n if return_formatted:\n # * We do not have to convert to IDNA.\n # but\n # * We have to return the full URL.\n\n # We return the initially given URL.\n return to_test\n\n if return_base:\n # We have to return the base of the URL.\n\n # We return the base of the URL.\n return base\n\n # We return True.\n return True\n except TypeError:\n pass\n\n if return_formatted:\n # We have to return an URL.\n\n # We return the initily given URL.\n return to_test\n\n # We return False.\n return False", "response": "Check if the given URL is valid."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking if the given domain is a valid sub - domain.", "response": "def is_domain_valid(\n self, domain=None, subdomain_check=False\n ): # pylint:disable=too-many-return-statements, too-many-branches\n \"\"\"\n Check if the given domain is a valid.\n\n :param domain: The domain to validate.\n :type domain: str\n\n :param subdomain_check:\n Activate the subdomain checking.\n :type subdomain_check: bool\n\n :return: The validity of the sub-domain.\n :rtype: bool\n \"\"\"\n\n # We initate our regex which will match for valid domains.\n regex_valid_domains = r\"^(?=.{0,253}$)(([a-z0-9][a-z0-9-]{0,61}[a-z0-9]|[a-z0-9])\\.)+((?=.*[^0-9])([a-z0-9][a-z0-9-]{0,61}[a-z0-9](?:\\.)?|[a-z0-9](?:\\.)?))$\" # pylint: disable=line-too-long\n\n # We initiate our regex which will match for valid subdomains.\n regex_valid_subdomains = r\"^(?=.{0,253}$)(([a-z0-9_][a-z0-9-_]{0,61}[a-z0-9_-]|[a-z0-9])\\.)+((?=.*[^0-9])([a-z0-9][a-z0-9-]{0,61}[a-z0-9]|[a-z0-9]))$\" # pylint: disable=line-too-long\n\n if domain:\n # A domain is given.\n\n # We set the element to test as the parsed domain.\n to_test = domain\n elif self.element:\n # A domain is globally given.\n\n # We set the globally parsed domain.\n to_test = self.element\n else:\n # A domain is not given.\n\n # We set the element to test as the currently tested element.\n to_test = PyFunceble.INTERN[\"to_test\"]\n\n try:\n # We get the position of the last point.\n last_point_index = to_test.rindex(\".\")\n # And with the help of the position of the last point, we get the domain extension.\n extension = to_test[last_point_index + 1 :]\n\n if not extension and to_test.endswith(\".\"):\n try:\n extension = [x for x in to_test.split(\".\") if x][-1]\n except IndexError:\n pass\n\n if not extension or extension not in PyFunceble.INTERN[\"iana_db\"]:\n # * The extension is not found.\n # or\n # * The extension is not into the IANA database.\n\n # We return false.\n return False\n\n if (\n Regex(to_test, regex_valid_domains, return_data=False).match()\n and not subdomain_check\n ):\n # * The element pass the domain validation.\n # and\n # * We are not checking if it is a subdomain.\n\n # We return True. The domain is valid.\n return True\n\n # The element did not pass the domain validation. That means that\n # it has invalid character or the position of - or _ are not right.\n\n if extension in PyFunceble.INTERN[\"psl_db\"]:\n # The extension is into the psl database.\n\n for suffix in PyFunceble.INTERN[\"psl_db\"][extension]:\n # We loop through the element of the extension into the psl database.\n\n try:\n # We try to get the position of the currently read suffix\n # in the element ot test.\n suffix_index = to_test.rindex(\".\" + suffix)\n\n # We get the element to check.\n # The idea here is to delete the suffix, then retest with our\n # subdomains regex.\n to_check = to_test[:suffix_index]\n\n if \".\" not in to_check and subdomain_check:\n # * There is no point into the new element to check.\n # and\n # * We are checking if it is a subdomain.\n\n # We return False, it is not a subdomain.\n return False\n\n if \".\" in to_check and subdomain_check:\n # * There is a point into the new element to check.\n # and\n # * We are checking if it is a subdomain.\n\n # We return True, it is a subdomain.\n return True\n\n # We are not checking if it is a subdomain.\n\n if \".\" in to_check:\n # There is a point into the new element to check.\n\n # We check if it passes our subdomain regex.\n # * True: It's a valid domain.\n # * False: It's an invalid domain.\n return Regex(\n to_check, regex_valid_subdomains, return_data=False\n ).match()\n\n except ValueError:\n # In case of a value error because the position is not found,\n # we continue to the next element.\n pass\n\n # * The extension is not into the psl database.\n # or\n # * there was no point into the suffix checking.\n\n # We get the element before the last point.\n to_check = to_test[:last_point_index]\n\n if \".\" in to_check and subdomain_check:\n # * There is a point in to_check.\n # and\n # * We are checking if it is a subdomain.\n\n # We return True, it is a subdomain.\n return True\n\n # We are not checking if it is a subdomain.\n\n if \".\" in to_check:\n # There is a point in to_check.\n\n # We check if it passes our subdomain regex.\n # * True: It's a valid domain.\n # * False: It's an invalid domain.\n return Regex(\n to_check, regex_valid_subdomains, return_data=False\n ).match()\n\n except (ValueError, AttributeError):\n # In case of a value or attribute error we ignore them.\n pass\n\n # And we return False, the domain is not valid.\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nchecks if the given domain is a subdomain.", "response": "def is_subdomain(self, domain=None):\n \"\"\"\n Check if the given subdomain is a subdomain.\n\n :param domain: The domain to validate.\n :type domain: str\n\n :return: The validity of the subdomain.\n :rtype: bool\n \"\"\"\n\n if domain:\n # A domain is given.\n\n # We set the element to test as the parsed domain.\n to_test = domain\n elif self.element:\n # A domain is globally given.\n\n # We set the globally parsed domain.\n to_test = self.element\n else:\n # A domain is not given.\n\n # We set the element to test as the currently tested element.\n to_test = PyFunceble.INTERN[\"to_test\"]\n\n # We return the status of the check.\n return self.is_domain_valid(to_test, subdomain_check=True)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if the given IP is a valid IPv4.", "response": "def is_ip_valid(self, ip_to_check=None):\n \"\"\"\n Check if the given IP is a valid IPv4.\n\n :param ip_to_check: The IP to test.\n :type ip_to_check: str\n\n :return: The validity of the IP.\n :rtype: bool\n\n .. note::\n We only test IPv4 because for now we only them for now.\n \"\"\"\n\n # We initate our regex which will match for valid IPv4.\n regex_ipv4 = r\"^(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?|[0-9]{1,}\\/[0-9]{1,})$\" # pylint: disable=line-too-long\n\n if ip_to_check:\n # An element is localy given.\n\n # We consider it as the element to test.\n to_test = ip_to_check\n elif self.element:\n # An element is given globally.\n\n # We consider it as the element to test.\n to_test = self.element\n else:\n # An element is not localy given.\n\n # We consider the global element to test as the element to test.\n to_test = PyFunceble.INTERN[\"to_test\"]\n\n # We check if it passes our IPv4 regex.\n # * True: It's a valid IPv4.\n # * False: It's an invalid IPv4.\n return Regex(to_test, regex_ipv4, return_data=False).match()"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef is_ip_range(self, ip_to_check=None):\n\n if ip_to_check:\n # An element is localy given.\n\n # We consider it as the element to test.\n to_test = ip_to_check\n elif self.element:\n # An element is given globally.\n\n # We consider it as the element to test.\n to_test = self.element\n else:\n # An element is not localy given.\n\n # We consider the global element to test as the element to test.\n to_test = PyFunceble.INTERN[\"to_test\"]\n\n if self.is_ip_valid(to_test):\n # We initate our regex which will match for valid IPv4 ranges.\n regex_ipv4_range = r\"^(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.([0-9]{1,}\\/[0-9]{1,})$\" # pylint: disable=line-too-long\n\n # We check if it passes our regex.\n # * True: It's an IPv4 range.\n # * False: It's not an IPv4 range.\n return Regex(to_test, regex_ipv4_range, return_data=False).match()\n return False", "response": "Check if the given IP is a valid IPv4 range."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get(cls):\n\n if PyFunceble.INTERN[\"to_test_type\"] == \"domain\":\n # We are testing for domain or ip.\n\n if Check().is_domain_valid() or Check().is_ip_valid():\n # * The domain is valid.\n # or\n # * The IP is valid.\n\n # We handle and return the valid status.\n return SyntaxStatus(PyFunceble.STATUS[\"official\"][\"valid\"]).handle()\n elif PyFunceble.INTERN[\"to_test_type\"] == \"url\":\n # We are testing for URL.\n\n if Check().is_url_valid():\n # * The url is valid.\n\n # We handle and return the valid status.\n return SyntaxStatus(PyFunceble.STATUS[\"official\"][\"valid\"]).handle()\n else:\n raise Exception(\"Unknow test type.\")\n\n # We handle and return the invalid status.\n return SyntaxStatus(PyFunceble.STATUS[\"official\"][\"invalid\"]).handle()", "response": "Execute the logic behind the Syntax handling."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _reformat_historical_formating_error(self): # pragma: no cover\n\n if PyFunceble.CONFIGURATION[\"inactive_database\"]:\n # The database subsystem is activated.\n\n # We construct the possible path to an older version of the database.\n historical_formating_error = (\n PyFunceble.CURRENT_DIRECTORY + \"inactive-db.json\"\n )\n\n if PyFunceble.path.isfile(historical_formating_error):\n # The histortical file already exists.\n\n # We get its content.\n data = Dict().from_json(File(historical_formating_error).read())\n\n # We initiate a variable which will save the data that is going\n # to be merged.\n data_to_parse = {}\n\n # We get the database keybase.\n top_keys = data.keys()\n\n for top_key in top_keys:\n # We loop through the list of upper keys.\n\n # We get the lowest keys.\n low_keys = data[top_key].keys()\n\n # We initiate the data to parse.\n data_to_parse[top_key] = {}\n\n for low_key in low_keys:\n # We loop through the list of lower keys.\n\n if low_key.isdigit():\n # The current low key is a digit.\n\n # We parse its content (from the old) into the new format.\n # In between, we remove 30 days from the low_key so that\n # it become in the past. This way they will be retested\n # automatically.\n data_to_parse[top_key][\n int(low_key) - (self.one_day_in_seconds * 30)\n ] = data[top_key][low_key]\n else:\n # The current low key is not a digit.\n\n # We parse its content (from the old) into the new format.\n # In between, we remove 30 days from the current time so that\n # it become in the past. This way they will be retested\n # automatically.\n data_to_parse[top_key][\n int(PyFunceble.time()) - (self.one_day_in_seconds * 30)\n ] = data[top_key][low_key]\n\n if \"inactive_db\" in PyFunceble.INTERN:\n # The current (new) database is not empty.\n\n # We update add the content of the old into the current database.\n PyFunceble.INTERN[\"inactive_db\"].update(data_to_parse)\n else:\n # The current (new) database is empty.\n\n # We replace the content with the data_to_parse as it is complient\n # with the new format.\n PyFunceble.INTERN[\"inactive_db\"] = data_to_parse\n\n # We delete the old database file.\n File(historical_formating_error).delete()", "response": "Reformat the historical formating error into the new format."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _merge(self):\n\n if PyFunceble.CONFIGURATION[\"inactive_database\"]:\n # The database subsystem is activated.\n\n # We get the content of the database.\n database_content = Dict().from_json(File(self.inactive_db_path).read())\n\n # We get the database top keys.\n database_top_keys = database_content.keys()\n\n for database_top_key in database_top_keys:\n # We loop through the list of database top keys.\n\n if database_top_key not in PyFunceble.INTERN[\"inactive_db\"]:\n # The currently read top key is not already into the database.\n\n # We initiate the currently read key with the same key from\n # our database file.\n PyFunceble.INTERN[\"inactive_db\"][\n database_top_key\n ] = database_content[database_top_key]\n else:\n # The currently read top key is already into the database.\n\n # We get the list of lower indexes.\n database_low_keys = database_content[database_top_key].keys()\n\n for database_low_key in database_low_keys:\n # We loop through the lower keys.\n\n if (\n database_low_key\n not in PyFunceble.INTERN[\"inactive_db\"][database_top_key]\n ): # pragma: no cover\n # The lower key is not already into the database.\n\n # We initiate the currently read low and top key with the\n # same combinaison from our database file.\n PyFunceble.INTERN[\"inactive_db\"][database_top_key][\n database_low_key\n ] = database_content[database_top_key][database_low_key]\n else:\n # The lower key is not already into the database.\n\n # We exted the currently read low and top key combinaison\n # with the same combinaison from our database file.\n PyFunceble.INTERN[\"inactive_db\"][database_top_key][\n database_low_key\n ].extend(\n database_content[database_top_key][database_low_key]\n )\n\n # And we format the list of element to ensure that there is no\n # duplicate into the database content.\n PyFunceble.INTERN[\"inactive_db\"][database_top_key][\n database_low_key\n ] = List(\n PyFunceble.INTERN[\"inactive_db\"][database_top_key][\n database_low_key\n ]\n ).format()", "response": "Merge the real database with the older one which has already been set into the inactive database."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _retrieve(self):\n\n if PyFunceble.CONFIGURATION[\"inactive_database\"]:\n # The database subsystem is activated.\n\n # We get, format and initiate the historical database file.\n self._reformat_historical_formating_error()\n\n if PyFunceble.path.isfile(self.inactive_db_path):\n # The database file exist.\n\n # We merge our current database into already initiated one.\n self._merge()", "response": "Retrieve the current content of the inactive - db. json file."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nsaving the current database into the inactive-db.json file.", "response": "def _backup(self):\n \"\"\"\n Save the current database into the inactive-db.json file.\n \"\"\"\n\n if PyFunceble.CONFIGURATION[\"inactive_database\"]:\n # The database subsystem is activated.\n\n # We save the current database state into the database file.\n Dict(PyFunceble.INTERN[\"inactive_db\"]).to_json(self.inactive_db_path)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _timestamp(self):\n\n if PyFunceble.CONFIGURATION[\"inactive_database\"]:\n # The database subsystem is activated.\n\n if (\n \"inactive_db\" in PyFunceble.INTERN\n and PyFunceble.INTERN[\"file_to_test\"]\n in PyFunceble.INTERN[\"inactive_db\"]\n and PyFunceble.INTERN[\"inactive_db\"][PyFunceble.INTERN[\"file_to_test\"]]\n ):\n # The file we are testing is into the database and its content\n # is not empty.\n\n # We get the indexes of the current file (in the dabase).\n database_keys = [\n x\n for x in PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ].keys()\n if x.isdigit()\n ]\n\n if database_keys:\n # The list of keys is not empty.\n\n # We get the most recent date.\n recent_date = max(database_keys)\n else: # pragma: no cover\n # The list of keys is empty.\n\n # We return the current time.\n return int(PyFunceble.time())\n\n if int(PyFunceble.time()) > int(recent_date) + self.one_day_in_seconds:\n # The most recent time was in more than one day.\n\n # We return the current time.\n return int(PyFunceble.time())\n\n # The most recent time was in less than one day.\n\n if int(PyFunceble.time()) < int(recent_date) + self.days_in_seconds:\n # The most recent time was in less than the expected number of day for\n # retesting.\n\n # We return the most recent data.\n return int(recent_date)\n\n # The database subsystem is not activated.\n\n # We return the current time.\n return int(PyFunceble.time())", "response": "Return the timestamp where we are going to save our current list."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding the current time to the database.", "response": "def add(self):\n \"\"\"\n Save the current :code.`PyFunceble.CONFIGURATION['to_test']`\n into the current timestamp.\n \"\"\"\n\n if PyFunceble.CONFIGURATION[\"inactive_database\"]:\n # The database subsystem is activated.\n\n # We get the timestamp to use as index.\n timestamp = str(self._timestamp())\n\n if (\n \"inactive_db\" in PyFunceble.INTERN\n and PyFunceble.INTERN[\"file_to_test\"]\n in PyFunceble.INTERN[\"inactive_db\"]\n ):\n # * The file path is not into the database.\n\n if (\n timestamp\n in PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ]\n ):\n # The timetamp is already into the database related to the file we\n # are testing.\n\n if (\n PyFunceble.INTERN[\"to_test\"]\n not in PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ][timestamp]\n ):\n # The currently tested element is not into the database related\n # to the file we are testing.\n\n # We append the currently tested element into the database.\n PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ][timestamp].append(PyFunceble.INTERN[\"to_test\"])\n else:\n # The timetamp is not into the database related to the file we\n # are testing.\n\n # We append the index and the database element into the databse\n # related to the file we are testing.\n PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ].update({timestamp: [PyFunceble.INTERN[\"to_test\"]]})\n\n if (\n \"to_test\"\n in PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ]\n and PyFunceble.INTERN[\"to_test\"]\n in PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ][\"to_test\"]\n ):\n # * The `to_test` index is into the database related to the file we\n # are testing.\n # and\n # * The element we are testing is into the `to_test` index related to\n # the file we are testing.\n\n # We remove the element from the list of element to test.\n PyFunceble.INTERN[\"inactive_db\"][PyFunceble.INTERN[\"file_to_test\"]][\n \"to_test\"\n ].remove(PyFunceble.INTERN[\"to_test\"])\n else:\n # The file path is not into the database.\n\n # We initiate the file path and its content into the database.\n PyFunceble.INTERN[\"inactive_db\"] = {\n PyFunceble.INTERN[\"file_to_test\"]: {\n timestamp: [PyFunceble.INTERN[\"to_test\"]]\n }\n }\n\n # And we save the data into the database.\n self._backup()"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nremoving all occurence of to_test from the database.", "response": "def remove(self):\n \"\"\"\n Remove all occurence of :code:`PyFunceble.CONFIGURATION['to_test']`\n from the database.\n \"\"\"\n\n if PyFunceble.CONFIGURATION[\"inactive_database\"]:\n # The database subsystem is activated.\n\n if PyFunceble.INTERN[\"file_to_test\"] in PyFunceble.INTERN[\"inactive_db\"]:\n # The file path is into the database.\n\n for data in PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ]:\n # We loop through the index of the file database.\n\n if (\n PyFunceble.INTERN[\"to_test\"]\n in PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ][data]\n ):\n # The currently tested element into the currently read index.\n\n # We remove the currently tested element from the read index.\n PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ][data].remove(PyFunceble.INTERN[\"to_test\"])\n\n # And we save the data into the database.\n self._backup()"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ngets the content of the database.", "response": "def content(cls):\n \"\"\"\n Get the content of the database.\n\n :return: The content of the database.\n :rtype: list\n \"\"\"\n\n # We initiate a variable which will save what we are going to return.\n result = []\n\n if (\n PyFunceble.CONFIGURATION[\"inactive_database\"]\n and PyFunceble.INTERN[\"inactive_db\"]\n ):\n # * The database subsystem is activated.\n # and\n # * The database is not empty.\n\n for key in PyFunceble.INTERN[\"inactive_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ]:\n # We loop through the index of the current file database.\n\n if key == \"to_test\":\n # The current key is `to_test`.\n\n # We continue to the next element.\n continue\n\n # We extend the result with the content of the currently read index.\n result.extend(\n PyFunceble.INTERN[\"inactive_db\"][PyFunceble.INTERN[\"file_to_test\"]][\n key\n ]\n )\n\n # We return the content of the database.\n return result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nchecking if the currently tested element is into the database.", "response": "def is_present(cls):\n \"\"\"\n Check if the currently tested element is into the database.\n \"\"\"\n\n if PyFunceble.CONFIGURATION[\"inactive_database\"]:\n # The database subsystem is activated.\n\n if PyFunceble.INTERN[\"to_test\"] in PyFunceble.INTERN[\n \"flatten_inactive_db\"\n ] or (\n PyFunceble.INTERN[\"file_to_test\"] in PyFunceble.INTERN[\"inactive_db\"]\n and PyFunceble.INTERN[\"inactive_db\"][PyFunceble.INTERN[\"file_to_test\"]]\n and \"to_test\"\n in PyFunceble.INTERN[\"inactive_db\"][PyFunceble.INTERN[\"file_to_test\"]]\n and PyFunceble.INTERN[\"to_test\"]\n in PyFunceble.INTERN[\"inactive_db\"][PyFunceble.INTERN[\"file_to_test\"]][\n \"to_test\"\n ]\n ):\n return True\n\n return False"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nretrieving the data from the database.", "response": "def _retrieve(self):\n \"\"\"\n Retrieve the data from the database.\n \"\"\"\n\n if self._authorization() and \"whois_db\" not in PyFunceble.INTERN:\n # The usage of the whois database is activated.\n\n if PyFunceble.path.isfile(self.whois_db_path):\n # The database file exist.\n\n # We merge our current database into already initiated one.\n PyFunceble.INTERN[\"whois_db\"] = Dict().from_json(\n File(self.whois_db_path).read()\n )\n else:\n # The database file does not exist.\n\n # We initiate an empty database.\n PyFunceble.INTERN[\"whois_db\"] = {}"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef is_in_database(self):\n\n if (\n self._authorization()\n and PyFunceble.INTERN[\"file_to_test\"] in PyFunceble.INTERN[\"whois_db\"]\n and PyFunceble.INTERN[\"to_test\"]\n in PyFunceble.INTERN[\"whois_db\"][PyFunceble.INTERN[\"file_to_test\"]]\n ):\n # * We are authorized to work.\n # and\n # * The given file path exist in the database.\n # and\n # * The element we are testing is in the database related to the\n # given file path.\n\n # We return True, the element we are testing is into the database.\n return True\n\n # * We are not authorized to work.\n # or\n # * The given file path does not exist in the database.\n # or\n # * The element we are testing is not in the database related to the\n # given file path.\n\n # We return False,the element we are testing is not into the database.\n return False", "response": "Check if the element is in the database."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nchecks if the current time is older than the one in the database.", "response": "def is_time_older(self):\n \"\"\"\n Check if the current time is older than the one in the database.\n \"\"\"\n\n if (\n self._authorization()\n and self.is_in_database()\n and int(\n PyFunceble.INTERN[\"whois_db\"][PyFunceble.INTERN[\"file_to_test\"]][\n PyFunceble.INTERN[\"to_test\"]\n ][\"epoch\"]\n )\n < int(PyFunceble.time())\n ):\n # * We are authorized to work.\n # and\n # * The element we are testing is in the database.\n # and\n # * The epoch of the expiration date is less than our current epoch.\n\n # The expiration date is in the past, we return True.\n return True\n\n # The expiration date is in the future, we return False.\n return False"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nget the expiration date from the database.", "response": "def get_expiration_date(self):\n \"\"\"\n Get the expiration date from the database.\n\n :return: The expiration date from the database.\n :rtype: str|None\n \"\"\"\n\n if self._authorization() and self.is_in_database() and not self.is_time_older():\n # * We are authorized to work.\n # and\n # * The element we are testing is in the database.\n # and\n # * The expiration date is in the future.\n\n # We get the expiration date from the database.\n result = PyFunceble.INTERN[\"whois_db\"][PyFunceble.INTERN[\"file_to_test\"]][\n PyFunceble.INTERN[\"to_test\"]\n ][\"expiration_date\"]\n\n if result:\n # The expiration date from the database is not empty nor\n # equal to None.\n\n # We return it.\n return result\n\n # We return None, there is no data to work with.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nadding the currently tested element into the database.", "response": "def add(self):\n \"\"\"\n Add the currently tested element into the database.\n \"\"\"\n\n if self._authorization():\n # We are authorized to work.\n\n if self.epoch < int(PyFunceble.time()):\n state = \"past\"\n else:\n state = \"future\"\n\n if self.is_in_database():\n # The element we are working with is in the database.\n\n if (\n str(self.epoch)\n != PyFunceble.INTERN[\"whois_db\"][PyFunceble.INTERN[\"file_to_test\"]][\n PyFunceble.INTERN[\"to_test\"]\n ][\"epoch\"]\n ):\n # The given epoch is diffent from the one saved.\n\n # We update it.\n PyFunceble.INTERN[\"whois_db\"][PyFunceble.INTERN[\"file_to_test\"]][\n PyFunceble.INTERN[\"to_test\"]\n ].update(\n {\n \"epoch\": str(self.epoch),\n \"state\": state,\n \"expiration_date\": self.expiration_date,\n }\n )\n\n elif self.is_time_older():\n # The expiration date from the database is in the past.\n\n if (\n PyFunceble.INTERN[\"whois_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ][PyFunceble.INTERN[\"to_test\"]][\"state\"]\n != \"past\"\n ): # pragma: no cover\n # The state of the element in the datbase is not\n # equal to `past`.\n\n # We update it to `past`.\n PyFunceble.INTERN[\"whois_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ][PyFunceble.INTERN[\"to_test\"]].update({\"state\": \"past\"})\n elif (\n PyFunceble.INTERN[\"whois_db\"][PyFunceble.INTERN[\"file_to_test\"]][\n PyFunceble.INTERN[\"to_test\"]\n ][\"state\"]\n != \"future\"\n ):\n # * The expiration date from the database is in the future.\n # and\n # * The state of the element in the database is not\n # equal to `future`.\n\n # We update it to `future`.\n PyFunceble.INTERN[\"whois_db\"][PyFunceble.INTERN[\"file_to_test\"]][\n PyFunceble.INTERN[\"to_test\"]\n ].update({\"state\": \"future\"})\n else:\n # The element we are working with is not in the database.\n\n if (\n not PyFunceble.INTERN[\"file_to_test\"]\n in PyFunceble.INTERN[\"whois_db\"]\n ):\n # The file path is not in the database.\n\n # We initiate it.\n PyFunceble.INTERN[\"whois_db\"][\n PyFunceble.INTERN[\"file_to_test\"]\n ] = {}\n\n # We create the first dataset.\n PyFunceble.INTERN[\"whois_db\"][PyFunceble.INTERN[\"file_to_test\"]].update(\n {\n PyFunceble.INTERN[\"to_test\"]: {\n \"epoch\": str(self.epoch),\n \"state\": state,\n \"expiration_date\": self.expiration_date,\n }\n }\n )\n\n # We do a safety backup of our database.\n self._backup()"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef travis_permissions(cls):\n\n if PyFunceble.CONFIGURATION[\"travis\"]:\n try:\n build_dir = PyFunceble.environ[\"TRAVIS_BUILD_DIR\"]\n commands = [\n \"sudo chown -R travis:travis %s\" % (build_dir),\n \"sudo chgrp -R travis %s\" % (build_dir),\n \"sudo chmod -R g+rwX %s\" % (build_dir),\n \"sudo chmod 777 -Rf %s.git\"\n % (build_dir + PyFunceble.directory_separator),\n r\"sudo find %s -type d -exec chmod g+x '{}' \\;\" % (build_dir),\n ]\n\n for command in commands:\n Command(command).execute()\n\n if Command(\"git config core.sharedRepository\").execute() == \"\":\n Command(\"git config core.sharedRepository group\").execute()\n except KeyError:\n pass", "response": "Set permissions for the travis project."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _travis(self):\n\n if PyFunceble.CONFIGURATION[\"travis\"]:\n try:\n _ = PyFunceble.environ[\"TRAVIS_BUILD_DIR\"]\n time_autorisation = False\n\n try:\n time_autorisation = int(PyFunceble.time()) >= int(\n PyFunceble.INTERN[\"start\"]\n ) + (int(PyFunceble.CONFIGURATION[\"travis_autosave_minutes\"]) * 60)\n except KeyError:\n if self.last and not self.bypass:\n raise Exception(\n \"Please review the way `ExecutionTime()` is called.\"\n )\n\n if self.last or time_autorisation or self.bypass:\n Percentage().log()\n self.travis_permissions()\n\n command = 'git add --all && git commit -a -m \"%s\"'\n\n if self.last or self.bypass:\n if PyFunceble.CONFIGURATION[\"command_before_end\"]:\n for line in Command(\n PyFunceble.CONFIGURATION[\"command_before_end\"]\n ).run():\n sys_stdout.write(\"{}\\n\".format(line))\n\n self.travis_permissions()\n\n message = (\n PyFunceble.CONFIGURATION[\"travis_autosave_final_commit\"]\n + \" [ci skip]\"\n )\n\n Command(command % message).execute()\n else:\n if PyFunceble.CONFIGURATION[\"command\"]:\n for line in Command(\n PyFunceble.CONFIGURATION[\"command\"]\n ).run():\n sys_stdout.write(\"{}\\n\".format(line))\n\n self.travis_permissions()\n\n Command(\n command % PyFunceble.CONFIGURATION[\"travis_autosave_commit\"]\n ).execute()\n\n print(\n Command(\n \"git push origin %s\"\n % PyFunceble.CONFIGURATION[\"travis_branch\"]\n ).execute()\n )\n exit(0)\n except KeyError:\n pass", "response": "This function is called by the travis daemon when the application is running in Travis CI."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nexecutes the logic behind the URL handling.", "response": "def get(cls): # pragma: no cover\n \"\"\"\n Execute the logic behind the URL handling.\n\n :return: The status of the URL.\n :rtype: str\n \"\"\"\n\n if Check().is_url_valid() or PyFunceble.CONFIGURATION[\"local\"]:\n # * The url is valid.\n # or\n # * We are testing in/for a local or private network.\n\n if \"current_test_data\" in PyFunceble.INTERN:\n PyFunceble.INTERN[\"current_test_data\"][\"url_syntax_validation\"] = True\n\n # We initiate the HTTP status code.\n PyFunceble.INTERN.update({\"http_code\": HTTPCode().get()})\n\n # We initiate the list of active status code.\n active_list = []\n active_list.extend(PyFunceble.HTTP_CODE[\"list\"][\"potentially_up\"])\n active_list.extend(PyFunceble.HTTP_CODE[\"list\"][\"up\"])\n\n # We initiate the list of inactive status code.\n inactive_list = []\n inactive_list.extend(PyFunceble.HTTP_CODE[\"list\"][\"potentially_down\"])\n inactive_list.append(\"*\" * 3)\n\n if PyFunceble.INTERN[\"http_code\"] in active_list:\n # The extracted HTTP status code is in the list of active list.\n\n # We handle and return the up status.\n return URLStatus(PyFunceble.STATUS[\"official\"][\"up\"]).handle()\n\n if PyFunceble.INTERN[\"http_code\"] in inactive_list:\n # The extracted HTTP status code is in the list of inactive list.\n\n # We handle and return the down status.\n return URLStatus(PyFunceble.STATUS[\"official\"][\"down\"]).handle()\n\n # The extracted HTTP status code is not in the list of active nor invalid list.\n\n if \"current_test_data\" in PyFunceble.INTERN:\n # The end-user want more information whith his test.\n\n # We update the url_syntax_validation index.\n PyFunceble.INTERN[\"current_test_data\"][\"url_syntax_validation\"] = False\n\n # We handle and return the invalid down status.\n return URLStatus(PyFunceble.STATUS[\"official\"][\"invalid\"]).handle()"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns the referer aka the WHOIS server of the current domain extension.", "response": "def get(self):\n \"\"\"\n Return the referer aka the WHOIS server of the current domain extension.\n \"\"\"\n\n if not PyFunceble.CONFIGURATION[\"local\"]:\n # We are not running a test in a local network.\n\n if self.domain_extension not in self.ignored_extension:\n # The extension of the domain we are testing is not into\n # the list of ignored extensions.\n\n # We set the referer to None as we do not have any.\n referer = None\n\n if self.domain_extension in PyFunceble.INTERN[\"iana_db\"]:\n # The domain extension is in the iana database.\n\n if not PyFunceble.CONFIGURATION[\"no_whois\"]:\n # We are authorized to use WHOIS for the test result.\n\n # We get the referer from the database.\n referer = PyFunceble.INTERN[\"iana_db\"][self.domain_extension]\n\n if not referer:\n # The referer is not filled.\n\n # We log the case of the current extension.\n Logs().referer_not_found(self.domain_extension)\n\n # And we handle and return None status.\n return None\n\n # The referer is into the database.\n\n # We return the extracted referer.\n return referer\n\n # We are not authorized to use WHOIS for the test result.\n\n # We return None.\n return None\n\n # The domain extension is not in the iana database.\n\n # We return False, it is an invalid domain.\n return False\n\n # The extension of the domain we are testing is not into\n # the list of ignored extensions.\n\n # We return None, the domain does not have a whois server.\n return None\n\n # We are running a test in a local network.\n\n # We return None.\n return None"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _register_servicer(self, servicer):\n name = servicer.__name__\n if name in self._servicers:\n raise exceptions.ConfigException(\n 'servicer duplicated: {}'.format(name))\n add_func = self._get_servicer_add_func(servicer)\n self._servicers[name] = (add_func, servicer)", "response": "register serviser\n\n :param servicer: servicer"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nregister an extension with the cache", "response": "def _register_extension(self, name, ext):\n \"\"\"register extension\n\n :param name: extension name\n :param ext: extension object\n \"\"\"\n ext.init_app(self)\n if name in self._extensions:\n raise exceptions.ConfigException(\n 'extension duplicated: {}'.format(name))\n self._extensions[name] = ext"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngetting current object. This is useful if you want the real object behind the proxy at a time for performance reasons or because you want to pass the object into a different context.", "response": "def _get_current_object(self):\n \"\"\"Get current object.\n This is useful if you want the real\n object behind the proxy at a time for performance reasons or because\n you want to pass the object into a different context.\n \"\"\"\n loc = object.__getattribute__(self, '_Proxy__local')\n if not hasattr(loc, '__release_local__'):\n return loc(*self.__args, **self.__kwargs)\n try: # pragma: no cover\n # not sure what this is about\n return getattr(loc, self.__name__)\n except AttributeError: # pragma: no cover\n raise RuntimeError('no object bound to {0.__name__}'.format(self))"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nyields paths to standard modules.", "response": "def standard_paths():\n \"\"\"Yield paths to standard modules.\"\"\"\n for is_plat_spec in [True, False]:\n path = distutils.sysconfig.get_python_lib(standard_lib=True,\n plat_specific=is_plat_spec)\n\n for name in os.listdir(path):\n yield name\n\n try:\n for name in os.listdir(os.path.join(path, 'lib-dynload')):\n yield name\n except OSError: # pragma: no cover\n pass"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nyields all standard module names.", "response": "def standard_package_names():\n \"\"\"Yield standard module names.\"\"\"\n for name in standard_paths():\n if name.startswith('_') or '-' in name:\n continue\n\n if '.' in name and name.rsplit('.')[-1] not in ['so', 'py', 'pyc']:\n continue\n\n yield name.split('.')[0]"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nyields line numbers of unused imports.", "response": "def unused_import_line_numbers(messages):\n \"\"\"Yield line numbers of unused imports.\"\"\"\n for message in messages:\n if isinstance(message, pyflakes.messages.UnusedImport):\n yield message.lineno"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nyields line number and module name of unused imports.", "response": "def unused_import_module_name(messages):\n \"\"\"Yield line number and module name of unused imports.\"\"\"\n pattern = r'\\'(.+?)\\''\n for message in messages:\n if isinstance(message, pyflakes.messages.UnusedImport):\n module_name = re.search(pattern, str(message))\n module_name = module_name.group()[1:-1]\n if module_name:\n yield (message.lineno, module_name)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nyields line numbers of star import used messages.", "response": "def star_import_used_line_numbers(messages):\n \"\"\"Yield line number of star import usage.\"\"\"\n for message in messages:\n if isinstance(message, pyflakes.messages.ImportStarUsed):\n yield message.lineno"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nyield line number undefined name and its possible origin module.", "response": "def star_import_usage_undefined_name(messages):\n \"\"\"Yield line number, undefined name, and its possible origin module.\"\"\"\n for message in messages:\n if isinstance(message, pyflakes.messages.ImportStarUsage):\n undefined_name = message.message_args[0]\n module_name = message.message_args[1]\n yield (message.lineno, undefined_name, module_name)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef unused_variable_line_numbers(messages):\n for message in messages:\n if isinstance(message, pyflakes.messages.UnusedVariable):\n yield message.lineno", "response": "Yield line numbers of unused variables."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef duplicate_key_line_numbers(messages, source):\n messages = [\n message for message in messages\n if isinstance(message, pyflakes.messages.MultiValueRepeatedKeyLiteral)]\n\n if messages:\n # Filter out complex cases. We don't want to bother trying to parse\n # this stuff and get it right. We can do it on a key-by-key basis.\n\n key_to_messages = create_key_to_messages_dict(messages)\n\n lines = source.split('\\n')\n\n for (key, messages) in key_to_messages.items():\n good = True\n for message in messages:\n line = lines[message.lineno - 1]\n key = message.message_args[0]\n\n if not dict_entry_has_key(line, key):\n good = False\n\n if good:\n for message in messages:\n yield message.lineno", "response": "Yield line numbers of duplicate keys."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nreturn dict mapping the key to list of messages.", "response": "def create_key_to_messages_dict(messages):\n \"\"\"Return dict mapping the key to list of messages.\"\"\"\n dictionary = collections.defaultdict(lambda: [])\n for message in messages:\n dictionary[message.message_args[0]].append(message)\n return dictionary"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef check(source):\n if sys.version_info[0] == 2 and isinstance(source, unicode):\n # Convert back to original byte string encoding, otherwise pyflakes\n # call to compile() will complain. See PEP 263. This only affects\n # Python 2.\n try:\n source = source.encode('utf-8')\n except UnicodeError: # pragma: no cover\n return []\n\n reporter = ListReporter()\n try:\n pyflakes.api.check(source, filename='', reporter=reporter)\n except (AttributeError, RecursionError, UnicodeDecodeError):\n pass\n return reporter.messages", "response": "Return messages from pyflakes."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef extract_package_name(line):\n assert '\\\\' not in line\n assert '(' not in line\n assert ')' not in line\n assert ';' not in line\n\n if line.lstrip().startswith(('import', 'from')):\n word = line.split()[1]\n else:\n # Ignore doctests.\n return None\n\n package = word.split('.')[0]\n assert ' ' not in package\n\n return package", "response": "Return package name in import statement."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns True if import is spans multiples lines.", "response": "def multiline_import(line, previous_line=''):\n \"\"\"Return True if import is spans multiples lines.\"\"\"\n for symbol in '()':\n if symbol in line:\n return True\n\n # Ignore doctests.\n if line.lstrip().startswith('>'):\n return True\n\n return multiline_statement(line, previous_line)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef multiline_statement(line, previous_line=''):\n for symbol in '\\\\:;':\n if symbol in line:\n return True\n\n sio = io.StringIO(line)\n try:\n list(tokenize.generate_tokens(sio.readline))\n return previous_line.rstrip().endswith('\\\\')\n except (SyntaxError, tokenize.TokenError):\n return True", "response": "Return True if this is part of a multiline statement."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef filter_from_import(line, unused_module):\n (indentation, imports) = re.split(pattern=r'\\bimport\\b',\n string=line, maxsplit=1)\n base_module = re.search(pattern=r'\\bfrom\\s+([^ ]+)',\n string=indentation).group(1)\n\n # Create an imported module list with base module name\n # ex ``from a import b, c as d`` -> ``['a.b', 'a.c as d']``\n imports = re.split(pattern=r',', string=imports.strip())\n imports = [base_module + '.' + x.strip() for x in imports]\n\n # We compare full module name (``a.module`` not `module`) to\n # guarantee the exact same module as detected from pyflakes.\n filtered_imports = [x.replace(base_module + '.', '')\n for x in imports if x not in unused_module]\n\n # All of the import in this statement is unused\n if not filtered_imports:\n return get_indentation(line) + 'pass' + get_line_ending(line)\n\n indentation += 'import '\n\n return (\n indentation +\n ', '.join(sorted(filtered_imports)) +\n get_line_ending(line))", "response": "Parse and filter from something import a b c as d."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreturning line with imports on separate lines.", "response": "def break_up_import(line):\n \"\"\"Return line with imports on separate lines.\"\"\"\n assert '\\\\' not in line\n assert '(' not in line\n assert ')' not in line\n assert ';' not in line\n assert '#' not in line\n assert not line.lstrip().startswith('from')\n\n newline = get_line_ending(line)\n if not newline:\n return line\n\n (indentation, imports) = re.split(pattern=r'\\bimport\\b',\n string=line, maxsplit=1)\n\n indentation += 'import '\n assert newline\n\n return ''.join([indentation + i.strip() + newline\n for i in sorted(imports.split(','))])"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef filter_code(source, additional_imports=None,\n expand_star_imports=False,\n remove_all_unused_imports=False,\n remove_duplicate_keys=False,\n remove_unused_variables=False,\n ignore_init_module_imports=False,\n ):\n \"\"\"Yield code with unused imports removed.\"\"\"\n imports = SAFE_IMPORTS\n if additional_imports:\n imports |= frozenset(additional_imports)\n del additional_imports\n\n messages = check(source)\n\n if ignore_init_module_imports:\n marked_import_line_numbers = frozenset()\n else:\n marked_import_line_numbers = frozenset(\n unused_import_line_numbers(messages))\n marked_unused_module = collections.defaultdict(lambda: [])\n for line_number, module_name in unused_import_module_name(messages):\n marked_unused_module[line_number].append(module_name)\n\n if expand_star_imports and not (\n # See explanations in #18.\n re.search(r'\\b__all__\\b', source) or\n re.search(r'\\bdel\\b', source)\n ):\n marked_star_import_line_numbers = frozenset(\n star_import_used_line_numbers(messages))\n if len(marked_star_import_line_numbers) > 1:\n # Auto expanding only possible for single star import\n marked_star_import_line_numbers = frozenset()\n else:\n undefined_names = []\n for line_number, undefined_name, _ \\\n in star_import_usage_undefined_name(messages):\n undefined_names.append(undefined_name)\n if not undefined_names:\n marked_star_import_line_numbers = frozenset()\n else:\n marked_star_import_line_numbers = frozenset()\n\n if remove_unused_variables:\n marked_variable_line_numbers = frozenset(\n unused_variable_line_numbers(messages))\n else:\n marked_variable_line_numbers = frozenset()\n\n if remove_duplicate_keys:\n marked_key_line_numbers = frozenset(\n duplicate_key_line_numbers(messages, source))\n else:\n marked_key_line_numbers = frozenset()\n\n line_messages = get_messages_by_line(messages)\n\n sio = io.StringIO(source)\n previous_line = ''\n for line_number, line in enumerate(sio.readlines(), start=1):\n if '#' in line:\n yield line\n elif line_number in marked_import_line_numbers:\n yield filter_unused_import(\n line,\n unused_module=marked_unused_module[line_number],\n remove_all_unused_imports=remove_all_unused_imports,\n imports=imports,\n previous_line=previous_line)\n elif line_number in marked_variable_line_numbers:\n yield filter_unused_variable(line)\n elif line_number in marked_key_line_numbers:\n yield filter_duplicate_key(line, line_messages[line_number],\n line_number, marked_key_line_numbers,\n source)\n elif line_number in marked_star_import_line_numbers:\n yield filter_star_import(line, undefined_names)\n else:\n yield line\n\n previous_line = line", "response": "Yields code with unused imports removed."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreturning dictionary that maps line number to message.", "response": "def get_messages_by_line(messages):\n \"\"\"Return dictionary that maps line number to message.\"\"\"\n line_messages = {}\n for message in messages:\n line_messages[message.lineno] = message\n return line_messages"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns line with the star import expanded.", "response": "def filter_star_import(line, marked_star_import_undefined_name):\n \"\"\"Return line with the star import expanded.\"\"\"\n undefined_name = sorted(set(marked_star_import_undefined_name))\n return re.sub(r'\\*', ', '.join(undefined_name), line)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreturning line if used otherwise return None.", "response": "def filter_unused_import(line, unused_module, remove_all_unused_imports,\n imports, previous_line=''):\n \"\"\"Return line if used, otherwise return None.\"\"\"\n if multiline_import(line, previous_line):\n return line\n\n is_from_import = line.lstrip().startswith('from')\n\n if ',' in line and not is_from_import:\n return break_up_import(line)\n\n package = extract_package_name(line)\n if not remove_all_unused_imports and package not in imports:\n return line\n\n if ',' in line:\n assert is_from_import\n return filter_from_import(line, unused_module)\n else:\n # We need to replace import with \"pass\" in case the import is the\n # only line inside a block. For example,\n # \"if True:\\n import os\". In such cases, if the import is\n # removed, the block will be left hanging with no body.\n return (get_indentation(line) +\n 'pass' +\n get_line_ending(line))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef filter_unused_variable(line, previous_line=''):\n if re.match(EXCEPT_REGEX, line):\n return re.sub(r' as \\w+:$', ':', line, count=1)\n elif multiline_statement(line, previous_line):\n return line\n elif line.count('=') == 1:\n split_line = line.split('=')\n assert len(split_line) == 2\n value = split_line[1].lstrip()\n if ',' in split_line[0]:\n return line\n\n if is_literal_or_name(value):\n # Rather than removing the line, replace with it \"pass\" to avoid\n # a possible hanging block with no body.\n value = 'pass' + get_line_ending(line)\n\n return get_indentation(line) + value\n else:\n return line", "response": "Return line if used otherwise return None."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nreturn line if line_number is the first occurrence of the key otherwise return line.", "response": "def filter_duplicate_key(line, message, line_number, marked_line_numbers,\n source, previous_line=''):\n \"\"\"Return '' if first occurrence of the key otherwise return `line`.\"\"\"\n if marked_line_numbers and line_number == sorted(marked_line_numbers)[0]:\n return ''\n\n return line"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef dict_entry_has_key(line, key):\n if '#' in line:\n return False\n\n result = re.match(r'\\s*(.*)\\s*:\\s*(.*),\\s*$', line)\n if not result:\n return False\n\n try:\n candidate_key = ast.literal_eval(result.group(1))\n except (SyntaxError, ValueError):\n return False\n\n if multiline_statement(result.group(2)):\n return False\n\n return candidate_key == key", "response": "Return True if line is a dict entry that uses key."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef is_literal_or_name(value):\n try:\n ast.literal_eval(value)\n return True\n except (SyntaxError, ValueError):\n pass\n\n if value.strip() in ['dict()', 'list()', 'set()']:\n return True\n\n # Support removal of variables on the right side. But make sure\n # there are no dots, which could mean an access of a property.\n return re.match(r'^\\w+\\s*$', value)", "response": "Return True if value is a literal or a name."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nyielding line numbers of unneeded pass statements.", "response": "def useless_pass_line_numbers(source):\n \"\"\"Yield line numbers of unneeded \"pass\" statements.\"\"\"\n sio = io.StringIO(source)\n previous_token_type = None\n last_pass_row = None\n last_pass_indentation = None\n previous_line = ''\n for token in tokenize.generate_tokens(sio.readline):\n token_type = token[0]\n start_row = token[2][0]\n line = token[4]\n\n is_pass = (token_type == tokenize.NAME and line.strip() == 'pass')\n\n # Leading \"pass\".\n if (start_row - 1 == last_pass_row and\n get_indentation(line) == last_pass_indentation and\n token_type in ATOMS and\n not is_pass):\n yield start_row - 1\n\n if is_pass:\n last_pass_row = start_row\n last_pass_indentation = get_indentation(line)\n\n # Trailing \"pass\".\n if (is_pass and\n previous_token_type != tokenize.INDENT and\n not previous_line.rstrip().endswith('\\\\')):\n yield start_row\n\n previous_token_type = token_type\n previous_line = line"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef filter_useless_pass(source):\n try:\n marked_lines = frozenset(useless_pass_line_numbers(source))\n except (SyntaxError, tokenize.TokenError):\n marked_lines = frozenset()\n\n sio = io.StringIO(source)\n for line_number, line in enumerate(sio.readlines(), start=1):\n if line_number not in marked_lines:\n yield line", "response": "Yield code with useless pass lines removed."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreturning code with all filtering run on it.", "response": "def fix_code(source, additional_imports=None, expand_star_imports=False,\n remove_all_unused_imports=False, remove_duplicate_keys=False,\n remove_unused_variables=False, ignore_init_module_imports=False):\n \"\"\"Return code with all filtering run on it.\"\"\"\n if not source:\n return source\n\n # pyflakes does not handle \"nonlocal\" correctly.\n if 'nonlocal' in source:\n remove_unused_variables = False\n\n filtered_source = None\n while True:\n filtered_source = ''.join(\n filter_useless_pass(''.join(\n filter_code(\n source,\n additional_imports=additional_imports,\n expand_star_imports=expand_star_imports,\n remove_all_unused_imports=remove_all_unused_imports,\n remove_duplicate_keys=remove_duplicate_keys,\n remove_unused_variables=remove_unused_variables,\n ignore_init_module_imports=ignore_init_module_imports,\n ))))\n\n if filtered_source == source:\n break\n source = filtered_source\n\n return filtered_source"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nrunning fix_code() on a file.", "response": "def fix_file(filename, args, standard_out):\n \"\"\"Run fix_code() on a file.\"\"\"\n encoding = detect_encoding(filename)\n with open_with_encoding(filename, encoding=encoding) as input_file:\n source = input_file.read()\n\n original_source = source\n\n isInitFile = os.path.basename(filename) == '__init__.py'\n\n if args.ignore_init_module_imports and isInitFile:\n ignore_init_module_imports = True\n else:\n ignore_init_module_imports = False\n\n filtered_source = fix_code(\n source,\n additional_imports=args.imports.split(',') if args.imports else None,\n expand_star_imports=args.expand_star_imports,\n remove_all_unused_imports=args.remove_all_unused_imports,\n remove_duplicate_keys=args.remove_duplicate_keys,\n remove_unused_variables=args.remove_unused_variables,\n ignore_init_module_imports=ignore_init_module_imports,\n )\n\n if original_source != filtered_source:\n if args.check:\n standard_out.write('Unused imports/variables detected.')\n sys.exit(1)\n if args.in_place:\n with open_with_encoding(filename, mode='w',\n encoding=encoding) as output_file:\n output_file.write(filtered_source)\n else:\n diff = get_diff_text(\n io.StringIO(original_source).readlines(),\n io.StringIO(filtered_source).readlines(),\n filename)\n standard_out.write(''.join(diff))\n else:\n if args.check:\n standard_out.write('No issues detected!')"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ndetects encoding of a file.", "response": "def detect_encoding(filename, limit_byte_check=-1):\n \"\"\"Return file encoding.\"\"\"\n try:\n with open(filename, 'rb') as input_file:\n encoding = _detect_encoding(input_file.readline)\n\n # Check for correctness of encoding.\n with open_with_encoding(filename, encoding) as input_file:\n input_file.read(limit_byte_check)\n\n return encoding\n except (LookupError, SyntaxError, UnicodeDecodeError):\n return 'latin-1'"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndetecting encoding of the file.", "response": "def _detect_encoding(readline):\n \"\"\"Return file encoding.\"\"\"\n try:\n from lib2to3.pgen2 import tokenize as lib2to3_tokenize\n encoding = lib2to3_tokenize.detect_encoding(readline)[0]\n return encoding\n except (LookupError, SyntaxError, UnicodeDecodeError):\n return 'latin-1'"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreturns a set of strings.", "response": "def _split_comma_separated(string):\n \"\"\"Return a set of strings.\"\"\"\n return set(text.strip() for text in string.split(',') if text.strip())"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef is_python_file(filename):\n if filename.endswith('.py'):\n return True\n\n try:\n with open_with_encoding(\n filename,\n None,\n limit_byte_check=MAX_PYTHON_FILE_DETECTION_BYTES) as f:\n text = f.read(MAX_PYTHON_FILE_DETECTION_BYTES)\n if not text:\n return False\n first_line = text.splitlines()[0]\n except (IOError, IndexError):\n return False\n\n if not PYTHON_SHEBANG_REGEX.match(first_line):\n return False\n\n return True", "response": "Return True if filename is Python file."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef is_exclude_file(filename, exclude):\n base_name = os.path.basename(filename)\n\n if base_name.startswith('.'):\n return True\n\n for pattern in exclude:\n if fnmatch.fnmatch(base_name, pattern):\n return True\n if fnmatch.fnmatch(filename, pattern):\n return True\n return False", "response": "Return True if file matches exclude pattern."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef match_file(filename, exclude):\n if is_exclude_file(filename, exclude):\n return False\n\n if not os.path.isdir(filename) and not is_python_file(filename):\n return False\n\n return True", "response": "Return True if file is okay for modifying or recursing."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nyielding the names of all files in the archive.", "response": "def find_files(filenames, recursive, exclude):\n \"\"\"Yield filenames.\"\"\"\n while filenames:\n name = filenames.pop(0)\n if recursive and os.path.isdir(name):\n for root, directories, children in os.walk(name):\n filenames += [os.path.join(root, f) for f in children\n if match_file(os.path.join(root, f),\n exclude)]\n directories[:] = [d for d in directories\n if match_file(os.path.join(root, d),\n exclude)]\n else:\n if not is_exclude_file(name, exclude):\n yield name"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _main(argv, standard_out, standard_error):\n import argparse\n parser = argparse.ArgumentParser(description=__doc__, prog='autoflake')\n parser.add_argument('-c', '--check', action='store_true',\n help='return error code if changes are needed')\n parser.add_argument('-i', '--in-place', action='store_true',\n help='make changes to files instead of printing diffs')\n parser.add_argument('-r', '--recursive', action='store_true',\n help='drill down directories recursively')\n parser.add_argument('--exclude', metavar='globs',\n help='exclude file/directory names that match these '\n 'comma-separated globs')\n parser.add_argument('--imports',\n help='by default, only unused standard library '\n 'imports are removed; specify a comma-separated '\n 'list of additional modules/packages')\n parser.add_argument('--expand-star-imports', action='store_true',\n help='expand wildcard star imports with undefined '\n 'names; this only triggers if there is only '\n 'one star import in the file; this is skipped if '\n 'there are any uses of `__all__` or `del` in the '\n 'file')\n parser.add_argument('--remove-all-unused-imports', action='store_true',\n help='remove all unused imports (not just those from '\n 'the standard library)')\n parser.add_argument('--ignore-init-module-imports', action='store_true',\n help='exclude __init__.py when removing unused '\n 'imports')\n parser.add_argument('--remove-duplicate-keys', action='store_true',\n help='remove all duplicate keys in objects')\n parser.add_argument('--remove-unused-variables', action='store_true',\n help='remove unused variables')\n parser.add_argument('--version', action='version',\n version='%(prog)s ' + __version__)\n parser.add_argument('files', nargs='+', help='files to format')\n\n args = parser.parse_args(argv[1:])\n\n if args.remove_all_unused_imports and args.imports:\n print('Using both --remove-all and --imports is redundant',\n file=standard_error)\n return 1\n\n if args.exclude:\n args.exclude = _split_comma_separated(args.exclude)\n else:\n args.exclude = set([])\n\n filenames = list(set(args.files))\n failure = False\n for name in find_files(filenames, args.recursive, args.exclude):\n try:\n fix_file(name, args=args, standard_out=standard_out)\n except IOError as exception:\n print(unicode(exception), file=standard_error)\n failure = True\n\n return 1 if failure else 0", "response": "Main function for the automlake command line interface."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreads the data encoding the ObtainLease response payload and decode it into its constituent parts.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the ObtainLease response payload and decode it\n into its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is missing from the\n encoded payload.\n \"\"\"\n super(ObtainLeaseResponsePayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(enums.Tags.LEASE_TIME, local_stream):\n self._lease_time = primitives.Interval(\n tag=enums.Tags.LEASE_TIME\n )\n self._lease_time.read(local_stream, kmip_version=kmip_version)\n if self.is_tag_next(enums.Tags.LAST_CHANGE_DATE, local_stream):\n self._last_change_date = primitives.DateTime(\n tag=enums.Tags.LAST_CHANGE_DATE\n )\n self._last_change_date.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nwrites the data encoding the ObtainLease response payload to the output stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the ObtainLease response payload to a stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is not defined.\n \"\"\"\n local_stream = utils.BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._lease_time:\n self._lease_time.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._last_change_date:\n self._last_change_date.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(ObtainLeaseResponsePayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = utils.BytearrayStream()\n\n if self._asynchronous_correlation_value:\n self._asynchronous_correlation_value.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(CancelRequestPayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the object to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nread the data encoding the Cancel response payload and decode it into the object.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Cancel response payload and decode it into\n its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is missing from the\n encoded payload.\n \"\"\"\n super(CancelResponsePayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(\n enums.Tags.ASYNCHRONOUS_CORRELATION_VALUE,\n local_stream\n ):\n self._asynchronous_correlation_value = primitives.ByteString(\n tag=enums.Tags.ASYNCHRONOUS_CORRELATION_VALUE\n )\n self._asynchronous_correlation_value.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(enums.Tags.CANCELLATION_RESULT, local_stream):\n self._cancellation_result = primitives.Enumeration(\n enums.CancellationResult,\n tag=enums.Tags.CANCELLATION_RESULT\n )\n self._cancellation_result.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef create(cls, name_value, name_type):\n '''\n Returns a Name object, populated with the given value and type\n '''\n if isinstance(name_value, Name.NameValue):\n value = name_value\n elif isinstance(name_value, str):\n value = cls.NameValue(name_value)\n else:\n name = 'Name'\n msg = exceptions.ErrorStrings.BAD_EXP_RECV\n member = 'name_value'\n raise TypeError(msg.format('{0}.{1}'.format(name, member),\n 'name_value', type(Name.NameValue),\n type(name_value)))\n\n if isinstance(name_type, Name.NameType):\n n_type = name_type\n elif isinstance(name_type, Enum):\n n_type = cls.NameType(name_type)\n else:\n name = 'Name'\n msg = exceptions.ErrorStrings.BAD_EXP_RECV\n member = 'name_type'\n raise TypeError(msg.format('{0}.{1}'.format(name, member),\n 'name_type', type(Name.NameType),\n type(name_type)))\n\n return Name(name_value=value,\n name_type=n_type)", "response": "Returns a Name object populated with the given value and type"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(Digest, self).read(istream, kmip_version=kmip_version)\n tstream = BytearrayStream(istream.read(self.length))\n\n self.hashing_algorithm.read(tstream, kmip_version=kmip_version)\n self.digest_value.read(tstream, kmip_version=kmip_version)\n self.key_format_type.read(tstream, kmip_version=kmip_version)\n\n self.is_oversized(tstream)\n self.validate()", "response": "Reads the data encoding the Digest object and decode it into its own parts."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nwriting the data encoding the object to the data stream.", "response": "def write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the Digest object to a stream.\n\n Args:\n ostream (Stream): A data stream in which to encode object data,\n supporting a write method; usually a BytearrayStream object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n tstream = BytearrayStream()\n\n self.hashing_algorithm.write(tstream, kmip_version=kmip_version)\n self.digest_value.write(tstream, kmip_version=kmip_version)\n self.key_format_type.write(tstream, kmip_version=kmip_version)\n\n self.length = tstream.length()\n super(Digest, self).write(ostream, kmip_version=kmip_version)\n ostream.write(tstream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef create(cls,\n hashing_algorithm=HashingAlgorithmEnum.SHA_256,\n digest_value=b'',\n key_format_type=KeyFormatTypeEnum.RAW):\n \"\"\"\n Construct a Digest object from provided digest values.\n\n Args:\n hashing_algorithm (HashingAlgorithm): An enumeration representing\n the hash algorithm used to compute the digest. Optional,\n defaults to HashingAlgorithm.SHA_256.\n digest_value (byte string): The bytes of the digest hash. Optional,\n defaults to the empty byte string.\n key_format_type (KeyFormatType): An enumeration representing the\n format of the key corresponding to the digest. Optional,\n defaults to KeyFormatType.RAW.\n\n Returns:\n Digest: The newly created Digest.\n\n Example:\n >>> x = Digest.create(HashingAlgorithm.MD5, b'\\x00',\n ... KeyFormatType.RAW)\n >>> x.hashing_algorithm\n HashingAlgorithm(value=HashingAlgorithm.MD5)\n >>> x.digest_value\n DigestValue(value=bytearray(b'\\x00'))\n >>> x.key_format_type\n KeyFormatType(value=KeyFormatType.RAW)\n \"\"\"\n algorithm = HashingAlgorithm(hashing_algorithm)\n value = DigestValue(bytearray(digest_value))\n format_type = KeyFormatType(key_format_type)\n\n return Digest(hashing_algorithm=algorithm,\n digest_value=value,\n key_format_type=format_type)", "response": "Constructs a new Digest object from provided digest values."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreading the data encoding the ApplicationSpecificInformation object and decode it into constituent parts.", "response": "def read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the ApplicationSpecificInformation object and\n decode it into its constituent parts.\n\n Args:\n istream (Stream): A data stream containing encoded object data,\n supporting a read method; usually a BytearrayStream object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n super(ApplicationSpecificInformation, self).read(\n istream,\n kmip_version=kmip_version\n )\n tstream = BytearrayStream(istream.read(self.length))\n\n self.application_namespace.read(tstream, kmip_version=kmip_version)\n self.application_data.read(tstream, kmip_version=kmip_version)\n\n self.is_oversized(tstream)\n self.validate()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nwrite the data encoding the ApplicationSpecificInformation object to an output stream.", "response": "def write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the ApplicationSpecificInformation object to a\n stream.\n\n Args:\n ostream (Stream): A data stream in which to encode object data,\n supporting a write method; usually a BytearrayStream object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n tstream = BytearrayStream()\n\n self.application_namespace.write(tstream, kmip_version=kmip_version)\n self.application_data.write(tstream, kmip_version=kmip_version)\n\n self.length = tstream.length()\n super(ApplicationSpecificInformation, self).write(\n ostream,\n kmip_version=kmip_version\n )\n ostream.write(tstream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef create(cls, application_namespace, application_data):\n namespace = ApplicationNamespace(application_namespace)\n data = ApplicationData(application_data)\n return ApplicationSpecificInformation(\n application_namespace=namespace, application_data=data)", "response": "Create an ApplicationSpecificInformation object from provided data and application namespace values."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(DerivationParameters, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(\n enums.Tags.CRYPTOGRAPHIC_PARAMETERS,\n local_stream\n ):\n self._cryptographic_parameters = CryptographicParameters()\n self._cryptographic_parameters.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.INITIALIZATION_VECTOR, local_stream):\n self._initialization_vector = ByteString(\n tag=enums.Tags.INITIALIZATION_VECTOR\n )\n self._initialization_vector.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.DERIVATION_DATA, local_stream):\n self._derivation_data = ByteString(tag=enums.Tags.DERIVATION_DATA)\n self._derivation_data.read(local_stream, kmip_version=kmip_version)\n\n if self.is_tag_next(enums.Tags.SALT, local_stream):\n self._salt = ByteString(tag=enums.Tags.SALT)\n self._salt.read(local_stream, kmip_version=kmip_version)\n\n if self.is_tag_next(Tags.ITERATION_COUNT, local_stream):\n self._iteration_count = Integer(tag=Tags.ITERATION_COUNT)\n self._iteration_count.read(local_stream, kmip_version=kmip_version)\n\n self.is_oversized(local_stream)", "response": "Reads the data encoding the DerivationParameters struct and decodes it into parts."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nwriting the DerivationParameters struct to a byte array stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the DerivationParameters struct to a stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n local_stream = BytearrayStream()\n\n if self._cryptographic_parameters:\n self._cryptographic_parameters.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._initialization_vector:\n self._initialization_vector.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._derivation_data:\n self._derivation_data.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._salt:\n self._salt.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._iteration_count:\n self._iteration_count.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(DerivationParameters, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(GetRequestPayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.KEY_FORMAT_TYPE, local_stream):\n self._key_format_type = primitives.Enumeration(\n enum=enums.KeyFormatType,\n tag=enums.Tags.KEY_FORMAT_TYPE\n )\n self._key_format_type.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.KEY_COMPRESSION_TYPE, local_stream):\n self._key_compression_type = primitives.Enumeration(\n enum=enums.KeyCompressionType,\n tag=enums.Tags.KEY_COMPRESSION_TYPE\n )\n self._key_compression_type.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(\n enums.Tags.KEY_WRAPPING_SPECIFICATION,\n local_stream\n ):\n self._key_wrapping_specification = \\\n objects.KeyWrappingSpecification()\n self._key_wrapping_specification.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)", "response": "Reads the data encoding the Get Request Payload and decodes it into its components."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nwrites the data encoding the object to the output stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the Get request payload to a stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n local_stream = utils.BytearrayStream()\n\n if self._unique_identifier is not None:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._key_format_type is not None:\n self._key_format_type.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._key_compression_type is not None:\n self._key_compression_type.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._key_wrapping_specification is not None:\n self._key_wrapping_specification.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(GetRequestPayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreading the data encoding the Get response payload and decode it into parts.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Get response payload and decode it\n into its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the object type, unique identifier, or\n secret attributes are missing from the encoded payload.\n \"\"\"\n super(GetResponsePayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.OBJECT_TYPE, local_stream):\n self._object_type = primitives.Enumeration(\n enum=enums.ObjectType,\n tag=enums.Tags.OBJECT_TYPE\n )\n self._object_type.read(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Parsed payload encoding is missing the object type field.\"\n )\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Parsed payload encoding is missing the unique identifier \"\n \"field.\"\n )\n\n self.secret = self.secret_factory.create(self.object_type)\n if self.is_tag_next(self._secret.tag, local_stream):\n self._secret.read(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Parsed payload encoding is missing the secret field.\"\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = utils.BytearrayStream()\n\n if self.object_type:\n self._object_type.write(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\"Payload is missing the object type field.\")\n\n if self.unique_identifier:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Payload is missing the unique identifier field.\"\n )\n\n if self.secret:\n self._secret.write(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\"Payload is missing the secret field.\")\n\n self.length = local_stream.length()\n super(GetResponsePayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the Get response payload to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(SignatureVerifyRequestPayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(enums.Tags.CRYPTOGRAPHIC_PARAMETERS, local_stream):\n self._cryptographic_parameters = \\\n attributes.CryptographicParameters()\n self._cryptographic_parameters.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(enums.Tags.DATA, local_stream):\n self._data = primitives.ByteString(tag=enums.Tags.DATA)\n self._data.read(local_stream, kmip_version=kmip_version)\n if self.is_tag_next(enums.Tags.DIGESTED_DATA, local_stream):\n self._digested_data = primitives.ByteString(\n tag=enums.Tags.DIGESTED_DATA\n )\n self._digested_data.read(local_stream, kmip_version=kmip_version)\n if self.is_tag_next(enums.Tags.SIGNATURE_DATA, local_stream):\n self._signature_data = primitives.ByteString(\n tag=enums.Tags.SIGNATURE_DATA\n )\n self._signature_data.read(local_stream, kmip_version=kmip_version)\n if self.is_tag_next(enums.Tags.CORRELATION_VALUE, local_stream):\n self._correlation_value = primitives.ByteString(\n tag=enums.Tags.CORRELATION_VALUE\n )\n self._correlation_value.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(enums.Tags.INIT_INDICATOR, local_stream):\n self._init_indicator = primitives.Boolean(\n tag=enums.Tags.INIT_INDICATOR\n )\n self._init_indicator.read(local_stream, kmip_version=kmip_version)\n if self.is_tag_next(enums.Tags.FINAL_INDICATOR, local_stream):\n self._final_indicator = primitives.Boolean(\n tag=enums.Tags.FINAL_INDICATOR\n )\n self._final_indicator.read(local_stream, kmip_version=kmip_version)\n\n self.is_oversized(local_stream)", "response": "Reads the contents of the SignatureVerifyRequestPayload into the object."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nwrite the contents of this object to the output stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the SignatureVerify request payload to a\n stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is not defined.\n \"\"\"\n local_stream = utils.BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._cryptographic_parameters:\n self._cryptographic_parameters.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._data:\n self._data.write(local_stream, kmip_version=kmip_version)\n if self._digested_data:\n self._digested_data.write(local_stream, kmip_version=kmip_version)\n if self._signature_data:\n self._signature_data.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._correlation_value:\n self._correlation_value.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._init_indicator:\n self._init_indicator.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._final_indicator:\n self._final_indicator.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(SignatureVerifyRequestPayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(SignatureVerifyResponsePayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Parsed payload encoding is missing the unique identifier \"\n \"field.\"\n )\n if self.is_tag_next(enums.Tags.VALIDITY_INDICATOR, local_stream):\n self._validity_indicator = primitives.Enumeration(\n enums.ValidityIndicator,\n tag=enums.Tags.VALIDITY_INDICATOR\n )\n self._validity_indicator.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Parsed payload encoding is missing the validity indicator \"\n \"field.\"\n )\n if self.is_tag_next(enums.Tags.DATA, local_stream):\n self._data = primitives.ByteString(tag=enums.Tags.DATA)\n self._data.read(local_stream, kmip_version=kmip_version)\n if self.is_tag_next(enums.Tags.CORRELATION_VALUE, local_stream):\n self._correlation_value = primitives.ByteString(\n tag=enums.Tags.CORRELATION_VALUE\n )\n self._correlation_value.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)", "response": "Reads the contents of the SignatureVerify response payload and decodes it into its constituent parts."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nprocesses a KMIP request message. This routine is the main driver of the KmipEngine. It breaks apart and processes the request header, handles any message errors that may result, and then passes the set of request batch items on for processing. This routine is thread-safe, allowing multiple client connections to use the same KmipEngine. Args: request (RequestMessage): The request message containing the batch items to be processed. credential (string): Identifying information about the client obtained from the client certificate. Optional, defaults to None. Returns: ResponseMessage: The response containing all of the results from the request batch items.", "response": "def process_request(self, request, credential=None):\n \"\"\"\n Process a KMIP request message.\n\n This routine is the main driver of the KmipEngine. It breaks apart and\n processes the request header, handles any message errors that may\n result, and then passes the set of request batch items on for\n processing. This routine is thread-safe, allowing multiple client\n connections to use the same KmipEngine.\n\n Args:\n request (RequestMessage): The request message containing the batch\n items to be processed.\n credential (string): Identifying information about the client\n obtained from the client certificate. Optional, defaults to\n None.\n\n Returns:\n ResponseMessage: The response containing all of the results from\n the request batch items.\n \"\"\"\n self._client_identity = [None, None]\n header = request.request_header\n\n # Process the protocol version\n self._set_protocol_version(header.protocol_version)\n\n # Process the maximum response size\n max_response_size = None\n if header.maximum_response_size:\n max_response_size = header.maximum_response_size.value\n\n # Process the time stamp\n now = int(time.time())\n if header.time_stamp:\n then = header.time_stamp.value\n\n if (now >= then) and ((now - then) < 60):\n self._logger.info(\"Received request at time: {0}\".format(\n time.strftime(\n \"%Y-%m-%d %H:%M:%S\",\n time.gmtime(then)\n )\n ))\n else:\n if now < then:\n self._logger.warning(\n \"Received request with future timestamp. Received \"\n \"timestamp: {0}, Current timestamp: {1}\".format(\n then,\n now\n )\n )\n\n raise exceptions.InvalidMessage(\n \"Future request rejected by server.\"\n )\n else:\n self._logger.warning(\n \"Received request with old timestamp. Possible \"\n \"replay attack. Received timestamp: {0}, Current \"\n \"timestamp: {1}\".format(then, now)\n )\n\n raise exceptions.InvalidMessage(\n \"Stale request rejected by server.\"\n )\n else:\n self._logger.info(\"Received request at time: {0}\".format(\n time.strftime(\n \"%Y-%m-%d %H:%M:%S\",\n time.gmtime(now)\n )\n ))\n\n # Process the asynchronous indicator\n self.is_asynchronous = False\n if header.asynchronous_indicator is not None:\n self.is_asynchronous = header.asynchronous_indicator.value\n\n if self.is_asynchronous:\n raise exceptions.InvalidMessage(\n \"Asynchronous operations are not supported.\"\n )\n\n # Process the authentication credentials\n if header.authentication:\n if header.authentication.credentials:\n auth_credentials = header.authentication.credentials[0]\n else:\n auth_credentials = None\n else:\n auth_credentials = None\n\n self._verify_credential(auth_credentials, credential)\n\n # Process the batch error continuation option\n batch_error_option = enums.BatchErrorContinuationOption.STOP\n if header.batch_error_cont_option is not None:\n batch_error_option = header.batch_error_cont_option.value\n\n if batch_error_option == enums.BatchErrorContinuationOption.UNDO:\n raise exceptions.InvalidMessage(\n \"Undo option for batch handling is not supported.\"\n )\n\n # Process the batch order option\n batch_order_option = False\n if header.batch_order_option:\n batch_order_option = header.batch_order_option.value\n\n response_batch = self._process_batch(\n request.batch_items,\n batch_error_option,\n batch_order_option\n )\n response = self._build_response(\n header.protocol_version,\n response_batch\n )\n\n return response, max_response_size, header.protocol_version"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nbuild a simple ResponseMessage containing a single error result.", "response": "def build_error_response(self, version, reason, message):\n \"\"\"\n Build a simple ResponseMessage with a single error result.\n\n Args:\n version (ProtocolVersion): The protocol version the response\n should be addressed with.\n reason (ResultReason): An enumeration classifying the type of\n error occurred.\n message (str): A string providing additional information about\n the error.\n\n Returns:\n ResponseMessage: The simple ResponseMessage containing a\n single error result.\n \"\"\"\n batch_item = messages.ResponseBatchItem(\n result_status=contents.ResultStatus(\n enums.ResultStatus.OPERATION_FAILED\n ),\n result_reason=contents.ResultReason(reason),\n result_message=contents.ResultMessage(message)\n )\n return self._build_response(version, [batch_item])"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _process_template_attribute(self, template_attribute):\n attributes = {}\n\n if len(template_attribute.names) > 0:\n raise exceptions.ItemNotFound(\n \"Attribute templates are not supported.\"\n )\n\n for attribute in template_attribute.attributes:\n name = attribute.attribute_name.value\n\n if not self._attribute_policy.is_attribute_supported(name):\n raise exceptions.InvalidField(\n \"The {0} attribute is unsupported.\".format(name)\n )\n\n if self._attribute_policy.is_attribute_multivalued(name):\n values = attributes.get(name, list())\n if (not attribute.attribute_index) and len(values) > 0:\n raise exceptions.InvalidField(\n \"Attribute index missing from multivalued attribute.\"\n )\n\n values.append(attribute.attribute_value)\n attributes.update([(name, values)])\n else:\n if attribute.attribute_index:\n if attribute.attribute_index.value != 0:\n raise exceptions.InvalidField(\n \"Non-zero attribute index found for \"\n \"single-valued attribute.\"\n )\n value = attributes.get(name, None)\n if value:\n raise exceptions.IndexOutOfBounds(\n \"Cannot set multiple instances of the \"\n \"{0} attribute.\".format(name)\n )\n else:\n attributes.update([(name, attribute.attribute_value)])\n\n return attributes", "response": "Given a kmip. core TemplateAttribute object extract the attribute data into a usable dictionary format."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ngive a kmip. pie object and a list of attribute names attempt to get all of the existing attributes from the object.", "response": "def _get_attributes_from_managed_object(self, managed_object, attr_names):\n \"\"\"\n Given a kmip.pie object and a list of attribute names, attempt to get\n all of the existing attribute values from the object.\n \"\"\"\n attr_factory = attribute_factory.AttributeFactory()\n retrieved_attributes = list()\n\n if not attr_names:\n attr_names = self._attribute_policy.get_all_attribute_names()\n\n for attribute_name in attr_names:\n object_type = managed_object._object_type\n\n if not self._attribute_policy.is_attribute_supported(\n attribute_name\n ):\n continue\n\n if self._attribute_policy.is_attribute_applicable_to_object_type(\n attribute_name,\n object_type\n ):\n try:\n attribute_value = self._get_attribute_from_managed_object(\n managed_object,\n attribute_name\n )\n except Exception:\n attribute_value = None\n\n if attribute_value is not None:\n if self._attribute_policy.is_attribute_multivalued(\n attribute_name\n ):\n for count, value in enumerate(attribute_value):\n attribute = attr_factory.create_attribute(\n enums.AttributeType(attribute_name),\n value,\n count\n )\n retrieved_attributes.append(attribute)\n else:\n attribute = attr_factory.create_attribute(\n enums.AttributeType(attribute_name),\n attribute_value\n )\n retrieved_attributes.append(attribute)\n\n return retrieved_attributes"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _get_attribute_from_managed_object(self, managed_object, attr_name):\n if attr_name == 'Unique Identifier':\n return str(managed_object.unique_identifier)\n elif attr_name == 'Name':\n names = list()\n for name in managed_object.names:\n name = attributes.Name(\n attributes.Name.NameValue(name),\n attributes.Name.NameType(\n enums.NameType.UNINTERPRETED_TEXT_STRING\n )\n )\n names.append(name)\n return names\n elif attr_name == 'Object Type':\n return managed_object._object_type\n elif attr_name == 'Cryptographic Algorithm':\n return managed_object.cryptographic_algorithm\n elif attr_name == 'Cryptographic Length':\n return managed_object.cryptographic_length\n elif attr_name == 'Cryptographic Parameters':\n return None\n elif attr_name == 'Cryptographic Domain Parameters':\n return None\n elif attr_name == 'Certificate Type':\n return managed_object.certificate_type\n elif attr_name == 'Certificate Length':\n return None\n elif attr_name == 'X.509 Certificate Identifier':\n return None\n elif attr_name == 'X.509 Certificate Subject':\n return None\n elif attr_name == 'X.509 Certificate Issuer':\n return None\n elif attr_name == 'Certificate Identifier':\n return None\n elif attr_name == 'Certificate Subject':\n return None\n elif attr_name == 'Certificate Issuer':\n return None\n elif attr_name == 'Digital Signature Algorithm':\n return None\n elif attr_name == 'Digest':\n return None\n elif attr_name == 'Operation Policy Name':\n return managed_object.operation_policy_name\n elif attr_name == 'Cryptographic Usage Mask':\n return managed_object.cryptographic_usage_masks\n elif attr_name == 'Lease Time':\n return None\n elif attr_name == 'Usage Limits':\n return None\n elif attr_name == 'State':\n return managed_object.state\n elif attr_name == 'Initial Date':\n return managed_object.initial_date\n elif attr_name == 'Activation Date':\n return None\n elif attr_name == 'Process Start Date':\n return None\n elif attr_name == 'Protect Stop Date':\n return None\n elif attr_name == 'Deactivation Date':\n return None\n elif attr_name == 'Destroy Date':\n return None\n elif attr_name == 'Compromise Occurrence Date':\n return None\n elif attr_name == 'Compromise Date':\n return None\n elif attr_name == 'Revocation Reason':\n return None\n elif attr_name == 'Archive Date':\n return None\n elif attr_name == 'Object Group':\n return None\n elif attr_name == 'Fresh':\n return None\n elif attr_name == 'Link':\n return None\n elif attr_name == 'Application Specific Information':\n return None\n elif attr_name == 'Contact Information':\n return None\n elif attr_name == 'Last Change Date':\n return None\n else:\n # Since custom attribute names are possible, just return None\n # for unrecognized attributes. This satisfies the spec.\n return None", "response": "Get the attribute value from the kmip. pie managed object."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nattempting to set the attributes on the object.", "response": "def _set_attributes_on_managed_object(self, managed_object, attributes):\n \"\"\"\n Given a kmip.pie object and a dictionary of attributes, attempt to set\n the attribute values on the object.\n \"\"\"\n for attribute_name, attribute_value in six.iteritems(attributes):\n object_type = managed_object._object_type\n if self._attribute_policy.is_attribute_applicable_to_object_type(\n attribute_name,\n object_type):\n self._set_attribute_on_managed_object(\n managed_object,\n (attribute_name, attribute_value)\n )\n else:\n name = object_type.name\n raise exceptions.InvalidField(\n \"Cannot set {0} attribute on {1} object.\".format(\n attribute_name,\n ''.join([x.capitalize() for x in name.split('_')])\n )\n )"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _set_attribute_on_managed_object(self, managed_object, attribute):\n attribute_name = attribute[0]\n attribute_value = attribute[1]\n\n if self._attribute_policy.is_attribute_multivalued(attribute_name):\n if attribute_name == 'Name':\n managed_object.names.extend(\n [x.name_value.value for x in attribute_value]\n )\n for name in managed_object.names:\n if managed_object.names.count(name) > 1:\n raise exceptions.InvalidField(\n \"Cannot set duplicate name values.\"\n )\n else:\n # TODO (peterhamilton) Remove when all attributes are supported\n raise exceptions.InvalidField(\n \"The {0} attribute is unsupported.\".format(attribute_name)\n )\n else:\n field = None\n value = attribute_value.value\n\n if attribute_name == 'Cryptographic Algorithm':\n field = 'cryptographic_algorithm'\n elif attribute_name == 'Cryptographic Length':\n field = 'cryptographic_length'\n elif attribute_name == 'Cryptographic Usage Mask':\n field = 'cryptographic_usage_masks'\n value = list()\n for e in enums.CryptographicUsageMask:\n if e.value & attribute_value.value:\n value.append(e)\n elif attribute_name == 'Operation Policy Name':\n field = 'operation_policy_name'\n\n if field:\n existing_value = getattr(managed_object, field)\n if existing_value:\n if existing_value != value:\n raise exceptions.InvalidField(\n \"Cannot overwrite the {0} attribute.\".format(\n attribute_name\n )\n )\n else:\n setattr(managed_object, field, value)\n else:\n # TODO (peterhamilton) Remove when all attributes are supported\n raise exceptions.InvalidField(\n \"The {0} attribute is unsupported.\".format(attribute_name)\n )", "response": "Sets the attribute value on the kmip. pie managed object."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nlook up the relevant policy section.", "response": "def get_relevant_policy_section(self, policy_name, group=None):\n \"\"\"\n Look up the policy corresponding to the provided policy name and\n group (optional). Log any issues found during the look up.\n \"\"\"\n policy_bundle = self._operation_policies.get(policy_name)\n\n if not policy_bundle:\n self._logger.warning(\n \"The '{}' policy does not exist.\".format(policy_name)\n )\n return None\n\n if group:\n groups_policy_bundle = policy_bundle.get('groups')\n if not groups_policy_bundle:\n self._logger.debug(\n \"The '{}' policy does not support groups.\".format(\n policy_name\n )\n )\n return None\n else:\n group_policy = groups_policy_bundle.get(group)\n if not group_policy:\n self._logger.debug(\n \"The '{}' policy does not support group '{}'.\".format(\n policy_name,\n group\n )\n )\n return None\n else:\n return group_policy\n else:\n return policy_bundle.get('preset')"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ndetermine if object access is allowed for the provided policy and session settings.", "response": "def is_allowed(\n self,\n policy_name,\n session_user,\n session_group,\n object_owner,\n object_type,\n operation\n ):\n \"\"\"\n Determine if object access is allowed for the provided policy and\n session settings.\n \"\"\"\n policy_section = self.get_relevant_policy_section(\n policy_name,\n session_group\n )\n if policy_section is None:\n return False\n\n object_policy = policy_section.get(object_type)\n if not object_policy:\n self._logger.warning(\n \"The '{0}' policy does not apply to {1} objects.\".format(\n policy_name,\n self._get_enum_string(object_type)\n )\n )\n return False\n\n operation_object_policy = object_policy.get(operation)\n if not operation_object_policy:\n self._logger.warning(\n \"The '{0}' policy does not apply to {1} operations on {2} \"\n \"objects.\".format(\n policy_name,\n self._get_enum_string(operation),\n self._get_enum_string(object_type)\n )\n )\n return False\n\n if operation_object_policy == enums.Policy.ALLOW_ALL:\n return True\n elif operation_object_policy == enums.Policy.ALLOW_OWNER:\n if session_user == object_owner:\n return True\n else:\n return False\n elif operation_object_policy == enums.Policy.DISALLOW_ALL:\n return False\n else:\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = utils.BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._cryptographic_parameters:\n self._cryptographic_parameters.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self._data:\n self._data.write(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\"invalid payload missing the data attribute\")\n\n if self._iv_counter_nonce:\n self._iv_counter_nonce.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(DecryptRequestPayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the DecryptRequestPayload object to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(RevokeRequestPayload, self).read(\n istream,\n kmip_version=kmip_version\n )\n tstream = BytearrayStream(istream.read(self.length))\n\n self.unique_identifier = attributes.UniqueIdentifier()\n self.unique_identifier.read(tstream, kmip_version=kmip_version)\n\n self.revocation_reason = objects.RevocationReason()\n self.revocation_reason.read(tstream, kmip_version=kmip_version)\n\n if self.is_tag_next(enums.Tags.COMPROMISE_OCCURRENCE_DATE, tstream):\n self.compromise_occurrence_date = primitives.DateTime(\n tag=enums.Tags.COMPROMISE_OCCURRENCE_DATE)\n self.compromise_occurrence_date.read(\n tstream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(tstream)\n self.validate()", "response": "Reads the data encoding the RevokeRequestPayload object and decode it into its constituent parts."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n tstream = BytearrayStream()\n\n # Write the contents of the request payload\n if self.unique_identifier is not None:\n self.unique_identifier.write(tstream, kmip_version=kmip_version)\n\n self.revocation_reason.write(tstream, kmip_version=kmip_version)\n\n if self.compromise_occurrence_date is not None:\n self.compromise_occurrence_date.write(\n tstream,\n kmip_version=kmip_version\n )\n\n # Write the length and value of the request payload\n self.length = tstream.length()\n super(RevokeRequestPayload, self).write(\n ostream,\n kmip_version=kmip_version\n )\n ostream.write(tstream.buffer)", "response": "Writes the contents of the RevokeRequestPayload object to the data stream."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nvalidates the attributes of the ActivateRequestPayload object.", "response": "def validate(self):\n \"\"\"\n Error check the attributes of the ActivateRequestPayload object.\n \"\"\"\n if self.unique_identifier is not None:\n if not isinstance(self.unique_identifier,\n attributes.UniqueIdentifier):\n msg = \"invalid unique identifier\"\n raise TypeError(msg)\n if self.compromise_occurrence_date is not None:\n if not isinstance(self.compromise_occurrence_date,\n primitives.DateTime):\n msg = \"invalid compromise time\"\n raise TypeError(msg)\n if not isinstance(self.revocation_reason, objects.RevocationReason):\n msg = \"invalid revocation reason\"\n raise TypeError(msg)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(RevokeResponsePayload, self).read(\n istream,\n kmip_version=kmip_version\n )\n tstream = BytearrayStream(istream.read(self.length))\n\n self.unique_identifier = attributes.UniqueIdentifier()\n self.unique_identifier.read(tstream, kmip_version=kmip_version)\n\n self.is_oversized(tstream)\n self.validate()", "response": "Reads the data encoding the RevokeResponsePayload object and decode it into constituent parts."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\ncreating a new secret object of the specified type with the given value.", "response": "def create(self, secret_type, value=None):\n \"\"\"\n Create a secret object of the specified type with the given value.\n\n Args:\n secret_type (ObjectType): An ObjectType enumeration specifying the\n type of secret to create.\n value (dict): A dictionary containing secret data. Optional,\n defaults to None.\n\n Returns:\n secret: The newly constructed secret object.\n\n Raises:\n TypeError: If the provided secret type is unrecognized.\n\n Example:\n >>> factory.create(ObjectType.SYMMETRIC_KEY)\n SymmetricKey(...)\n \"\"\"\n if secret_type is ObjectType.CERTIFICATE:\n return self._create_certificate(value)\n elif secret_type is ObjectType.SYMMETRIC_KEY:\n return self._create_symmetric_key(value)\n elif secret_type is ObjectType.PUBLIC_KEY:\n return self._create_public_key(value)\n elif secret_type is ObjectType.PRIVATE_KEY:\n return self._create_private_key(value)\n elif secret_type is ObjectType.SPLIT_KEY:\n return self._create_split_key(value)\n elif secret_type is ObjectType.TEMPLATE:\n return self._create_template(value)\n elif secret_type is ObjectType.SECRET_DATA:\n return self._create_secret_data(value)\n elif secret_type is ObjectType.OPAQUE_DATA:\n return self._create_opaque_data(value)\n else:\n raise TypeError(\"Unrecognized secret type: {0}\".format(\n secret_type))"} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef set_setting(self, setting, value):\n if setting not in self._expected_settings + self._optional_settings:\n raise exceptions.ConfigurationError(\n \"Setting '{0}' is not supported.\".format(setting)\n )\n\n if setting == 'hostname':\n self._set_hostname(value)\n elif setting == 'port':\n self._set_port(value)\n elif setting == 'certificate_path':\n self._set_certificate_path(value)\n elif setting == 'key_path':\n self._set_key_path(value)\n elif setting == 'ca_path':\n self._set_ca_path(value)\n elif setting == 'auth_suite':\n self._set_auth_suite(value)\n elif setting == 'policy_path':\n self._set_policy_path(value)\n elif setting == 'enable_tls_client_auth':\n self._set_enable_tls_client_auth(value)\n elif setting == 'tls_cipher_suites':\n self._set_tls_cipher_suites(value)\n elif setting == 'logging_level':\n self._set_logging_level(value)\n else:\n self._set_database_path(value)", "response": "Sets the value of a specific setting."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nload the server configuration settings from the file pointed to by path.", "response": "def load_settings(self, path):\n \"\"\"\n Load configuration settings from the file pointed to by path.\n\n This will overwrite all current setting values.\n\n Args:\n path (string): The path to the configuration file containing\n the settings to load. Required.\n Raises:\n ConfigurationError: Raised if the path does not point to an\n existing file or if a setting value is invalid.\n \"\"\"\n if not os.path.exists(path):\n raise exceptions.ConfigurationError(\n \"The server configuration file ('{0}') could not be \"\n \"located.\".format(path)\n )\n\n self._logger.info(\n \"Loading server configuration settings from: {0}\".format(path)\n )\n\n parser = configparser.ConfigParser()\n parser.read(path)\n self._parse_settings(parser)\n self.parse_auth_settings(parser)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nprocessing the bind parameter value for the field.", "response": "def process_bind_param(self, value, dialect):\n \"\"\"\n Returns the integer value of the usage mask bitmask. This value is\n stored in the database.\n\n Args:\n value(list): list of enums in the\n usage mask\n dialect(string): SQL dialect\n \"\"\"\n bitmask = 0x00\n for e in value:\n bitmask = bitmask | e.value\n return bitmask"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreturns a new list of enums.CryptographicUsageMask Enums. This converts the integer value into the list of enums. Args: value(int): The integer value stored in the database that is used to create the list of enums.CryptographicUsageMask Enums. dialect(string): SQL dialect", "response": "def process_result_value(self, value, dialect):\n \"\"\"\n Returns a new list of enums.CryptographicUsageMask Enums. This converts\n the integer value into the list of enums.\n\n Args:\n value(int): The integer value stored in the database that is used\n to create the list of enums.CryptographicUsageMask Enums.\n dialect(string): SQL dialect\n \"\"\"\n masks = list()\n if value:\n for e in enums.CryptographicUsageMask:\n if e.value & value:\n masks.append(e)\n return masks"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nread the encoding of the LongInteger from the input stream.", "response": "def read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the encoding of the LongInteger from the input stream.\n\n Args:\n istream (stream): A buffer containing the encoded bytes of a\n LongInteger. Usually a BytearrayStream object. Required.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n InvalidPrimitiveLength: if the long integer encoding read in has\n an invalid encoded length.\n \"\"\"\n super(LongInteger, self).read(istream, kmip_version=kmip_version)\n\n if self.length is not LongInteger.LENGTH:\n raise exceptions.InvalidPrimitiveLength(\n \"invalid long integer length read; \"\n \"expected: {0}, observed: {1}\".format(\n LongInteger.LENGTH, self.length))\n\n self.value = unpack('!q', istream.read(self.length))[0]\n self.validate()"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(LongInteger, self).write(ostream, kmip_version=kmip_version)\n ostream.write(pack('!q', self.value))", "response": "Writes the encoding of the object to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nverifies that the value of the LongInteger is valid.", "response": "def validate(self):\n \"\"\"\n Verify that the value of the LongInteger is valid.\n\n Raises:\n TypeError: if the value is not of type int or long\n ValueError: if the value cannot be represented by a signed 64-bit\n integer\n \"\"\"\n if self.value is not None:\n if not isinstance(self.value, six.integer_types):\n raise TypeError('expected (one of): {0}, observed: {1}'.format(\n six.integer_types, type(self.value)))\n else:\n if self.value > LongInteger.MAX:\n raise ValueError(\n 'long integer value greater than accepted max')\n elif self.value < LongInteger.MIN:\n raise ValueError(\n 'long integer value less than accepted min')"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(BigInteger, self).read(istream, kmip_version=kmip_version)\n\n # Check for a valid length before even trying to parse the value.\n if self.length % 8:\n raise exceptions.InvalidPrimitiveLength(\n \"invalid big integer length read; \"\n \"expected: multiple of 8, observed: {0}\".format(self.length))\n\n sign = 1\n binary = ''\n\n # Read the value byte by byte and convert it into binary, padding each\n # byte as needed.\n for _ in range(self.length):\n byte = struct.unpack('!B', istream.read(1))[0]\n bits = \"{0:b}\".format(byte)\n pad = len(bits) % 8\n if pad:\n bits = ('0' * (8 - pad)) + bits\n binary += bits\n\n # If the value is negative, convert via two's complement.\n if binary[0] == '1':\n sign = -1\n binary = binary.replace('1', 'i')\n binary = binary.replace('0', '1')\n binary = binary.replace('i', '0')\n\n pivot = binary.rfind('0')\n binary = binary[0:pivot] + '1' + ('0' * len(binary[pivot + 1:]))\n\n # Convert the value back to an integer and reapply the sign.\n self.value = int(binary, 2) * sign", "response": "Reads the encoding of the has\n from the input stream."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nwriting the encoding of the BigInteger object to the output stream.", "response": "def write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the encoding of the BigInteger to the output stream.\n\n Args:\n ostream (Stream): A buffer to contain the encoded bytes of a\n BigInteger object. Usually a BytearrayStream object.\n Required.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n # Convert the value to binary and pad it as needed.\n binary = \"{0:b}\".format(abs(self.value))\n binary = (\"0\" * (64 - (len(binary) % 64))) + binary\n\n # If the value is negative, convert via two's complement.\n if self.value < 0:\n binary = binary.replace('1', 'i')\n binary = binary.replace('0', '1')\n binary = binary.replace('i', '0')\n\n pivot = binary.rfind('0')\n binary = binary[0:pivot] + '1' + ('0' * len(binary[pivot + 1:]))\n\n # Convert each byte to hex and build the hex string for the value.\n hexadecimal = b''\n for i in range(0, len(binary), 8):\n byte = binary[i:i + 8]\n byte = int(byte, 2)\n hexadecimal += struct.pack('!B', byte)\n\n self.length = len(hexadecimal)\n super(BigInteger, self).write(ostream, kmip_version=kmip_version)\n ostream.write(hexadecimal)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef validate(self):\n if self.value is not None:\n if not isinstance(self.value, six.integer_types):\n raise TypeError('expected (one of): {0}, observed: {1}'.format(\n six.integer_types, type(self.value)))", "response": "Verify that the value of the BigInteger is valid."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nverifies that the value of the Enumeration is valid.", "response": "def validate(self):\n \"\"\"\n Verify that the value of the Enumeration is valid.\n\n Raises:\n TypeError: if the enum is not of type Enum\n ValueError: if the value is not of the expected Enum subtype or if\n the value cannot be represented by an unsigned 32-bit integer\n \"\"\"\n if not isinstance(self.enum, enumeration.EnumMeta):\n raise TypeError(\n 'enumeration type {0} must be of type EnumMeta'.format(\n self.enum))\n if self.value is not None:\n if not isinstance(self.value, self.enum):\n raise TypeError(\n 'enumeration {0} must be of type {1}'.format(\n self.value, self.enum))\n if type(self.value.value) not in six.integer_types:\n raise TypeError('enumeration value must be an int')\n else:\n if self.value.value > Enumeration.MAX:\n raise ValueError(\n 'enumeration value greater than accepted max')\n elif self.value.value < Enumeration.MIN:\n raise ValueError(\n 'enumeration value less than accepted min')"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef read_value(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n try:\n value = unpack('!Q', istream.read(self.LENGTH))[0]\n except Exception:\n self.logger.error(\"Error reading boolean value from buffer\")\n raise\n\n if value == 1:\n self.value = True\n elif value == 0:\n self.value = False\n else:\n raise ValueError(\"expected: 0 or 1, observed: {0}\".format(value))\n\n self.validate()", "response": "Reads the value of the KMIP Boolean object from the input stream."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nwrite the value of the object to the output stream.", "response": "def write_value(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the value of the Boolean object to the output stream.\n\n Args:\n ostream (Stream): A buffer to contain the encoded bytes of the\n value of a Boolean object. Usually a BytearrayStream object.\n Required.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n try:\n ostream.write(pack('!Q', self.value))\n except Exception:\n self.logger.error(\"Error writing boolean value to buffer\")\n raise"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(Boolean, self).write(ostream, kmip_version=kmip_version)\n self.write_value(ostream, kmip_version=kmip_version)", "response": "Writes the encoding of the object to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef validate(self):\n if self.value:\n if not isinstance(self.value, bool):\n raise TypeError(\"expected: {0}, observed: {1}\".format(\n bool, type(self.value)))", "response": "Verify that the value of the Boolean object is valid."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nread the encoding of the Interval from the input stream.", "response": "def read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the encoding of the Interval from the input stream.\n\n Args:\n istream (stream): A buffer containing the encoded bytes of the\n value of an Interval. Usually a BytearrayStream object.\n Required.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n InvalidPrimitiveLength: if the Interval encoding read in has an\n invalid encoded length.\n InvalidPaddingBytes: if the Interval encoding read in does not use\n zeroes for its padding bytes.\n \"\"\"\n super(Interval, self).read(istream, kmip_version=kmip_version)\n\n # Check for a valid length before even trying to parse the value.\n if self.length != Interval.LENGTH:\n raise exceptions.InvalidPrimitiveLength(\n \"interval length must be {0}\".format(Interval.LENGTH))\n\n # Decode the Interval value and the padding bytes.\n self.value = unpack('!I', istream.read(Interval.LENGTH))[0]\n pad = unpack('!I', istream.read(Interval.LENGTH))[0]\n\n # Verify that the padding bytes are zero bytes.\n if pad != 0:\n raise exceptions.InvalidPaddingBytes(\"padding bytes must be zero\")\n\n self.validate()"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nverify that the value of the Interval is valid.", "response": "def validate(self):\n \"\"\"\n Verify that the value of the Interval is valid.\n\n Raises:\n TypeError: if the value is not of type int or long\n ValueError: if the value cannot be represented by an unsigned\n 32-bit integer\n \"\"\"\n if self.value is not None:\n if type(self.value) not in six.integer_types:\n raise TypeError('expected (one of): {0}, observed: {1}'.format(\n six.integer_types, type(self.value)))\n else:\n if self.value > Interval.MAX:\n raise ValueError(\n 'interval value greater than accepted max')\n elif self.value < Interval.MIN:\n raise ValueError('interval value less than accepted min')"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef key_wrapping_data(self):\n key_wrapping_data = {}\n encryption_key_info = {\n 'unique_identifier': self._kdw_eki_unique_identifier,\n 'cryptographic_parameters': {\n 'block_cipher_mode': self._kdw_eki_cp_block_cipher_mode,\n 'padding_method': self._kdw_eki_cp_padding_method,\n 'hashing_algorithm': self._kdw_eki_cp_hashing_algorithm,\n 'key_role_type': self._kdw_eki_cp_key_role_type,\n 'digital_signature_algorithm':\n self._kdw_eki_cp_digital_signature_algorithm,\n 'cryptographic_algorithm':\n self._kdw_eki_cp_cryptographic_algorithm,\n 'random_iv': self._kdw_eki_cp_random_iv,\n 'iv_length': self._kdw_eki_cp_iv_length,\n 'tag_length': self._kdw_eki_cp_tag_length,\n 'fixed_field_length': self._kdw_eki_cp_fixed_field_length,\n 'invocation_field_length':\n self._kdw_eki_cp_invocation_field_length,\n 'counter_length': self._kdw_eki_cp_counter_length,\n 'initial_counter_value':\n self._kdw_eki_cp_initial_counter_value\n }\n }\n if not any(encryption_key_info['cryptographic_parameters'].values()):\n encryption_key_info['cryptographic_parameters'] = {}\n if not any(encryption_key_info.values()):\n encryption_key_info = {}\n\n mac_sign_key_info = {\n 'unique_identifier': self._kdw_mski_unique_identifier,\n 'cryptographic_parameters': {\n 'block_cipher_mode': self._kdw_mski_cp_block_cipher_mode,\n 'padding_method': self._kdw_mski_cp_padding_method,\n 'hashing_algorithm': self._kdw_mski_cp_hashing_algorithm,\n 'key_role_type': self._kdw_mski_cp_key_role_type,\n 'digital_signature_algorithm':\n self._kdw_mski_cp_digital_signature_algorithm,\n 'cryptographic_algorithm':\n self._kdw_mski_cp_cryptographic_algorithm,\n 'random_iv': self._kdw_mski_cp_random_iv,\n 'iv_length': self._kdw_mski_cp_iv_length,\n 'tag_length': self._kdw_mski_cp_tag_length,\n 'fixed_field_length': self._kdw_mski_cp_fixed_field_length,\n 'invocation_field_length':\n self._kdw_mski_cp_invocation_field_length,\n 'counter_length': self._kdw_mski_cp_counter_length,\n 'initial_counter_value':\n self._kdw_mski_cp_initial_counter_value\n }\n }\n if not any(mac_sign_key_info['cryptographic_parameters'].values()):\n mac_sign_key_info['cryptographic_parameters'] = {}\n if not any(mac_sign_key_info.values()):\n mac_sign_key_info = {}\n\n key_wrapping_data['wrapping_method'] = self._kdw_wrapping_method\n key_wrapping_data['encryption_key_information'] = encryption_key_info\n key_wrapping_data['mac_signature_key_information'] = mac_sign_key_info\n key_wrapping_data['mac_signature'] = self._kdw_mac_signature\n key_wrapping_data['iv_counter_nonce'] = self._kdw_iv_counter_nonce\n key_wrapping_data['encoding_option'] = self._kdw_encoding_option\n if not any(key_wrapping_data.values()):\n key_wrapping_data = {}\n\n return key_wrapping_data", "response": "Retrieves all of the relevant key wrapping data fields and return them as a dictionary."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsetting the key wrapping data attributes using a dictionary.", "response": "def key_wrapping_data(self, value):\n \"\"\"\n Set the key wrapping data attributes using a dictionary.\n \"\"\"\n if value is None:\n value = {}\n elif not isinstance(value, dict):\n raise TypeError(\"Key wrapping data must be a dictionary.\")\n\n self._kdw_wrapping_method = value.get('wrapping_method')\n\n eki = value.get('encryption_key_information')\n if eki is None:\n eki = {}\n self._kdw_eki_unique_identifier = eki.get('unique_identifier')\n eki_cp = eki.get('cryptographic_parameters')\n if eki_cp is None:\n eki_cp = {}\n self._kdw_eki_cp_block_cipher_mode = eki_cp.get('block_cipher_mode')\n self._kdw_eki_cp_padding_method = eki_cp.get('padding_method')\n self._kdw_eki_cp_hashing_algorithm = eki_cp.get('hashing_algorithm')\n self._kdw_eki_cp_key_role_type = eki_cp.get('key_role_type')\n self._kdw_eki_cp_digital_signature_algorithm = \\\n eki_cp.get('digital_signature_algorithm')\n self._kdw_eki_cp_cryptographic_algorithm = \\\n eki_cp.get('cryptographic_algorithm')\n self._kdw_eki_cp_random_iv = eki_cp.get('random_iv')\n self._kdw_eki_cp_iv_length = eki_cp.get('iv_length')\n self._kdw_eki_cp_tag_length = eki_cp.get('tag_length')\n self._kdw_eki_cp_fixed_field_length = eki_cp.get('fixed_field_length')\n self._kdw_eki_cp_invocation_field_length = \\\n eki_cp.get('invocation_field_length')\n self._kdw_eki_cp_counter_length = eki_cp.get('counter_length')\n self._kdw_eki_cp_initial_counter_value = \\\n eki_cp.get('initial_counter_value')\n\n mski = value.get('mac_signature_key_information')\n if mski is None:\n mski = {}\n self._kdw_mski_unique_identifier = mski.get('unique_identifier')\n mski_cp = mski.get('cryptographic_parameters')\n if mski_cp is None:\n mski_cp = {}\n self._kdw_mski_cp_block_cipher_mode = mski_cp.get('block_cipher_mode')\n self._kdw_mski_cp_padding_method = mski_cp.get('padding_method')\n self._kdw_mski_cp_hashing_algorithm = mski_cp.get('hashing_algorithm')\n self._kdw_mski_cp_key_role_type = mski_cp.get('key_role_type')\n self._kdw_mski_cp_digital_signature_algorithm = \\\n mski_cp.get('digital_signature_algorithm')\n self._kdw_mski_cp_cryptographic_algorithm = \\\n mski_cp.get('cryptographic_algorithm')\n self._kdw_mski_cp_random_iv = mski_cp.get('random_iv')\n self._kdw_mski_cp_iv_length = mski_cp.get('iv_length')\n self._kdw_mski_cp_tag_length = mski_cp.get('tag_length')\n self._kdw_mski_cp_fixed_field_length = \\\n mski_cp.get('fixed_field_length')\n self._kdw_mski_cp_invocation_field_length = \\\n mski_cp.get('invocation_field_length')\n self._kdw_mski_cp_counter_length = mski_cp.get('counter_length')\n self._kdw_mski_cp_initial_counter_value = \\\n mski_cp.get('initial_counter_value')\n\n self._kdw_mac_signature = value.get('mac_signature')\n self._kdw_iv_counter_nonce = value.get('iv_counter_nonce')\n self._kdw_encoding_option = value.get('encoding_option')"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef validate(self):\n if not isinstance(self.value, bytes):\n raise TypeError(\"key value must be bytes\")\n elif not isinstance(self.cryptographic_algorithm,\n enums.CryptographicAlgorithm):\n raise TypeError(\"key algorithm must be a CryptographicAlgorithm \"\n \"enumeration\")\n elif not isinstance(self.cryptographic_length, six.integer_types):\n raise TypeError(\"key length must be an integer\")\n elif not isinstance(self.key_format_type, enums.KeyFormatType):\n raise TypeError(\"key format type must be a KeyFormatType \"\n \"enumeration\")\n elif self.key_format_type not in self._valid_formats:\n raise ValueError(\"key format type must be one of {0}\".format(\n self._valid_formats))\n\n # TODO (peter-hamilton) Verify that the key bytes match the key format\n\n mask_count = len(self.cryptographic_usage_masks)\n for i in range(mask_count):\n mask = self.cryptographic_usage_masks[i]\n if not isinstance(mask, enums.CryptographicUsageMask):\n position = \"({0} in list)\".format(i)\n raise TypeError(\n \"key mask {0} must be a CryptographicUsageMask \"\n \"enumeration\".format(position))\n\n name_count = len(self.names)\n for i in range(name_count):\n name = self.names[i]\n if not isinstance(name, six.string_types):\n position = \"({0} in list)\".format(i)\n raise TypeError(\"key name {0} must be a string\".format(\n position))", "response": "Validates that the contents of the PublicKey object are valid."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nensures that the contents of the SecretData object is valid.", "response": "def validate(self):\n \"\"\"\n Verify that the contents of the SecretData object are valid.\n\n Raises:\n TypeError: if the types of any SecretData attributes are invalid.\n \"\"\"\n if not isinstance(self.value, bytes):\n raise TypeError(\"secret value must be bytes\")\n elif not isinstance(self.data_type, enums.SecretDataType):\n raise TypeError(\"secret data type must be a SecretDataType \"\n \"enumeration\")\n\n mask_count = len(self.cryptographic_usage_masks)\n for i in range(mask_count):\n mask = self.cryptographic_usage_masks[i]\n if not isinstance(mask, enums.CryptographicUsageMask):\n position = \"({0} in list)\".format(i)\n raise TypeError(\n \"secret data mask {0} must be a CryptographicUsageMask \"\n \"enumeration\".format(position))\n\n name_count = len(self.names)\n for i in range(name_count):\n name = self.names[i]\n if not isinstance(name, six.string_types):\n position = \"({0} in list)\".format(i)\n raise TypeError(\"secret data name {0} must be a string\".format(\n position))"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nverifying that the contents of the OpaqueObject are valid.", "response": "def validate(self):\n \"\"\"\n Verify that the contents of the OpaqueObject are valid.\n\n Raises:\n TypeError: if the types of any OpaqueObject attributes are invalid.\n \"\"\"\n if not isinstance(self.value, bytes):\n raise TypeError(\"opaque value must be bytes\")\n elif not isinstance(self.opaque_type, enums.OpaqueDataType):\n raise TypeError(\"opaque data type must be an OpaqueDataType \"\n \"enumeration\")\n\n name_count = len(self.names)\n for i in range(name_count):\n name = self.names[i]\n if not isinstance(name, six.string_types):\n position = \"({0} in list)\".format(i)\n raise TypeError(\"opaque data name {0} must be a string\".format(\n position))"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get_bit_mask_from_enumerations(enumerations):\n return functools.reduce(\n lambda x, y: x | y, [z.value for z in enumerations]\n )", "response": "A utility function that computes a bit mask from a collection of enumeration values."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef get_enumerations_from_bit_mask(enumeration, mask):\n return [x for x in enumeration if (x.value & mask) == x.value]", "response": "This utility function creates a list of enumeration values corresponding to a specific bit mask enumeration class."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef is_bit_mask(enumeration, potential_mask):\n if not isinstance(potential_mask, six.integer_types):\n return False\n\n mask_enumerations = (\n CryptographicUsageMask,\n ProtectionStorageMask,\n StorageStatusMask\n )\n if enumeration not in mask_enumerations:\n return False\n\n mask = 0\n for value in [e.value for e in enumeration]:\n if (value & potential_mask) == value:\n mask |= value\n\n if mask != potential_mask:\n return False\n\n return True", "response": "A utility function that checks if the provided value is a composite bit mask of the specified enumeration class."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef is_attribute(tag, kmip_version=None):\n kmip_1_0_attribute_tags = [\n Tags.UNIQUE_IDENTIFIER,\n Tags.NAME,\n Tags.OBJECT_TYPE,\n Tags.CRYPTOGRAPHIC_ALGORITHM,\n Tags.CRYPTOGRAPHIC_LENGTH,\n Tags.CRYPTOGRAPHIC_PARAMETERS,\n Tags.CRYPTOGRAPHIC_DOMAIN_PARAMETERS,\n Tags.CERTIFICATE_TYPE,\n Tags.CERTIFICATE_IDENTIFIER,\n Tags.CERTIFICATE_SUBJECT,\n Tags.CERTIFICATE_ISSUER,\n Tags.DIGEST,\n Tags.OPERATION_POLICY_NAME,\n Tags.CRYPTOGRAPHIC_USAGE_MASK,\n Tags.LEASE_TIME,\n Tags.USAGE_LIMITS,\n Tags.STATE,\n Tags.INITIAL_DATE,\n Tags.ACTIVATION_DATE,\n Tags.PROCESS_START_DATE,\n Tags.PROTECT_STOP_DATE,\n Tags.DEACTIVATION_DATE,\n Tags.DESTROY_DATE,\n Tags.COMPROMISE_OCCURRENCE_DATE,\n Tags.COMPROMISE_DATE,\n Tags.REVOCATION_REASON,\n Tags.ARCHIVE_DATE,\n Tags.OBJECT_GROUP,\n Tags.LINK,\n Tags.APPLICATION_SPECIFIC_INFORMATION,\n Tags.CONTACT_INFORMATION,\n Tags.LAST_CHANGE_DATE,\n Tags.CUSTOM_ATTRIBUTE\n ]\n kmip_1_1_attribute_tags = copy.deepcopy(kmip_1_0_attribute_tags) + [\n Tags.CERTIFICATE_LENGTH,\n Tags.X_509_CERTIFICATE_IDENTIFIER,\n Tags.X_509_CERTIFICATE_SUBJECT,\n Tags.X_509_CERTIFICATE_ISSUER,\n Tags.DIGITAL_SIGNATURE_ALGORITHM,\n Tags.FRESH\n ]\n kmip_1_2_attribute_tags = copy.deepcopy(kmip_1_1_attribute_tags) + [\n Tags.ALTERNATIVE_NAME,\n Tags.KEY_VALUE_PRESENT,\n Tags.KEY_VALUE_LOCATION,\n Tags.ORIGINAL_CREATION_DATE\n ]\n kmip_1_3_attribute_tags = copy.deepcopy(kmip_1_2_attribute_tags) + [\n Tags.RANDOM_NUMBER_GENERATOR\n ]\n kmip_1_4_attribute_tags = copy.deepcopy(kmip_1_3_attribute_tags) + [\n Tags.PKCS12_FRIENDLY_NAME,\n Tags.DESCRIPTION,\n Tags.COMMENT,\n Tags.SENSITIVE,\n Tags.ALWAYS_SENSITIVE,\n Tags.EXTRACTABLE,\n Tags.NEVER_EXTRACTABLE\n ]\n kmip_2_0_attribute_tags = copy.deepcopy(kmip_1_4_attribute_tags) + [\n Tags.CERTIFICATE_SUBJECT_CN,\n Tags.CERTIFICATE_SUBJECT_O,\n Tags.CERTIFICATE_SUBJECT_OU,\n Tags.CERTIFICATE_SUBJECT_EMAIL,\n Tags.CERTIFICATE_SUBJECT_C,\n Tags.CERTIFICATE_SUBJECT_ST,\n Tags.CERTIFICATE_SUBJECT_L,\n Tags.CERTIFICATE_SUBJECT_UID,\n Tags.CERTIFICATE_SUBJECT_SERIAL_NUMBER,\n Tags.CERTIFICATE_SUBJECT_TITLE,\n Tags.CERTIFICATE_SUBJECT_DC,\n Tags.CERTIFICATE_SUBJECT_DN_QUALIFIER,\n Tags.CERTIFICATE_ISSUER_CN,\n Tags.CERTIFICATE_ISSUER_O,\n Tags.CERTIFICATE_ISSUER_OU,\n Tags.CERTIFICATE_ISSUER_EMAIL,\n Tags.CERTIFICATE_ISSUER_C,\n Tags.CERTIFICATE_ISSUER_ST,\n Tags.CERTIFICATE_ISSUER_L,\n Tags.CERTIFICATE_ISSUER_UID,\n Tags.CERTIFICATE_ISSUER_SERIAL_NUMBER,\n Tags.CERTIFICATE_ISSUER_TITLE,\n Tags.CERTIFICATE_ISSUER_DC,\n Tags.CERTIFICATE_ISSUER_DN_QUALIFIER,\n Tags.KEY_FORMAT_TYPE,\n Tags.NIST_KEY_TYPE,\n Tags.OPAQUE_DATA_TYPE,\n Tags.PROTECTION_LEVEL,\n Tags.PROTECTION_PERIOD,\n Tags.PROTECTION_STORAGE_MASK,\n Tags.QUANTUM_SAFE,\n Tags.SHORT_UNIQUE_IDENTIFIER,\n Tags.ATTRIBUTE\n ]\n kmip_2_0_attribute_tags.remove(Tags.CERTIFICATE_IDENTIFIER)\n kmip_2_0_attribute_tags.remove(Tags.CERTIFICATE_SUBJECT)\n kmip_2_0_attribute_tags.remove(Tags.CERTIFICATE_ISSUER)\n kmip_2_0_attribute_tags.remove(Tags.OPERATION_POLICY_NAME)\n kmip_2_0_attribute_tags.remove(Tags.CUSTOM_ATTRIBUTE)\n\n if kmip_version == KMIPVersion.KMIP_1_0:\n return tag in kmip_1_0_attribute_tags\n elif kmip_version == KMIPVersion.KMIP_1_1:\n return tag in kmip_1_1_attribute_tags\n elif kmip_version == KMIPVersion.KMIP_1_2:\n return tag in kmip_1_2_attribute_tags\n elif kmip_version == KMIPVersion.KMIP_1_3:\n return tag in kmip_1_3_attribute_tags\n elif kmip_version == KMIPVersion.KMIP_1_4:\n return tag in kmip_1_4_attribute_tags\n elif kmip_version == KMIPVersion.KMIP_2_0:\n return tag in kmip_2_0_attribute_tags\n else:\n all_attribute_tags = set(\n kmip_1_0_attribute_tags +\n kmip_1_1_attribute_tags +\n kmip_1_2_attribute_tags +\n kmip_1_3_attribute_tags +\n kmip_1_4_attribute_tags +\n kmip_2_0_attribute_tags\n )\n return tag in all_attribute_tags", "response": "A utility function that checks if the tag is a valid attribute tag."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(CreateKeyPairRequestPayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self.is_tag_next(\n enums.Tags.COMMON_TEMPLATE_ATTRIBUTE,\n local_buffer\n ):\n self._common_template_attribute = objects.TemplateAttribute(\n tag=enums.Tags.COMMON_TEMPLATE_ATTRIBUTE\n )\n self._common_template_attribute.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n if self.is_tag_next(enums.Tags.COMMON_ATTRIBUTES, local_buffer):\n attributes = objects.Attributes(\n tag=enums.Tags.COMMON_ATTRIBUTES\n )\n attributes.read(local_buffer, kmip_version=kmip_version)\n self._common_template_attribute = \\\n objects.convert_attributes_to_template_attribute(\n attributes\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self.is_tag_next(\n enums.Tags.PRIVATE_KEY_TEMPLATE_ATTRIBUTE,\n local_buffer\n ):\n self._private_key_template_attribute = \\\n objects.TemplateAttribute(\n tag=enums.Tags.PRIVATE_KEY_TEMPLATE_ATTRIBUTE\n )\n self._private_key_template_attribute.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n if self.is_tag_next(\n enums.Tags.PRIVATE_KEY_ATTRIBUTES,\n local_buffer\n ):\n attributes = objects.Attributes(\n tag=enums.Tags.PRIVATE_KEY_ATTRIBUTES\n )\n attributes.read(local_buffer, kmip_version=kmip_version)\n self._private_key_template_attribute = \\\n objects.convert_attributes_to_template_attribute(\n attributes\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self.is_tag_next(\n enums.Tags.PUBLIC_KEY_TEMPLATE_ATTRIBUTE,\n local_buffer\n ):\n self._public_key_template_attribute = \\\n objects.TemplateAttribute(\n tag=enums.Tags.PUBLIC_KEY_TEMPLATE_ATTRIBUTE\n )\n self._public_key_template_attribute.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n if self.is_tag_next(\n enums.Tags.PUBLIC_KEY_ATTRIBUTES,\n local_buffer\n ):\n attributes = objects.Attributes(\n tag=enums.Tags.PUBLIC_KEY_ATTRIBUTES\n )\n attributes.read(local_buffer, kmip_version=kmip_version)\n self._public_key_template_attribute = \\\n objects.convert_attributes_to_template_attribute(\n attributes\n )\n\n self.is_oversized(local_buffer)", "response": "Reads the data encoding the CreateKeyPairRequestPayload and decodes it into constituent parts."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nwrites the data encoding the CreateKeyPair request payload to a stream.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the CreateKeyPair request payload to a buffer.\n\n Args:\n output_buffer (stream): A data buffer in which to encode object\n data, supporting a write method.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n local_buffer = utils.BytearrayStream()\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self._common_template_attribute is not None:\n self._common_template_attribute.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n if self._common_template_attribute is not None:\n attributes = objects.convert_template_attribute_to_attributes(\n self._common_template_attribute\n )\n attributes.write(local_buffer, kmip_version=kmip_version)\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self._private_key_template_attribute is not None:\n self._private_key_template_attribute.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n if self._private_key_template_attribute is not None:\n attributes = objects.convert_template_attribute_to_attributes(\n self._private_key_template_attribute\n )\n attributes.write(local_buffer, kmip_version=kmip_version)\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self._public_key_template_attribute is not None:\n self._public_key_template_attribute.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n if self._public_key_template_attribute is not None:\n attributes = objects.convert_template_attribute_to_attributes(\n self._public_key_template_attribute\n )\n attributes.write(local_buffer, kmip_version=kmip_version)\n\n self.length = local_buffer.length()\n super(CreateKeyPairRequestPayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(CreateKeyPairResponsePayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(\n enums.Tags.PRIVATE_KEY_UNIQUE_IDENTIFIER,\n local_buffer\n ):\n self._private_key_unique_identifier = primitives.TextString(\n tag=enums.Tags.PRIVATE_KEY_UNIQUE_IDENTIFIER\n )\n self._private_key_unique_identifier.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The CreateKeyPair response payload encoding is missing the \"\n \"private key unique identifier.\"\n )\n\n if self.is_tag_next(\n enums.Tags.PUBLIC_KEY_UNIQUE_IDENTIFIER,\n local_buffer\n ):\n self._public_key_unique_identifier = primitives.TextString(\n tag=enums.Tags.PUBLIC_KEY_UNIQUE_IDENTIFIER\n )\n self._public_key_unique_identifier.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The CreateKeyPair response payload encoding is missing the \"\n \"public key unique identifier.\"\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self.is_tag_next(\n enums.Tags.PRIVATE_KEY_TEMPLATE_ATTRIBUTE,\n local_buffer\n ):\n self._private_key_template_attribute = \\\n objects.TemplateAttribute(\n tag=enums.Tags.PRIVATE_KEY_TEMPLATE_ATTRIBUTE\n )\n self._private_key_template_attribute.read(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(\n enums.Tags.PUBLIC_KEY_TEMPLATE_ATTRIBUTE,\n local_buffer\n ):\n self._public_key_template_attribute = \\\n objects.TemplateAttribute(\n tag=enums.Tags.PUBLIC_KEY_TEMPLATE_ATTRIBUTE\n )\n self._public_key_template_attribute.read(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_buffer)", "response": "Reads the data encoding the CreateKeyPair response payload and decode it into constituent parts."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_buffer = utils.BytearrayStream()\n\n if self._private_key_unique_identifier:\n self._private_key_unique_identifier.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The CreateKeyPair response payload is missing the private \"\n \"key unique identifier field.\"\n )\n\n if self._public_key_unique_identifier:\n self._public_key_unique_identifier.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The CreateKeyPair response payload is missing the public \"\n \"key unique identifier field.\"\n )\n\n if self._private_key_template_attribute:\n self._private_key_template_attribute.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._public_key_template_attribute:\n self._public_key_template_attribute.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.length = local_buffer.length()\n super(CreateKeyPairResponsePayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)", "response": "Writes the data encoding the CreateKeyPair response payload to the output_buffer."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nread the data encoding the GetAttributeList request payload and decode it into its constituent parts.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the GetAttributeList request payload and decode\n it into its constituent parts.\n\n Args:\n input_buffer (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n super(GetAttributeListRequestPayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_buffer):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n self._unique_identifier = None\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_buffer = utils.BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.length = local_buffer.length()\n super(GetAttributeListRequestPayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)", "response": "Writes the data encoding the GetAttributeList request payload to the output_buffer."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(GetAttributeListResponsePayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_buffer):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The GetAttributeList response payload encoding is missing \"\n \"the unique identifier.\"\n )\n\n names = list()\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n while self.is_tag_next(enums.Tags.ATTRIBUTE_NAME, local_buffer):\n name = primitives.TextString(tag=enums.Tags.ATTRIBUTE_NAME)\n name.read(local_buffer, kmip_version=kmip_version)\n names.append(name)\n if len(names) == 0:\n raise exceptions.InvalidKmipEncoding(\n \"The GetAttributeList response payload encoding is \"\n \"missing the attribute names.\"\n )\n self._attribute_names = names\n else:\n while self.is_tag_next(\n enums.Tags.ATTRIBUTE_REFERENCE,\n local_buffer\n ):\n if self.is_type_next(enums.Types.STRUCTURE, local_buffer):\n reference = objects.AttributeReference()\n reference.read(local_buffer, kmip_version=kmip_version)\n names.append(\n primitives.TextString(\n value=reference.attribute_name,\n tag=enums.Tags.ATTRIBUTE_NAME\n )\n )\n elif self.is_type_next(enums.Types.ENUMERATION, local_buffer):\n reference = primitives.Enumeration(\n enums.Tags,\n tag=enums.Tags.ATTRIBUTE_REFERENCE\n )\n reference.read(local_buffer, kmip_version=kmip_version)\n name = enums.convert_attribute_tag_to_name(reference.value)\n names.append(\n primitives.TextString(\n value=name,\n tag=enums.Tags.ATTRIBUTE_NAME\n )\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The GetAttributeList response payload encoding \"\n \"contains an invalid AttributeReference type.\"\n )\n self._attribute_names = names\n\n self.is_oversized(local_buffer)", "response": "Reads the data encoding the GetAttributeList response payload and decodes it into constituent parts."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nwrite the data encoding the GetAttributeList response payload to the output_buffer.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the GetAttributeList response payload to a\n stream.\n\n Args:\n output_buffer (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n InvalidField: Raised if the unique identifier or attribute name\n are not defined.\n \"\"\"\n local_buffer = utils.BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The GetAttributeList response payload is missing the unique \"\n \"identifier field.\"\n )\n\n if self._attribute_names:\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n for attribute_name in self._attribute_names:\n attribute_name.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n # NOTE (ph) This approach simplifies backwards compatible\n # issues but limits easy support for Attribute\n # Reference structures going forward, specifically\n # limiting the use of VendorIdentification for\n # custom attributes. If custom attributes need to\n # be retrieved using the GetAttributeList operation\n # for KMIP 2.0 applications this code will need to\n # change.\n for attribute_name in self._attribute_names:\n t = enums.convert_attribute_name_to_tag(\n attribute_name.value\n )\n e = primitives.Enumeration(\n enums.Tags,\n value=t,\n tag=enums.Tags.ATTRIBUTE_REFERENCE\n )\n e.write(local_buffer, kmip_version=kmip_version)\n\n else:\n raise exceptions.InvalidField(\n \"The GetAttributeList response payload is missing the \"\n \"attribute names field.\"\n )\n\n self.length = local_buffer.length()\n super(GetAttributeListResponsePayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nscan the provided directory for all JSON policy files.", "response": "def get_json_files(p):\n \"\"\"\n Scan the provided policy directory for all JSON policy files.\n \"\"\"\n f = [os.path.join(p, x) for x in os.listdir(p) if x.endswith(\".json\")]\n return sorted(f)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nscan the policy directory for policies and load them into memory.", "response": "def scan_policies(self):\n \"\"\"\n Scan the policy directory for policy data.\n \"\"\"\n policy_files = get_json_files(self.policy_directory)\n for f in set(policy_files) - set(self.policy_files):\n self.file_timestamps[f] = 0\n for f in set(self.policy_files) - set(policy_files):\n self.logger.info(\"Removing policies for file: {}\".format(f))\n self.file_timestamps.pop(f, None)\n for p in self.policy_cache.keys():\n self.disassociate_policy_and_file(p, f)\n for p in [k for k, v in self.policy_map.items() if v == f]:\n self.restore_or_delete_policy(p)\n self.policy_files = policy_files\n\n for f in sorted(self.file_timestamps.keys()):\n t = os.path.getmtime(f)\n if t > self.file_timestamps[f]:\n self.logger.info(\"Loading policies for file: {}\".format(f))\n self.file_timestamps[f] = t\n old_p = [k for k, v in self.policy_map.items() if v == f]\n try:\n new_p = operation_policy.read_policy_from_file(f)\n except ValueError:\n self.logger.error(\"Failure loading file: {}\".format(f))\n self.logger.debug(\"\", exc_info=True)\n continue\n for p in new_p.keys():\n self.logger.info(\"Loading policy: {}\".format(p))\n if p in self.reserved_policies:\n self.logger.warning(\n \"Policy '{}' overwrites a reserved policy and \"\n \"will be thrown out.\".format(p)\n )\n continue\n if p in sorted(self.policy_store.keys()):\n self.logger.debug(\n \"Policy '{}' overwrites an existing \"\n \"policy.\".format(p)\n )\n if f != self.policy_map.get(p):\n self.policy_cache.get(p).append(\n (\n time.time(),\n self.policy_map.get(p),\n self.policy_store.get(p)\n )\n )\n else:\n self.policy_cache[p] = []\n self.policy_store[p] = new_p.get(p)\n self.policy_map[p] = f\n for p in set(old_p) - set(new_p.keys()):\n self.disassociate_policy_and_file(p, f)\n self.restore_or_delete_policy(p)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef run(self):\n self.initialize_tracking_structures()\n\n if self.live_monitoring:\n self.logger.info(\"Starting up the operation policy file monitor.\")\n while not self.halt_trigger.is_set():\n time.sleep(1)\n self.scan_policies()\n self.logger.info(\"Stopping the operation policy file monitor.\")\n else:\n self.scan_policies()", "response": "Start monitoring the operation policy files."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nextracts an X.509 certificate from a socket connection.", "response": "def get_certificate_from_connection(connection):\n \"\"\"\n Extract an X.509 certificate from a socket connection.\n \"\"\"\n certificate = connection.getpeercert(binary_form=True)\n if certificate:\n return x509.load_der_x509_certificate(\n certificate,\n backends.default_backend()\n )\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_extended_key_usage_from_certificate(certificate):\n try:\n return certificate.extensions.get_extension_for_oid(\n x509.oid.ExtensionOID.EXTENDED_KEY_USAGE\n ).value\n except x509.ExtensionNotFound:\n return None", "response": "Given an X.509 certificate extract and return the extendedKeyUsage\n extension."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_common_names_from_certificate(certificate):\n\n common_names = certificate.subject.get_attributes_for_oid(\n x509.oid.NameOID.COMMON_NAME\n )\n return [common_name.value for common_name in common_names]", "response": "Given an X.509 certificate extract and return all common names."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\ngive an X.509 certificate extract and return the client identity.", "response": "def get_client_identity_from_certificate(certificate):\n \"\"\"\n Given an X.509 certificate, extract and return the client identity.\n \"\"\"\n client_ids = get_common_names_from_certificate(certificate)\n\n if len(client_ids) > 0:\n if len(client_ids) > 1:\n raise exceptions.PermissionDenied(\n \"Multiple client identities found.\"\n )\n return client_ids[0]\n else:\n raise exceptions.PermissionDenied(\n \"The certificate does not define any subject common names. \"\n \"Client identity unavailable.\"\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(CreateRequestPayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.OBJECT_TYPE, local_buffer):\n self._object_type = primitives.Enumeration(\n enums.ObjectType,\n tag=enums.Tags.OBJECT_TYPE\n )\n self._object_type.read(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The Create request payload encoding is missing the object \"\n \"type.\"\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self.is_tag_next(enums.Tags.TEMPLATE_ATTRIBUTE, local_buffer):\n self._template_attribute = objects.TemplateAttribute()\n self._template_attribute.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The Create request payload encoding is missing the \"\n \"template attribute.\"\n )\n else:\n # NOTE (ph) For now, leave attributes natively in TemplateAttribute\n # form and just convert to the KMIP 2.0 Attributes form as needed\n # for encoding/decoding purposes. Changing the payload to require\n # the new Attributes structure will trigger a bunch of second-order\n # effects across the client and server codebases that is beyond\n # the scope of updating the Create payloads to support KMIP 2.0.\n if self.is_tag_next(enums.Tags.ATTRIBUTES, local_buffer):\n attributes = objects.Attributes()\n attributes.read(local_buffer, kmip_version=kmip_version)\n value = objects.convert_attributes_to_template_attribute(\n attributes\n )\n self._template_attribute = value\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The Create request payload encoding is missing the \"\n \"attributes structure.\"\n )\n\n self.is_oversized(local_buffer)", "response": "Reads the data encoding the Create Request Payload and decode it into the object type and template attribute parts."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nwriting the data encoding the object into a stream.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the Create request payload to a buffer.\n\n Args:\n output_buffer (stream): A data buffer in which to encode object\n data, supporting a write method.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n InvalidField: Raised if the object type attribute or template\n attribute is not defined.\n \"\"\"\n local_buffer = utils.BytearrayStream()\n\n if self._object_type:\n self._object_type.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The Create request payload is missing the object type field.\"\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self._template_attribute:\n self._template_attribute.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The Create request payload is missing the template \"\n \"attribute field.\"\n )\n else:\n # NOTE (ph) For now, leave attributes natively in TemplateAttribute\n # form and just convert to the KMIP 2.0 Attributes form as needed\n # for encoding/decoding purposes. Changing the payload to require\n # the new Attributes structure will trigger a bunch of second-order\n # effects across the client and server codebases that is beyond\n # the scope of updating the Create payloads to support KMIP 2.0.\n if self._template_attribute:\n attributes = objects.convert_template_attribute_to_attributes(\n self._template_attribute\n )\n attributes.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The Create request payload is missing the template \"\n \"attribute field.\"\n )\n\n self.length = local_buffer.length()\n super(CreateRequestPayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(CreateResponsePayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.OBJECT_TYPE, local_buffer):\n self._object_type = primitives.Enumeration(\n enums.ObjectType,\n tag=enums.Tags.OBJECT_TYPE\n )\n self._object_type.read(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The Create response payload encoding is missing the object \"\n \"type.\"\n )\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_buffer):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The Create response payload encoding is missing the unique \"\n \"identifier.\"\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self.is_tag_next(enums.Tags.TEMPLATE_ATTRIBUTE, local_buffer):\n self._template_attribute = objects.TemplateAttribute()\n self._template_attribute.read(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_buffer)", "response": "Reads the data encoding the Create response payload and decode it into the corresponding parts."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_buffer = utils.BytearrayStream()\n\n if self._object_type:\n self._object_type.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The Create response payload is missing the object type field.\"\n )\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The Create response payload is missing the unique identifier \"\n \"field.\"\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self._template_attribute:\n self._template_attribute.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.length = local_buffer.length()\n super(CreateResponsePayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)", "response": "Writes the data encoding the object into a stream."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nconvert a Pie object into a core secret object and vice versa.", "response": "def convert(self, obj):\n \"\"\"\n Convert a Pie object into a core secret object and vice versa.\n\n Args:\n obj (various): A Pie or core secret object to convert into the\n opposite object space. Required.\n\n Raises:\n TypeError: if the object type is unrecognized or unsupported.\n \"\"\"\n if isinstance(obj, pobjects.SymmetricKey):\n return self._build_core_key(obj, secrets.SymmetricKey)\n elif isinstance(obj, secrets.SymmetricKey):\n return self._build_pie_key(obj, pobjects.SymmetricKey)\n elif isinstance(obj, pobjects.PublicKey):\n return self._build_core_key(obj, secrets.PublicKey)\n elif isinstance(obj, secrets.PublicKey):\n return self._build_pie_key(obj, pobjects.PublicKey)\n elif isinstance(obj, pobjects.PrivateKey):\n return self._build_core_key(obj, secrets.PrivateKey)\n elif isinstance(obj, secrets.PrivateKey):\n return self._build_pie_key(obj, pobjects.PrivateKey)\n elif isinstance(obj, pobjects.Certificate):\n return self._build_core_certificate(obj)\n elif isinstance(obj, secrets.Certificate):\n return self._build_pie_certificate(obj)\n elif isinstance(obj, pobjects.SecretData):\n return self._build_core_secret_data(obj)\n elif isinstance(obj, secrets.SecretData):\n return self._build_pie_secret_data(obj)\n elif isinstance(obj, pobjects.OpaqueObject):\n return self._build_core_opaque_object(obj)\n elif isinstance(obj, secrets.OpaqueObject):\n return self._build_pie_opaque_object(obj)\n else:\n raise TypeError(\"object type unsupported and cannot be converted\")"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(EncryptResponsePayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"invalid payload missing the unique identifier attribute\"\n )\n\n if self.is_tag_next(enums.Tags.DATA, local_stream):\n self._data = primitives.ByteString(tag=enums.Tags.DATA)\n self._data.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\"invalid payload missing the data attribute\")\n\n if self.is_tag_next(enums.Tags.IV_COUNTER_NONCE, local_stream):\n self._iv_counter_nonce = primitives.ByteString(\n tag=enums.Tags.IV_COUNTER_NONCE\n )\n self._iv_counter_nonce.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)", "response": "Reads the data encoding the Encrypt response payload and decode it into its constituent parts."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreads the DeriveKey request payload and decodes it into its constituent parts.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the DeriveKey request payload and decode it\n into its constituent parts.\n\n Args:\n input_buffer (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is missing from the\n encoded payload.\n \"\"\"\n super(DeriveKeyRequestPayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.OBJECT_TYPE, local_buffer):\n self._object_type = primitives.Enumeration(\n enums.ObjectType,\n tag=enums.Tags.OBJECT_TYPE\n )\n self._object_type.read(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The DeriveKey request payload encoding is missing the object \"\n \"type.\"\n )\n\n unique_identifiers = []\n while self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_buffer):\n unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n unique_identifier.read(local_buffer, kmip_version=kmip_version)\n unique_identifiers.append(unique_identifier)\n if not unique_identifiers:\n raise exceptions.InvalidKmipEncoding(\n \"The DeriveKey request payload encoding is missing the unique \"\n \"identifiers.\"\n )\n else:\n self._unique_identifiers = unique_identifiers\n\n if self.is_tag_next(enums.Tags.DERIVATION_METHOD, local_buffer):\n self._derivation_method = primitives.Enumeration(\n enums.DerivationMethod,\n tag=enums.Tags.DERIVATION_METHOD\n )\n self._derivation_method.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The DeriveKey request payload encoding is missing the \"\n \"derivation method.\"\n )\n\n if self.is_tag_next(enums.Tags.DERIVATION_PARAMETERS, local_buffer):\n self._derivation_parameters = attributes.DerivationParameters()\n self._derivation_parameters.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The DeriveKey request payload encoding is missing the \"\n \"derivation parameters.\"\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self.is_tag_next(enums.Tags.TEMPLATE_ATTRIBUTE, local_buffer):\n self._template_attribute = objects.TemplateAttribute()\n self._template_attribute.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The DeriveKey request payload encoding is missing the \"\n \"template attribute.\"\n )\n else:\n if self.is_tag_next(enums.Tags.ATTRIBUTES, local_buffer):\n attrs = objects.Attributes()\n attrs.read(local_buffer, kmip_version=kmip_version)\n value = objects.convert_attributes_to_template_attribute(\n attrs\n )\n self._template_attribute = value\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The DeriveKey request payload encoding is missing the \"\n \"attributes structure.\"\n )\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nwrite the DeriveKey request payload to the output_buffer.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the DeriveKey request payload to a stream.\n\n Args:\n output_buffer (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is not defined.\n \"\"\"\n local_buffer = utils.BytearrayStream()\n\n if self._object_type:\n self._object_type.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The DeriveKey request payload is missing the object type \"\n \"field.\"\n )\n\n if self._unique_identifiers:\n for unique_identifier in self._unique_identifiers:\n unique_identifier.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The DeriveKey request payload is missing the unique \"\n \"identifiers field.\"\n )\n\n if self._derivation_method:\n self._derivation_method.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The DeriveKey request payload is missing the derivation \"\n \"method field.\"\n )\n\n if self._derivation_parameters:\n self._derivation_parameters.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The DeriveKey request payload is missing the derivation \"\n \"parameters field.\"\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self._template_attribute:\n self._template_attribute.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The DeriveKey request payload is missing the template \"\n \"attribute field.\"\n )\n else:\n if self._template_attribute:\n attrs = objects.convert_template_attribute_to_attributes(\n self._template_attribute\n )\n attrs.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The DeriveKey request payload is missing the template \"\n \"attribute field.\"\n )\n\n self.length = local_buffer.length()\n super(DeriveKeyRequestPayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ncheck if the attribute is supported by the current KMIP version.", "response": "def is_attribute_supported(self, attribute):\n \"\"\"\n Check if the attribute is supported by the current KMIP version.\n\n Args:\n attribute (string): The name of the attribute\n (e.g., 'Cryptographic Algorithm'). Required.\n Returns:\n bool: True if the attribute is supported by the current KMIP\n version. False otherwise.\n \"\"\"\n if attribute not in self._attribute_rule_sets.keys():\n return False\n\n rule_set = self._attribute_rule_sets.get(attribute)\n if self._version >= rule_set.version_added:\n return True\n else:\n return False"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef is_attribute_deprecated(self, attribute):\n rule_set = self._attribute_rule_sets.get(attribute)\n if rule_set.version_deprecated:\n if self._version >= rule_set.version_deprecated:\n return True\n else:\n return False\n else:\n return False", "response": "Checks if the attribute is deprecated by the current KMIP version."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef is_attribute_applicable_to_object_type(self, attribute, object_type):\n # TODO (peterhamilton) Handle applicability between certificate types\n rule_set = self._attribute_rule_sets.get(attribute)\n if object_type in rule_set.applies_to_object_types:\n return True\n else:\n return False", "response": "Checks if the given attribute is applicable to the given object type."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef is_attribute_multivalued(self, attribute):\n # TODO (peterhamilton) Handle multivalue swap between certificate types\n rule_set = self._attribute_rule_sets.get(attribute)\n return rule_set.multiple_instances_permitted", "response": "Checks if the attribute is allowed to have multiple instances."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef get_valid_value(self, direct_value, config_section,\n config_option_name, default_value):\n \"\"\"Returns a value that can be used as a parameter in client or\n server. If a direct_value is given, that value will be returned\n instead of the value from the config file. If the appropriate config\n file option is not found, the default_value is returned.\n\n :param direct_value: represents a direct value that should be used.\n supercedes values from config files\n :param config_section: which section of the config file to use\n :param config_option_name: name of config option value\n :param default_value: default value to be used if other options not\n found\n :returns: a value that can be used as a parameter\n \"\"\"\n ARG_MSG = \"Using given value '{0}' for {1}\"\n CONF_MSG = \"Using value '{0}' from configuration file {1} for {2}\"\n DEFAULT_MSG = \"Using default value '{0}' for {1}\"\n if direct_value:\n return_value = direct_value\n self.logger.debug(ARG_MSG.format(direct_value, config_option_name))\n else:\n try:\n return_value = self.conf.get(config_section,\n config_option_name)\n self.logger.debug(CONF_MSG.format(return_value,\n CONFIG_FILE,\n config_option_name))\n except Exception:\n return_value = default_value\n self.logger.debug(DEFAULT_MSG.format(default_value,\n config_option_name))\n # TODO (peter-hamilton): Think about adding better value validation\n if return_value == self.NONE_VALUE:\n return None\n else:\n return return_value", "response": "Returns a value that can be used as a parameter in a client or a server."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreading the data encoding the Check response payload and decode it into the object.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Check response payload and decode it into\n its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is missing from the\n encoded payload.\n \"\"\"\n super(CheckResponsePayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(enums.Tags.USAGE_LIMITS_COUNT, local_stream):\n self._usage_limits_count = primitives.LongInteger(\n tag=enums.Tags.USAGE_LIMITS_COUNT\n )\n self._usage_limits_count.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(enums.Tags.CRYPTOGRAPHIC_USAGE_MASK, local_stream):\n self._cryptographic_usage_mask = primitives.Integer(\n tag=enums.Tags.CRYPTOGRAPHIC_USAGE_MASK\n )\n self._cryptographic_usage_mask.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(enums.Tags.LEASE_TIME, local_stream):\n self._lease_time = primitives.Interval(\n tag=enums.Tags.LEASE_TIME\n )\n self._lease_time.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = utils.BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._usage_limits_count:\n self._usage_limits_count.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._cryptographic_usage_mask:\n self._cryptographic_usage_mask.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._lease_time:\n self._lease_time.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(CheckResponsePayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the Check response payload to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nread the contents of the object from the input stream and decodes the parts into the object.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_2_0):\n \"\"\"\n Read the data stream and decode the AttributeReference structure into\n its parts.\n\n Args:\n input_buffer (stream): A data stream containing encoded object\n data, supporting a read method.\n kmip_version (enum): A KMIPVersion enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 2.0.\n\n Raises:\n InvalidKmipEncoding: Raised if the vendor identification or\n attribute name is missing from the encoding.\n VersionNotSupported: Raised when a KMIP version is provided that\n does not support the AttributeReference structure.\n \"\"\"\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the AttributeReference \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n super(AttributeReference, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.VENDOR_IDENTIFICATION, local_buffer):\n self._vendor_identification = primitives.TextString(\n tag=enums.Tags.VENDOR_IDENTIFICATION\n )\n self._vendor_identification.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The AttributeReference encoding is missing the vendor \"\n \"identification string.\"\n )\n\n if self.is_tag_next(enums.Tags.ATTRIBUTE_NAME, local_buffer):\n self._attribute_name = primitives.TextString(\n tag=enums.Tags.ATTRIBUTE_NAME\n )\n self._attribute_name.read(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The AttributeReference encoding is missing the attribute \"\n \"name string.\"\n )\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_2_0):\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the AttributeReference \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n local_buffer = BytearrayStream()\n\n if self._vendor_identification:\n self._vendor_identification.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The AttributeReference is missing the vendor identification \"\n \"field.\"\n )\n\n if self._attribute_name:\n self._attribute_name.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The AttributeReference is missing the attribute name field.\"\n )\n\n self.length = local_buffer.length()\n super(AttributeReference, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)", "response": "Writes the contents of the object to the output_buffer."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_2_0):\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the Attributes object.\".format(\n kmip_version.value\n )\n )\n\n super(Attributes, self).read(input_stream, kmip_version=kmip_version)\n local_stream = BytearrayStream(input_stream.read(self.length))\n\n while True:\n if len(local_stream) < 3:\n break\n tag = struct.unpack('!I', b'\\x00' + local_stream.peek(3))[0]\n if enums.is_enum_value(enums.Tags, tag):\n tag = enums.Tags(tag)\n if not enums.is_attribute(tag, kmip_version=kmip_version):\n raise exceptions.AttributeNotSupported(\n \"Attribute {} is not supported by KMIP {}.\".format(\n tag.name,\n kmip_version.value\n )\n )\n value = self._factory.create_attribute_value_by_enum(tag, None)\n value.read(local_stream, kmip_version=kmip_version)\n self._attributes.append(value)\n else:\n break\n\n self.is_oversized(local_stream)", "response": "Reads the contents of the object from the input stream and appends them to the internal list of attributes."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_2_0):\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the Attributes object.\".format(\n kmip_version.value\n )\n )\n\n local_stream = BytearrayStream()\n\n for attribute in self._attributes:\n tag = attribute.tag\n if not enums.is_attribute(tag, kmip_version=kmip_version):\n raise exceptions.AttributeNotSupported(\n \"Attribute {} is not supported by KMIP {}.\".format(\n tag.name,\n kmip_version.value\n )\n )\n attribute.write(local_stream, kmip_version=kmip_version)\n\n self.length = local_stream.length()\n super(Attributes, self).write(output_stream, kmip_version=kmip_version)\n output_stream.write(local_stream.buffer)", "response": "Writes the contents of the object to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(Nonce, self).read(input_stream, kmip_version=kmip_version)\n local_stream = BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.NONCE_ID, local_stream):\n self._nonce_id = primitives.ByteString(\n tag=enums.Tags.NONCE_ID\n )\n self._nonce_id.read(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Nonce encoding missing the nonce ID.\"\n )\n\n if self.is_tag_next(enums.Tags.NONCE_VALUE, local_stream):\n self._nonce_value = primitives.ByteString(\n tag=enums.Tags.NONCE_VALUE\n )\n self._nonce_value.read(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Nonce encoding missing the nonce value.\"\n )\n\n self.is_oversized(local_stream)", "response": "Reads the data encoding the object\n into the internal representation."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = BytearrayStream()\n\n if self._nonce_id:\n self._nonce_id.write(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\"Nonce struct is missing the nonce ID.\")\n\n if self._nonce_value:\n self._nonce_value.write(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\"Nonce struct is missing the nonce value.\")\n\n self.length = local_stream.length()\n super(Nonce, self).write(output_stream, kmip_version=kmip_version)\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the object to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(UsernamePasswordCredential, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.USERNAME, local_stream):\n self._username = primitives.TextString(\n tag=enums.Tags.USERNAME\n )\n self._username.read(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Username/password credential encoding missing the username.\"\n )\n\n if self.is_tag_next(enums.Tags.PASSWORD, local_stream):\n self._password = primitives.TextString(\n tag=enums.Tags.PASSWORD\n )\n self._password.read(local_stream, kmip_version=kmip_version)\n\n self.is_oversized(local_stream)", "response": "Reads the data encoding the UsernamePasswordCredential struct and decodes it into constituent parts."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = BytearrayStream()\n\n if self._username:\n self._username.write(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Username/password credential struct missing the username.\"\n )\n\n if self._password:\n self._password.write(local_stream, kmip_version=kmip_version)\n\n self.length = local_stream.length()\n super(UsernamePasswordCredential, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the UsernamePasswordCredential struct to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(DeviceCredential, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.DEVICE_SERIAL_NUMBER, local_stream):\n self._device_serial_number = primitives.TextString(\n tag=enums.Tags.DEVICE_SERIAL_NUMBER\n )\n self._device_serial_number.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.PASSWORD, local_stream):\n self._password = primitives.TextString(\n tag=enums.Tags.PASSWORD\n )\n self._password.read(local_stream, kmip_version=kmip_version)\n\n if self.is_tag_next(enums.Tags.DEVICE_IDENTIFIER, local_stream):\n self._device_identifier = primitives.TextString(\n tag=enums.Tags.DEVICE_IDENTIFIER\n )\n self._device_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.NETWORK_IDENTIFIER, local_stream):\n self._network_identifier = primitives.TextString(\n tag=enums.Tags.NETWORK_IDENTIFIER\n )\n self._network_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.MACHINE_IDENTIFIER, local_stream):\n self._machine_identifier = primitives.TextString(\n tag=enums.Tags.MACHINE_IDENTIFIER\n )\n self._machine_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.MEDIA_IDENTIFIER, local_stream):\n self._media_identifier = primitives.TextString(\n tag=enums.Tags.MEDIA_IDENTIFIER\n )\n self._media_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)", "response": "Reads the data encoding the object into the object."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = BytearrayStream()\n\n if self._device_serial_number is not None:\n self._device_serial_number.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._password is not None:\n self._password.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._device_identifier is not None:\n self._device_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._network_identifier is not None:\n self._network_identifier.write(\n local_stream,\n kmip_version=kmip_version)\n if self._machine_identifier is not None:\n self._machine_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._media_identifier is not None:\n self._media_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(DeviceCredential, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the object to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreads the contents of the object from the input stream and populates the attributes of the object.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Credential struct and decode it into its\n constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if either the credential type or value are\n missing from the encoding.\n \"\"\"\n super(Credential, self).read(input_stream, kmip_version=kmip_version)\n local_stream = BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.CREDENTIAL_TYPE, local_stream):\n self._credential_type = primitives.Enumeration(\n enum=enums.CredentialType,\n tag=enums.Tags.CREDENTIAL_TYPE\n )\n self._credential_type.read(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Credential encoding missing the credential type.\"\n )\n\n if self.is_tag_next(enums.Tags.CREDENTIAL_VALUE, local_stream):\n if self.credential_type == \\\n enums.CredentialType.USERNAME_AND_PASSWORD:\n self._credential_value = UsernamePasswordCredential()\n elif self.credential_type == enums.CredentialType.DEVICE:\n self._credential_value = DeviceCredential()\n elif self.credential_type == enums.CredentialType.ATTESTATION:\n self._credential_value = AttestationCredential()\n else:\n raise ValueError(\n \"Credential encoding includes unrecognized credential \"\n \"type.\"\n )\n self._credential_value.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Credential encoding missing the credential value.\"\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nwriting the data encoding the object to the output stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the Credential struct to a stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if either the credential type or value are not\n defined.\n \"\"\"\n local_stream = BytearrayStream()\n\n if self._credential_type:\n self._credential_type.write(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Credential struct missing the credential type.\"\n )\n\n if self._credential_value:\n self._credential_value.write(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Credential struct missing the credential value.\"\n )\n\n self.length = local_stream.length()\n super(Credential, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(MACSignatureKeyInformation, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Invalid struct missing the unique identifier attribute.\"\n )\n\n if self.is_tag_next(\n enums.Tags.CRYPTOGRAPHIC_PARAMETERS,\n local_stream\n ):\n self._cryptographic_parameters = CryptographicParameters()\n self._cryptographic_parameters.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)", "response": "Reads the data encoding the MACSignatureKeyInformation struct and decodes it into constituent parts."} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Invalid struct missing the unique identifier attribute.\"\n )\n\n if self._cryptographic_parameters:\n self._cryptographic_parameters.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(MACSignatureKeyInformation, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the MACSignatureKeyInformation struct to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(KeyWrappingData, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.WRAPPING_METHOD, local_stream):\n self._wrapping_method = primitives.Enumeration(\n enum=enums.WrappingMethod,\n tag=enums.Tags.WRAPPING_METHOD\n )\n self._wrapping_method.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Invalid struct missing the wrapping method attribute.\"\n )\n\n if self.is_tag_next(\n enums.Tags.ENCRYPTION_KEY_INFORMATION,\n local_stream\n ):\n self._encryption_key_information = EncryptionKeyInformation()\n self._encryption_key_information.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(\n enums.Tags.MAC_SIGNATURE_KEY_INFORMATION,\n local_stream\n ):\n self._mac_signature_key_information = MACSignatureKeyInformation()\n self._mac_signature_key_information.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.MAC_SIGNATURE, local_stream):\n self._mac_signature = primitives.ByteString(\n tag=enums.Tags.MAC_SIGNATURE\n )\n self._mac_signature.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.IV_COUNTER_NONCE, local_stream):\n self._iv_counter_nonce = primitives.ByteString(\n tag=enums.Tags.IV_COUNTER_NONCE\n )\n self._iv_counter_nonce.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.ENCODING_OPTION, local_stream):\n self._encoding_option = primitives.Enumeration(\n enum=enums.EncodingOption,\n tag=enums.Tags.ENCODING_OPTION\n )\n self._encoding_option.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)", "response": "Reads the data encoding the object containing the object and decodes it into the corresponding parts."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nwrite the data encoding the object to the output stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the KeyWrappingData struct to a stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n local_stream = BytearrayStream()\n\n if self._wrapping_method:\n self._wrapping_method.write(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Invalid struct missing the wrapping method attribute.\"\n )\n\n if self._encryption_key_information:\n self._encryption_key_information.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._mac_signature_key_information:\n self._mac_signature_key_information.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._mac_signature:\n self._mac_signature.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._iv_counter_nonce:\n self._iv_counter_nonce.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._encoding_option:\n self._encoding_option.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(KeyWrappingData, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nread the data encoding the object containing the object and decodes it into its constituent parts.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the KeyWrappingSpecification struct and decode\n it into its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n super(KeyWrappingSpecification, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.WRAPPING_METHOD, local_stream):\n self._wrapping_method = primitives.Enumeration(\n enum=enums.WrappingMethod,\n tag=enums.Tags.WRAPPING_METHOD\n )\n self._wrapping_method.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Invalid struct missing the wrapping method attribute.\"\n )\n\n if self.is_tag_next(\n enums.Tags.ENCRYPTION_KEY_INFORMATION,\n local_stream\n ):\n self._encryption_key_information = EncryptionKeyInformation()\n self._encryption_key_information.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(\n enums.Tags.MAC_SIGNATURE_KEY_INFORMATION,\n local_stream\n ):\n self._mac_signature_key_information = MACSignatureKeyInformation()\n self._mac_signature_key_information.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n attribute_names = []\n while self.is_tag_next(enums.Tags.ATTRIBUTE_NAME, local_stream):\n attribute_name = primitives.TextString(\n tag=enums.Tags.ATTRIBUTE_NAME\n )\n attribute_name.read(local_stream, kmip_version=kmip_version)\n attribute_names.append(attribute_name)\n self._attribute_names = attribute_names\n\n if self.is_tag_next(enums.Tags.ENCODING_OPTION, local_stream):\n self._encoding_option = primitives.Enumeration(\n enum=enums.EncodingOption,\n tag=enums.Tags.ENCODING_OPTION\n )\n self._encoding_option.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = BytearrayStream()\n\n if self._wrapping_method:\n self._wrapping_method.write(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"Invalid struct missing the wrapping method attribute.\"\n )\n\n if self._encryption_key_information:\n self._encryption_key_information.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._mac_signature_key_information:\n self._mac_signature_key_information.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._attribute_names:\n for unique_identifier in self._attribute_names:\n unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._encoding_option:\n self._encoding_option.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(KeyWrappingSpecification, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the KeyWrappingSpecification struct to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nread the data encoding the ExtensionInformation object and decode it into constituent parts.", "response": "def read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the ExtensionInformation object and decode it\n into its constituent parts.\n\n Args:\n istream (Stream): A data stream containing encoded object data,\n supporting a read method; usually a BytearrayStream object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n super(ExtensionInformation, self).read(\n istream,\n kmip_version=kmip_version\n )\n tstream = BytearrayStream(istream.read(self.length))\n\n self.extension_name.read(tstream, kmip_version=kmip_version)\n\n if self.is_tag_next(Tags.EXTENSION_TAG, tstream):\n self.extension_tag = ExtensionTag()\n self.extension_tag.read(tstream, kmip_version=kmip_version)\n if self.is_tag_next(Tags.EXTENSION_TYPE, tstream):\n self.extension_type = ExtensionType()\n self.extension_type.read(tstream, kmip_version=kmip_version)\n\n self.is_oversized(tstream)\n self.validate()"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nwrite the data encoding the object to the data stream.", "response": "def write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the ExtensionInformation object to a stream.\n\n Args:\n ostream (Stream): A data stream in which to encode object data,\n supporting a write method; usually a BytearrayStream object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n tstream = BytearrayStream()\n\n self.extension_name.write(tstream, kmip_version=kmip_version)\n\n if self.extension_tag is not None:\n self.extension_tag.write(tstream, kmip_version=kmip_version)\n if self.extension_type is not None:\n self.extension_type.write(tstream, kmip_version=kmip_version)\n\n self.length = tstream.length()\n super(ExtensionInformation, self).write(\n ostream,\n kmip_version=kmip_version\n )\n ostream.write(tstream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nconstructing an ExtensionInformation object from provided values.", "response": "def create(cls, extension_name=None, extension_tag=None,\n extension_type=None):\n \"\"\"\n Construct an ExtensionInformation object from provided extension\n values.\n\n Args:\n extension_name (str): The name of the extension. Optional,\n defaults to None.\n extension_tag (int): The tag number of the extension. Optional,\n defaults to None.\n extension_type (int): The type index of the extension. Optional,\n defaults to None.\n\n Returns:\n ExtensionInformation: The newly created set of extension\n information.\n\n Example:\n >>> x = ExtensionInformation.create('extension', 1, 1)\n >>> x.extension_name.value\n ExtensionName(value='extension')\n >>> x.extension_tag.value\n ExtensionTag(value=1)\n >>> x.extension_type.value\n ExtensionType(value=1)\n \"\"\"\n extension_name = ExtensionName(extension_name)\n extension_tag = ExtensionTag(extension_tag)\n extension_type = ExtensionType(extension_type)\n\n return ExtensionInformation(\n extension_name=extension_name,\n extension_tag=extension_tag,\n extension_type=extension_type)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreading the data encoding the RevocationReason object and decode it into its constituent parts.", "response": "def read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the RevocationReason object and decode it\n into its constituent parts.\n\n Args:\n istream (Stream): A data stream containing encoded object data,\n supporting a read method; usually a BytearrayStream object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n super(RevocationReason, self).read(istream, kmip_version=kmip_version)\n tstream = BytearrayStream(istream.read(self.length))\n\n self.revocation_code = RevocationReasonCode()\n self.revocation_code.read(tstream, kmip_version=kmip_version)\n\n if self.is_tag_next(Tags.REVOCATION_MESSAGE, tstream):\n self.revocation_message = TextString()\n self.revocation_message.read(tstream, kmip_version=kmip_version)\n\n self.is_oversized(tstream)\n self.validate()"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n tstream = BytearrayStream()\n\n self.revocation_code.write(tstream, kmip_version=kmip_version)\n if self.revocation_message is not None:\n self.revocation_message.write(tstream, kmip_version=kmip_version)\n\n # Write the length and value\n self.length = tstream.length()\n super(RevocationReason, self).write(ostream, kmip_version=kmip_version)\n ostream.write(tstream.buffer)", "response": "Writes the data encoding the RevocationReason object to the ostream."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef validate(self):\n if not isinstance(self.revocation_code, RevocationReasonCode):\n msg = \"RevocationReaonCode expected\"\n raise TypeError(msg)\n if self.revocation_message is not None:\n if not isinstance(self.revocation_message, TextString):\n msg = \"TextString expect\"\n raise TypeError(msg)", "response": "validate the RevocationReason object"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nreads the contents of the objectDefaults structure from the input_buffer and populates the internal state of the object.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_2_0):\n \"\"\"\n Read the data encoding the ObjectDefaults structure and decode it into\n its constituent parts.\n\n Args:\n input_buffer (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 2.0.\n\n Raises:\n InvalidKmipEncoding: Raised if the object type or attributes are\n missing from the encoding.\n VersionNotSupported: Raised when a KMIP version is provided that\n does not support the ObjectDefaults structure.\n \"\"\"\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the ObjectDefaults object.\".format(\n kmip_version.value\n )\n )\n\n super(ObjectDefaults, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.OBJECT_TYPE, local_buffer):\n self._object_type = primitives.Enumeration(\n enums.ObjectType,\n tag=enums.Tags.OBJECT_TYPE\n )\n self._object_type.read(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The ObjectDefaults encoding is missing the object type \"\n \"enumeration.\"\n )\n\n if self.is_tag_next(enums.Tags.ATTRIBUTES, local_buffer):\n self._attributes = Attributes()\n self._attributes.read(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The ObjectDefaults encoding is missing the attributes \"\n \"structure.\"\n )\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nwrites the object - level attributes of the object to the output stream.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_2_0):\n \"\"\"\n Write the ObjectDefaults structure encoding to the data stream.\n\n Args:\n output_buffer (stream): A data stream in which to encode\n Attributes structure data, supporting a write method.\n kmip_version (enum): A KMIPVersion enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 2.0.\n\n Raises:\n InvalidField: Raised if the object type or attributes fields are\n not defined.\n VersionNotSupported: Raised when a KMIP version is provided that\n does not support the ObjectDefaults structure.\n \"\"\"\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the ObjectDefaults object.\".format(\n kmip_version.value\n )\n )\n\n local_buffer = BytearrayStream()\n\n if self._object_type:\n self._object_type.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The ObjectDefaults structure is missing the object type \"\n \"field.\"\n )\n\n if self._attributes:\n self._attributes.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The ObjectDefaults structure is missing the attributes field.\"\n )\n\n self.length = local_buffer.length()\n super(ObjectDefaults, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_2_0):\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the DefaultsInformation \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n super(DefaultsInformation, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n object_defaults = []\n while self.is_tag_next(enums.Tags.OBJECT_DEFAULTS, local_buffer):\n object_default = ObjectDefaults()\n object_default.read(local_buffer, kmip_version=kmip_version)\n object_defaults.append(object_default)\n\n if len(object_defaults) == 0:\n raise exceptions.InvalidKmipEncoding(\n \"The DefaultsInformation encoding is missing the object \"\n \"defaults structure.\"\n )\n else:\n self._object_defaults = object_defaults\n\n self.is_oversized(local_buffer)", "response": "Reads the data encoding the object - level defaults structure and decodes the object - level defaults into constituent parts."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_2_0):\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the DefaultsInformation \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n local_buffer = BytearrayStream()\n\n if self._object_defaults:\n for object_default in self._object_defaults:\n object_default.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The DefaultsInformation structure is missing the object \"\n \"defaults field.\"\n )\n\n self.length = local_buffer.length()\n super(DefaultsInformation, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)", "response": "Writes the object to the output_buffer."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nread the contents of the object in the specified KMIP version from the input stream.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_3):\n \"\"\"\n Read the data encoding the RNGParameters structure and decode it\n into its constituent parts.\n\n Args:\n input_buffer (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 2.0.\n\n Raises:\n InvalidKmipEncoding: Raised if the RNG algorithm is missing from\n the encoding.\n VersionNotSupported: Raised when a KMIP version is provided that\n does not support the RNGParameters structure.\n \"\"\"\n if kmip_version < enums.KMIPVersion.KMIP_1_3:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the RNGParameters object.\".format(\n kmip_version.value\n )\n )\n\n super(RNGParameters, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.RNG_ALGORITHM, local_buffer):\n rng_algorithm = primitives.Enumeration(\n enums.RNGAlgorithm,\n tag=enums.Tags.RNG_ALGORITHM\n )\n rng_algorithm.read(local_buffer, kmip_version=kmip_version)\n self._rng_algorithm = rng_algorithm\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The RNGParameters encoding is missing the RNG algorithm.\"\n )\n\n if self.is_tag_next(enums.Tags.CRYPTOGRAPHIC_ALGORITHM, local_buffer):\n cryptographic_algorithm = primitives.Enumeration(\n enums.CryptographicAlgorithm,\n tag=enums.Tags.CRYPTOGRAPHIC_ALGORITHM\n )\n cryptographic_algorithm.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._cryptographic_algorithm = cryptographic_algorithm\n\n if self.is_tag_next(enums.Tags.CRYPTOGRAPHIC_LENGTH, local_buffer):\n cryptographic_length = primitives.Integer(\n tag=enums.Tags.CRYPTOGRAPHIC_LENGTH\n )\n cryptographic_length.read(local_buffer, kmip_version=kmip_version)\n self._cryptographic_length = cryptographic_length\n\n if self.is_tag_next(enums.Tags.HASHING_ALGORITHM, local_buffer):\n hashing_algorithm = primitives.Enumeration(\n enums.HashingAlgorithm,\n tag=enums.Tags.HASHING_ALGORITHM\n )\n hashing_algorithm.read(local_buffer, kmip_version=kmip_version)\n self._hashing_algorithm = hashing_algorithm\n\n if self.is_tag_next(enums.Tags.DRBG_ALGORITHM, local_buffer):\n drbg_algorithm = primitives.Enumeration(\n enums.DRBGAlgorithm,\n tag=enums.Tags.DRBG_ALGORITHM\n )\n drbg_algorithm.read(local_buffer, kmip_version=kmip_version)\n self._drbg_algorithm = drbg_algorithm\n\n if self.is_tag_next(enums.Tags.RECOMMENDED_CURVE, local_buffer):\n recommended_curve = primitives.Enumeration(\n enums.RecommendedCurve,\n tag=enums.Tags.RECOMMENDED_CURVE\n )\n recommended_curve.read(local_buffer, kmip_version=kmip_version)\n self._recommended_curve = recommended_curve\n\n if self.is_tag_next(enums.Tags.FIPS186_VARIATION, local_buffer):\n fips186_variation = primitives.Enumeration(\n enums.FIPS186Variation,\n tag=enums.Tags.FIPS186_VARIATION\n )\n fips186_variation.read(local_buffer, kmip_version=kmip_version)\n self._fips186_variation = fips186_variation\n\n if self.is_tag_next(enums.Tags.PREDICTION_RESISTANCE, local_buffer):\n prediction_resistance = primitives.Boolean(\n tag=enums.Tags.PREDICTION_RESISTANCE\n )\n prediction_resistance.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._prediction_resistance = prediction_resistance\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_3):\n if kmip_version < enums.KMIPVersion.KMIP_1_3:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the RNGParameters object.\".format(\n kmip_version.value\n )\n )\n\n local_buffer = BytearrayStream()\n\n if self._rng_algorithm:\n self._rng_algorithm.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The RNGParameters structure is missing the RNG algorithm \"\n \"field.\"\n )\n\n if self._cryptographic_algorithm:\n self._cryptographic_algorithm.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._cryptographic_length:\n self._cryptographic_length.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._hashing_algorithm:\n self._hashing_algorithm.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._drbg_algorithm:\n self._drbg_algorithm.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._recommended_curve:\n self._recommended_curve.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._fips186_variation:\n self._fips186_variation.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._prediction_resistance:\n self._prediction_resistance.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.length = local_buffer.length()\n super(RNGParameters, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)", "response": "Writes the RNGParameters structure to the output_buffer."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreads the contents of the object in the input_buffer and populates the properties of the object with the corresponding parts.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_3):\n \"\"\"\n Read the data encoding the ProfileInformation structure and decode it\n into its constituent parts.\n\n Args:\n input_buffer (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 2.0.\n\n Raises:\n InvalidKmipEncoding: Raised if the profile name is missing from\n the encoding.\n VersionNotSupported: Raised when a KMIP version is provided that\n does not support the ProfileInformation structure.\n \"\"\"\n if kmip_version < enums.KMIPVersion.KMIP_1_3:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the ProfileInformation \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n super(ProfileInformation, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.PROFILE_NAME, local_buffer):\n profile_name = primitives.Enumeration(\n enums.ProfileName,\n tag=enums.Tags.PROFILE_NAME\n )\n profile_name.read(local_buffer, kmip_version=kmip_version)\n self._profile_name = profile_name\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The ProfileInformation encoding is missing the profile name.\"\n )\n\n if self.is_tag_next(enums.Tags.SERVER_URI, local_buffer):\n server_uri = primitives.TextString(tag=enums.Tags.SERVER_URI)\n server_uri.read(local_buffer, kmip_version=kmip_version)\n self._server_uri = server_uri\n\n if self.is_tag_next(enums.Tags.SERVER_PORT, local_buffer):\n server_port = primitives.Integer(tag=enums.Tags.SERVER_PORT)\n server_port.read(local_buffer, kmip_version=kmip_version)\n self._server_port = server_port\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nwriting the ProfileInformation structure to the output_buffer.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_3):\n \"\"\"\n Write the ProfileInformation structure encoding to the data stream.\n\n Args:\n output_buffer (stream): A data stream in which to encode\n ProfileInformation structure data, supporting a write method.\n kmip_version (enum): A KMIPVersion enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 2.0.\n\n Raises:\n InvalidField: Raised if the profile name field is not defined.\n VersionNotSupported: Raised when a KMIP version is provided that\n does not support the ProfileInformation structure.\n \"\"\"\n if kmip_version < enums.KMIPVersion.KMIP_1_3:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the ProfileInformation \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n local_buffer = BytearrayStream()\n\n if self._profile_name:\n self._profile_name.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The ProfileInformation structure is missing the profile \"\n \"name field.\"\n )\n\n if self._server_uri:\n self._server_uri.write(local_buffer, kmip_version=kmip_version)\n\n if self._server_port:\n self._server_port.write(local_buffer, kmip_version=kmip_version)\n\n self.length = local_buffer.length()\n super(ProfileInformation, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_3):\n if kmip_version < enums.KMIPVersion.KMIP_1_3:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the ValidationInformation \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n super(ValidationInformation, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(\n enums.Tags.VALIDATION_AUTHORITY_TYPE,\n local_buffer\n ):\n validation_authority_type = primitives.Enumeration(\n enums.ValidationAuthorityType,\n tag=enums.Tags.VALIDATION_AUTHORITY_TYPE\n )\n validation_authority_type.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._validation_authority_type = validation_authority_type\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The ValidationInformation encoding is missing the \"\n \"validation authority type.\"\n )\n\n if self.is_tag_next(\n enums.Tags.VALIDATION_AUTHORITY_COUNTRY,\n local_buffer\n ):\n validation_authority_country = primitives.TextString(\n tag=enums.Tags.VALIDATION_AUTHORITY_COUNTRY\n )\n validation_authority_country.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._validation_authority_country = validation_authority_country\n\n if self.is_tag_next(enums.Tags.VALIDATION_AUTHORITY_URI, local_buffer):\n validation_authority_uri = primitives.TextString(\n tag=enums.Tags.VALIDATION_AUTHORITY_URI\n )\n validation_authority_uri.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._validation_authority_uri = validation_authority_uri\n\n if self.is_tag_next(\n enums.Tags.VALIDATION_VERSION_MAJOR,\n local_buffer\n ):\n validation_version_major = primitives.Integer(\n tag=enums.Tags.VALIDATION_VERSION_MAJOR\n )\n validation_version_major.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._validation_version_major = validation_version_major\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The ValidationInformation encoding is missing the \"\n \"validation version major.\"\n )\n\n if self.is_tag_next(\n enums.Tags.VALIDATION_VERSION_MINOR,\n local_buffer\n ):\n validation_version_minor = primitives.Integer(\n tag=enums.Tags.VALIDATION_VERSION_MINOR\n )\n validation_version_minor.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._validation_version_minor = validation_version_minor\n\n if self.is_tag_next(enums.Tags.VALIDATION_TYPE, local_buffer):\n validation_type = primitives.Enumeration(\n enums.ValidationType,\n tag=enums.Tags.VALIDATION_TYPE\n )\n validation_type.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._validation_type = validation_type\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The ValidationInformation encoding is missing the \"\n \"validation type.\"\n )\n\n if self.is_tag_next(enums.Tags.VALIDATION_LEVEL, local_buffer):\n validation_level = primitives.Integer(\n tag=enums.Tags.VALIDATION_LEVEL\n )\n validation_level.read(local_buffer, kmip_version=kmip_version)\n self._validation_level = validation_level\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The ValidationInformation encoding is missing the \"\n \"validation level.\"\n )\n\n if self.is_tag_next(\n enums.Tags.VALIDATION_CERTIFICATE_IDENTIFIER,\n local_buffer\n ):\n validation_certificate_identifier = primitives.TextString(\n tag=enums.Tags.VALIDATION_CERTIFICATE_IDENTIFIER\n )\n validation_certificate_identifier.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._validation_certificate_identifier = \\\n validation_certificate_identifier\n\n if self.is_tag_next(\n enums.Tags.VALIDATION_CERTIFICATE_URI,\n local_buffer\n ):\n validation_certificate_uri = primitives.TextString(\n tag=enums.Tags.VALIDATION_CERTIFICATE_URI\n )\n validation_certificate_uri.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._validation_certificate_uri = validation_certificate_uri\n\n if self.is_tag_next(enums.Tags.VALIDATION_VENDOR_URI, local_buffer):\n validation_vendor_uri = primitives.TextString(\n tag=enums.Tags.VALIDATION_VENDOR_URI\n )\n validation_vendor_uri.read(local_buffer, kmip_version=kmip_version)\n self._validation_vendor_uri = validation_vendor_uri\n\n validation_profiles = []\n while self.is_tag_next(enums.Tags.VALIDATION_PROFILE, local_buffer):\n validation_profile = primitives.TextString(\n tag=enums.Tags.VALIDATION_PROFILE\n )\n validation_profile.read(local_buffer, kmip_version=kmip_version)\n validation_profiles.append(validation_profile)\n self._validation_profiles = validation_profiles\n\n self.is_oversized(local_buffer)", "response": "Reads the data encoding the ValidationInformation structure and decode it into parts."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_3):\n if kmip_version < enums.KMIPVersion.KMIP_1_3:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the ValidationInformation \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n local_buffer = BytearrayStream()\n\n if self._validation_authority_type:\n self._validation_authority_type.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The ValidationInformation structure is missing the \"\n \"validation authority type field.\"\n )\n\n if self._validation_authority_country:\n self._validation_authority_country.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._validation_authority_uri:\n self._validation_authority_uri.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._validation_version_major:\n self._validation_version_major.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The ValidationInformation structure is missing the \"\n \"validation version major field.\"\n )\n\n if self._validation_version_minor:\n self._validation_version_minor.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._validation_type:\n self._validation_type.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The ValidationInformation structure is missing the \"\n \"validation type field.\"\n )\n\n if self._validation_level:\n self._validation_level.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The ValidationInformation structure is missing the \"\n \"validation level field.\"\n )\n\n if self._validation_certificate_identifier:\n self._validation_certificate_identifier.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._validation_certificate_uri:\n self._validation_certificate_uri.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._validation_vendor_uri:\n self._validation_vendor_uri.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._validation_profiles:\n for validation_profile in self._validation_profiles:\n validation_profile.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.length = local_buffer.length()\n super(ValidationInformation, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)", "response": "Writes the KMIP object to the data stream."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nread the data encoding the CapabilityInformation structure and decode it into its constituent parts.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_3):\n \"\"\"\n Read the data encoding the CapabilityInformation structure and decode\n it into its constituent parts.\n\n Args:\n input_buffer (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 2.0.\n\n Raises:\n VersionNotSupported: Raised when a KMIP version is provided that\n does not support the CapabilityInformation structure.\n \"\"\"\n if kmip_version < enums.KMIPVersion.KMIP_1_3:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the CapabilityInformation \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n super(CapabilityInformation, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.STREAMING_CAPABILITY, local_buffer):\n streaming_capability = primitives.Boolean(\n tag=enums.Tags.STREAMING_CAPABILITY\n )\n streaming_capability.read(local_buffer, kmip_version=kmip_version)\n self._streaming_capability = streaming_capability\n\n if self.is_tag_next(enums.Tags.ASYNCHRONOUS_CAPABILITY, local_buffer):\n asynchronous_capability = primitives.Boolean(\n tag=enums.Tags.ASYNCHRONOUS_CAPABILITY\n )\n asynchronous_capability.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._asynchronous_capability = asynchronous_capability\n\n if self.is_tag_next(enums.Tags.ATTESTATION_CAPABILITY, local_buffer):\n attestation_capability = primitives.Boolean(\n tag=enums.Tags.ATTESTATION_CAPABILITY\n )\n attestation_capability.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._attestation_capability = attestation_capability\n\n if kmip_version >= enums.KMIPVersion.KMIP_1_4:\n if self.is_tag_next(\n enums.Tags.BATCH_UNDO_CAPABILITY,\n local_buffer\n ):\n batch_undo_capability = primitives.Boolean(\n tag=enums.Tags.BATCH_UNDO_CAPABILITY\n )\n batch_undo_capability.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._batch_continue_capability = batch_undo_capability\n\n if self.is_tag_next(\n enums.Tags.BATCH_CONTINUE_CAPABILITY,\n local_buffer\n ):\n batch_continue_capability = primitives.Boolean(\n tag=enums.Tags.BATCH_CONTINUE_CAPABILITY\n )\n batch_continue_capability.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._batch_continue_capability = batch_continue_capability\n\n if self.is_tag_next(enums.Tags.UNWRAP_MODE, local_buffer):\n unwrap_mode = primitives.Enumeration(\n enums.UnwrapMode,\n tag=enums.Tags.UNWRAP_MODE\n )\n unwrap_mode.read(local_buffer, kmip_version=kmip_version)\n self._unwrap_mode = unwrap_mode\n\n if self.is_tag_next(enums.Tags.DESTROY_ACTION, local_buffer):\n destroy_action = primitives.Enumeration(\n enums.DestroyAction,\n tag=enums.Tags.DESTROY_ACTION\n )\n destroy_action.read(local_buffer, kmip_version=kmip_version)\n self._destroy_action = destroy_action\n\n if self.is_tag_next(enums.Tags.SHREDDING_ALGORITHM, local_buffer):\n shredding_algorithm = primitives.Enumeration(\n enums.ShreddingAlgorithm,\n tag=enums.Tags.SHREDDING_ALGORITHM\n )\n shredding_algorithm.read(local_buffer, kmip_version=kmip_version)\n self._shredding_algorithm = shredding_algorithm\n\n if self.is_tag_next(enums.Tags.RNG_MODE, local_buffer):\n rng_mode = primitives.Enumeration(\n enums.RNGMode,\n tag=enums.Tags.RNG_MODE\n )\n rng_mode.read(local_buffer, kmip_version=kmip_version)\n self._rng_mode = rng_mode\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nwriting the KMIP Capability Information structure to the output stream.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_3):\n \"\"\"\n Write the CapabilityInformation structure encoding to the data stream.\n\n Args:\n output_buffer (stream): A data stream in which to encode\n CapabilityInformation structure data, supporting a write\n method.\n kmip_version (enum): A KMIPVersion enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 2.0.\n\n Raises:\n VersionNotSupported: Raised when a KMIP version is provided that\n does not support the CapabilityInformation structure.\n \"\"\"\n if kmip_version < enums.KMIPVersion.KMIP_1_3:\n raise exceptions.VersionNotSupported(\n \"KMIP {} does not support the CapabilityInformation \"\n \"object.\".format(\n kmip_version.value\n )\n )\n\n local_buffer = BytearrayStream()\n\n if self._streaming_capability:\n self._streaming_capability.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._asynchronous_capability:\n self._asynchronous_capability.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._attestation_capability:\n self._attestation_capability.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if kmip_version >= enums.KMIPVersion.KMIP_1_4:\n if self._batch_undo_capability:\n self._batch_undo_capability.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._batch_continue_capability:\n self._batch_continue_capability.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._unwrap_mode:\n self._unwrap_mode.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._destroy_action:\n self._destroy_action.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._shredding_algorithm:\n self._shredding_algorithm.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._rng_mode:\n self._rng_mode.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.length = local_buffer.length()\n super(CapabilityInformation, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef start(self):\n self.manager = multiprocessing.Manager()\n self.policies = self.manager.dict()\n policies = copy.deepcopy(operation_policy.policies)\n for policy_name, policy_set in six.iteritems(policies):\n self.policies[policy_name] = policy_set\n\n self.policy_monitor = monitor.PolicyDirectoryMonitor(\n self.config.settings.get('policy_path'),\n self.policies,\n self.live_policies\n )\n\n def interrupt_handler(trigger, frame):\n self.policy_monitor.stop()\n signal.signal(signal.SIGINT, interrupt_handler)\n signal.signal(signal.SIGTERM, interrupt_handler)\n\n self.policy_monitor.start()\n\n self._engine = engine.KmipEngine(\n policies=self.policies,\n database_path=self.config.settings.get('database_path')\n )\n\n self._logger.info(\"Starting server socket handler.\")\n\n # Create a TCP stream socket and configure it for immediate reuse.\n socket.setdefaulttimeout(10)\n self._socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\n self._socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)\n\n self._logger.debug(\n \"Configured cipher suites: {0}\".format(\n len(self.config.settings.get('tls_cipher_suites'))\n )\n )\n for cipher in self.config.settings.get('tls_cipher_suites'):\n self._logger.debug(cipher)\n auth_suite_ciphers = self.auth_suite.ciphers.split(':')\n self._logger.debug(\n \"Authentication suite ciphers to use: {0}\".format(\n len(auth_suite_ciphers)\n )\n )\n for cipher in auth_suite_ciphers:\n self._logger.debug(cipher)\n\n self._socket = ssl.wrap_socket(\n self._socket,\n keyfile=self.config.settings.get('key_path'),\n certfile=self.config.settings.get('certificate_path'),\n server_side=True,\n cert_reqs=ssl.CERT_REQUIRED,\n ssl_version=self.auth_suite.protocol,\n ca_certs=self.config.settings.get('ca_path'),\n do_handshake_on_connect=False,\n suppress_ragged_eofs=True,\n ciphers=self.auth_suite.ciphers\n )\n\n try:\n self._socket.bind(\n (\n self.config.settings.get('hostname'),\n int(self.config.settings.get('port'))\n )\n )\n except Exception as e:\n self._logger.exception(e)\n raise exceptions.NetworkingError(\n \"Server failed to bind socket handler to {0}:{1}\".format(\n self.config.settings.get('hostname'),\n self.config.settings.get('port')\n )\n )\n else:\n self._logger.info(\n \"Server successfully bound socket handler to {0}:{1}\".format(\n self.config.settings.get('hostname'),\n self.config.settings.get('port')\n )\n )\n self._is_serving = True", "response": "Start serving the server."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nstop the server and clean up any remaining connection threads.", "response": "def stop(self):\n \"\"\"\n Stop the server.\n\n Halt server client connections and clean up any existing connection\n threads.\n\n Raises:\n NetworkingError: Raised if a failure occurs while sutting down\n or closing the TLS server socket.\n \"\"\"\n self._logger.info(\"Cleaning up remaining connection threads.\")\n\n for thread in threading.enumerate():\n if thread is not threading.current_thread():\n try:\n thread.join(10.0)\n except Exception as e:\n self._logger.info(\n \"Error occurred while attempting to cleanup thread: \"\n \"{0}\".format(thread.name)\n )\n self._logger.exception(e)\n else:\n if thread.is_alive():\n self._logger.warning(\n \"Cleanup failed for thread: {0}. Thread is \"\n \"still alive\".format(thread.name)\n )\n else:\n self._logger.info(\n \"Cleanup succeeded for thread: {0}\".format(\n thread.name\n )\n )\n\n self._logger.info(\"Shutting down server socket handler.\")\n try:\n self._socket.shutdown(socket.SHUT_RDWR)\n self._socket.close()\n except Exception as e:\n self._logger.exception(e)\n raise exceptions.NetworkingError(\n \"Server failed to shutdown socket handler.\"\n )\n\n if hasattr(self, \"policy_monitor\"):\n try:\n self.policy_monitor.stop()\n self.policy_monitor.join()\n except Exception as e:\n self._logger.exception(e)\n raise exceptions.ShutdownError(\n \"Server failed to clean up the policy monitor.\"\n )"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef serve(self):\n self._socket.listen(5)\n\n def _signal_handler(signal_number, stack_frame):\n self._is_serving = False\n\n # Python3.5+ silently ignores SIGINT and retries system calls if\n # the signal handler does not raise an exception. Explicitly\n # detect SIGINT and raise a KeyboardInterrupt exception to regain\n # old functionality.\n if signal_number == signal.SIGINT:\n raise KeyboardInterrupt(\"SIGINT received\")\n\n signal.signal(signal.SIGINT, _signal_handler)\n signal.signal(signal.SIGTERM, _signal_handler)\n\n self._logger.info(\"Starting connection service...\")\n\n while self._is_serving:\n try:\n connection, address = self._socket.accept()\n except socket.timeout:\n # Setting the default socket timeout to break hung connections\n # will cause accept to periodically raise socket.timeout. This\n # is expected behavior, so ignore it and retry accept.\n pass\n except socket.error as e:\n self._logger.warning(\n \"Error detected while establishing new connection.\"\n )\n self._logger.exception(e)\n except KeyboardInterrupt:\n self._logger.warning(\"Interrupting connection service.\")\n self._is_serving = False\n break\n except Exception as e:\n self._logger.warning(\n \"Error detected while establishing new connection.\"\n )\n self._logger.exception(e)\n else:\n self._setup_connection_handler(connection, address)\n\n self._logger.info(\"Stopping connection service.\")", "response": "Start serving new client connections."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nreads the data encoding the Locate request payload and decode it into the attributes structure.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Locate request payload and decode it into\n its constituent parts.\n\n Args:\n input_buffer (stream): A data buffer containing encoded object\n data, supporting a read method.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n InvalidKmipEncoding: Raised if the attributes structure is missing\n from the encoded payload for KMIP 2.0+ encodings.\n \"\"\"\n super(LocateRequestPayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.MAXIMUM_ITEMS, local_buffer):\n self._maximum_items = primitives.Integer(\n tag=enums.Tags.MAXIMUM_ITEMS\n )\n self._maximum_items.read(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.OFFSET_ITEMS, local_buffer):\n self._offset_items = primitives.Integer(\n tag=enums.Tags.OFFSET_ITEMS\n )\n self._offset_items.read(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.STORAGE_STATUS_MASK, local_buffer):\n self._storage_status_mask = primitives.Integer(\n tag=enums.Tags.STORAGE_STATUS_MASK\n )\n self._storage_status_mask.read(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self.is_tag_next(enums.Tags.OBJECT_GROUP_MEMBER, local_buffer):\n self._object_group_member = primitives.Enumeration(\n enums.ObjectGroupMember,\n tag=enums.Tags.OBJECT_GROUP_MEMBER\n )\n self._object_group_member.read(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n while self.is_tag_next(enums.Tags.ATTRIBUTE, local_buffer):\n attribute = objects.Attribute()\n attribute.read(local_buffer, kmip_version=kmip_version)\n self._attributes.append(attribute)\n else:\n if self.is_tag_next(enums.Tags.ATTRIBUTES, local_buffer):\n attributes = objects.Attributes()\n attributes.read(local_buffer, kmip_version=kmip_version)\n # TODO (ph) Add a new utility to avoid using TemplateAttributes\n temp_attr = objects.convert_attributes_to_template_attribute(\n attributes\n )\n self._attributes = temp_attr.attributes\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The Locate request payload encoding is missing the \"\n \"attributes structure.\"\n )"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nwriting the data encoding the Locate Request Payload to the output_buffer.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the Locate request payload to a buffer.\n\n Args:\n output_buffer (stream): A data buffer in which to encode object\n data, supporting a write method.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n local_buffer = utils.BytearrayStream()\n\n if self._maximum_items:\n self._maximum_items.write(local_buffer, kmip_version=kmip_version)\n\n if self._offset_items:\n self._offset_items.write(local_buffer, kmip_version=kmip_version)\n\n if self._storage_status_mask:\n self._storage_status_mask.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._object_group_member:\n self._object_group_member.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n if self._attributes:\n for attribute in self.attributes:\n attribute.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n if self._attributes:\n # TODO (ph) Add a new utility to avoid using TemplateAttributes\n template_attribute = objects.TemplateAttribute(\n attributes=self.attributes\n )\n attributes = objects.convert_template_attribute_to_attributes(\n template_attribute\n )\n attributes.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The Locate request payload is missing the attributes \"\n \"list.\"\n )\n\n self.length = local_buffer.length()\n super(LocateRequestPayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nread the data encoding the Locate response payload and decode it into its constituent parts.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Locate response payload and decode it\n into its constituent parts.\n\n Args:\n input_buffer (stream): A data buffer containing encoded object\n data, supporting a read method.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n super(LocateResponsePayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.LOCATED_ITEMS, local_buffer):\n self._located_items = primitives.Integer(\n tag=enums.Tags.LOCATED_ITEMS\n )\n self._located_items.read(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self._unique_identifiers = []\n while self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_buffer):\n unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n unique_identifier.read(local_buffer, kmip_version=kmip_version)\n self._unique_identifiers.append(unique_identifier)\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nwriting the data encoding the Locate response payload to a stream.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the Locate response payload to a buffer.\n\n Args:\n output_buffer (stream): A data buffer in which to encode object\n data, supporting a write method.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n local_buffer = utils.BytearrayStream()\n\n if self._located_items:\n self._located_items.write(local_buffer, kmip_version=kmip_version)\n\n if self._unique_identifiers:\n for unique_identifier in self._unique_identifiers:\n unique_identifier.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.length = local_buffer.length()\n super(LocateResponsePayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\ncreate a symmetric key.", "response": "def create_symmetric_key(self, algorithm, length):\n \"\"\"\n Create a symmetric key.\n\n Args:\n algorithm(CryptographicAlgorithm): An enumeration specifying the\n algorithm for which the created key will be compliant.\n length(int): The length of the key to be created. This value must\n be compliant with the constraints of the provided algorithm.\n\n Returns:\n dict: A dictionary containing the key data, with the following\n key/value fields:\n * value - the bytes of the key\n * format - a KeyFormatType enumeration for the bytes format\n\n Raises:\n InvalidField: Raised when the algorithm is unsupported or the\n length is incompatible with the algorithm.\n CryptographicFailure: Raised when the key generation process\n fails.\n\n Example:\n >>> engine = CryptographyEngine()\n >>> key = engine.create_symmetric_key(\n ... CryptographicAlgorithm.AES, 256)\n \"\"\"\n if algorithm not in self._symmetric_key_algorithms.keys():\n raise exceptions.InvalidField(\n \"The cryptographic algorithm {0} is not a supported symmetric \"\n \"key algorithm.\".format(algorithm)\n )\n\n cryptography_algorithm = self._symmetric_key_algorithms.get(algorithm)\n\n if length not in cryptography_algorithm.key_sizes:\n raise exceptions.InvalidField(\n \"The cryptographic length ({0}) is not valid for \"\n \"the cryptographic algorithm ({1}).\".format(\n length, algorithm.name\n )\n )\n\n self.logger.info(\n \"Generating a {0} symmetric key with length: {1}\".format(\n algorithm.name, length\n )\n )\n\n key_bytes = os.urandom(length // 8)\n try:\n cryptography_algorithm(key_bytes)\n except Exception as e:\n self.logger.exception(e)\n raise exceptions.CryptographicFailure(\n \"Invalid bytes for the provided cryptographic algorithm.\")\n\n return {'value': key_bytes, 'format': enums.KeyFormatType.RAW}"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\ncreate an asymmetric key pair.", "response": "def create_asymmetric_key_pair(self, algorithm, length):\n \"\"\"\n Create an asymmetric key pair.\n\n Args:\n algorithm(CryptographicAlgorithm): An enumeration specifying the\n algorithm for which the created keys will be compliant.\n length(int): The length of the keys to be created. This value must\n be compliant with the constraints of the provided algorithm.\n\n Returns:\n dict: A dictionary containing the public key data, with at least\n the following key/value fields:\n * value - the bytes of the key\n * format - a KeyFormatType enumeration for the bytes format\n dict: A dictionary containing the private key data, identical in\n structure to the one above.\n\n Raises:\n InvalidField: Raised when the algorithm is unsupported or the\n length is incompatible with the algorithm.\n CryptographicFailure: Raised when the key generation process\n fails.\n\n Example:\n >>> engine = CryptographyEngine()\n >>> key = engine.create_asymmetric_key(\n ... CryptographicAlgorithm.RSA, 2048)\n \"\"\"\n if algorithm not in self._asymmetric_key_algorithms.keys():\n raise exceptions.InvalidField(\n \"The cryptographic algorithm ({0}) is not a supported \"\n \"asymmetric key algorithm.\".format(algorithm)\n )\n\n engine_method = self._asymmetric_key_algorithms.get(algorithm)\n return engine_method(length)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef mac(self, algorithm, key, data):\n\n mac_data = None\n\n if algorithm in self._hash_algorithms.keys():\n self.logger.info(\n \"Generating a hash-based message authentication code using \"\n \"{0}\".format(algorithm.name)\n )\n hash_algorithm = self._hash_algorithms.get(algorithm)\n try:\n h = hmac.HMAC(key, hash_algorithm(), backend=default_backend())\n h.update(data)\n mac_data = h.finalize()\n except Exception as e:\n self.logger.exception(e)\n raise exceptions.CryptographicFailure(\n \"An error occurred while computing an HMAC. \"\n \"See the server log for more information.\"\n )\n elif algorithm in self._symmetric_key_algorithms.keys():\n self.logger.info(\n \"Generating a cipher-based message authentication code using \"\n \"{0}\".format(algorithm.name)\n )\n cipher_algorithm = self._symmetric_key_algorithms.get(algorithm)\n try:\n # ARC4 and IDEA algorithms will raise exception as CMAC\n # requires block ciphers\n c = cmac.CMAC(cipher_algorithm(key), backend=default_backend())\n c.update(data)\n mac_data = c.finalize()\n except Exception as e:\n raise exceptions.CryptographicFailure(\n \"An error occurred while computing a CMAC. \"\n \"See the server log for more information.\"\n )\n else:\n raise exceptions.InvalidField(\n \"The cryptographic algorithm ({0}) is not a supported \"\n \"for a MAC operation.\".format(algorithm)\n )\n return mac_data", "response": "Generate a MAC - based message authentication code for the specified algorithm and secret key."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nencrypting the data in the specified language with the specified encryption key and returns the encrypted data in the specified language.", "response": "def encrypt(self,\n encryption_algorithm,\n encryption_key,\n plain_text,\n cipher_mode=None,\n padding_method=None,\n iv_nonce=None,\n hashing_algorithm=None):\n \"\"\"\n Encrypt data using symmetric or asymmetric encryption.\n\n Args:\n encryption_algorithm (CryptographicAlgorithm): An enumeration\n specifying the encryption algorithm to use for encryption.\n encryption_key (bytes): The bytes of the encryption key to use for\n encryption.\n plain_text (bytes): The bytes to be encrypted.\n cipher_mode (BlockCipherMode): An enumeration specifying the\n block cipher mode to use with the encryption algorithm.\n Required in the general case. Optional if the encryption\n algorithm is RC4 (aka ARC4). If optional, defaults to None.\n padding_method (PaddingMethod): An enumeration specifying the\n padding method to use on the data before encryption. Required\n if the cipher mode is for block ciphers (e.g., CBC, ECB).\n Optional otherwise, defaults to None.\n iv_nonce (bytes): The IV/nonce value to use to initialize the mode\n of the encryption algorithm. Optional, defaults to None. If\n required and not provided, it will be autogenerated and\n returned with the cipher text.\n hashing_algorithm (HashingAlgorithm): An enumeration specifying\n the hashing algorithm to use with the encryption algorithm,\n if needed. Required for OAEP-based asymmetric encryption.\n Optional, defaults to None.\n\n Returns:\n dict: A dictionary containing the encrypted data, with at least\n the following key/value fields:\n * cipher_text - the bytes of the encrypted data\n * iv_nonce - the bytes of the IV/counter/nonce used if it\n was needed by the encryption scheme and if it was\n automatically generated for the encryption\n\n Raises:\n InvalidField: Raised when the algorithm is unsupported or the\n length is incompatible with the algorithm.\n CryptographicFailure: Raised when the key generation process\n fails.\n\n Example:\n >>> engine = CryptographyEngine()\n >>> result = engine.encrypt(\n ... encryption_algorithm=CryptographicAlgorithm.AES,\n ... encryption_key=(\n ... b'\\xF3\\x96\\xE7\\x1C\\xCF\\xCD\\xEC\\x1F'\n ... b'\\xFC\\xE2\\x8E\\xA6\\xF8\\x74\\x28\\xB0'\n ... ),\n ... plain_text=(\n ... b'\\x00\\x01\\x02\\x03\\x04\\x05\\x06\\x07'\n ... b'\\x08\\x09\\x0A\\x0B\\x0C\\x0D\\x0E\\x0F'\n ... ),\n ... cipher_mode=BlockCipherMode.CBC,\n ... padding_method=PaddingMethod.ANSI_X923,\n ... )\n >>> result.get('cipher_text')\n b'\\x18[\\xb9y\\x1bL\\xd1\\x8f\\x9a\\xa0e\\x02b\\xa3=c'\n >>> result.iv_counter_nonce\n b'8qA\\x05\\xc4\\x86\\x03\\xd9=\\xef\\xdf\\xb8ke\\x9a\\xa2'\n \"\"\"\n if encryption_algorithm is None:\n raise exceptions.InvalidField(\"Encryption algorithm is required.\")\n\n if encryption_algorithm == enums.CryptographicAlgorithm.RSA:\n return self._encrypt_asymmetric(\n encryption_algorithm,\n encryption_key,\n plain_text,\n padding_method,\n hashing_algorithm=hashing_algorithm\n )\n else:\n return self._encrypt_symmetric(\n encryption_algorithm,\n encryption_key,\n plain_text,\n cipher_mode=cipher_mode,\n padding_method=padding_method,\n iv_nonce=iv_nonce\n )"} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _encrypt_symmetric(\n self,\n encryption_algorithm,\n encryption_key,\n plain_text,\n cipher_mode=None,\n padding_method=None,\n iv_nonce=None):\n \"\"\"\n Encrypt data using symmetric encryption.\n\n Args:\n encryption_algorithm (CryptographicAlgorithm): An enumeration\n specifying the symmetric encryption algorithm to use for\n encryption.\n encryption_key (bytes): The bytes of the symmetric key to use for\n encryption.\n plain_text (bytes): The bytes to be encrypted.\n cipher_mode (BlockCipherMode): An enumeration specifying the\n block cipher mode to use with the encryption algorithm.\n Required in the general case. Optional if the encryption\n algorithm is RC4 (aka ARC4). If optional, defaults to None.\n padding_method (PaddingMethod): An enumeration specifying the\n padding method to use on the data before encryption. Required\n if the cipher mode is for block ciphers (e.g., CBC, ECB).\n Optional otherwise, defaults to None.\n iv_nonce (bytes): The IV/nonce value to use to initialize the mode\n of the encryption algorithm. Optional, defaults to None. If\n required and not provided, it will be autogenerated and\n returned with the cipher text.\n\n Returns:\n dict: A dictionary containing the encrypted data, with at least\n the following key/value fields:\n * cipher_text - the bytes of the encrypted data\n * iv_nonce - the bytes of the IV/counter/nonce used if it\n was needed by the encryption scheme and if it was\n automatically generated for the encryption\n\n Raises:\n InvalidField: Raised when the algorithm is unsupported or the\n encryption key is incompatible with the algorithm.\n CryptographicFailure: Raised when the key generation process\n fails.\n \"\"\"\n\n # Set up the algorithm\n algorithm = self._symmetric_key_algorithms.get(\n encryption_algorithm,\n None\n )\n if algorithm is None:\n raise exceptions.InvalidField(\n \"Encryption algorithm '{0}' is not a supported symmetric \"\n \"encryption algorithm.\".format(encryption_algorithm)\n )\n try:\n algorithm = algorithm(encryption_key)\n except Exception as e:\n self.logger.exception(e)\n raise exceptions.CryptographicFailure(\n \"Invalid key bytes for the specified encryption algorithm.\"\n )\n\n # Set up the cipher mode if needed\n return_iv_nonce = False\n if encryption_algorithm == enums.CryptographicAlgorithm.RC4:\n mode = None\n else:\n if cipher_mode is None:\n raise exceptions.InvalidField(\"Cipher mode is required.\")\n mode = self._modes.get(cipher_mode, None)\n if mode is None:\n raise exceptions.InvalidField(\n \"Cipher mode '{0}' is not a supported mode.\".format(\n cipher_mode\n )\n )\n if hasattr(mode, 'initialization_vector') or \\\n hasattr(mode, 'nonce'):\n if iv_nonce is None:\n iv_nonce = os.urandom(algorithm.block_size // 8)\n return_iv_nonce = True\n mode = mode(iv_nonce)\n else:\n mode = mode()\n\n # Pad the plain text if needed (separate methods for testing purposes)\n if cipher_mode in [\n enums.BlockCipherMode.CBC,\n enums.BlockCipherMode.ECB\n ]:\n plain_text = self._handle_symmetric_padding(\n self._symmetric_key_algorithms.get(encryption_algorithm),\n plain_text,\n padding_method\n )\n\n # Encrypt the plain text\n cipher = ciphers.Cipher(algorithm, mode, backend=default_backend())\n encryptor = cipher.encryptor()\n cipher_text = encryptor.update(plain_text) + encryptor.finalize()\n\n if return_iv_nonce:\n return {\n 'cipher_text': cipher_text,\n 'iv_nonce': iv_nonce\n }\n else:\n return {'cipher_text': cipher_text}", "response": "Encrypts the data using the specified symmetric key."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _encrypt_asymmetric(self,\n encryption_algorithm,\n encryption_key,\n plain_text,\n padding_method,\n hashing_algorithm=None):\n \"\"\"\n Encrypt data using asymmetric encryption.\n\n Args:\n encryption_algorithm (CryptographicAlgorithm): An enumeration\n specifying the asymmetric encryption algorithm to use for\n encryption. Required.\n encryption_key (bytes): The bytes of the public key to use for\n encryption. Required.\n plain_text (bytes): The bytes to be encrypted. Required.\n padding_method (PaddingMethod): An enumeration specifying the\n padding method to use with the asymmetric encryption\n algorithm. Required.\n hashing_algorithm (HashingAlgorithm): An enumeration specifying\n the hashing algorithm to use with the encryption padding\n method. Required, if the padding method is OAEP. Optional\n otherwise, defaults to None.\n\n Returns:\n dict: A dictionary containing the encrypted data, with at least\n the following key/value field:\n * cipher_text - the bytes of the encrypted data\n\n Raises:\n InvalidField: Raised when the algorithm is unsupported or the\n length is incompatible with the algorithm.\n CryptographicFailure: Raised when the key generation process\n fails.\n \"\"\"\n if encryption_algorithm == enums.CryptographicAlgorithm.RSA:\n if padding_method == enums.PaddingMethod.OAEP:\n hash_algorithm = self._encryption_hash_algorithms.get(\n hashing_algorithm\n )\n if hash_algorithm is None:\n raise exceptions.InvalidField(\n \"The hashing algorithm '{0}' is not supported for \"\n \"asymmetric encryption.\".format(hashing_algorithm)\n )\n\n padding_method = asymmetric_padding.OAEP(\n mgf=asymmetric_padding.MGF1(\n algorithm=hash_algorithm()\n ),\n algorithm=hash_algorithm(),\n label=None\n )\n elif padding_method == enums.PaddingMethod.PKCS1v15:\n padding_method = asymmetric_padding.PKCS1v15()\n else:\n raise exceptions.InvalidField(\n \"The padding method '{0}' is not supported for asymmetric \"\n \"encryption.\".format(padding_method)\n )\n\n backend = default_backend()\n\n try:\n public_key = backend.load_der_public_key(encryption_key)\n except Exception:\n try:\n public_key = backend.load_pem_public_key(encryption_key)\n except Exception:\n raise exceptions.CryptographicFailure(\n \"The public key bytes could not be loaded.\"\n )\n cipher_text = public_key.encrypt(\n plain_text,\n padding_method\n )\n return {'cipher_text': cipher_text}\n else:\n raise exceptions.InvalidField(\n \"The cryptographic algorithm '{0}' is not supported for \"\n \"asymmetric encryption.\".format(encryption_algorithm)\n )", "response": "Encrypts the data using asymmetric encryption."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _decrypt_symmetric(\n self,\n decryption_algorithm,\n decryption_key,\n cipher_text,\n cipher_mode=None,\n padding_method=None,\n iv_nonce=None):\n \"\"\"\n Decrypt data using symmetric decryption.\n\n Args:\n decryption_algorithm (CryptographicAlgorithm): An enumeration\n specifying the symmetric decryption algorithm to use for\n decryption.\n decryption_key (bytes): The bytes of the symmetric key to use for\n decryption.\n cipher_text (bytes): The bytes to be decrypted.\n cipher_mode (BlockCipherMode): An enumeration specifying the\n block cipher mode to use with the decryption algorithm.\n Required in the general case. Optional if the decryption\n algorithm is RC4 (aka ARC4). If optional, defaults to None.\n padding_method (PaddingMethod): An enumeration specifying the\n padding method to use on the data after decryption. Required\n if the cipher mode is for block ciphers (e.g., CBC, ECB).\n Optional otherwise, defaults to None.\n iv_nonce (bytes): The IV/nonce value to use to initialize the mode\n of the decryption algorithm. Optional, defaults to None.\n\n Returns:\n bytes: the bytes of the decrypted data\n\n Raises:\n InvalidField: Raised when the algorithm is unsupported or the\n length is incompatible with the algorithm.\n CryptographicFailure: Raised when the key generation process\n fails.\n \"\"\"\n # Set up the algorithm\n algorithm = self._symmetric_key_algorithms.get(\n decryption_algorithm,\n None\n )\n if algorithm is None:\n raise exceptions.InvalidField(\n \"Decryption algorithm '{0}' is not a supported symmetric \"\n \"decryption algorithm.\".format(decryption_algorithm)\n )\n try:\n algorithm = algorithm(decryption_key)\n except Exception as e:\n self.logger.exception(e)\n raise exceptions.CryptographicFailure(\n \"Invalid key bytes for the specified decryption algorithm.\"\n )\n\n # Set up the cipher mode if needed\n if decryption_algorithm == enums.CryptographicAlgorithm.RC4:\n mode = None\n else:\n if cipher_mode is None:\n raise exceptions.InvalidField(\"Cipher mode is required.\")\n mode = self._modes.get(cipher_mode, None)\n if mode is None:\n raise exceptions.InvalidField(\n \"Cipher mode '{0}' is not a supported mode.\".format(\n cipher_mode\n )\n )\n if hasattr(mode, 'initialization_vector') or \\\n hasattr(mode, 'nonce'):\n if iv_nonce is None:\n raise exceptions.InvalidField(\n \"IV/nonce is required.\"\n )\n mode = mode(iv_nonce)\n else:\n mode = mode()\n\n # Decrypt the plain text\n cipher = ciphers.Cipher(algorithm, mode, backend=default_backend())\n decryptor = cipher.decryptor()\n plain_text = decryptor.update(cipher_text) + decryptor.finalize()\n\n # Unpad the plain text if needed (separate methods for testing\n # purposes)\n if cipher_mode in [\n enums.BlockCipherMode.CBC,\n enums.BlockCipherMode.ECB\n ]:\n plain_text = self._handle_symmetric_padding(\n self._symmetric_key_algorithms.get(decryption_algorithm),\n plain_text,\n padding_method,\n undo_padding=True\n )\n\n return plain_text", "response": "Decrypt data using the specified symmetric key."} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nencrypt the data using asymmetric decryption.", "response": "def _decrypt_asymmetric(\n self,\n decryption_algorithm,\n decryption_key,\n cipher_text,\n padding_method,\n hashing_algorithm=None):\n \"\"\"\n Encrypt data using asymmetric decryption.\n\n Args:\n decryption_algorithm (CryptographicAlgorithm): An enumeration\n specifying the asymmetric decryption algorithm to use for\n decryption. Required.\n decryption_key (bytes): The bytes of the private key to use for\n decryption. Required.\n cipher_text (bytes): The bytes to be decrypted. Required.\n padding_method (PaddingMethod): An enumeration specifying the\n padding method to use with the asymmetric decryption\n algorithm. Required.\n hashing_algorithm (HashingAlgorithm): An enumeration specifying\n the hashing algorithm to use with the decryption padding\n method. Required, if the padding method is OAEP. Optional\n otherwise, defaults to None.\n\n Returns:\n dict: A dictionary containing the decrypted data, with at least\n the following key/value field:\n * plain_text - the bytes of the decrypted data\n\n Raises:\n InvalidField: Raised when the algorithm is unsupported or the\n length is incompatible with the algorithm.\n CryptographicFailure: Raised when the key generation process\n fails.\n \"\"\"\n if decryption_algorithm == enums.CryptographicAlgorithm.RSA:\n if padding_method == enums.PaddingMethod.OAEP:\n hash_algorithm = self._encryption_hash_algorithms.get(\n hashing_algorithm\n )\n if hash_algorithm is None:\n raise exceptions.InvalidField(\n \"The hashing algorithm '{0}' is not supported for \"\n \"asymmetric decryption.\".format(hashing_algorithm)\n )\n\n padding_method = asymmetric_padding.OAEP(\n mgf=asymmetric_padding.MGF1(\n algorithm=hash_algorithm()\n ),\n algorithm=hash_algorithm(),\n label=None\n )\n elif padding_method == enums.PaddingMethod.PKCS1v15:\n padding_method = asymmetric_padding.PKCS1v15()\n else:\n raise exceptions.InvalidField(\n \"The padding method '{0}' is not supported for asymmetric \"\n \"decryption.\".format(padding_method)\n )\n\n backend = default_backend()\n\n try:\n private_key = backend.load_der_private_key(\n decryption_key,\n None\n )\n except Exception:\n try:\n private_key = backend.load_pem_private_key(\n decryption_key,\n None\n )\n except Exception:\n raise exceptions.CryptographicFailure(\n \"The private key bytes could not be loaded.\"\n )\n plain_text = private_key.decrypt(\n cipher_text,\n padding_method\n )\n return plain_text\n else:\n raise exceptions.InvalidField(\n \"The cryptographic algorithm '{0}' is not supported for \"\n \"asymmetric decryption.\".format(decryption_algorithm)\n )"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ncreating a new RSA key pair.", "response": "def _create_rsa_key_pair(self, length, public_exponent=65537):\n \"\"\"\n Create an RSA key pair.\n\n Args:\n length(int): The length of the keys to be created. This value must\n be compliant with the constraints of the provided algorithm.\n public_exponent(int): The value of the public exponent needed to\n generate the keys. Usually a small Fermat prime number.\n Optional, defaults to 65537.\n\n Returns:\n dict: A dictionary containing the public key data, with the\n following key/value fields:\n * value - the bytes of the key\n * format - a KeyFormatType enumeration for the bytes format\n * public_exponent - the public exponent integer\n dict: A dictionary containing the private key data, identical in\n structure to the one above.\n\n Raises:\n CryptographicFailure: Raised when the key generation process\n fails.\n \"\"\"\n self.logger.info(\n \"Generating an RSA key pair with length: {0}, and \"\n \"public_exponent: {1}\".format(\n length, public_exponent\n )\n )\n try:\n private_key = rsa.generate_private_key(\n public_exponent=public_exponent,\n key_size=length,\n backend=default_backend())\n public_key = private_key.public_key()\n\n private_bytes = private_key.private_bytes(\n serialization.Encoding.DER,\n serialization.PrivateFormat.PKCS8,\n serialization.NoEncryption())\n public_bytes = public_key.public_bytes(\n serialization.Encoding.DER,\n serialization.PublicFormat.PKCS1)\n except Exception as e:\n self.logger.exception(e)\n raise exceptions.CryptographicFailure(\n \"An error occurred while generating the RSA key pair. \"\n \"See the server log for more information.\"\n )\n\n public_key = {\n 'value': public_bytes,\n 'format': enums.KeyFormatType.PKCS_1,\n 'public_exponent': public_exponent\n }\n private_key = {\n 'value': private_bytes,\n 'format': enums.KeyFormatType.PKCS_8,\n 'public_exponent': public_exponent\n }\n\n return public_key, private_key"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nderiving a key from the base64 - encoded bytes of the data.", "response": "def derive_key(self,\n derivation_method,\n derivation_length,\n derivation_data=None,\n key_material=None,\n hash_algorithm=None,\n salt=None,\n iteration_count=None,\n encryption_algorithm=None,\n cipher_mode=None,\n padding_method=None,\n iv_nonce=None):\n \"\"\"\n Derive key data using a variety of key derivation functions.\n\n Args:\n derivation_method (DerivationMethod): An enumeration specifying\n the key derivation method to use. Required.\n derivation_length (int): An integer specifying the size of the\n derived key data in bytes. Required.\n derivation_data (bytes): The non-cryptographic bytes to be used\n in the key derivation process (e.g., the data to be encrypted,\n hashed, HMACed). Required in the general case. Optional if the\n derivation method is Hash and the key material is provided.\n Optional, defaults to None.\n key_material (bytes): The bytes of the key material to use for\n key derivation. Required in the general case. Optional if\n the derivation_method is HASH and derivation_data is provided.\n Optional, defaults to None.\n hash_algorithm (HashingAlgorithm): An enumeration specifying the\n hashing algorithm to use with the key derivation method.\n Required in the general case, optional if the derivation\n method specifies encryption. Optional, defaults to None.\n salt (bytes): Bytes representing a randomly generated salt.\n Required if the derivation method is PBKDF2. Optional,\n defaults to None.\n iteration_count (int): An integer representing the number of\n iterations to use when deriving key material. Required if\n the derivation method is PBKDF2. Optional, defaults to None.\n encryption_algorithm (CryptographicAlgorithm): An enumeration\n specifying the symmetric encryption algorithm to use for\n encryption-based key derivation. Required if the derivation\n method specifies encryption. Optional, defaults to None.\n cipher_mode (BlockCipherMode): An enumeration specifying the\n block cipher mode to use with the encryption algorithm.\n Required in in the general case if the derivation method\n specifies encryption and the encryption algorithm is\n specified. Optional if the encryption algorithm is RC4 (aka\n ARC4). Optional, defaults to None.\n padding_method (PaddingMethod): An enumeration specifying the\n padding method to use on the data before encryption. Required\n in in the general case if the derivation method specifies\n encryption and the encryption algorithm is specified. Required\n if the cipher mode is for block ciphers (e.g., CBC, ECB).\n Optional otherwise, defaults to None.\n iv_nonce (bytes): The IV/nonce value to use to initialize the mode\n of the encryption algorithm. Required in the general case if\n the derivation method specifies encryption and the encryption\n algorithm is specified. Optional, defaults to None. If\n required and not provided, it will be autogenerated.\n\n Returns:\n bytes: the bytes of the derived data\n\n Raises:\n InvalidField: Raised when cryptographic data and/or settings are\n unsupported or incompatible with the derivation method.\n\n Example:\n >>> engine = CryptographyEngine()\n >>> result = engine.derive_key(\n ... derivation_method=enums.DerivationMethod.HASH,\n ... derivation_length=16,\n ... derivation_data=b'abc',\n ... hash_algorithm=enums.HashingAlgorithm.MD5\n ... )\n >>> result\n b'\\x90\\x01P\\x98<\\xd2O\\xb0\\xd6\\x96?}(\\xe1\\x7fr'\n \"\"\"\n if derivation_method == enums.DerivationMethod.ENCRYPT:\n result = self.encrypt(\n encryption_algorithm=encryption_algorithm,\n encryption_key=key_material,\n plain_text=derivation_data,\n cipher_mode=cipher_mode,\n padding_method=padding_method,\n iv_nonce=iv_nonce\n )\n return result.get('cipher_text')\n else:\n # Handle key derivation functions that use hash algorithms\n\n # Set up the hashing algorithm\n if hash_algorithm is None:\n raise exceptions.InvalidField(\"Hash algorithm is required.\")\n hashing_algorithm = self._encryption_hash_algorithms.get(\n hash_algorithm,\n None\n )\n if hashing_algorithm is None:\n raise exceptions.InvalidField(\n \"Hash algorithm '{0}' is not a supported hashing \"\n \"algorithm.\".format(hash_algorithm)\n )\n\n if derivation_method == enums.DerivationMethod.HMAC:\n df = hkdf.HKDF(\n algorithm=hashing_algorithm(),\n length=derivation_length,\n salt=salt,\n info=derivation_data,\n backend=default_backend()\n )\n derived_data = df.derive(key_material)\n return derived_data\n elif derivation_method == enums.DerivationMethod.HASH:\n if None not in [derivation_data, key_material]:\n raise exceptions.InvalidField(\n \"For hash-based key derivation, specify only \"\n \"derivation data or key material, not both.\"\n )\n elif derivation_data is not None:\n hashing_data = derivation_data\n elif key_material is not None:\n hashing_data = key_material\n else:\n raise exceptions.InvalidField(\n \"For hash-based key derivation, derivation data or \"\n \"key material must be specified.\"\n )\n\n df = hashes.Hash(\n algorithm=hashing_algorithm(),\n backend=default_backend()\n )\n df.update(hashing_data)\n derived_data = df.finalize()\n return derived_data\n elif derivation_method == enums.DerivationMethod.PBKDF2:\n if salt is None:\n raise exceptions.InvalidField(\n \"For PBKDF2 key derivation, salt must be specified.\"\n )\n if iteration_count is None:\n raise exceptions.InvalidField(\n \"For PBKDF2 key derivation, iteration count must be \"\n \"specified.\"\n )\n\n df = pbkdf2.PBKDF2HMAC(\n algorithm=hashing_algorithm(),\n length=derivation_length,\n salt=salt,\n iterations=iteration_count,\n backend=default_backend()\n )\n derived_data = df.derive(key_material)\n return derived_data\n elif derivation_method == enums.DerivationMethod.NIST800_108_C:\n df = kbkdf.KBKDFHMAC(\n algorithm=hashing_algorithm(),\n mode=kbkdf.Mode.CounterMode,\n length=derivation_length,\n rlen=4,\n llen=None,\n location=kbkdf.CounterLocation.BeforeFixed,\n label=None,\n context=None,\n fixed=derivation_data,\n backend=default_backend()\n )\n derived_data = df.derive(key_material)\n return derived_data\n else:\n raise exceptions.InvalidField(\n \"Derivation method '{0}' is not a supported key \"\n \"derivation method.\".format(derivation_method)\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef wrap_key(self,\n key_material,\n wrapping_method,\n key_wrap_algorithm,\n encryption_key):\n \"\"\"\n Args:\n key_material (bytes): The bytes of the key to wrap. Required.\n wrapping_method (WrappingMethod): A WrappingMethod enumeration\n specifying what wrapping technique to use to wrap the key\n material. Required.\n key_wrap_algorithm (BlockCipherMode): A BlockCipherMode\n enumeration specifying the key wrapping algorithm to use to\n wrap the key material. Required.\n encryption_key (bytes): The bytes of the encryption key to use\n to encrypt the key material. Required.\n\n Returns:\n bytes: the bytes of the wrapped key\n\n Raises:\n CryptographicFailure: Raised when an error occurs during key\n wrapping.\n InvalidField: Raised when an unsupported wrapping or encryption\n algorithm is specified.\n\n Example:\n >>> engine = CryptographyEngine()\n >>> result = engine.wrap_key(\n ... key_material=(\n ... b'\\x00\\x11\\x22\\x33\\x44\\x55\\x66\\x77'\n ... b'\\x88\\x99\\xAA\\xBB\\xCC\\xDD\\xEE\\xFF'\n ... )\n ... wrapping_method=enums.WrappingMethod.ENCRYPT,\n ... key_wrap_algorithm=enums.BlockCipherMode.NIST_KEY_WRAP,\n ... encryption_key=(\n ... b'\\x00\\x01\\x02\\x03\\x04\\x05\\x06\\x07'\n ... b'\\x08\\x09\\x0A\\x0B\\x0C\\x0D\\x0E\\x0F'\n ... )\n ... )\n >>> result\n b'\\x1f\\xa6\\x8b\\n\\x81\\x12\\xb4G\\xae\\xf3K\\xd8\\xfbZ{\\x82\\x9d>\\x86#q\n \\xd2\\xcf\\xe5'\n \"\"\"\n if wrapping_method == enums.WrappingMethod.ENCRYPT:\n if key_wrap_algorithm == enums.BlockCipherMode.NIST_KEY_WRAP:\n try:\n wrapped_key = keywrap.aes_key_wrap(\n encryption_key,\n key_material,\n default_backend()\n )\n return wrapped_key\n except Exception as e:\n raise exceptions.CryptographicFailure(str(e))\n else:\n raise exceptions.InvalidField(\n \"Encryption algorithm '{0}' is not a supported key \"\n \"wrapping algorithm.\".format(key_wrap_algorithm)\n )\n else:\n raise exceptions.InvalidField(\n \"Wrapping method '{0}' is not a supported key wrapping \"\n \"method.\".format(wrapping_method)\n )", "response": "Wrap a key in the specified language."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef _create_RSA_private_key(self,\n bytes):\n \"\"\"\n Instantiates an RSA key from bytes.\n\n Args:\n bytes (byte string): Bytes of RSA private key.\n Returns:\n private_key\n (cryptography.hazmat.primitives.asymmetric.rsa.RSAPrivateKey):\n RSA private key created from key bytes.\n \"\"\"\n\n try:\n private_key = serialization.load_pem_private_key(\n bytes,\n password=None,\n backend=default_backend()\n )\n return private_key\n except Exception:\n private_key = serialization.load_der_private_key(\n bytes,\n password=None,\n backend=default_backend()\n )\n return private_key", "response": "Creates an RSA private key from the given bytes."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef sign(self,\n digital_signature_algorithm,\n crypto_alg,\n hash_algorithm,\n padding,\n signing_key,\n data):\n \"\"\"\n Args:\n digital_signature_algorithm (DigitalSignatureAlgorithm): An\n enumeration specifying the asymmetric cryptographic algorithm\n and hashing algorithm to use for the signature operation. Can\n be None if cryptographic_algorithm and hash_algorithm are set.\n crypto_alg (CryptographicAlgorithm): An enumeration\n specifying the asymmetric cryptographic algorithm to use for\n the signature operation. Can be None if\n digital_signature_algorithm is set.\n hash_algorithm (HashingAlgorithm): An enumeration specifying the\n hash algorithm to use for the signature operation. Can be None\n if digital_signature_algorithm is set.\n padding (PaddingMethod): An enumeration specifying the asymmetric\n padding method to use for the signature operation.\n signing_key (bytes): The bytes of the private key to use for the\n signature operation.\n data (bytes): The data to be signed.\n\n Returns:\n signature (bytes): the bytes of the signature data\n\n Raises:\n CryptographicFailure: Raised when an error occurs during signature\n creation.\n InvalidField: Raised when an unsupported hashing or cryptographic\n algorithm is specified.\n \"\"\"\n\n if digital_signature_algorithm:\n (hash_alg, crypto_alg) = self._digital_signature_algorithms.get(\n digital_signature_algorithm,\n (None, None)\n )\n\n elif crypto_alg and hash_algorithm:\n hash_alg = self._encryption_hash_algorithms.get(\n hash_algorithm, None\n )\n else:\n raise exceptions.InvalidField(\n 'For signing, either a digital signature algorithm or a hash'\n ' algorithm and a cryptographic algorithm must be specified.'\n )\n\n if crypto_alg == enums.CryptographicAlgorithm.RSA:\n try:\n key = self._create_RSA_private_key(signing_key)\n except Exception:\n raise exceptions.InvalidField('Unable to deserialize key '\n 'bytes, unknown format.')\n else:\n raise exceptions.InvalidField(\n 'For signing, an RSA key must be used.'\n )\n\n if padding:\n padding_method = self._asymmetric_padding_methods.get(\n padding, None\n )\n else:\n raise exceptions.InvalidField(\n 'For signing, a padding method must be specified.'\n )\n\n if padding == enums.PaddingMethod.PSS:\n signature = key.sign(\n data,\n asymmetric_padding.PSS(\n mgf=asymmetric_padding.MGF1(hash_alg()),\n salt_length=asymmetric_padding.PSS.MAX_LENGTH\n ),\n hash_alg()\n )\n elif padding == enums.PaddingMethod.PKCS1v15:\n signature = key.sign(\n data,\n padding_method(),\n hash_alg()\n )\n else:\n raise exceptions.InvalidField(\n \"Padding method '{0}' is not a supported signature \"\n \"padding method.\".format(padding)\n )\n return signature", "response": "Signs the data of the object with the specified parameters."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef verify_signature(self,\n signing_key,\n message,\n signature,\n padding_method,\n signing_algorithm=None,\n hashing_algorithm=None,\n digital_signature_algorithm=None):\n \"\"\"\n Verify a message signature.\n\n Args:\n signing_key (bytes): The bytes of the signing key to use for\n signature verification. Required.\n message (bytes): The bytes of the message that corresponds with\n the signature. Required.\n signature (bytes): The bytes of the signature to be verified.\n Required.\n padding_method (PaddingMethod): An enumeration specifying the\n padding method to use during signature verification. Required.\n signing_algorithm (CryptographicAlgorithm): An enumeration\n specifying the cryptographic algorithm to use for signature\n verification. Only RSA is supported. Optional, must match the\n algorithm specified by the digital signature algorithm if both\n are provided. Defaults to None.\n hashing_algorithm (HashingAlgorithm): An enumeration specifying\n the hashing algorithm to use with the cryptographic algortihm,\n if needed. Optional, must match the algorithm specified by the\n digital signature algorithm if both are provided. Defaults to\n None.\n digital_signature_algorithm (DigitalSignatureAlgorithm): An\n enumeration specifying both the cryptographic and hashing\n algorithms to use for signature verification. Optional, must\n match the cryptographic and hashing algorithms if both are\n provided. Defaults to None.\n\n Returns:\n boolean: the result of signature verification, True for valid\n signatures, False for invalid signatures\n\n Raises:\n InvalidField: Raised when various settings or values are invalid.\n CryptographicFailure: Raised when the signing key bytes cannot be\n loaded, or when the signature verification process fails\n unexpectedly.\n \"\"\"\n backend = default_backend()\n\n hash_algorithm = None\n dsa_hash_algorithm = None\n dsa_signing_algorithm = None\n\n if hashing_algorithm:\n hash_algorithm = self._encryption_hash_algorithms.get(\n hashing_algorithm\n )\n if digital_signature_algorithm:\n algorithm_pair = self._digital_signature_algorithms.get(\n digital_signature_algorithm\n )\n if algorithm_pair:\n dsa_hash_algorithm = algorithm_pair[0]\n dsa_signing_algorithm = algorithm_pair[1]\n\n if dsa_hash_algorithm and dsa_signing_algorithm:\n if hash_algorithm and (hash_algorithm != dsa_hash_algorithm):\n raise exceptions.InvalidField(\n \"The hashing algorithm does not match the digital \"\n \"signature algorithm.\"\n )\n if (signing_algorithm and\n (signing_algorithm != dsa_signing_algorithm)):\n raise exceptions.InvalidField(\n \"The signing algorithm does not match the digital \"\n \"signature algorithm.\"\n )\n\n signing_algorithm = dsa_signing_algorithm\n hash_algorithm = dsa_hash_algorithm\n\n if signing_algorithm == enums.CryptographicAlgorithm.RSA:\n if padding_method == enums.PaddingMethod.PSS:\n if hash_algorithm:\n padding = asymmetric_padding.PSS(\n mgf=asymmetric_padding.MGF1(hash_algorithm()),\n salt_length=asymmetric_padding.PSS.MAX_LENGTH\n )\n else:\n raise exceptions.InvalidField(\n \"A hashing algorithm must be specified for PSS \"\n \"padding.\"\n )\n elif padding_method == enums.PaddingMethod.PKCS1v15:\n padding = asymmetric_padding.PKCS1v15()\n else:\n raise exceptions.InvalidField(\n \"The padding method '{0}' is not supported for signature \"\n \"verification.\".format(padding_method)\n )\n\n try:\n public_key = backend.load_der_public_key(signing_key)\n except Exception:\n try:\n public_key = backend.load_pem_public_key(signing_key)\n except Exception:\n raise exceptions.CryptographicFailure(\n \"The signing key bytes could not be loaded.\"\n )\n\n try:\n public_key.verify(\n signature,\n message,\n padding,\n hash_algorithm()\n )\n return True\n except errors.InvalidSignature:\n return False\n except Exception:\n raise exceptions.CryptographicFailure(\n \"The signature verification process failed.\"\n )\n else:\n raise exceptions.InvalidField(\n \"The signing algorithm '{0}' is not supported for \"\n \"signature verification.\".format(signing_algorithm)\n )", "response": "Verifies a message s signature."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreads the data encoding the Sign response payload and decode it.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Sign response payload and decode it.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the unique_identifier or signature attributes\n are missing from the encoded payload.\n \"\"\"\n\n super(SignResponsePayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"invalid payload missing the unique identifier attribute\"\n )\n\n if self.is_tag_next(enums.Tags.SIGNATURE_DATA, local_stream):\n self._signature_data = primitives.ByteString(\n tag=enums.Tags.SIGNATURE_DATA\n )\n self._signature_data.read(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"invalid payload missing the signature data attribute\"\n )"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n\n local_stream = utils.BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"invalid payload missing the unique identifier attribute\"\n )\n\n if self._signature_data:\n self._signature_data.write(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"invalid payload missing the signature attribute\"\n )\n\n self.length = local_stream.length()\n super(SignResponsePayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the object to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nreading the data encoding the GetUsageAllocationRequestPayload and decodes it into its constituent parts.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the GetUsageAllocation request payload and\n decode it into its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is missing from the\n encoded payload.\n \"\"\"\n super(GetUsageAllocationRequestPayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n if self.is_tag_next(enums.Tags.USAGE_LIMITS_COUNT, local_stream):\n self._usage_limits_count = primitives.LongInteger(\n tag=enums.Tags.USAGE_LIMITS_COUNT\n )\n self._usage_limits_count.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef protocol_version_to_kmip_version(value):\n if not isinstance(value, ProtocolVersion):\n return None\n\n if value.major == 1:\n if value.minor == 0:\n return enums.KMIPVersion.KMIP_1_0\n elif value.minor == 1:\n return enums.KMIPVersion.KMIP_1_1\n elif value.minor == 2:\n return enums.KMIPVersion.KMIP_1_2\n elif value.minor == 3:\n return enums.KMIPVersion.KMIP_1_3\n elif value.minor == 4:\n return enums.KMIPVersion.KMIP_1_4\n else:\n return None\n else:\n return None", "response": "Converts a ProtocolVersion struct into its KMIPVersion enumeration equivalent."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreading the data encoding the object containing the object containing the object.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the ProtocolVersion struct and decode it into\n its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if either the major or minor protocol versions\n are missing from the encoding.\n \"\"\"\n super(ProtocolVersion, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.PROTOCOL_VERSION_MAJOR, local_stream):\n self._major = primitives.Integer(\n tag=enums.Tags.PROTOCOL_VERSION_MAJOR\n )\n self._major.read(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Invalid encoding missing the major protocol version number.\"\n )\n\n if self.is_tag_next(enums.Tags.PROTOCOL_VERSION_MINOR, local_stream):\n self._minor = primitives.Integer(\n tag=enums.Tags.PROTOCOL_VERSION_MINOR\n )\n self._minor.read(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Invalid encoding missing the minor protocol version number.\"\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nwrites the data encoding the ProtocolVersion struct to the output stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the ProtocolVersion struct to a stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is not defined.\n \"\"\"\n local_stream = utils.BytearrayStream()\n\n if self._major:\n self._major.write(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Invalid struct missing the major protocol version number.\"\n )\n\n if self._minor:\n self._minor.write(local_stream, kmip_version=kmip_version)\n else:\n raise ValueError(\n \"Invalid struct missing the minor protocol version number.\"\n )\n\n self.length = local_stream.length()\n super(ProtocolVersion, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nreading the data encoding the Authentication struct and decode it into the corresponding parts.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Authentication struct and decode it into\n its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n super(Authentication, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n credentials = []\n while self.is_tag_next(enums.Tags.CREDENTIAL, local_stream):\n credential = objects.Credential()\n credential.read(local_stream, kmip_version=kmip_version)\n credentials.append(credential)\n if len(credentials) == 0:\n raise ValueError(\"Authentication encoding missing credentials.\")\n self._credentials = credentials\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nwrites the data encoding the Authentication struct to a stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the Authentication struct to a stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n local_stream = utils.BytearrayStream()\n\n if len(self._credentials) == 0:\n raise ValueError(\"Authentication struct missing credentials.\")\n for credential in self._credentials:\n credential.write(local_stream, kmip_version=kmip_version)\n\n self.length = local_stream.length()\n super(Authentication, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreading the data encoding the Poll request payload and decode it into the object.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Poll request payload and decode it into\n its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is missing from the\n encoded payload.\n \"\"\"\n super(PollRequestPayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(\n enums.Tags.ASYNCHRONOUS_CORRELATION_VALUE,\n local_stream\n ):\n self._asynchronous_correlation_value = primitives.ByteString(\n tag=enums.Tags.ASYNCHRONOUS_CORRELATION_VALUE\n )\n self._asynchronous_correlation_value.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef read(self, istream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(Certificate, self).read(istream, kmip_version=kmip_version)\n tstream = BytearrayStream(istream.read(self.length))\n\n self.certificate_type = CertificateType()\n self.certificate_value = CertificateValue()\n\n self.certificate_type.read(tstream, kmip_version=kmip_version)\n self.certificate_value.read(tstream, kmip_version=kmip_version)\n\n self.is_oversized(tstream)", "response": "Reads the data encoding the Certificate object and decode it into its own parts."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n tstream = BytearrayStream()\n\n self.certificate_type.write(tstream, kmip_version=kmip_version)\n self.certificate_value.write(tstream, kmip_version=kmip_version)\n\n self.length = tstream.length()\n super(Certificate, self).write(ostream, kmip_version=kmip_version)\n ostream.write(tstream.buffer)", "response": "Writes the data encoding the certificate object to the data stream."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef authenticate(self,\n connection_certificate=None,\n connection_info=None,\n request_credentials=None):\n \"\"\"\n Query the configured SLUGS service with the provided credentials.\n\n Args:\n connection_certificate (cryptography.x509.Certificate): An X.509\n certificate object obtained from the connection being\n authenticated. Required for SLUGS authentication.\n connection_info (tuple): A tuple of information pertaining to the\n connection being authenticated, including the source IP address\n and a timestamp (e.g., ('127.0.0.1', 1519759267.467451)).\n Optional, defaults to None. Ignored for SLUGS authentication.\n request_credentials (list): A list of KMIP Credential structures\n containing credential information to use for authentication.\n Optional, defaults to None. Ignored for SLUGS authentication.\n \"\"\"\n if (self.users_url is None) or (self.groups_url is None):\n raise exceptions.ConfigurationError(\n \"The SLUGS URL must be specified.\"\n )\n\n user_id = utils.get_client_identity_from_certificate(\n connection_certificate\n )\n\n try:\n response = requests.get(self.users_url.format(user_id))\n except Exception:\n raise exceptions.ConfigurationError(\n \"A connection could not be established using the SLUGS URL.\"\n )\n if response.status_code == 404:\n raise exceptions.PermissionDenied(\n \"Unrecognized user ID: {}\".format(user_id)\n )\n\n response = requests.get(self.groups_url.format(user_id))\n if response.status_code == 404:\n raise exceptions.PermissionDenied(\n \"Group information could not be retrieved for user ID: \"\n \"{}\".format(user_id)\n )\n\n return user_id, response.json().get('groups')", "response": "Authenticate the user with the SLUGS service."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n super(ArchiveResponsePayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)", "response": "Reads the data encoding the archive response payload and decode it into its constituent parts."} {"SOURCE": "codesearchnet", "instruction": "Can you create a Python 3 function that\nwrites the data encoding the object to a stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the Archive response payload to a stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the data attribute is not defined.\n \"\"\"\n local_stream = utils.BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(ArchiveResponsePayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_stream = utils.BytearrayStream()\n\n if self._unique_identifier is not None:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n if self._offset is not None:\n self._offset.write(local_stream, kmip_version=kmip_version)\n if self._template_attribute is not None:\n self._template_attribute.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(RekeyRequestPayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)", "response": "Writes the data encoding the object to the output stream."} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nreads the data encoding the Rekey response payload and decode it into the corresponding parts.", "response": "def read(self, input_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the Rekey response payload and decode it into\n its constituent parts.\n\n Args:\n input_stream (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the unique identifier attribute is missing\n from the encoded payload.\n \"\"\"\n super(RekeyResponsePayload, self).read(\n input_stream,\n kmip_version=kmip_version\n )\n local_stream = utils.BytearrayStream(input_stream.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_stream):\n self._unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n self._unique_identifier.read(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"The Rekey response payload encoding is missing the unique \"\n \"identifier.\"\n )\n\n if self.is_tag_next(enums.Tags.TEMPLATE_ATTRIBUTE, local_stream):\n self._template_attribute = objects.TemplateAttribute()\n self._template_attribute.read(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.is_oversized(local_stream)"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nwrites the data encoding the object to the output stream.", "response": "def write(self, output_stream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the Rekey request payload to a stream.\n\n Args:\n output_stream (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n ValueError: Raised if the payload is missing the unique identifier.\n \"\"\"\n local_stream = utils.BytearrayStream()\n\n if self._unique_identifier is not None:\n self._unique_identifier.write(\n local_stream,\n kmip_version=kmip_version\n )\n else:\n raise ValueError(\n \"The Rekey response payload is missing the unique identifier.\"\n )\n if self._template_attribute is not None:\n self._template_attribute.write(\n local_stream,\n kmip_version=kmip_version\n )\n\n self.length = local_stream.length()\n super(RekeyResponsePayload, self).write(\n output_stream,\n kmip_version=kmip_version\n )\n output_stream.write(local_stream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nwrite the contents of the ActivateRequestPayload object to the data stream.", "response": "def write(self, ostream, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the ActivateRequestPayload object to a stream.\n Args:\n ostream (Stream): A data stream in which to encode object data,\n supporting a write method; usually a BytearrayStream object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n tstream = BytearrayStream()\n\n # Write the contents of the request payload\n if self.unique_identifier is not None:\n self.unique_identifier.write(tstream, kmip_version=kmip_version)\n\n # Write the length and value of the request payload\n self.length = tstream.length()\n super(ActivateRequestPayload, self).write(\n ostream,\n kmip_version=kmip_version\n )\n ostream.write(tstream.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nvalidate the attributes of the ActivateRequestPayload object.", "response": "def validate(self):\n \"\"\"\n Error check the attributes of the ActivateRequestPayload object.\n \"\"\"\n if self.unique_identifier is not None:\n if not isinstance(self.unique_identifier,\n attributes.UniqueIdentifier):\n msg = \"invalid unique identifier\"\n raise TypeError(msg)"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nsetting the KMIP version for the client.", "response": "def kmip_version(self, value):\n \"\"\"\n Set the KMIP version for the client.\n\n Args:\n value (KMIPVersion): A KMIPVersion enumeration\n\n Return:\n None\n\n Raises:\n ValueError: if value is not a KMIPVersion enumeration\n\n Example:\n >>> client.kmip_version = enums.KMIPVersion.KMIP_1_1\n >>>\n \"\"\"\n if isinstance(value, enums.KMIPVersion):\n self._kmip_version = value\n else:\n raise ValueError(\"KMIP version must be a KMIPVersion enumeration\")"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nchecking if a profile is supported by the client.", "response": "def is_profile_supported(self, conformance_clause, authentication_suite):\n \"\"\"\n Check if a profile is supported by the client.\n\n Args:\n conformance_clause (ConformanceClause):\n authentication_suite (AuthenticationSuite):\n\n Returns:\n bool: True if the profile is supported, False otherwise.\n\n Example:\n >>> client.is_profile_supported(\n ... ConformanceClause.DISCOVER_VERSIONS,\n ... AuthenticationSuite.BASIC)\n True\n \"\"\"\n return (self.is_conformance_clause_supported(conformance_clause) and\n self.is_authentication_suite_supported(authentication_suite))"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nchecking object usage according to specific constraints. Args: uuid (string): The unique identifier of a managed cryptographic object that should be checked. Optional, defaults to None. offset (int): An integer specifying, in seconds, the difference between the rekeyed objects initialization date and activation date. Optional, defaults to None. template_attribute (TemplateAttribute): A TemplateAttribute struct containing the attributes to set on the newly rekeyed object. Optional, defaults to None. credential (Credential): A Credential struct containing a set of authorization parameters for the operation. Optional, defaults to None. Returns: dict: The results of the check operation, containing the following key/value pairs: Key | Value ---------------------------|----------------------------------- 'unique_identifier' | (string) The unique ID of the | checked cryptographic object. 'template_attribute' | (TemplateAttribute) A struct | containing attribute set by the | server. Optional. 'result_status' | (ResultStatus) An enumeration | indicating the status of the | operation result. 'result_reason' | (ResultReason) An enumeration | providing context for the result | status. 'result_message' | (string) A message providing | additional context for the | operation result.", "response": "def rekey(self,\n uuid=None,\n offset=None,\n template_attribute=None,\n credential=None):\n \"\"\"\n Check object usage according to specific constraints.\n\n Args:\n uuid (string): The unique identifier of a managed cryptographic\n object that should be checked. Optional, defaults to None.\n offset (int): An integer specifying, in seconds, the difference\n between the rekeyed objects initialization date and activation\n date. Optional, defaults to None.\n template_attribute (TemplateAttribute): A TemplateAttribute struct\n containing the attributes to set on the newly rekeyed object.\n Optional, defaults to None.\n credential (Credential): A Credential struct containing a set of\n authorization parameters for the operation. Optional, defaults\n to None.\n\n Returns:\n dict: The results of the check operation, containing the following\n key/value pairs:\n\n Key | Value\n ---------------------------|-----------------------------------\n 'unique_identifier' | (string) The unique ID of the\n | checked cryptographic object.\n 'template_attribute' | (TemplateAttribute) A struct\n | containing attribute set by the\n | server. Optional.\n 'result_status' | (ResultStatus) An enumeration\n | indicating the status of the\n | operation result.\n 'result_reason' | (ResultReason) An enumeration\n | providing context for the result\n | status.\n 'result_message' | (string) A message providing\n | additional context for the\n | operation result.\n \"\"\"\n operation = Operation(OperationEnum.REKEY)\n request_payload = payloads.RekeyRequestPayload(\n unique_identifier=uuid,\n offset=offset,\n template_attribute=template_attribute\n )\n batch_item = messages.RequestBatchItem(\n operation=operation,\n request_payload=request_payload\n )\n\n request = self._build_request_message(credential, [batch_item])\n response = self._send_and_receive_message(request)\n batch_item = response.batch_items[0]\n payload = batch_item.response_payload\n\n result = {}\n\n if payload:\n result['unique_identifier'] = payload.unique_identifier\n\n if payload.template_attribute is not None:\n result['template_attribute'] = payload.template_attribute\n\n result['result_status'] = batch_item.result_status.value\n try:\n result['result_reason'] = batch_item.result_reason.value\n except Exception:\n result['result_reason'] = batch_item.result_reason\n try:\n result['result_message'] = batch_item.result_message.value\n except Exception:\n result['result_message'] = batch_item.result_message\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nderiving a new key or secret data from an existing managed object. Args: object_type (ObjectType): An ObjectType enumeration specifying what type of object to create. Required. unique_identifiers (list): A list of strings specifying the unique IDs of the existing managed objects to use for key derivation. Required. derivation_method (DerivationMethod): A DerivationMethod enumeration specifying what key derivation method to use. Required. derivation_parameters (DerivationParameters): A DerivationParameters struct containing the settings and options to use for key derivation. template_attribute (TemplateAttribute): A TemplateAttribute struct containing the attributes to set on the newly derived object. credential (Credential): A Credential struct containing a set of authorization parameters for the operation. Optional, defaults to None. Returns: dict: The results of the derivation operation, containing the following key/value pairs: Key | Value ---------------------|----------------------------------------- 'unique_identifier' | (string) The unique ID of the newly | derived object. 'template_attribute' | (TemplateAttribute) A struct containing | any attributes set on the newly derived | object. 'result_status' | (ResultStatus) An enumeration indicating | the status of the operation result. 'result_reason' | (ResultReason) An enumeration providing | context for the result status. 'result_message' | (string) A message providing additional | context for the operation result.", "response": "def derive_key(self,\n object_type,\n unique_identifiers,\n derivation_method,\n derivation_parameters,\n template_attribute,\n credential=None):\n \"\"\"\n Derive a new key or secret data from an existing managed object.\n\n Args:\n object_type (ObjectType): An ObjectType enumeration specifying\n what type of object to create. Required.\n unique_identifiers (list): A list of strings specifying the unique\n IDs of the existing managed objects to use for key derivation.\n Required.\n derivation_method (DerivationMethod): A DerivationMethod\n enumeration specifying what key derivation method to use.\n Required.\n derivation_parameters (DerivationParameters): A\n DerivationParameters struct containing the settings and\n options to use for key derivation.\n template_attribute (TemplateAttribute): A TemplateAttribute struct\n containing the attributes to set on the newly derived object.\n credential (Credential): A Credential struct containing a set of\n authorization parameters for the operation. Optional, defaults\n to None.\n\n Returns:\n dict: The results of the derivation operation, containing the\n following key/value pairs:\n\n Key | Value\n ---------------------|-----------------------------------------\n 'unique_identifier' | (string) The unique ID of the newly\n | derived object.\n 'template_attribute' | (TemplateAttribute) A struct containing\n | any attributes set on the newly derived\n | object.\n 'result_status' | (ResultStatus) An enumeration indicating\n | the status of the operation result.\n 'result_reason' | (ResultReason) An enumeration providing\n | context for the result status.\n 'result_message' | (string) A message providing additional\n | context for the operation result.\n \"\"\"\n operation = Operation(OperationEnum.DERIVE_KEY)\n request_payload = payloads.DeriveKeyRequestPayload(\n object_type=object_type,\n unique_identifiers=unique_identifiers,\n derivation_method=derivation_method,\n derivation_parameters=derivation_parameters,\n template_attribute=template_attribute\n )\n batch_item = messages.RequestBatchItem(\n operation=operation,\n request_payload=request_payload\n )\n\n request = self._build_request_message(credential, [batch_item])\n response = self._send_and_receive_message(request)\n batch_item = response.batch_items[0]\n payload = batch_item.response_payload\n\n result = {}\n\n if payload:\n result['unique_identifier'] = payload.unique_identifier\n result['template_attribute'] = payload.template_attribute\n\n result['result_status'] = batch_item.result_status.value\n try:\n result['result_reason'] = batch_item.result_reason.value\n except Exception:\n result['result_reason'] = batch_item.result_reason\n try:\n result['result_message'] = batch_item.result_message.value\n except Exception:\n result['result_message'] = batch_item.result_message\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nchecking the cryptographic object usage according to specific constraints.", "response": "def check(self,\n uuid=None,\n usage_limits_count=None,\n cryptographic_usage_mask=None,\n lease_time=None,\n credential=None):\n \"\"\"\n Check object usage according to specific constraints.\n\n Args:\n uuid (string): The unique identifier of a managed cryptographic\n object that should be checked. Optional, defaults to None.\n usage_limits_count (int): An integer specifying the number of\n items that can be secured with the specified cryptographic\n object. Optional, defaults to None.\n cryptographic_usage_mask (list): A list of CryptographicUsageMask\n enumerations specifying the operations possible with the\n specified cryptographic object. Optional, defaults to None.\n lease_time (int): The number of seconds that can be leased for the\n specified cryptographic object. Optional, defaults to None.\n credential (Credential): A Credential struct containing a set of\n authorization parameters for the operation. Optional, defaults\n to None.\n\n Returns:\n dict: The results of the check operation, containing the following\n key/value pairs:\n\n Key | Value\n ---------------------------|-----------------------------------\n 'unique_identifier' | (string) The unique ID of the\n | checked cryptographic object.\n 'usage_limits_count' | (int) The value provided as input\n | if the value exceeds server\n | constraints.\n 'cryptographic_usage_mask' | (list) The value provided as input\n | if the value exceeds server\n | constraints.\n 'lease_time' | (int) The value provided as input\n | if the value exceeds server\n | constraints.\n 'result_status' | (ResultStatus) An enumeration\n | indicating the status of the\n | operation result.\n 'result_reason' | (ResultReason) An enumeration\n | providing context for the result\n | status.\n 'result_message' | (string) A message providing\n | additional context for the\n | operation result.\n \"\"\"\n # TODO (peter-hamilton) Push this into the Check request.\n mask = 0\n for m in cryptographic_usage_mask:\n mask |= m.value\n\n operation = Operation(OperationEnum.CHECK)\n request_payload = payloads.CheckRequestPayload(\n unique_identifier=uuid,\n usage_limits_count=usage_limits_count,\n cryptographic_usage_mask=mask,\n lease_time=lease_time\n )\n batch_item = messages.RequestBatchItem(\n operation=operation,\n request_payload=request_payload\n )\n\n request = self._build_request_message(credential, [batch_item])\n response = self._send_and_receive_message(request)\n batch_item = response.batch_items[0]\n payload = batch_item.response_payload\n\n result = {}\n\n if payload:\n result['unique_identifier'] = payload.unique_identifier\n if payload.usage_limits_count is not None:\n result['usage_limits_count'] = payload.usage_limits_count\n if payload.cryptographic_usage_mask is not None:\n # TODO (peter-hamilton) Push this into the Check response.\n masks = []\n for enumeration in enums.CryptographicUsageMask:\n if payload.cryptographic_usage_mask & enumeration.value:\n masks.append(enumeration)\n result['cryptographic_usage_mask'] = masks\n if payload.lease_time is not None:\n result['lease_time'] = payload.lease_time\n\n result['result_status'] = batch_item.result_status.value\n try:\n result['result_reason'] = batch_item.result_reason.value\n except Exception:\n result['result_reason'] = batch_item.result_reason\n try:\n result['result_message'] = batch_item.result_message.value\n except Exception:\n result['result_message'] = batch_item.result_message\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nsends a GetAttributes request to the server and returns the response.", "response": "def get_attributes(self, uuid=None, attribute_names=None):\n \"\"\"\n Send a GetAttributes request to the server.\n\n Args:\n uuid (string): The ID of the managed object with which the\n retrieved attributes should be associated. Optional, defaults\n to None.\n attribute_names (list): A list of AttributeName values indicating\n what object attributes the client wants from the server.\n Optional, defaults to None.\n\n Returns:\n result (GetAttributesResult): A structure containing the results\n of the operation.\n \"\"\"\n batch_item = self._build_get_attributes_batch_item(\n uuid,\n attribute_names\n )\n\n request = self._build_request_message(None, [batch_item])\n response = self._send_and_receive_message(request)\n results = self._process_batch_items(response)\n return results[0]"} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nsends a GetAttributeList request to the server.", "response": "def get_attribute_list(self, uid=None):\n \"\"\"\n Send a GetAttributeList request to the server.\n\n Args:\n uid (string): The ID of the managed object with which the retrieved\n attribute names should be associated.\n\n Returns:\n result (GetAttributeListResult): A structure containing the results\n of the operation.\n \"\"\"\n batch_item = self._build_get_attribute_list_batch_item(uid)\n\n request = self._build_request_message(None, [batch_item])\n response = self._send_and_receive_message(request)\n results = self._process_batch_items(response)\n return results[0]"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef query(self, batch=False, query_functions=None, credential=None):\n batch_item = self._build_query_batch_item(query_functions)\n\n # TODO (peter-hamilton): Replace this with official client batch mode.\n if batch:\n self.batch_items.append(batch_item)\n else:\n request = self._build_request_message(credential, [batch_item])\n response = self._send_and_receive_message(request)\n results = self._process_batch_items(response)\n return results[0]", "response": "Send a Query request to the server."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef encrypt(self,\n data,\n unique_identifier=None,\n cryptographic_parameters=None,\n iv_counter_nonce=None,\n credential=None):\n \"\"\"\n Encrypt data using the specified encryption key and parameters.\n\n Args:\n data (bytes): The bytes to encrypt. Required.\n unique_identifier (string): The unique ID of the encryption key\n to use. Optional, defaults to None.\n cryptographic_parameters (CryptographicParameters): A structure\n containing various cryptographic settings to be used for the\n encryption. Optional, defaults to None.\n iv_counter_nonce (bytes): The bytes to use for the IV/counter/\n nonce, if needed by the encryption algorithm and/or cipher\n mode. Optional, defaults to None.\n credential (Credential): A credential object containing a set of\n authorization parameters for the operation. Optional, defaults\n to None.\n\n Returns:\n dict: The results of the encrypt operation, containing the\n following key/value pairs:\n\n Key | Value\n --------------------|-----------------------------------------\n 'unique_identifier' | (string) The unique ID of the encryption\n | key used to encrypt the data.\n 'data' | (bytes) The encrypted data.\n 'iv_counter_nonce' | (bytes) The IV/counter/nonce used for\n | the encryption, if autogenerated.\n 'result_status' | (ResultStatus) An enumeration indicating\n | the status of the operation result.\n 'result_reason' | (ResultReason) An enumeration providing\n | context for the result status.\n 'result_message' | (string) A message providing additional\n | context for the operation result.\n \"\"\"\n operation = Operation(OperationEnum.ENCRYPT)\n\n request_payload = payloads.EncryptRequestPayload(\n unique_identifier=unique_identifier,\n data=data,\n cryptographic_parameters=cryptographic_parameters,\n iv_counter_nonce=iv_counter_nonce\n )\n batch_item = messages.RequestBatchItem(\n operation=operation,\n request_payload=request_payload\n )\n\n request = self._build_request_message(credential, [batch_item])\n response = self._send_and_receive_message(request)\n batch_item = response.batch_items[0]\n payload = batch_item.response_payload\n\n result = {}\n\n if payload:\n result['unique_identifier'] = payload.unique_identifier\n result['data'] = payload.data\n result['iv_counter_nonce'] = payload.iv_counter_nonce\n\n result['result_status'] = batch_item.result_status.value\n try:\n result['result_reason'] = batch_item.result_reason.value\n except Exception:\n result['result_reason'] = batch_item.result_reason\n try:\n result['result_message'] = batch_item.result_message.value\n except Exception:\n result['result_message'] = batch_item.result_message\n\n return result", "response": "This function encrypts the specified data using the specified encryption key and parameters."} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef signature_verify(self,\n message,\n signature,\n unique_identifier=None,\n cryptographic_parameters=None,\n credential=None):\n \"\"\"\n Verify a message signature using the specified signing key.\n\n Args:\n message (bytes): The bytes of the signed message. Required.\n signature (bytes): The bytes of the message signature. Required.\n unique_identifier (string): The unique ID of the signing key to\n use. Optional, defaults to None.\n cryptographic_parameters (CryptographicParameters): A structure\n containing various cryptographic settings to be used for\n signature verification. Optional, defaults to None.\n credential (Credential): A credential object containing a set of\n authorization parameters for the operation. Optional, defaults\n to None.\n\n Returns:\n dict: The results of the signature verify operation, containing the\n following key/value pairs:\n\n Key | Value\n ---------------------|-----------------------------------------\n 'unique_identifier' | (string) The unique ID of the signing\n | key used to verify the signature.\n 'validity_indicator' | (ValidityIndicator) An enumeration\n | indicating the result of signature\n | verification.\n 'result_status' | (ResultStatus) An enumeration indicating\n | the status of the operation result.\n 'result_reason' | (ResultReason) An enumeration providing\n | context for the result status.\n 'result_message' | (string) A message providing additional\n | context for the operation result.\n \"\"\"\n operation = Operation(OperationEnum.SIGNATURE_VERIFY)\n\n request_payload = payloads.SignatureVerifyRequestPayload(\n unique_identifier=unique_identifier,\n cryptographic_parameters=cryptographic_parameters,\n data=message,\n signature_data=signature\n )\n batch_item = messages.RequestBatchItem(\n operation=operation,\n request_payload=request_payload\n )\n\n request = self._build_request_message(credential, [batch_item])\n response = self._send_and_receive_message(request)\n batch_item = response.batch_items[0]\n payload = batch_item.response_payload\n\n result = {}\n\n if payload:\n result['unique_identifier'] = payload.unique_identifier\n result['validity_indicator'] = payload.validity_indicator\n\n result['result_status'] = batch_item.result_status.value\n try:\n result['result_reason'] = batch_item.result_reason.value\n except Exception:\n result['result_reason'] = batch_item.result_reason\n try:\n result['result_message'] = batch_item.result_message.value\n except Exception:\n result['result_message'] = batch_item.result_message\n\n return result", "response": "This method is used to verify a message signature using the specified signing key."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef sign(self, data, unique_identifier=None,\n cryptographic_parameters=None, credential=None):\n \"\"\"\n Sign specified data using a specified signing key.\n\n Args:\n data (bytes): Data to be signed. Required.\n unique_identifier (string): The unique ID of the signing\n key to be used. Optional, defaults to None.\n cryptographic_parameters (CryptographicParameters): A structure\n containing various cryptographic settings to be used for\n creating the signature. Optional, defaults to None.\n credential (Credential): A credential object containing a set of\n authorization parameters for the operation. Optional, defaults\n to None.\n Returns:\n dict: The results of the sign operation, containing the\n following key/value pairs:\n\n Key | Value\n ---------------------|-----------------------------------------\n 'unique_identifier' | (string) The unique ID of the signing\n | key used to create the signature\n 'signature' | (bytes) The bytes of the signature\n 'result_status' | (ResultStatus) An enumeration indicating\n | the status of the operation result\n 'result_reason' | (ResultReason) An enumeration providing\n | context for the result status.\n 'result_message' | (string) A message providing additional\n | context for the operation result.\n \"\"\"\n operation = Operation(OperationEnum.SIGN)\n\n request_payload = payloads.SignRequestPayload(\n unique_identifier=unique_identifier,\n cryptographic_parameters=cryptographic_parameters,\n data=data\n )\n\n batch_item = messages.RequestBatchItem(\n operation=operation,\n request_payload=request_payload\n )\n\n request = self._build_request_message(credential, [batch_item])\n response = self._send_and_receive_message(request)\n batch_item = response.batch_items[0]\n payload = batch_item.response_payload\n\n result = {}\n\n if payload:\n result['unique_identifier'] = payload.unique_identifier\n result['signature'] = payload.signature_data\n result['result_status'] = batch_item.result_status.value\n try:\n result['result_reason'] = batch_item.result_reason.value\n except Exception:\n result['result_reason'] = batch_item.result_reason\n try:\n result['result_message'] = batch_item.result_message.value\n except Exception:\n result['result_message'] = batch_item.result_message\n\n return result", "response": "Signs the specified data using a specified signing key."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef _build_host_list(self, host_list_str):\n '''\n This internal function takes the host string from the config file\n and turns it into a list\n :return: LIST host list\n '''\n\n host_list = []\n if isinstance(host_list_str, str):\n host_list = host_list_str.replace(' ', '').split(',')\n else:\n raise TypeError(\"Unrecognized variable type provided for host \"\n \"list string. 'String' type expected but '\" +\n str(type(host_list_str)) + \"' received\")\n return host_list", "response": "This internal function takes the host string from the config file\n and turns it into a LIST host list\n "} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nsetting the KMIP version for the client.", "response": "def kmip_version(self, value):\n \"\"\"\n Set the KMIP version for the client.\n\n Args:\n value (KMIPVersion): A KMIPVersion enumeration\n\n Return:\n None\n\n Raises:\n ValueError: if value is not a KMIPVersion enumeration\n\n Example:\n >>> client.kmip_version = enums.KMIPVersion.KMIP_1_1\n >>>\n \"\"\"\n if isinstance(value, enums.KMIPVersion):\n self.proxy.kmip_version = value\n else:\n raise ValueError(\"KMIP version must be a KMIPVersion enumeration\")"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function to\nopen the client connection.", "response": "def open(self):\n \"\"\"\n Open the client connection.\n\n Raises:\n ClientConnectionFailure: if the client connection is already open\n Exception: if an error occurs while trying to open the connection\n \"\"\"\n if self._is_open:\n raise exceptions.ClientConnectionFailure(\n \"client connection already open\")\n else:\n try:\n self.proxy.open()\n self._is_open = True\n except Exception as e:\n self.logger.error(\"could not open client connection: %s\", e)\n raise"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef close(self):\n if not self._is_open:\n return\n else:\n try:\n self.proxy.close()\n self._is_open = False\n except Exception as e:\n self.logger.error(\"could not close client connection: %s\", e)\n raise", "response": "Closes the client connection."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\ncreates a new symmetric key on a KMIP appliance.", "response": "def create(self, algorithm, length, operation_policy_name=None, name=None,\n cryptographic_usage_mask=None):\n \"\"\"\n Create a symmetric key on a KMIP appliance.\n\n Args:\n algorithm (CryptographicAlgorithm): An enumeration defining the\n algorithm to use to generate the symmetric key.\n length (int): The length in bits for the symmetric key.\n operation_policy_name (string): The name of the operation policy\n to use for the new symmetric key. Optional, defaults to None\n name (string): The name to give the key. Optional, defaults to None\n cryptographic_usage_mask (list): list of enumerations of crypto\n usage mask passing to the symmetric key. Optional, defaults to\n None\n\n Returns:\n string: The uid of the newly created symmetric key.\n\n Raises:\n ClientConnectionNotOpen: if the client connection is unusable\n KmipOperationFailure: if the operation result is a failure\n TypeError: if the input arguments are invalid\n \"\"\"\n # Check inputs\n if not isinstance(algorithm, enums.CryptographicAlgorithm):\n raise TypeError(\n \"algorithm must be a CryptographicAlgorithm enumeration\")\n elif not isinstance(length, six.integer_types) or length <= 0:\n raise TypeError(\"length must be a positive integer\")\n if cryptographic_usage_mask is not None:\n if not isinstance(cryptographic_usage_mask, list) or \\\n all(isinstance(item, enums.CryptographicUsageMask)\n for item in cryptographic_usage_mask) is False:\n raise TypeError(\n \"cryptographic_usage_mask must be a list of \"\n \"CryptographicUsageMask enumerations\")\n\n # Create the template containing the attributes\n common_attributes = self._build_common_attributes(\n operation_policy_name\n )\n key_attributes = self._build_key_attributes(\n algorithm, length, cryptographic_usage_mask)\n key_attributes.extend(common_attributes)\n\n if name:\n key_attributes.extend(self._build_name_attribute(name))\n\n template = cobjects.TemplateAttribute(attributes=key_attributes)\n\n # Create the symmetric key and handle the results\n result = self.proxy.create(enums.ObjectType.SYMMETRIC_KEY, template)\n\n status = result.result_status.value\n if status == enums.ResultStatus.SUCCESS:\n return result.uuid\n else:\n reason = result.result_reason.value\n message = result.result_message.value\n raise exceptions.KmipOperationFailure(status, reason, message)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\ncreating an asymmetric key pair on a KMIP appliance.", "response": "def create_key_pair(self,\n algorithm,\n length,\n operation_policy_name=None,\n public_name=None,\n public_usage_mask=None,\n private_name=None,\n private_usage_mask=None):\n \"\"\"\n Create an asymmetric key pair on a KMIP appliance.\n\n Args:\n algorithm (CryptographicAlgorithm): An enumeration defining the\n algorithm to use to generate the key pair.\n length (int): The length in bits for the key pair.\n operation_policy_name (string): The name of the operation policy\n to use for the new key pair. Optional, defaults to None.\n public_name (string): The name to give the public key. Optional,\n defaults to None.\n public_usage_mask (list): A list of CryptographicUsageMask\n enumerations indicating how the public key should be used.\n Optional, defaults to None.\n private_name (string): The name to give the public key. Optional,\n defaults to None.\n private_usage_mask (list): A list of CryptographicUsageMask\n enumerations indicating how the private key should be used.\n Optional, defaults to None.\n\n Returns:\n string: The uid of the newly created public key.\n string: The uid of the newly created private key.\n\n Raises:\n ClientConnectionNotOpen: if the client connection is unusable\n KmipOperationFailure: if the operation result is a failure\n TypeError: if the input arguments are invalid\n \"\"\"\n # Check inputs\n if not isinstance(algorithm, enums.CryptographicAlgorithm):\n raise TypeError(\n \"algorithm must be a CryptographicAlgorithm enumeration\")\n elif not isinstance(length, six.integer_types) or length <= 0:\n raise TypeError(\"length must be a positive integer\")\n\n # Create the common attributes that are shared\n common_attributes = self._build_common_attributes(\n operation_policy_name\n )\n\n algorithm_attribute = self.attribute_factory.create_attribute(\n enums.AttributeType.CRYPTOGRAPHIC_ALGORITHM,\n algorithm\n )\n length_attribute = self.attribute_factory.create_attribute(\n enums.AttributeType.CRYPTOGRAPHIC_LENGTH,\n length\n )\n\n common_attributes.extend([algorithm_attribute, length_attribute])\n template = cobjects.TemplateAttribute(\n attributes=common_attributes,\n tag=enums.Tags.COMMON_TEMPLATE_ATTRIBUTE\n )\n\n # Create public / private specific attributes\n public_template = None\n names = None\n if public_name:\n names = self._build_name_attribute(name=public_name)\n attrs = []\n if public_usage_mask:\n attrs = [\n self.attribute_factory.create_attribute(\n enums.AttributeType.CRYPTOGRAPHIC_USAGE_MASK,\n public_usage_mask\n )\n ]\n if names or attrs:\n public_template = cobjects.TemplateAttribute(\n names=names,\n attributes=attrs,\n tag=enums.Tags.PUBLIC_KEY_TEMPLATE_ATTRIBUTE\n )\n\n private_template = None\n names = None\n if private_name:\n names = self._build_name_attribute(name=private_name)\n attrs = []\n if private_usage_mask:\n attrs = [\n self.attribute_factory.create_attribute(\n enums.AttributeType.CRYPTOGRAPHIC_USAGE_MASK,\n private_usage_mask\n )\n ]\n if names or attrs:\n private_template = cobjects.TemplateAttribute(\n names=names,\n attributes=attrs,\n tag=enums.Tags.PRIVATE_KEY_TEMPLATE_ATTRIBUTE\n )\n\n # Create the asymmetric key pair and handle the results\n result = self.proxy.create_key_pair(\n common_template_attribute=template,\n private_key_template_attribute=private_template,\n public_key_template_attribute=public_template)\n\n status = result.result_status.value\n if status == enums.ResultStatus.SUCCESS:\n public_uid = result.public_key_uuid\n private_uid = result.private_key_uuid\n return public_uid, private_uid\n else:\n reason = result.result_reason.value\n message = result.result_message.value\n raise exceptions.KmipOperationFailure(status, reason, message)"} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nregistering a new object with a KMIP appliance.", "response": "def register(self, managed_object):\n \"\"\"\n Register a managed object with a KMIP appliance.\n\n Args:\n managed_object (ManagedObject): A managed object to register. An\n instantiatable subclass of ManagedObject from the Pie API.\n\n Returns:\n string: The uid of the newly registered managed object.\n\n Raises:\n ClientConnectionNotOpen: if the client connection is unusable\n KmipOperationFailure: if the operation result is a failure\n TypeError: if the input argument is invalid\n \"\"\"\n # Check input\n if not isinstance(managed_object, pobjects.ManagedObject):\n raise TypeError(\"managed object must be a Pie ManagedObject\")\n\n # Extract and create attributes\n object_attributes = list()\n\n if hasattr(managed_object, 'cryptographic_usage_masks'):\n if managed_object.cryptographic_usage_masks is not None:\n mask_attribute = self.attribute_factory.create_attribute(\n enums.AttributeType.CRYPTOGRAPHIC_USAGE_MASK,\n managed_object.cryptographic_usage_masks\n )\n object_attributes.append(mask_attribute)\n if hasattr(managed_object, 'operation_policy_name'):\n if managed_object.operation_policy_name is not None:\n opn_attribute = self.attribute_factory.create_attribute(\n enums.AttributeType.OPERATION_POLICY_NAME,\n managed_object.operation_policy_name\n )\n object_attributes.append(opn_attribute)\n if hasattr(managed_object, 'names'):\n if managed_object.names:\n for name in managed_object.names:\n name_attribute = self.attribute_factory.create_attribute(\n enums.AttributeType.NAME,\n name\n )\n object_attributes.append(name_attribute)\n\n template = cobjects.TemplateAttribute(attributes=object_attributes)\n object_type = managed_object.object_type\n\n # Register the managed object and handle the results\n secret = self.object_factory.convert(managed_object)\n result = self.proxy.register(object_type, template, secret)\n\n status = result.result_status.value\n if status == enums.ResultStatus.SUCCESS:\n return result.uuid\n else:\n reason = result.result_reason.value\n message = result.result_message.value\n raise exceptions.KmipOperationFailure(status, reason, message)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef derive_key(self,\n object_type,\n unique_identifiers,\n derivation_method,\n derivation_parameters,\n **kwargs):\n \"\"\"\n Derive a new key or secret data from existing managed objects.\n\n Args:\n object_type (ObjectType): An ObjectType enumeration specifying\n what type of object to derive. Only SymmetricKeys and\n SecretData can be specified. Required.\n unique_identifiers (list): A list of strings specifying the\n unique IDs of the existing managed objects to use for\n derivation. Multiple objects can be specified to fit the\n requirements of the given derivation method. Required.\n derivation_method (DerivationMethod): A DerivationMethod\n enumeration specifying how key derivation should be done.\n Required.\n derivation_parameters (dict): A dictionary containing various\n settings for the key derivation process. See Note below.\n Required.\n **kwargs (various): A placeholder for object attributes that\n should be set on the newly derived object. Currently\n supported attributes include:\n cryptographic_algorithm (enums.CryptographicAlgorithm)\n cryptographic_length (int)\n\n Returns:\n string: The unique ID of the newly derived object.\n\n Raises:\n ClientConnectionNotOpen: if the client connection is unusable\n KmipOperationFailure: if the operation result is a failure\n TypeError: if the input arguments are invalid\n\n Notes:\n The derivation_parameters argument is a dictionary that can\n contain the following key/value pairs:\n\n Key | Value\n ---------------------------|---------------------------------------\n 'cryptographic_parameters' | A dictionary containing additional\n | cryptographic settings. See the\n | decrypt method for more information.\n 'initialization_vector' | Bytes to be used to initialize the key\n | derivation function, if needed.\n 'derivation_data' | Bytes to be used as the basis for the\n | key derivation process (e.g., the\n | bytes to be encrypted, hashed, etc).\n 'salt' | Bytes to used as a salt value for the\n | key derivation function, if needed.\n | Usually used with PBKDF2.\n 'iteration_count' | An integer defining how many\n | iterations should be used with the key\n | derivation function, if needed.\n | Usually used with PBKDF2.\n \"\"\"\n # Check input\n if not isinstance(object_type, enums.ObjectType):\n raise TypeError(\"Object type must be an ObjectType enumeration.\")\n if not isinstance(unique_identifiers, list):\n raise TypeError(\"Unique identifiers must be a list of strings.\")\n else:\n for unique_identifier in unique_identifiers:\n if not isinstance(unique_identifier, six.string_types):\n raise TypeError(\n \"Unique identifiers must be a list of strings.\"\n )\n if not isinstance(derivation_method, enums.DerivationMethod):\n raise TypeError(\n \"Derivation method must be a DerivationMethod enumeration.\"\n )\n if not isinstance(derivation_parameters, dict):\n raise TypeError(\"Derivation parameters must be a dictionary.\")\n\n derivation_parameters = DerivationParameters(\n cryptographic_parameters=self._build_cryptographic_parameters(\n derivation_parameters.get('cryptographic_parameters')\n ),\n initialization_vector=derivation_parameters.get(\n 'initialization_vector'\n ),\n derivation_data=derivation_parameters.get('derivation_data'),\n salt=derivation_parameters.get('salt'),\n iteration_count=derivation_parameters.get('iteration_count')\n )\n\n # Handle object attributes\n attributes = []\n if kwargs.get('cryptographic_length'):\n attributes.append(\n self.attribute_factory.create_attribute(\n enums.AttributeType.CRYPTOGRAPHIC_LENGTH,\n kwargs.get('cryptographic_length')\n )\n )\n if kwargs.get('cryptographic_algorithm'):\n attributes.append(\n self.attribute_factory.create_attribute(\n enums.AttributeType.CRYPTOGRAPHIC_ALGORITHM,\n kwargs.get('cryptographic_algorithm')\n )\n )\n if kwargs.get('cryptographic_usage_mask'):\n attributes.append(\n self.attribute_factory.create_attribute(\n enums.AttributeType.CRYPTOGRAPHIC_USAGE_MASK,\n kwargs.get('cryptographic_usage_mask')\n )\n )\n template_attribute = cobjects.TemplateAttribute(\n attributes=attributes\n )\n\n # Derive the new key/data and handle the results\n result = self.proxy.derive_key(\n object_type,\n unique_identifiers,\n derivation_method,\n derivation_parameters,\n template_attribute\n )\n\n status = result.get('result_status')\n if status == enums.ResultStatus.SUCCESS:\n return result.get('unique_identifier')\n else:\n raise exceptions.KmipOperationFailure(\n status,\n result.get('result_reason'),\n result.get('result_message')\n )", "response": "Derive a new key from existing managed objects."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nsearching for managed objects based on the specified attributes and returns the unique identifiers of the objects found.", "response": "def locate(self, maximum_items=None, storage_status_mask=None,\n object_group_member=None, attributes=None):\n \"\"\"\n Search for managed objects, depending on the attributes specified in\n the request.\n\n Args:\n maximum_items (integer): Maximum number of object identifiers the\n server MAY return.\n storage_status_mask (integer): A bit mask that indicates whether\n on-line or archived objects are to be searched.\n object_group_member (ObjectGroupMember): An enumeration that\n indicates the object group member type.\n attributes (list): Attributes the are REQUIRED to match those in a\n candidate object.\n\n Returns:\n list: The Unique Identifiers of the located objects\n\n Raises:\n ClientConnectionNotOpen: if the client connection is unusable\n KmipOperationFailure: if the operation result is a failure\n TypeError: if the input arguments are invalid\n \"\"\"\n # Check inputs\n if maximum_items is not None:\n if not isinstance(maximum_items, six.integer_types):\n raise TypeError(\"maximum_items must be an integer\")\n if storage_status_mask is not None:\n if not isinstance(storage_status_mask, six.integer_types):\n raise TypeError(\"storage_status_mask must be an integer\")\n if object_group_member is not None:\n if not isinstance(object_group_member, enums.ObjectGroupMember):\n raise TypeError(\n \"object_group_member must be a ObjectGroupMember\"\n \"enumeration\")\n if attributes is not None:\n if not isinstance(attributes, list) or \\\n all(isinstance(item, cobjects.Attribute)\n for item in attributes) is False:\n raise TypeError(\n \"attributes must be a list of attributes\")\n\n # Search for managed objects and handle the results\n result = self.proxy.locate(\n maximum_items, storage_status_mask,\n object_group_member, attributes)\n\n status = result.result_status.value\n if status == enums.ResultStatus.SUCCESS:\n return result.uuids\n else:\n reason = result.result_reason.value\n message = result.result_message.value\n raise exceptions.KmipOperationFailure(status, reason, message)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef check(self,\n uid=None,\n usage_limits_count=None,\n cryptographic_usage_mask=None,\n lease_time=None):\n \"\"\"\n Check the constraints for a managed object.\n\n Args:\n uid (string): The unique ID of the managed object to check.\n Optional, defaults to None.\n usage_limits_count (int): The number of items that can be secured\n with the specified managed object. Optional, defaults to None.\n cryptographic_usage_mask (list): A list of CryptographicUsageMask\n enumerations specifying the operations possible with the\n specified managed object. Optional, defaults to None.\n lease_time (int): The number of seconds that can be leased for the\n specified managed object. Optional, defaults to None.\n \"\"\"\n if uid is not None:\n if not isinstance(uid, six.string_types):\n raise TypeError(\"The unique identifier must be a string.\")\n if usage_limits_count is not None:\n if not isinstance(usage_limits_count, six.integer_types):\n raise TypeError(\"The usage limits count must be an integer.\")\n if cryptographic_usage_mask is not None:\n if not isinstance(cryptographic_usage_mask, list) or \\\n not all(isinstance(\n x,\n enums.CryptographicUsageMask\n ) for x in cryptographic_usage_mask):\n raise TypeError(\n \"The cryptographic usage mask must be a list of \"\n \"CryptographicUsageMask enumerations.\"\n )\n if lease_time is not None:\n if not isinstance(lease_time, six.integer_types):\n raise TypeError(\"The lease time must be an integer.\")\n\n result = self.proxy.check(\n uid,\n usage_limits_count,\n cryptographic_usage_mask,\n lease_time\n )\n\n status = result.get('result_status')\n if status == enums.ResultStatus.SUCCESS:\n return result.get('unique_identifier')\n else:\n raise exceptions.KmipOperationFailure(\n status,\n result.get('result_reason'),\n result.get('result_message')\n )", "response": "Checks the constraints for a managed object."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef get(self, uid=None, key_wrapping_specification=None):\n # Check input\n if uid is not None:\n if not isinstance(uid, six.string_types):\n raise TypeError(\"uid must be a string\")\n if key_wrapping_specification is not None:\n if not isinstance(key_wrapping_specification, dict):\n raise TypeError(\n \"Key wrapping specification must be a dictionary.\"\n )\n\n spec = self._build_key_wrapping_specification(\n key_wrapping_specification\n )\n\n # Get the managed object and handle the results\n result = self.proxy.get(uid, key_wrapping_specification=spec)\n\n status = result.result_status.value\n if status == enums.ResultStatus.SUCCESS:\n managed_object = self.object_factory.convert(result.secret)\n return managed_object\n else:\n reason = result.result_reason.value\n message = result.result_message.value\n raise exceptions.KmipOperationFailure(status, reason, message)", "response": "This method returns a managed object from a KMIP appliance."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef get_attributes(self, uid=None, attribute_names=None):\n # Check input\n if uid is not None:\n if not isinstance(uid, six.string_types):\n raise TypeError(\"uid must be a string\")\n if attribute_names is not None:\n if not isinstance(attribute_names, list):\n raise TypeError(\"attribute_names must be a list of strings\")\n else:\n for attribute_name in attribute_names:\n if not isinstance(attribute_name, six.string_types):\n raise TypeError(\n \"attribute_names must be a list of strings\"\n )\n\n # Get the list of attributes for a managed object\n result = self.proxy.get_attributes(uid, attribute_names)\n\n status = result.result_status.value\n if status == enums.ResultStatus.SUCCESS:\n return result.uuid, result.attributes\n else:\n reason = result.result_reason.value\n message = result.result_message.value\n raise exceptions.KmipOperationFailure(status, reason, message)", "response": "Get the attributes associated with a managed object."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nactivates a managed object stored by a KMIP appliance.", "response": "def activate(self, uid=None):\n \"\"\"\n Activate a managed object stored by a KMIP appliance.\n\n Args:\n uid (string): The unique ID of the managed object to activate.\n Optional, defaults to None.\n\n Returns:\n None\n\n Raises:\n ClientConnectionNotOpen: if the client connection is unusable\n KmipOperationFailure: if the operation result is a failure\n TypeError: if the input argument is invalid\n \"\"\"\n # Check input\n if uid is not None:\n if not isinstance(uid, six.string_types):\n raise TypeError(\"uid must be a string\")\n\n # Activate the managed object and handle the results\n result = self.proxy.activate(uid)\n\n status = result.result_status.value\n if status == enums.ResultStatus.SUCCESS:\n return\n else:\n reason = result.result_reason.value\n message = result.result_message.value\n raise exceptions.KmipOperationFailure(status, reason, message)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef revoke(self, revocation_reason, uid=None, revocation_message=None,\n compromise_occurrence_date=None):\n \"\"\"\n Revoke a managed object stored by a KMIP appliance.\n\n Args:\n revocation_reason (RevocationReasonCode): An enumeration indicating\n the revocation reason.\n uid (string): The unique ID of the managed object to revoke.\n Optional, defaults to None.\n revocation_message (string): A message regarding the revocation.\n Optional, defaults to None.\n compromise_occurrence_date (int): An integer, the number of seconds\n since the epoch, which will be converted to the Datetime when\n the managed object was first believed to be compromised.\n Optional, defaults to None.\n\n Returns:\n None\n\n Raises:\n ClientConnectionNotOpen: if the client connection is unusable\n KmipOperationFailure: if the operation result is a failure\n TypeError: if the input argument is invalid\n \"\"\"\n # Check input\n if not isinstance(revocation_reason, enums.RevocationReasonCode):\n raise TypeError(\n \"revocation_reason must be a RevocationReasonCode enumeration\")\n if uid is not None:\n if not isinstance(uid, six.string_types):\n raise TypeError(\"uid must be a string\")\n if revocation_message is not None:\n if not isinstance(revocation_message, six.string_types):\n raise TypeError(\"revocation_message must be a string\")\n if compromise_occurrence_date is not None:\n if not isinstance(compromise_occurrence_date, six.integer_types):\n raise TypeError(\n \"compromise_occurrence_date must be an integer\")\n compromise_occurrence_date = primitives.DateTime(\n compromise_occurrence_date,\n enums.Tags.COMPROMISE_OCCURRENCE_DATE)\n\n # revoke the managed object and handle the results\n result = self.proxy.revoke(revocation_reason, uid, revocation_message,\n compromise_occurrence_date)\n\n status = result.result_status.value\n if status == enums.ResultStatus.SUCCESS:\n return\n else:\n reason = result.result_reason.value\n message = result.result_message.value\n raise exceptions.KmipOperationFailure(status, reason, message)", "response": "Revoke a managed object stored by a KMIP appliance."} {"SOURCE": "codesearchnet", "instruction": "Implement a Python 3 function for\nencrypting the data using the specified encryption key and parameters.", "response": "def encrypt(self, data, uid=None, cryptographic_parameters=None,\n iv_counter_nonce=None):\n \"\"\"\n Encrypt data using the specified encryption key and parameters.\n\n Args:\n data (bytes): The bytes to encrypt. Required.\n uid (string): The unique ID of the encryption key to use.\n Optional, defaults to None.\n cryptographic_parameters (dict): A dictionary containing various\n cryptographic settings to be used for the encryption.\n Optional, defaults to None.\n iv_counter_nonce (bytes): The bytes to use for the IV/counter/\n nonce, if needed by the encryption algorithm and/or cipher\n mode. Optional, defaults to None.\n\n Returns:\n bytes: The encrypted data.\n bytes: The IV/counter/nonce used with the encryption algorithm,\n only if it was autogenerated by the server.\n\n Raises:\n ClientConnectionNotOpen: if the client connection is unusable\n KmipOperationFailure: if the operation result is a failure\n TypeError: if the input arguments are invalid\n\n Notes:\n The cryptographic_parameters argument is a dictionary that can\n contain the following key/value pairs:\n\n Keys | Value\n ------------------------------|-----------------------------------\n 'block_cipher_mode' | A BlockCipherMode enumeration\n | indicating the cipher mode to use\n | with the encryption algorithm.\n 'padding_method' | A PaddingMethod enumeration\n | indicating which padding method to\n | use with the encryption algorithm.\n 'hashing_algorithm' | A HashingAlgorithm enumeration\n | indicating which hashing algorithm\n | to use.\n 'key_role_type' | A KeyRoleType enumeration\n | indicating the intended use of the\n | associated cryptographic key.\n 'digital_signature_algorithm' | A DigitalSignatureAlgorithm\n | enumeration indicating which\n | digital signature algorithm to\n | use.\n 'cryptographic_algorithm' | A CryptographicAlgorithm\n | enumeration indicating which\n | encryption algorithm to use.\n 'random_iv' | A boolean indicating whether the\n | server should autogenerate an IV.\n 'iv_length' | An integer representing the length\n | of the initialization vector (IV)\n | in bits.\n 'tag_length' | An integer representing the length\n | of the authenticator tag in bytes.\n 'fixed_field_length' | An integer representing the length\n | of the fixed field portion of the\n | IV in bits.\n 'invocation_field_length' | An integer representing the length\n | of the invocation field portion of\n | the IV in bits.\n 'counter_length' | An integer representing the length\n | of the coutner portion of the IV\n | in bits.\n 'initial_counter_value' | An integer representing the\n | starting counter value for CTR\n | mode (typically 1).\n \"\"\"\n # Check input\n if not isinstance(data, six.binary_type):\n raise TypeError(\"data must be bytes\")\n if uid is not None:\n if not isinstance(uid, six.string_types):\n raise TypeError(\"uid must be a string\")\n if cryptographic_parameters is not None:\n if not isinstance(cryptographic_parameters, dict):\n raise TypeError(\"cryptographic_parameters must be a dict\")\n if iv_counter_nonce is not None:\n if not isinstance(iv_counter_nonce, six.binary_type):\n raise TypeError(\"iv_counter_nonce must be bytes\")\n\n cryptographic_parameters = self._build_cryptographic_parameters(\n cryptographic_parameters\n )\n\n # Encrypt the provided data and handle the results\n result = self.proxy.encrypt(\n data,\n uid,\n cryptographic_parameters,\n iv_counter_nonce\n )\n\n status = result.get('result_status')\n if status == enums.ResultStatus.SUCCESS:\n return result.get('data'), result.get('iv_counter_nonce')\n else:\n raise exceptions.KmipOperationFailure(\n status,\n result.get('result_reason'),\n result.get('result_message')\n )"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef signature_verify(self, message, signature, uid=None,\n cryptographic_parameters=None):\n \"\"\"\n Verify a message signature using the specified signing key.\n\n Args:\n message (bytes): The bytes of the signed message. Required.\n signature (bytes): The bytes of the message signature. Required.\n uid (string): The unique ID of the signing key to use.\n Optional, defaults to None.\n cryptographic_parameters (dict): A dictionary containing various\n cryptographic settings to be used for signature verification\n (e.g., cryptographic algorithm, hashing algorithm, and/or\n digital signature algorithm). Optional, defaults to None.\n\n Returns:\n ValidityIndicator: An enumeration indicating whether or not the\n signature was valid.\n\n Raises:\n ClientConnectionNotOpen: if the client connection is unusable\n KmipOperationFailure: if the operation result is a failure\n TypeError: if the input arguments are invalid\n\n Notes:\n The cryptographic_parameters argument is a dictionary that can\n contain various key/value pairs. For a list of allowed pairs,\n see the documentation for encrypt/decrypt.\n \"\"\"\n # Check input\n if not isinstance(message, six.binary_type):\n raise TypeError(\"Message must be bytes.\")\n if not isinstance(signature, six.binary_type):\n raise TypeError(\"Signature must be bytes.\")\n if uid is not None:\n if not isinstance(uid, six.string_types):\n raise TypeError(\"Unique identifier must be a string.\")\n if cryptographic_parameters is not None:\n if not isinstance(cryptographic_parameters, dict):\n raise TypeError(\n \"Cryptographic parameters must be a dictionary.\"\n )\n\n cryptographic_parameters = self._build_cryptographic_parameters(\n cryptographic_parameters\n )\n\n # Decrypt the provided data and handle the results\n result = self.proxy.signature_verify(\n message,\n signature,\n uid,\n cryptographic_parameters\n )\n\n status = result.get('result_status')\n if status == enums.ResultStatus.SUCCESS:\n return result.get('validity_indicator')\n else:\n raise exceptions.KmipOperationFailure(\n status,\n result.get('result_reason'),\n result.get('result_message')\n )", "response": "Verify a message using the specified signing key."} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef mac(self, data, uid=None, algorithm=None):\n # Check inputs\n if not isinstance(data, six.binary_type):\n raise TypeError(\"data must be bytes\")\n if uid is not None:\n if not isinstance(uid, six.string_types):\n raise TypeError(\"uid must be a string\")\n if algorithm is not None:\n if not isinstance(algorithm, enums.CryptographicAlgorithm):\n raise TypeError(\n \"algorithm must be a CryptographicAlgorithm enumeration\")\n\n parameters_attribute = self._build_cryptographic_parameters(\n {'cryptographic_algorithm': algorithm}\n )\n\n # Get the message authentication code and handle the results\n result = self.proxy.mac(data, uid, parameters_attribute)\n\n status = result.result_status.value\n if status == enums.ResultStatus.SUCCESS:\n uid = result.uuid.value\n mac_data = result.mac_data.value\n return uid, mac_data\n else:\n reason = result.result_reason.value\n message = result.result_message.value\n raise exceptions.KmipOperationFailure(status, reason, message)", "response": "This method returns the message authentication code for the data."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _build_cryptographic_parameters(self, value):\n if value is None:\n return None\n elif not isinstance(value, dict):\n raise TypeError(\"Cryptographic parameters must be a dictionary.\")\n\n cryptographic_parameters = CryptographicParameters(\n block_cipher_mode=value.get('block_cipher_mode'),\n padding_method=value.get('padding_method'),\n hashing_algorithm=value.get('hashing_algorithm'),\n key_role_type=value.get('key_role_type'),\n digital_signature_algorithm=value.get(\n 'digital_signature_algorithm'\n ),\n cryptographic_algorithm=value.get('cryptographic_algorithm'),\n random_iv=value.get('random_iv'),\n iv_length=value.get('iv_length'),\n tag_length=value.get('tag_length'),\n fixed_field_length=value.get('fixed_field_length'),\n invocation_field_length=value.get('invocation_field_length'),\n counter_length=value.get('counter_length'),\n initial_counter_value=value.get('initial_counter_value')\n )\n return cryptographic_parameters", "response": "Builds a CryptographicParameters struct from a dictionary containing the key - value pairs."} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef _build_encryption_key_information(self, value):\n if value is None:\n return None\n if not isinstance(value, dict):\n raise TypeError(\"Encryption key information must be a dictionary.\")\n\n cryptographic_parameters = value.get('cryptographic_parameters')\n if cryptographic_parameters:\n cryptographic_parameters = self._build_cryptographic_parameters(\n cryptographic_parameters\n )\n encryption_key_information = cobjects.EncryptionKeyInformation(\n unique_identifier=value.get('unique_identifier'),\n cryptographic_parameters=cryptographic_parameters\n )\n return encryption_key_information", "response": "Builds an EncryptionKeyInformation struct from a dictionary containing the key value pairs for a specific locale."} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef _build_mac_signature_key_information(self, value):\n if value is None:\n return None\n if not isinstance(value, dict):\n raise TypeError(\n \"MAC/signature key information must be a dictionary.\"\n )\n\n cryptographic_parameters = value.get('cryptographic_parameters')\n if cryptographic_parameters:\n cryptographic_parameters = self._build_cryptographic_parameters(\n cryptographic_parameters\n )\n mac_signature_key_information = cobjects.MACSignatureKeyInformation(\n unique_identifier=value.get('unique_identifier'),\n cryptographic_parameters=cryptographic_parameters\n )\n return mac_signature_key_information", "response": "Builds a MACSignatureKeyInformation struct from a dictionary containing the key - value pairs for the MAC and signature key."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nbuilding a KeyWrappingSpecification struct from a dictionary containing the key - value pairs for a specific locale.", "response": "def _build_key_wrapping_specification(self, value):\n \"\"\"\n Build a KeyWrappingSpecification struct from a dictionary.\n\n Args:\n value (dict): A dictionary containing the key/value pairs for a\n KeyWrappingSpecification struct.\n\n Returns:\n KeyWrappingSpecification: a KeyWrappingSpecification struct\n\n Raises:\n TypeError: if the input argument is invalid\n \"\"\"\n if value is None:\n return None\n if not isinstance(value, dict):\n raise TypeError(\"Key wrapping specification must be a dictionary.\")\n\n encryption_key_info = self._build_encryption_key_information(\n value.get('encryption_key_information')\n )\n mac_signature_key_info = self._build_mac_signature_key_information(\n value.get('mac_signature_key_information')\n )\n\n key_wrapping_specification = cobjects.KeyWrappingSpecification(\n wrapping_method=value.get('wrapping_method'),\n encryption_key_information=encryption_key_info,\n mac_signature_key_information=mac_signature_key_info,\n attribute_names=value.get('attribute_names'),\n encoding_option=value.get('encoding_option')\n )\n return key_wrapping_specification"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef _build_common_attributes(self, operation_policy_name=None):\n '''\n Build a list of common attributes that are shared across\n symmetric as well as asymmetric objects\n '''\n common_attributes = []\n\n if operation_policy_name:\n common_attributes.append(\n self.attribute_factory.create_attribute(\n enums.AttributeType.OPERATION_POLICY_NAME,\n operation_policy_name\n )\n )\n\n return common_attributes", "response": "Build a list of common attributes that are shared across the local objects."} {"SOURCE": "codesearchnet", "instruction": "How would you implement a function in Python 3 that\nbuilds a name attribute for ease of use in the caller", "response": "def _build_name_attribute(self, name=None):\n '''\n Build a name attribute, returned in a list for ease\n of use in the caller\n '''\n name_list = []\n if name:\n name_list.append(self.attribute_factory.create_attribute(\n enums.AttributeType.NAME,\n name)\n )\n return name_list"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nreading the data encoding the object into the parts of the object.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the QueryRequestPayload object and decode it\n into its constituent parts.\n\n Args:\n input_buffer (Stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n\n Raises:\n InvalidKmipEncoding: Raised if the query functions are missing\n from the encoded payload.\n \"\"\"\n super(QueryRequestPayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n query_functions = []\n while(self.is_tag_next(enums.Tags.QUERY_FUNCTION, local_buffer)):\n query_function = primitives.Enumeration(\n enums.QueryFunction,\n tag=enums.Tags.QUERY_FUNCTION\n )\n query_function.read(local_buffer, kmip_version=kmip_version)\n query_functions.append(query_function)\n\n if query_functions:\n self._query_functions = query_functions\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The Query request payload encoding is missing the query \"\n \"functions.\"\n )\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n local_buffer = utils.BytearrayStream()\n\n if self._query_functions:\n for query_function in self._query_functions:\n query_function.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The Query request payload is missing the query functions \"\n \"field.\"\n )\n\n self.length = local_buffer.length()\n super(QueryRequestPayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)", "response": "Writes the data encoding the object to the output_buffer."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nreads the data encoding the QueryResponsePayload object and decodes it into its constituent parts.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the QueryResponsePayload object and decode it\n into its constituent parts.\n\n Args:\n input_buffer (Stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n super(QueryResponsePayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n operations = []\n while(self.is_tag_next(enums.Tags.OPERATION, local_buffer)):\n operation = primitives.Enumeration(\n enums.Operation,\n tag=enums.Tags.OPERATION\n )\n operation.read(local_buffer, kmip_version=kmip_version)\n operations.append(operation)\n self._operations = operations\n\n object_types = []\n while(self.is_tag_next(enums.Tags.OBJECT_TYPE, local_buffer)):\n object_type = primitives.Enumeration(\n enums.ObjectType,\n tag=enums.Tags.OBJECT_TYPE\n )\n object_type.read(local_buffer, kmip_version=kmip_version)\n object_types.append(object_type)\n self._object_types = object_types\n\n if self.is_tag_next(enums.Tags.VENDOR_IDENTIFICATION, local_buffer):\n vendor_identification = primitives.TextString(\n tag=enums.Tags.VENDOR_IDENTIFICATION\n )\n vendor_identification.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._vendor_identification = vendor_identification\n\n if self.is_tag_next(enums.Tags.SERVER_INFORMATION, local_buffer):\n server_information = misc.ServerInformation()\n server_information.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._server_information = server_information\n\n application_namespaces = []\n while(self.is_tag_next(\n enums.Tags.APPLICATION_NAMESPACE,\n local_buffer\n )\n ):\n application_namespace = primitives.TextString(\n tag=enums.Tags.APPLICATION_NAMESPACE\n )\n application_namespace.read(local_buffer, kmip_version=kmip_version)\n application_namespaces.append(application_namespace)\n self._application_namespaces = application_namespaces\n\n if kmip_version >= enums.KMIPVersion.KMIP_1_1:\n extensions_information = []\n while(self.is_tag_next(\n enums.Tags.EXTENSION_INFORMATION,\n local_buffer\n )\n ):\n extension_information = objects.ExtensionInformation()\n extension_information.read(\n local_buffer,\n kmip_version=kmip_version\n )\n extensions_information.append(extension_information)\n self._extension_information = extensions_information\n\n if kmip_version >= enums.KMIPVersion.KMIP_1_2:\n attestation_types = []\n while(self.is_tag_next(enums.Tags.ATTESTATION_TYPE, local_buffer)):\n attestation_type = primitives.Enumeration(\n enums.AttestationType,\n tag=enums.Tags.ATTESTATION_TYPE\n )\n attestation_type.read(local_buffer, kmip_version=kmip_version)\n attestation_types.append(attestation_type)\n self._attestation_types = attestation_types\n\n if kmip_version >= enums.KMIPVersion.KMIP_1_3:\n rngs_parameters = []\n while(self.is_tag_next(enums.Tags.RNG_PARAMETERS, local_buffer)):\n rng_parameters = objects.RNGParameters()\n rng_parameters.read(local_buffer, kmip_version=kmip_version)\n rngs_parameters.append(rng_parameters)\n self._rng_parameters = rngs_parameters\n\n profiles_information = []\n while(self.is_tag_next(\n enums.Tags.PROFILE_INFORMATION,\n local_buffer\n )\n ):\n profile_information = objects.ProfileInformation()\n profile_information.read(\n local_buffer,\n kmip_version=kmip_version\n )\n profiles_information.append(profile_information)\n self._profile_information = profiles_information\n\n validations_information = []\n while(self.is_tag_next(\n enums.Tags.VALIDATION_INFORMATION,\n local_buffer\n )\n ):\n validation_information = objects.ValidationInformation()\n validation_information.read(\n local_buffer,\n kmip_version=kmip_version\n )\n validations_information.append(validation_information)\n self._validation_information = validations_information\n\n capabilities_information = []\n while(self.is_tag_next(\n enums.Tags.CAPABILITY_INFORMATION,\n local_buffer\n )\n ):\n capability_information = objects.CapabilityInformation()\n capability_information.read(\n local_buffer,\n kmip_version=kmip_version\n )\n capabilities_information.append(capability_information)\n self._capability_information = capabilities_information\n\n client_registration_methods = []\n while(self.is_tag_next(\n enums.Tags.CLIENT_REGISTRATION_METHOD,\n local_buffer\n )\n ):\n client_registration_method = primitives.Enumeration(\n enums.ClientRegistrationMethod,\n tag=enums.Tags.CLIENT_REGISTRATION_METHOD\n )\n client_registration_method.read(\n local_buffer,\n kmip_version=kmip_version\n )\n client_registration_methods.append(client_registration_method)\n self._client_registration_methods = client_registration_methods\n\n if kmip_version >= enums.KMIPVersion.KMIP_2_0:\n if self.is_tag_next(enums.Tags.DEFAULTS_INFORMATION, local_buffer):\n defaults_information = objects.DefaultsInformation()\n defaults_information.read(\n local_buffer,\n kmip_version=kmip_version\n )\n self._defaults_information = defaults_information\n\n storage_protection_masks = []\n while(self.is_tag_next(\n enums.Tags.PROTECTION_STORAGE_MASK,\n local_buffer\n )\n ):\n storage_protection_mask = primitives.Integer(\n tag=enums.Tags.PROTECTION_STORAGE_MASK\n )\n storage_protection_mask.read(\n local_buffer,\n kmip_version=kmip_version\n )\n storage_protection_masks.append(storage_protection_mask)\n self._storage_protection_masks = storage_protection_masks\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nwriting the data encoding the object to the output_buffer.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the QueryResponsePayload object to a stream.\n\n Args:\n output_buffer (Stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n local_buffer = utils.BytearrayStream()\n\n if self._operations:\n for operation in self._operations:\n operation.write(local_buffer, kmip_version=kmip_version)\n\n if self._object_types:\n for object_type in self._object_types:\n object_type.write(local_buffer, kmip_version=kmip_version)\n\n if self._vendor_identification:\n self._vendor_identification.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._server_information:\n self._server_information.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if self._application_namespaces:\n for application_namespace in self._application_namespaces:\n application_namespace.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if kmip_version >= enums.KMIPVersion.KMIP_1_1:\n if self._extension_information:\n for extension_information in self._extension_information:\n extension_information.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if kmip_version >= enums.KMIPVersion.KMIP_1_2:\n if self._attestation_types:\n for attestation_type in self._attestation_types:\n attestation_type.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if kmip_version >= enums.KMIPVersion.KMIP_1_3:\n if self._rng_parameters:\n for rng_parameters in self._rng_parameters:\n rng_parameters.write(\n local_buffer,\n kmip_version=kmip_version\n )\n if self._profile_information:\n for profile_information in self._profile_information:\n profile_information.write(\n local_buffer,\n kmip_version=kmip_version\n )\n if self._validation_information:\n for validation_information in self._validation_information:\n validation_information.write(\n local_buffer,\n kmip_version=kmip_version\n )\n if self._capability_information:\n for capability_information in self._capability_information:\n capability_information.write(\n local_buffer,\n kmip_version=kmip_version\n )\n if self._client_registration_methods:\n for client_reg_method in self._client_registration_methods:\n client_reg_method.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n if kmip_version >= enums.KMIPVersion.KMIP_2_0:\n if self._defaults_information:\n self._defaults_information.write(\n local_buffer,\n kmip_version=kmip_version\n )\n if self._storage_protection_masks:\n for storage_protection_mask in self._storage_protection_masks:\n storage_protection_mask.write(\n local_buffer,\n kmip_version=kmip_version\n )\n\n self.length = local_buffer.length()\n super(QueryResponsePayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nread the data encoding the GetAttributes response payload and decode it into parts.", "response": "def read(self, input_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Read the data encoding the GetAttributes response payload and decode\n it into its constituent parts.\n\n Args:\n input_buffer (stream): A data stream containing encoded object\n data, supporting a read method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be decoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n super(GetAttributesResponsePayload, self).read(\n input_buffer,\n kmip_version=kmip_version\n )\n local_buffer = utils.BytearrayStream(input_buffer.read(self.length))\n\n if self.is_tag_next(enums.Tags.UNIQUE_IDENTIFIER, local_buffer):\n unique_identifier = primitives.TextString(\n tag=enums.Tags.UNIQUE_IDENTIFIER\n )\n unique_identifier.read(local_buffer, kmip_version=kmip_version)\n self.unique_identifier = unique_identifier.value\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The GetAttributes response payload encoding is missing the \"\n \"unique identifier.\"\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n self._attributes = list()\n while self.is_tag_next(enums.Tags.ATTRIBUTE, local_buffer):\n attribute = objects.Attribute()\n attribute.read(local_buffer, kmip_version=kmip_version)\n self._attributes.append(attribute)\n else:\n if self.is_tag_next(enums.Tags.ATTRIBUTES, local_buffer):\n attributes = objects.Attributes()\n attributes.read(local_buffer, kmip_version=kmip_version)\n # TODO (ph) Add a new utility to avoid using TemplateAttributes\n temp_attr = objects.convert_attributes_to_template_attribute(\n attributes\n )\n self._attributes = temp_attr.attributes\n else:\n raise exceptions.InvalidKmipEncoding(\n \"The GetAttributes response payload encoding is missing \"\n \"the attributes structure.\"\n )\n\n self.is_oversized(local_buffer)"} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nwrite the data encoding the GetAttributes response payload to the output_buffer.", "response": "def write(self, output_buffer, kmip_version=enums.KMIPVersion.KMIP_1_0):\n \"\"\"\n Write the data encoding the GetAttributes response payload to a\n stream.\n\n Args:\n output_buffer (stream): A data stream in which to encode object\n data, supporting a write method; usually a BytearrayStream\n object.\n kmip_version (KMIPVersion): An enumeration defining the KMIP\n version with which the object will be encoded. Optional,\n defaults to KMIP 1.0.\n \"\"\"\n local_buffer = utils.BytearrayStream()\n\n if self._unique_identifier:\n self._unique_identifier.write(\n local_buffer,\n kmip_version=kmip_version\n )\n else:\n raise exceptions.InvalidField(\n \"The GetAttributes response payload is missing the unique \"\n \"identifier field.\"\n )\n\n if kmip_version < enums.KMIPVersion.KMIP_2_0:\n for attribute in self._attributes:\n attribute.write(local_buffer, kmip_version=kmip_version)\n else:\n if self._attributes:\n # TODO (ph) Add a new utility to avoid using TemplateAttributes\n template_attribute = objects.TemplateAttribute(\n attributes=self.attributes\n )\n attributes = objects.convert_template_attribute_to_attributes(\n template_attribute\n )\n attributes.write(local_buffer, kmip_version=kmip_version)\n else:\n raise exceptions.InvalidField(\n \"The GetAttributes response payload is missing the \"\n \"attributes list.\"\n )\n\n self.length = local_buffer.length()\n super(GetAttributesResponsePayload, self).write(\n output_buffer,\n kmip_version=kmip_version\n )\n output_buffer.write(local_buffer.buffer)"} {"SOURCE": "codesearchnet", "instruction": "Can you implement a function in Python 3 that\nfinds a single entry point.", "response": "def get_single(group, name, path=None):\n \"\"\"Find a single entry point.\n\n Returns an :class:`EntryPoint` object, or raises :exc:`NoSuchEntryPoint`\n if no match is found.\n \"\"\"\n for config, distro in iter_files_distros(path=path):\n if (group in config) and (name in config[group]):\n epstr = config[group][name]\n with BadEntryPoint.err_to_warnings():\n return EntryPoint.from_string(epstr, name, distro)\n\n raise NoSuchEntryPoint(group, name)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef get_group_named(group, path=None):\n result = {}\n for ep in get_group_all(group, path=path):\n if ep.name not in result:\n result[ep.name] = ep\n return result", "response": "Find a group of entry points with unique names."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nfind all entry points in a group.", "response": "def get_group_all(group, path=None):\n \"\"\"Find all entry points in a group.\n\n Returns a list of :class:`EntryPoint` objects.\n \"\"\"\n result = []\n for config, distro in iter_files_distros(path=path):\n if group in config:\n for name, epstr in config[group].items():\n with BadEntryPoint.err_to_warnings():\n result.append(EntryPoint.from_string(epstr, name, distro))\n\n return result"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script for\nloading the object to which this entry point refers.", "response": "def load(self):\n \"\"\"Load the object to which this entry point refers.\n \"\"\"\n mod = import_module(self.module_name)\n obj = mod\n if self.object_name:\n for attr in self.object_name.split('.'):\n obj = getattr(obj, attr)\n return obj"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 script to\nparse an entry point from the syntax in entry_points. txt.", "response": "def from_string(cls, epstr, name, distro=None):\n \"\"\"Parse an entry point from the syntax in entry_points.txt\n\n :param str epstr: The entry point string (not including 'name =')\n :param str name: The name of this entry point\n :param Distribution distro: The distribution in which the entry point was found\n :rtype: EntryPoint\n :raises BadEntryPoint: if *epstr* can't be parsed as an entry point.\n \"\"\"\n m = entry_point_pattern.match(epstr)\n if m:\n mod, obj, extras = m.group('modulename', 'objectname', 'extras')\n if extras is not None:\n extras = re.split(r',\\s*', extras)\n return cls(name, mod, obj, extras, distro)\n else:\n raise BadEntryPoint(epstr)"} {"SOURCE": "codesearchnet", "instruction": "Can you tell what is the following Python 3 function doing\ndef generate_project(args):\n # Project templates path\n src = os.path.join(dirname(abspath(__file__)), 'project')\n\n project_name = args.get('')\n\n if not project_name:\n logger.warning('Project name cannot be empty.')\n return\n\n # Destination project path\n dst = os.path.join(os.getcwd(), project_name)\n\n if os.path.isdir(dst):\n logger.warning('Project directory already exists.')\n return\n\n logger.info('Start generating project files.')\n\n _mkdir_p(dst)\n\n for src_dir, sub_dirs, filenames in os.walk(src):\n # Build and create destination directory path\n relative_path = src_dir.split(src)[1].lstrip(os.path.sep)\n dst_dir = os.path.join(dst, relative_path)\n\n if src != src_dir:\n _mkdir_p(dst_dir)\n\n # Copy, rewrite and move project files\n for filename in filenames:\n if filename in ['development.py', 'production.py']:\n continue\n\n src_file = os.path.join(src_dir, filename)\n dst_file = os.path.join(dst_dir, filename)\n\n if filename.endswith(REWRITE_FILE_EXTS):\n _rewrite_and_copy(src_file, dst_file, project_name)\n else:\n shutil.copy(src_file, dst_file)\n logger.info(\"New: %s\" % dst_file)\n\n if filename in ['development_sample.py', 'production_sample.py']:\n dst_file = os.path.join(dst_dir, \"%s.py\" % filename.split('_')[0])\n _rewrite_and_copy(src_file, dst_file, project_name)\n logger.info(\"New: %s\" % dst_file)\n\n logger.info('Finish generating project files.')", "response": "Generate a new project."} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\ngenerating the controller file and include the controller file template & css & js directories.", "response": "def generate_controller(args):\n \"\"\"Generate controller, include the controller file, template & css & js directories.\"\"\"\n controller_template = os.path.join(dirname(abspath(__file__)), 'templates/controller.py')\n test_template = os.path.join(dirname(abspath(__file__)), 'templates/unittest.py')\n controller_name = args.get('')\n current_path = os.getcwd()\n\n logger.info('Start generating controller.')\n\n if not controller_name:\n logger.warning('Controller name cannot be empty.')\n return\n\n # controller file\n with open(controller_template, 'r') as template_file:\n controller_file_path = os.path.join(current_path, 'application/controllers',\n controller_name + '.py')\n with open(controller_file_path, 'w+') as controller_file:\n for line in template_file:\n new_line = line.replace('#{controller}', controller_name)\n controller_file.write(new_line)\n logger.info(\"New: %s\" % _relative_path(controller_file_path))\n\n # test file\n with open(test_template, 'r') as template_file:\n test_file_path = os.path.join(current_path, 'tests',\n 'test_%s.py' % controller_name)\n with open(test_file_path, 'w+') as test_file:\n for line in template_file:\n new_line = line.replace('#{controller}', controller_name) \\\n .replace('#{controller|title}', controller_name.title())\n test_file.write(new_line)\n logger.info(\"New: %s\" % _relative_path(test_file_path))\n\n # assets dir\n assets_dir_path = os.path.join(current_path, 'application/pages/%s' % controller_name)\n _mkdir_p(assets_dir_path)\n\n # form file\n _generate_form(controller_name)\n\n logger.info('Finish generating controller.')"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\ngenerate a new macro.", "response": "def generate_macro(args):\n \"\"\"Genarate macro.\"\"\"\n macro = args.get('').replace('-', '_')\n category = args.get('')\n\n if not macro:\n logger.warning('Macro name cannot be empty.')\n return\n\n logger.info('Start generating macro.')\n\n current_path = os.getcwd()\n\n if category:\n macro_root_path = os.path.join(current_path, 'application/macros', category, macro)\n else:\n macro_root_path = os.path.join(current_path, 'application/macros', macro)\n\n _mkdir_p(macro_root_path)\n\n macro_html_path = os.path.join(macro_root_path, '_%s.html' % macro)\n macro_css_path = os.path.join(macro_root_path, '_%s.less' % macro)\n macro_js_path = os.path.join(macro_root_path, '_%s.js' % macro)\n\n # html\n macro_html_template_path = os.path.join(dirname(abspath(__file__)), 'templates/macro.html')\n with open(macro_html_template_path, 'r') as template_file:\n with open(macro_html_path, 'w+') as html_file:\n for line in template_file:\n new_line = line.replace('#{macro}', macro)\n html_file.write(new_line)\n logger.info(\"New: %s\" % _relative_path(macro_html_path))\n\n # css\n open(macro_css_path, 'a').close()\n logger.info(\"New: %s\" % _relative_path(macro_css_path))\n\n # js\n open(macro_js_path, 'a').close()\n logger.info(\"New: %s\" % _relative_path(macro_js_path))\n\n logger.info('Finish generating macro.')"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _mkdir_p(path):\n try:\n os.makedirs(path)\n except OSError as exc:\n if exc.errno == errno.EEXIST and os.path.isdir(path):\n pass\n else:\n raise\n else:\n logger.info(\"New: %s%s\", path, os.path.sep)", "response": "mkdir - p path"} {"SOURCE": "codesearchnet", "instruction": "Create a Python 3 function for\nreplacing vars and copy.", "response": "def _rewrite_and_copy(src_file, dst_file, project_name):\n \"\"\"Replace vars and copy.\"\"\"\n # Create temp file\n fh, abs_path = mkstemp()\n\n with io.open(abs_path, 'w', encoding='utf-8') as new_file:\n with io.open(src_file, 'r', encoding='utf-8') as old_file:\n for line in old_file:\n new_line = line.replace('#{project}', project_name). \\\n replace('#{project|title}', project_name.title())\n new_file.write(new_line)\n\n # Copy to new file\n shutil.copy(abs_path, dst_file)\n os.close(fh)"} {"SOURCE": "codesearchnet", "instruction": "How would you explain what the following Python 3 function does\ndef encode(something):\n secret_key = current_app.config.get('SECRET_KEY')\n s = URLSafeSerializer(secret_key)\n return s.dumps(something)", "response": "Encode something with SECRET_KEY."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\ndecoding something with SECRET_KEY.", "response": "def decode(something):\n \"\"\"Decode something with SECRET_KEY.\"\"\"\n secret_key = current_app.config.get('SECRET_KEY')\n s = URLSafeSerializer(secret_key)\n try:\n return s.loads(something)\n except BadSignature:\n return None"} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef absolute_url_for(endpoint, **values):\n config = current_app.config\n site_domain = config.get('SITE_DOMAIN')\n relative_url = url_for(endpoint, **values)\n return join_url(site_domain, relative_url)", "response": "Absolute url for endpoint."} {"SOURCE": "codesearchnet", "instruction": "How would you code a function in Python 3 to\nregister jinja filters vars and functions.", "response": "def register_jinja(app):\n \"\"\"Register jinja filters, vars, functions.\"\"\"\n import jinja2\n from .utils import filters, permissions, helpers\n\n if app.debug or app.testing:\n my_loader = jinja2.ChoiceLoader([\n app.jinja_loader,\n jinja2.FileSystemLoader([\n os.path.join(app.config.get('PROJECT_PATH'), 'application/macros'),\n os.path.join(app.config.get('PROJECT_PATH'), 'application/pages')\n ])\n ])\n else:\n my_loader = jinja2.ChoiceLoader([\n app.jinja_loader,\n jinja2.FileSystemLoader([\n os.path.join(app.config.get('PROJECT_PATH'), 'output/macros'),\n os.path.join(app.config.get('PROJECT_PATH'), 'output/pages')\n ])\n ])\n app.jinja_loader = my_loader\n\n app.jinja_env.filters.update({\n 'timesince': filters.timesince\n })\n\n def url_for_other_page(page):\n \"\"\"Generate url for pagination.\"\"\"\n view_args = request.view_args.copy()\n args = request.args.copy().to_dict()\n combined_args = dict(view_args.items() + args.items())\n combined_args['page'] = page\n return url_for(request.endpoint, **combined_args)\n\n rules = {}\n for endpoint, _rules in iteritems(app.url_map._rules_by_endpoint):\n if any(item in endpoint for item in ['_debug_toolbar', 'debugtoolbar', 'static']):\n continue\n rules[endpoint] = [{'rule': rule.rule} for rule in _rules]\n\n app.jinja_env.globals.update({\n 'absolute_url_for': helpers.absolute_url_for,\n 'url_for_other_page': url_for_other_page,\n 'rules': rules,\n 'permissions': permissions\n })"} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function that can\nregister HTTP error pages.", "response": "def register_error_handle(app):\n \"\"\"Register HTTP error pages.\"\"\"\n\n @app.errorhandler(403)\n def page_403(error):\n return render_template('site/403/403.html'), 403\n\n @app.errorhandler(404)\n def page_404(error):\n return render_template('site/404/404.html'), 404\n\n @app.errorhandler(500)\n def page_500(error):\n return render_template('site/500/500.html'), 500"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef _dataframe_to_csv(writer, dataframe, delimiter, with_header):\n encoding_writer = codecs.getwriter('utf-8')(writer)\n dataframe.to_csv(\n path_or_buf=encoding_writer,\n sep=delimiter,\n header=with_header,\n index=False\n )", "response": "serialize the dataframe with different delimiters"} {"SOURCE": "codesearchnet", "instruction": "Can you generate the documentation for the following Python 3 function\ndef _dataframe_from_csv(reader, delimiter, with_header, skipspace):\n sep = delimiter\n header = 0\n if not with_header:\n header = None\n\n return pd.read_csv(\n reader,\n header=header,\n sep=sep,\n skipinitialspace=skipspace,\n encoding='utf-8-sig'\n )", "response": "Returns csv data as a pandas Dataframe object"} {"SOURCE": "codesearchnet", "instruction": "Implement a function in Python 3 to\nserialize a dataframe. Parameters ---------- writer : file File-like object to write to. Must be opened in binary mode. data_type_id : dict Serialization format to use. See the azureml.DataTypeIds class for constants. dataframe: pandas.DataFrame Dataframe to serialize.", "response": "def serialize_dataframe(writer, data_type_id, dataframe):\n \"\"\"\n Serialize a dataframe.\n\n Parameters\n ----------\n writer : file\n File-like object to write to. Must be opened in binary mode.\n data_type_id : dict\n Serialization format to use.\n See the azureml.DataTypeIds class for constants.\n dataframe: pandas.DataFrame\n Dataframe to serialize.\n \"\"\"\n _not_none('writer', writer)\n _not_none_or_empty('data_type_id', data_type_id)\n _not_none('dataframe', dataframe)\n\n serializer = _SERIALIZERS.get(data_type_id)\n if serializer is None:\n raise UnsupportedDatasetTypeError(data_type_id)\n serializer[0](writer=writer, dataframe=dataframe)"} {"SOURCE": "codesearchnet", "instruction": "Given the following Python 3 function, write the documentation\ndef deserialize_dataframe(reader, data_type_id):\n _not_none('reader', reader)\n _not_none_or_empty('data_type_id', data_type_id)\n\n serializer = _SERIALIZERS.get(data_type_id)\n if serializer is None:\n raise UnsupportedDatasetTypeError(data_type_id)\n return serializer[1](reader=reader)", "response": "Deserialize a dataframe.\n\n Parameters\n ----------\n reader : file\n File-like object to read from. Must be opened in binary mode.\n data_type_id : dict\n Serialization format of the raw data.\n See the azureml.DataTypeIds class for constants.\n\n Returns\n -------\n pandas.DataFrame\n Dataframe object."} {"SOURCE": "codesearchnet", "instruction": "Make a summary of the following Python 3 code\ndef _update_from_dataframe(self, dataframe, data_type_id=None, name=None,\n description=None):\n \"\"\"\n Serialize the specified DataFrame and replace the existing dataset.\n\n Parameters\n ----------\n dataframe : pandas.DataFrame\n Data to serialize.\n data_type_id : str, optional\n Format to serialize to.\n If None, the existing format is preserved.\n Supported formats are:\n 'PlainText'\n 'GenericCSV'\n 'GenericTSV'\n 'GenericCSVNoHeader'\n 'GenericTSVNoHeader'\n See the azureml.DataTypeIds class for constants.\n name : str, optional\n Name for the dataset.\n If None, the name of the existing dataset is used.\n description : str, optional\n Description for the dataset.\n If None, the name of the existing dataset is used.\n \"\"\"\n _not_none('dataframe', dataframe)\n\n if data_type_id is None:\n data_type_id = self.data_type_id\n if name is None:\n name = self.name\n if description is None:\n description = self.description\n\n try:\n output = BytesIO()\n serialize_dataframe(output, data_type_id, dataframe)\n raw_data = output.getvalue()\n finally:\n output.close()\n\n self._upload_and_refresh(raw_data, data_type_id, name, description)", "response": "Serialize the specified DataFrame and update the object with the new data."} {"SOURCE": "codesearchnet", "instruction": "Write a Python 3 function for\nupdating the internal state of the object with the raw data.", "response": "def _update_from_raw_data(self, raw_data, data_type_id=None, name=None,\n description=None):\n \"\"\"\n Upload already serialized raw data and replace the existing dataset.\n\n Parameters\n ----------\n raw_data: bytes\n Dataset contents to upload.\n data_type_id : str\n Serialization format of the raw data.\n If None, the format of the existing dataset is used.\n Supported formats are:\n 'PlainText'\n 'GenericCSV'\n 'GenericTSV'\n 'GenericCSVNoHeader'\n 'GenericTSVNoHeader'\n 'ARFF'\n See the azureml.DataTypeIds class for constants.\n name : str, optional\n Name for the dataset.\n If None, the name of the existing dataset is used.\n description : str, optional\n Description for the dataset.\n If None, the name of the existing dataset is used.\n \"\"\"\n _not_none('raw_data', raw_data)\n\n if data_type_id is None:\n data_type_id = self.data_type_id\n if name is None:\n name = self.name\n if description is None:\n description = self.description\n\n self._upload_and_refresh(raw_data, data_type_id, name, description)"} {"SOURCE": "codesearchnet", "instruction": "Here you have a function in Python 3, explain what it does\ndef contents_url(self):\n loc = self.download_location\n return loc.base_uri + loc.location + loc.access_credential", "response": "Full URL to the dataset contents."} {"SOURCE": "codesearchnet", "instruction": "Explain what the following Python 3 code does\ndef add_from_dataframe(self, dataframe, data_type_id, name, description):\n _not_none('dataframe', dataframe)\n _not_none_or_empty('data_type_id', data_type_id)\n _not_none_or_empty('name', name)\n _not_none_or_empty('description', description)\n\n try:\n output = BytesIO()\n serialize_dataframe(output, data_type_id, dataframe)\n raw_data = output.getvalue()\n finally:\n output.close()\n\n return self._upload(raw_data, data_type_id, name, description)", "response": "This method creates a new dataset from a pandas. DataFrame and uploads it to AzureML."} {"SOURCE": "codesearchnet", "instruction": "Can you generate a brief explanation for the following Python 3 code\ndef add_from_raw_data(self, raw_data, data_type_id, name, description):\n _not_none('raw_data', raw_data)\n _not_none_or_empty('data_type_id', data_type_id)\n _not_none_or_empty('name', name)\n _not_none_or_empty('description', description)\n\n return self._upload(raw_data, data_type_id, name, description)", "response": "Uploads a new dataset from a raw data file."} {"SOURCE": "codesearchnet", "instruction": "Can you write a function in Python 3 where it\nopens and return a stream for the dataset contents.", "response": "def open(self):\n '''Open and return a stream for the dataset contents.'''\n return self.workspace._rest.open_intermediate_dataset_contents(\n self.workspace.workspace_id,\n self.experiment.experiment_id,\n self.node_id,\n self.port_name\n )"}
%s%s